Sample records for case series model

  1. Design considerations for case series models with exposure onset measurement error.

    PubMed

    Mohammed, Sandra M; Dalrymple, Lorien S; Sentürk, Damla; Nguyen, Danh V

    2013-02-28

    The case series model allows for estimation of the relative incidence of events, such as cardiovascular events, within a pre-specified time window after an exposure, such as an infection. The method requires only cases (individuals with events) and controls for all fixed/time-invariant confounders. The measurement error case series model extends the original case series model to handle imperfect data, where the timing of an infection (exposure) is not known precisely. In this work, we propose a method for power/sample size determination for the measurement error case series model. Extensive simulation studies are used to assess the accuracy of the proposed sample size formulas. We also examine the magnitude of the relative loss of power due to exposure onset measurement error, compared with the ideal situation where the time of exposure is measured precisely. To facilitate the design of case series studies, we provide publicly available web-based tools for determining power/sample size for both the measurement error case series model as well as the standard case series model. Copyright © 2012 John Wiley & Sons, Ltd.

  2. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  3. An approach to checking case-crossover analyses based on equivalence with time-series methods.

    PubMed

    Lu, Yun; Symons, James Morel; Geyh, Alison S; Zeger, Scott L

    2008-03-01

    The case-crossover design has been increasingly applied to epidemiologic investigations of acute adverse health effects associated with ambient air pollution. The correspondence of the design to that of matched case-control studies makes it inferentially appealing for epidemiologic studies. Case-crossover analyses generally use conditional logistic regression modeling. This technique is equivalent to time-series log-linear regression models when there is a common exposure across individuals, as in air pollution studies. Previous methods for obtaining unbiased estimates for case-crossover analyses have assumed that time-varying risk factors are constant within reference windows. In this paper, we rely on the connection between case-crossover and time-series methods to illustrate model-checking procedures from log-linear model diagnostics for time-stratified case-crossover analyses. Additionally, we compare the relative performance of the time-stratified case-crossover approach to time-series methods under 3 simulated scenarios representing different temporal patterns of daily mortality associated with air pollution in Chicago, Illinois, during 1995 and 1996. Whenever a model-be it time-series or case-crossover-fails to account appropriately for fluctuations in time that confound the exposure, the effect estimate will be biased. It is therefore important to perform model-checking in time-stratified case-crossover analyses rather than assume the estimator is unbiased.

  4. Comparison of time series models for predicting campylobacteriosis risk in New Zealand.

    PubMed

    Al-Sakkaf, A; Jones, G

    2014-05-01

    Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.

  5. Generalized seasonal autoregressive integrated moving average models for count data with application to malaria time series with low case numbers.

    PubMed

    Briët, Olivier J T; Amerasinghe, Priyanie H; Vounatsou, Penelope

    2013-01-01

    With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions' impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during "consolidation" and "pre-elimination" phases. Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low.

  6. Generalized Seasonal Autoregressive Integrated Moving Average Models for Count Data with Application to Malaria Time Series with Low Case Numbers

    PubMed Central

    Briët, Olivier J. T.; Amerasinghe, Priyanie H.; Vounatsou, Penelope

    2013-01-01

    Introduction With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions’ impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during “consolidation” and “pre-elimination” phases. Methods Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. Results The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. Conclusions G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low. PMID:23785448

  7. Case Series Investigations in Cognitive Neuropsychology

    PubMed Central

    Schwartz, Myrna F.; Dell, Gary S.

    2011-01-01

    Case series methodology involves the systematic assessment of a sample of related patients, with the goal of understanding how and why they differ from one another. This method has become increasingly important in cognitive neuropsychology, which has long been identified with single-subject research. We review case series studies dealing with impaired semantic memory, reading, and language production, and draw attention to the affinity of this methodology for testing theories that are expressed as computational models and for addressing questions about neuroanatomy. It is concluded that case series methods usefully complement single-subject techniques. PMID:21714756

  8. Using Time Series Analysis to Predict Cardiac Arrest in a PICU.

    PubMed

    Kennedy, Curtis E; Aoki, Noriaki; Mariscalco, Michele; Turley, James P

    2015-11-01

    To build and test cardiac arrest prediction models in a PICU, using time series analysis as input, and to measure changes in prediction accuracy attributable to different classes of time series data. Retrospective cohort study. Thirty-one bed academic PICU that provides care for medical and general surgical (not congenital heart surgery) patients. Patients experiencing a cardiac arrest in the PICU and requiring external cardiac massage for at least 2 minutes. None. One hundred three cases of cardiac arrest and 109 control cases were used to prepare a baseline dataset that consisted of 1,025 variables in four data classes: multivariate, raw time series, clinical calculations, and time series trend analysis. We trained 20 arrest prediction models using a matrix of five feature sets (combinations of data classes) with four modeling algorithms: linear regression, decision tree, neural network, and support vector machine. The reference model (multivariate data with regression algorithm) had an accuracy of 78% and 87% area under the receiver operating characteristic curve. The best model (multivariate + trend analysis data with support vector machine algorithm) had an accuracy of 94% and 98% area under the receiver operating characteristic curve. Cardiac arrest predictions based on a traditional model built with multivariate data and a regression algorithm misclassified cases 3.7 times more frequently than predictions that included time series trend analysis and built with a support vector machine algorithm. Although the final model lacks the specificity necessary for clinical application, we have demonstrated how information from time series data can be used to increase the accuracy of clinical prediction models.

  9. Hybrid Forecasting of Daily River Discharges Considering Autoregressive Heteroscedasticity

    NASA Astrophysics Data System (ADS)

    Szolgayová, Elena Peksová; Danačová, Michaela; Komorniková, Magda; Szolgay, Ján

    2017-06-01

    It is widely acknowledged that in the hydrological and meteorological communities, there is a continuing need to improve the quality of quantitative rainfall and river flow forecasts. A hybrid (combined deterministic-stochastic) modelling approach is proposed here that combines the advantages offered by modelling the system dynamics with a deterministic model and a deterministic forecasting error series with a data-driven model in parallel. Since the processes to be modelled are generally nonlinear and the model error series may exhibit nonstationarity and heteroscedasticity, GARCH-type nonlinear time series models are considered here. The fitting, forecasting and simulation performance of such models have to be explored on a case-by-case basis. The goal of this paper is to test and develop an appropriate methodology for model fitting and forecasting applicable for daily river discharge forecast error data from the GARCH family of time series models. We concentrated on verifying whether the use of a GARCH-type model is suitable for modelling and forecasting a hydrological model error time series on the Hron and Morava Rivers in Slovakia. For this purpose we verified the presence of heteroscedasticity in the simulation error series of the KLN multilinear flow routing model; then we fitted the GARCH-type models to the data and compared their fit with that of an ARMA - type model. We produced one-stepahead forecasts from the fitted models and again provided comparisons of the model's performance.

  10. The short-term effects of air pollutants on respiratory disease mortality in Wuhan, China: comparison of time-series and case-crossover analyses.

    PubMed

    Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao

    2017-01-13

    Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified-case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m 3 increment in SO 2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO 2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case-crossover analysis displayed greater variation than that from the time-series analysis.

  11. The short-term effects of air pollutants on respiratory disease mortality in Wuhan, China: comparison of time-series and case-crossover analyses

    NASA Astrophysics Data System (ADS)

    Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao

    2017-01-01

    Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified-case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m3 increment in SO2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case-crossover analysis displayed greater variation than that from the time-series analysis.

  12. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: a case study in endemic districts of Bhutan.

    PubMed

    Wangdi, Kinley; Singhasivanon, Pratap; Silawan, Tassanee; Lawpoolsri, Saranath; White, Nicholas J; Kaewkungwal, Jaranit

    2010-09-03

    Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan.

  13. Ambient temperature and coronary heart disease mortality in Beijing, China: a time series study

    PubMed Central

    2012-01-01

    Background Many studies have examined the association between ambient temperature and mortality. However, less evidence is available on the temperature effects on coronary heart disease (CHD) mortality, especially in China. In this study, we examined the relationship between ambient temperature and CHD mortality in Beijing, China during 2000 to 2011. In addition, we compared time series and time-stratified case-crossover models for the non-linear effects of temperature. Methods We examined the effects of temperature on CHD mortality using both time series and time-stratified case-crossover models. We also assessed the effects of temperature on CHD mortality by subgroups: gender (female and male) and age (age > =65 and age < 65). We used a distributed lag non-linear model to examine the non-linear effects of temperature on CHD mortality up to 15 lag days. We used Akaike information criterion to assess the model fit for the two designs. Results The time series models had a better model fit than time-stratified case-crossover models. Both designs showed that the relationships between temperature and group-specific CHD mortality were non-linear. Extreme cold and hot temperatures significantly increased the risk of CHD mortality. Hot effects were acute and short-term, while cold effects were delayed by two days and lasted for five days. The old people and women were more sensitive to extreme cold and hot temperatures than young and men. Conclusions This study suggests that time series models performed better than time-stratified case-crossover models according to the model fit, even though they produced similar non-linear effects of temperature on CHD mortality. In addition, our findings indicate that extreme cold and hot temperatures increase the risk of CHD mortality in Beijing, China, particularly for women and old people. PMID:22909034

  14. Time-series-based hybrid mathematical modelling method adapted to forecast automotive and medical waste generation: Case study of Lithuania.

    PubMed

    Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras

    2018-05-01

    The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.

  15. Therapeutic Assessment for Preadolescent Boys with Oppositional Defiant Disorder: A Replicated Single-Case Time-Series Design

    ERIC Educational Resources Information Center

    Smith, Justin D.; Handler, Leonard; Nash, Michael R.

    2010-01-01

    The Therapeutic Assessment (TA) model is a relatively new treatment approach that fuses assessment and psychotherapy. The study examines the efficacy of this model with preadolescent boys with oppositional defiant disorder and their families. A replicated single-case time-series design with daily measures is used to assess the effects of TA and to…

  16. The short-term effects of air pollutants on respiratory disease mortality in Wuhan, China: comparison of time-series and case-crossover analyses

    PubMed Central

    Ren, Meng; Li, Na; Wang, Zhan; Liu, Yisi; Chen, Xi; Chu, Yuanyuan; Li, Xiangyu; Zhu, Zhongmin; Tian, Liqiao; Xiang, Hao

    2017-01-01

    Few studies have compared different methods when exploring the short-term effects of air pollutants on respiratory disease mortality in Wuhan, China. This study assesses the association between air pollutants and respiratory disease mortality with both time-series and time-stratified–case-crossover designs. The generalized additive model (GAM) and the conditional logistic regression model were used to assess the short-term effects of air pollutants on respiratory disease mortality. Stratified analyses were performed by age, sex, and diseases. A 10 μg/m3 increment in SO2 level was associated with an increase in relative risk for all respiratory disease mortality of 2.4% and 1.9% in the case-crossover and time-series analyses in single pollutant models, respectively. Strong evidence of an association between NO2 and daily respiratory disease mortality among men or people older than 65 years was found in the case-crossover study. There was a positive association between air pollutants and respiratory disease mortality in Wuhan, China. Both time-series and case-crossover analyses consistently reveal the association between three air pollutants and respiratory disease mortality. The estimates of association between air pollution and respiratory disease mortality from the case–crossover analysis displayed greater variation than that from the time-series analysis. PMID:28084399

  17. Temporal and long-term trend analysis of class C notifiable diseases in China from 2009 to 2014

    PubMed Central

    Zhang, Xingyu; Hou, Fengsu; Qiao, Zhijiao; Li, Xiaosong; Zhou, Lijun; Liu, Yuanyuan; Zhang, Tao

    2016-01-01

    Objectives Time series models are effective tools for disease forecasting. This study aims to explore the time series behaviour of 11 notifiable diseases in China and to predict their incidence through effective models. Settings and participants The Chinese Ministry of Health started to publish class C notifiable diseases in 2009. The monthly reported case time series of 11 infectious diseases from the surveillance system between 2009 and 2014 was collected. Methods We performed a descriptive and a time series study using the surveillance data. Decomposition methods were used to explore (1) their seasonality expressed in the form of seasonal indices and (2) their long-term trend in the form of a linear regression model. Autoregressive integrated moving average (ARIMA) models have been established for each disease. Results The number of cases and deaths caused by hand, foot and mouth disease ranks number 1 among the detected diseases. It occurred most often in May and July and increased, on average, by 0.14126/100 000 per month. The remaining incidence models show good fit except the influenza and hydatid disease models. Both the hydatid disease and influenza series become white noise after differencing, so no available ARIMA model can be fitted for these two diseases. Conclusion Time series analysis of effective surveillance time series is useful for better understanding the occurrence of the 11 types of infectious disease. PMID:27797981

  18. Development of temporal modelling for forecasting and prediction of malaria infections using time-series and ARIMAX analyses: A case study in endemic districts of Bhutan

    PubMed Central

    2010-01-01

    Background Malaria still remains a public health problem in some districts of Bhutan despite marked reduction of cases in last few years. To strengthen the country's prevention and control measures, this study was carried out to develop forecasting and prediction models of malaria incidence in the endemic districts of Bhutan using time series and ARIMAX. Methods This study was carried out retrospectively using the monthly reported malaria cases from the health centres to Vector-borne Disease Control Programme (VDCP) and the meteorological data from Meteorological Unit, Department of Energy, Ministry of Economic Affairs. Time series analysis was performed on monthly malaria cases, from 1994 to 2008, in seven malaria endemic districts. The time series models derived from a multiplicative seasonal autoregressive integrated moving average (ARIMA) was deployed to identify the best model using data from 1994 to 2006. The best-fit model was selected for each individual district and for the overall endemic area was developed and the monthly cases from January to December 2009 and 2010 were forecasted. In developing the prediction model, the monthly reported malaria cases and the meteorological factors from 1996 to 2008 of the seven districts were analysed. The method of ARIMAX modelling was employed to determine predictors of malaria of the subsequent month. Results It was found that the ARIMA (p, d, q) (P, D, Q)s model (p and P representing the auto regressive and seasonal autoregressive; d and D representing the non-seasonal differences and seasonal differencing; and q and Q the moving average parameters and seasonal moving average parameters, respectively and s representing the length of the seasonal period) for the overall endemic districts was (2,1,1)(0,1,1)12; the modelling data from each district revealed two most common ARIMA models including (2,1,1)(0,1,1)12 and (1,1,1)(0,1,1)12. The forecasted monthly malaria cases from January to December 2009 and 2010 varied from 15 to 82 cases in 2009 and 67 to 149 cases in 2010, where population in 2009 was 285,375 and the expected population of 2010 to be 289,085. The ARIMAX model of monthly cases and climatic factors showed considerable variations among the different districts. In general, the mean maximum temperature lagged at one month was a strong positive predictor of an increased malaria cases for four districts. The monthly number of cases of the previous month was also a significant predictor in one district, whereas no variable could predict malaria cases for two districts. Conclusions The ARIMA models of time-series analysis were useful in forecasting the number of cases in the endemic areas of Bhutan. There was no consistency in the predictors of malaria cases when using ARIMAX model with selected lag times and climatic predictors. The ARIMA forecasting models could be employed for planning and managing malaria prevention and control programme in Bhutan. PMID:20813066

  19. Bayesian dynamic modeling of time series of dengue disease case counts.

    PubMed

    Martínez-Bello, Daniel Adyro; López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-07-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health.

  20. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination

    PubMed Central

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu

    2013-01-01

    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  1. Nonlinear modeling of chaotic time series: Theory and applications

    NASA Astrophysics Data System (ADS)

    Casdagli, M.; Eubank, S.; Farmer, J. D.; Gibson, J.; Desjardins, D.; Hunter, N.; Theiler, J.

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases, apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases, it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifying and quantifying low-dimensional chaotic behavior. During the past few years, methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics, and human speech.

  2. A Time Series Analysis: Weather Factors, Human Migration and Malaria Cases in Endemic Area of Purworejo, Indonesia, 2005–2014

    PubMed Central

    REJEKI, Dwi Sarwani Sri; NURHAYATI, Nunung; AJI, Budi; MURHANDARWATI, E. Elsa Herdiana; KUSNANTO, Hari

    2018-01-01

    Background: Climatic and weather factors become important determinants of vector-borne diseases transmission like malaria. This study aimed to prove relationships between weather factors with considering human migration and previous case findings and malaria cases in endemic areas in Purworejo during 2005–2014. Methods: This study employed ecological time series analysis by using monthly data. The independent variables were the maximum temperature, minimum temperature, maximum humidity, minimum humidity, precipitation, human migration, and previous malaria cases, while the dependent variable was positive malaria cases. Three models of count data regression analysis i.e. Poisson model, quasi-Poisson model, and negative binomial model were applied to measure the relationship. The least Akaike Information Criteria (AIC) value was also performed to find the best model. Negative binomial regression analysis was considered as the best model. Results: The model showed that humidity (lag 2), precipitation (lag 3), precipitation (lag 12), migration (lag1) and previous malaria cases (lag 12) had a significant relationship with malaria cases. Conclusion: Weather, migration and previous malaria cases factors need to be considered as prominent indicators for the increase of malaria case projection. PMID:29900134

  3. Testing the effectiveness of family therapeutic assessment: a case study using a time-series design.

    PubMed

    Smith, Justin D; Wolf, Nicole J; Handler, Leonard; Nash, Michael R

    2009-11-01

    We describe a family Therapeutic Assessment (TA) case study employing 2 assessors, 2 assessment rooms, and a video link. In the study, we employed a daily measures time-series design with a pretreatment baseline and follow-up period to examine the family TA treatment model. In addition to being an illustrative addition to a number of clinical reports suggesting the efficacy of family TA, this study is the first to apply a case-based time-series design to test whether family TA leads to clinical improvement and also illustrates when that improvement occurs. Results support the trajectory of change proposed by Finn (2007), the TA model's creator, who posits that benefits continue beyond the formal treatment itself.

  4. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  5. Modelling fourier regression for time series data- a case study: modelling inflation in foods sector in Indonesia

    NASA Astrophysics Data System (ADS)

    Prahutama, Alan; Suparti; Wahyu Utami, Tiani

    2018-03-01

    Regression analysis is an analysis to model the relationship between response variables and predictor variables. The parametric approach to the regression model is very strict with the assumption, but nonparametric regression model isn’t need assumption of model. Time series data is the data of a variable that is observed based on a certain time, so if the time series data wanted to be modeled by regression, then we should determined the response and predictor variables first. Determination of the response variable in time series is variable in t-th (yt), while the predictor variable is a significant lag. In nonparametric regression modeling, one developing approach is to use the Fourier series approach. One of the advantages of nonparametric regression approach using Fourier series is able to overcome data having trigonometric distribution. In modeling using Fourier series needs parameter of K. To determine the number of K can be used Generalized Cross Validation method. In inflation modeling for the transportation sector, communication and financial services using Fourier series yields an optimal K of 120 parameters with R-square 99%. Whereas if it was modeled by multiple linear regression yield R-square 90%.

  6. Nonlinear modeling of chaotic time series: Theory and applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casdagli, M.; Eubank, S.; Farmer, J.D.

    1990-01-01

    We review recent developments in the modeling and prediction of nonlinear time series. In some cases apparent randomness in time series may be due to chaotic behavior of a nonlinear but deterministic system. In such cases it is possible to exploit the determinism to make short term forecasts that are much more accurate than one could make from a linear stochastic model. This is done by first reconstructing a state space, and then using nonlinear function approximation methods to create a dynamical model. Nonlinear models are valuable not only as short term forecasters, but also as diagnostic tools for identifyingmore » and quantifying low-dimensional chaotic behavior. During the past few years methods for nonlinear modeling have developed rapidly, and have already led to several applications where nonlinear models motivated by chaotic dynamics provide superior predictions to linear models. These applications include prediction of fluid flows, sunspots, mechanical vibrations, ice ages, measles epidemics and human speech. 162 refs., 13 figs.« less

  7. BASINS and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential future effects of climate change on water resources. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  8. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    PubMed

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  9. A Case History Introducing the Oregon Ag Seminar Series-Keys to Program and Research-to-Practice Success.

    PubMed

    Harrington, Marcy J; Lloyd, Kirk

    2017-01-01

    This case history of Oregon state's Ag Seminar Series is consistent with the Socio-Ecological Model, demonstrating how policy at a state level can influence an organizational approach with impacts that ultimately influence safety practices on the farm. From modest beginnings, the Ag Seminar Series, offered through a workers compensation insurance company, now serves over 2,300 Oregon farmers annually in English and Spanish. This case offers unique but also replicable methods for educators, insurers, and researchers in safety education, safety motivators, and research-to-practice (r2p).

  10. Bayesian dynamic modeling of time series of dengue disease case counts

    PubMed Central

    López-Quílez, Antonio; Torres-Prieto, Alexander

    2017-01-01

    The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model’s short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC) for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease, producing useful models for decision-making in public health. PMID:28671941

  11. Two States Mapping Based Time Series Neural Network Model for Compensation Prediction Residual Error

    NASA Astrophysics Data System (ADS)

    Jung, Insung; Koo, Lockjo; Wang, Gi-Nam

    2008-11-01

    The objective of this paper was to design a model of human bio signal data prediction system for decreasing of prediction error using two states mapping based time series neural network BP (back-propagation) model. Normally, a lot of the industry has been applied neural network model by training them in a supervised manner with the error back-propagation algorithm for time series prediction systems. However, it still has got a residual error between real value and prediction result. Therefore, we designed two states of neural network model for compensation residual error which is possible to use in the prevention of sudden death and metabolic syndrome disease such as hypertension disease and obesity. We determined that most of the simulation cases were satisfied by the two states mapping based time series prediction model. In particular, small sample size of times series were more accurate than the standard MLP model.

  12. Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis.

    PubMed

    Armstrong, Ben G; Gasparrini, Antonio; Tobias, Aurelio

    2014-11-24

    The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case-control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters. The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages. By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression. Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.

  13. Therapeutic assessment for preadolescent boys with oppositional defiant disorder: a replicated single-case time-series design.

    PubMed

    Smith, Justin D; Handler, Leonard; Nash, Michael R

    2010-09-01

    The Therapeutic Assessment (TA) model is a relatively new treatment approach that fuses assessment and psychotherapy. The study examines the efficacy of this model with preadolescent boys with oppositional defiant disorder and their families. A replicated single-case time-series design with daily measures is used to assess the effects of TA and to track the process of change as it unfolds. All 3 families benefitted from participation in TA across multiple domains of functioning, but the way in which change unfolded was unique for each family. These findings are substantiated by the Behavior Assessment System for Children (Reynolds & Kamphaus, 2004). The TA model is shown to be an effective treatment for preadolescent boys with oppositional defiant disorder and their families. Further, the time-series design of this study illustrated how this empirically grounded case-based methodology reveals when and how change unfolds during treatment in a way that is usually not possible with other research designs.

  14. Forecasting dengue hemorrhagic fever cases using ARIMA model: a case study in Asahan district

    NASA Astrophysics Data System (ADS)

    Siregar, Fazidah A.; Makmur, Tri; Saprin, S.

    2018-01-01

    Time series analysis had been increasingly used to forecast the number of dengue hemorrhagic fever in many studies. Since no vaccine exist and poor public health infrastructure, predicting the occurrence of dengue hemorrhagic fever (DHF) is crucial. This study was conducted to determine trend and forecasting the occurrence of DHF in Asahan district, North Sumatera Province. Monthly reported dengue cases for the years 2012-2016 were obtained from the district health offices. A time series analysis was conducted by Autoregressive integrated moving average (ARIMA) modeling to forecast the occurrence of DHF. The results demonstrated that the reported DHF cases showed a seasonal variation. The SARIMA (1,0,0)(0,1,1)12 model was the best model and adequate for the data. The SARIMA model for DHF is necessary and could applied to predict the incidence of DHF in Asahan district and assist with design public health maesures to prevent and control the diseases.

  15. Modeling seasonal leptospirosis transmission and its association with rainfall and temperature in Thailand using time-series and ARIMAX analyses.

    PubMed

    Chadsuthi, Sudarat; Modchang, Charin; Lenbury, Yongwimon; Iamsirithaworn, Sopon; Triampo, Wannapong

    2012-07-01

    To study the number of leptospirosis cases in relations to the seasonal pattern, and its association with climate factors. Time series analysis was used to study the time variations in the number of leptospirosis cases. The Autoregressive Integrated Moving Average (ARIMA) model was used in data curve fitting and predicting the next leptospirosis cases. We found that the amount of rainfall was correlated to leptospirosis cases in both regions of interest, namely the northern and northeastern region of Thailand, while the temperature played a role in the northeastern region only. The use of multivariate ARIMA (ARIMAX) model showed that factoring in rainfall (with an 8 months lag) yields the best model for the northern region while the model, which factors in rainfall (with a 10 months lag) and temperature (with an 8 months lag) was the best for the northeastern region. The models are able to show the trend in leptospirosis cases and closely fit the recorded data in both regions. The models can also be used to predict the next seasonal peak quite accurately. Copyright © 2012 Hainan Medical College. Published by Elsevier B.V. All rights reserved.

  16. State space model approach for forecasting the use of electrical energy (a case study on: PT. PLN (Persero) district of Kroya)

    NASA Astrophysics Data System (ADS)

    Kurniati, Devi; Hoyyi, Abdul; Widiharih, Tatik

    2018-05-01

    Time series data is a series of data taken or measured based on observations at the same time interval. Time series data analysis is used to perform data analysis considering the effect of time. The purpose of time series analysis is to know the characteristics and patterns of a data and predict a data value in some future period based on data in the past. One of the forecasting methods used for time series data is the state space model. This study discusses the modeling and forecasting of electric energy consumption using the state space model for univariate data. The modeling stage is began with optimal Autoregressive (AR) order selection, determination of state vector through canonical correlation analysis, estimation of parameter, and forecasting. The result of this research shows that modeling of electric energy consumption using state space model of order 4 with Mean Absolute Percentage Error (MAPE) value 3.655%, so the model is very good forecasting category.

  17. A bivariate gamma probability distribution with application to gust modeling. [for the ascent flight of the space shuttle

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.

    1982-01-01

    A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.

  18. Tuberculosis Case Finding in Benin, 2000–2014 and Beyond: A Retrospective Cohort and Time Series Study

    PubMed Central

    Ade, Serge; Békou, Wilfried; Adjobimey, Mênonli; Adjibode, Omer; Ade, Gabriel; Harries, Anthony D.; Anagonou, Séverin

    2016-01-01

    Objective. To determine any changes in tuberculosis epidemiology in the last 15 years in Benin, seasonal variations, and forecasted numbers of tuberculosis cases in the next five years. Materials and Methods. Retrospective cohort and time series study of all tuberculosis cases notified between 2000 and 2014. The “R” software version 3.2.1 (Institute for Statistics and Mathematics Vienna Austria) and the Box-Jenkins 1976 modeling approach were used for time series analysis. Results. Of 246943 presumptive cases, 54303 (22%) were diagnosed with tuberculosis. Annual notified case numbers increased, with the highest reported in 2011. New pulmonary bacteriologically confirmed tuberculosis (NPBCT) represented 78%  ± SD 2%. Retreatment cases decreased from 10% to 6% and new pulmonary clinically diagnosed cases increased from 2% to 8%. NPBCT notification rates decreased in males from 2012, in young people aged 15–34 years and in Borgou-Alibori region. There was a seasonal pattern in tuberculosis cases. Over 90% of NPBCT were HIV-tested with a stable HIV prevalence of 13%. The ARIMA best fit model predicted a decrease in tuberculosis cases finding in the next five years. Conclusion. Tuberculosis case notifications are predicted to decrease in the next five years if current passive case finding is used. Additional strategies are needed in the country. PMID:27293887

  19. Modeling and projection of dengue fever cases in Guangzhou based on variation of weather factors.

    PubMed

    Li, Chenlu; Wang, Xiaofeng; Wu, Xiaoxu; Liu, Jianing; Ji, Duoying; Du, Juan

    2017-12-15

    Dengue fever is one of the most serious vector-borne infectious diseases, especially in Guangzhou, China. Dengue viruses and their vectors Aedes albopictus are sensitive to climate change primarily in relation to weather factors. Previous research has mainly focused on identifying the relationship between climate factors and dengue cases, or developing dengue case models with some non-climate factors. However, there has been little research addressing the modeling and projection of dengue cases only from the perspective of climate change. This study considered this topic using long time series data (1998-2014). First, sensitive weather factors were identified through meta-analysis that included literature review screening, lagged analysis, and collinear analysis. Then, key factors that included monthly average temperature at a lag of two months, and monthly average relative humidity and monthly average precipitation at lags of three months were determined. Second, time series Poisson analysis was used with the generalized additive model approach to develop a dengue model based on key weather factors for January 1998 to December 2012. Data from January 2013 to July 2014 were used to validate that the model was reliable and reasonable. Finally, future weather data (January 2020 to December 2070) were input into the model to project the occurrence of dengue cases under different climate scenarios (RCP 2.6 and RCP 8.5). Longer time series analysis and scientifically selected weather variables were used to develop a dengue model to ensure reliability. The projections suggested that seasonal disease control (especially in summer and fall) and mitigation of greenhouse gas emissions could help reduce the incidence of dengue fever. The results of this study hope to provide a scientifically theoretical basis for the prevention and control of dengue fever in Guangzhou. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Association of daily asthma emergency department visits and hospital admissions with ambient air pollutants among the pediatric Medicaid population in Detroit: time-series and time-stratified case-crossover analyses with threshold effects.

    PubMed

    Li, Shi; Batterman, Stuart; Wasilevich, Elizabeth; Wahl, Robert; Wirth, Julie; Su, Feng-Chiao; Mukherjee, Bhramar

    2011-11-01

    Asthma morbidity has been associated with ambient air pollutants in time-series and case-crossover studies. In such study designs, threshold effects of air pollutants on asthma outcomes have been relatively unexplored, which are of potential interest for exploring concentration-response relationships. This study analyzes daily data on the asthma morbidity experienced by the pediatric Medicaid population (ages 2-18 years) of Detroit, Michigan and concentrations of pollutants fine particles (PM2.5), CO, NO2 and SO2 for the 2004-2006 period, using both time-series and case-crossover designs. We use a simple, testable and readily implementable profile likelihood-based approach to estimate threshold parameters in both designs. Evidence of significant increases in daily acute asthma events was found for SO2 and PM2.5, and a significant threshold effect was estimated for PM2.5 at 13 and 11 μg m(-3) using generalized additive models and conditional logistic regression models, respectively. Stronger effect sizes above the threshold were typically noted compared to standard linear relationship, e.g., in the time series analysis, an interquartile range increase (9.2 μg m(-3)) in PM2.5 (5-day-moving average) had a risk ratio of 1.030 (95% CI: 1.001, 1.061) in the generalized additive models, and 1.066 (95% CI: 1.031, 1.102) in the threshold generalized additive models. The corresponding estimates for the case-crossover design were 1.039 (95% CI: 1.013, 1.066) in the conditional logistic regression, and 1.054 (95% CI: 1.023, 1.086) in the threshold conditional logistic regression. This study indicates that the associations of SO2 and PM2.5 concentrations with asthma emergency department visits and hospitalizations, as well as the estimated PM2.5 threshold were fairly consistent across time-series and case-crossover analyses, and suggests that effect estimates based on linear models (without thresholds) may underestimate the true risk. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. Model for the heart beat-to-beat time series during meditation

    NASA Astrophysics Data System (ADS)

    Capurro, A.; Diambra, L.; Malta, C. P.

    2003-09-01

    We present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a pacemaker, that simulates the membrane potential of the sinoatrial node, modulated by a periodic input signal plus correlated noise that simulates the respiratory input. The model was used to assess the waveshape of the respiratory signals needed to reproduce in the phase space the trajectory of experimental heart beat-to-beat interval data. The data sets were recorded during meditation practices of the Chi and Kundalini Yoga techniques. Our study indicates that in the first case the respiratory signal has the shape of a smoothed square wave, and in the second case it has the shape of a smoothed triangular wave.

  2. A scan statistic for identifying optimal risk windows in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Hambidge, Simon J; McClure, David L; Daley, Matthew F; Glanz, Jason M

    2013-08-30

    In the examination of the association between vaccines and rare adverse events after vaccination in postlicensure observational studies, it is challenging to define appropriate risk windows because prelicensure RCTs provide little insight on the timing of specific adverse events. Past vaccine safety studies have often used prespecified risk windows based on prior publications, biological understanding of the vaccine, and expert opinion. Recently, a data-driven approach was developed to identify appropriate risk windows for vaccine safety studies that use the self-controlled case series design. This approach employs both the maximum incidence rate ratio and the linear relation between the estimated incidence rate ratio and the inverse of average person time at risk, given a specified risk window. In this paper, we present a scan statistic that can identify appropriate risk windows in vaccine safety studies using the self-controlled case series design while taking into account the dependence of time intervals within an individual and while adjusting for time-varying covariates such as age and seasonality. This approach uses the maximum likelihood ratio test based on fixed-effects models, which has been used for analyzing data from self-controlled case series design in addition to conditional Poisson models. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Modeling Longitudinal Data with Generalized Additive Models: Applications to Single-Case Designs

    ERIC Educational Resources Information Center

    Sullivan, Kristynn J.; Shadish, William R.

    2013-01-01

    Single case designs (SCDs) are short time series that assess intervention effects by measuring units repeatedly over time both in the presence and absence of treatment. For a variety of reasons, interest in the statistical analysis and meta-analysis of these designs has been growing in recent years. This paper proposes modeling SCD data with…

  4. Case Studies Comparing System Advisor Model (SAM) Results to Real Performance Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blair, N.; Dobos, A.; Sather, N.

    2012-06-01

    NREL has completed a series of detailed case studies comparing the simulations of the System Advisor Model (SAM) and measured performance data or published performance expectations. These case studies compare PV measured performance data with simulated performance data using appropriate weather data. The measured data sets were primarily taken from NREL onsite PV systems and weather monitoring stations.

  5. Forecasting incidence of dengue in Rajasthan, using time series analyses.

    PubMed

    Bhatnagar, Sunil; Lal, Vivek; Gupta, Shiv D; Gupta, Om P

    2012-01-01

    To develop a prediction model for dengue fever/dengue haemorrhagic fever (DF/DHF) using time series data over the past decade in Rajasthan and to forecast monthly DF/DHF incidence for 2011. Seasonal autoregressive integrated moving average (SARIMA) model was used for statistical modeling. During January 2001 to December 2010, the reported DF/DHF cases showed a cyclical pattern with seasonal variation. SARIMA (0,0,1) (0,1,1) 12 model had the lowest normalized Bayesian information criteria (BIC) of 9.426 and mean absolute percentage error (MAPE) of 263.361 and appeared to be the best model. The proportion of variance explained by the model was 54.3%. Adequacy of the model was established through Ljung-Box test (Q statistic 4.910 and P-value 0.996), which showed no significant correlation between residuals at different lag times. The forecast for the year 2011 showed a seasonal peak in the month of October with an estimated 546 cases. Application of SARIMA model may be useful for forecast of cases and impending outbreaks of DF/DHF and other infectious diseases, which exhibit seasonal pattern.

  6. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE PAGES

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong; ...

    2017-12-18

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  7. Multivariate exploration of non-intrusive load monitoring via spatiotemporal pattern network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chao; Akintayo, Adedotun; Jiang, Zhanhong

    Non-intrusive load monitoring (NILM) of electrical demand for the purpose of identifying load components has thus far mostly been studied using univariate data, e.g., using only whole building electricity consumption time series to identify a certain type of end-use such as lighting load. However, using additional variables in the form of multivariate time series data may provide more information in terms of extracting distinguishable features in the context of energy disaggregation. In this work, a novel probabilistic graphical modeling approach, namely the spatiotemporal pattern network (STPN) is proposed for energy disaggregation using multivariate time-series data. The STPN framework is shownmore » to be capable of handling diverse types of multivariate time-series to improve the energy disaggregation performance. The technique outperforms the state of the art factorial hidden Markov models (FHMM) and combinatorial optimization (CO) techniques in multiple real-life test cases. Furthermore, based on two homes' aggregate electric consumption data, a similarity metric is defined for the energy disaggregation of one home using a trained model based on the other home (i.e., out-of-sample case). The proposed similarity metric allows us to enhance scalability via learning supervised models for a few homes and deploying such models to many other similar but unmodeled homes with significantly high disaggregation accuracy.« less

  8. Time series regression model for infectious disease and weather.

    PubMed

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Model for the respiratory modulation of the heart beat-to-beat time interval series

    NASA Astrophysics Data System (ADS)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  10. FBST for Cointegration Problems

    NASA Astrophysics Data System (ADS)

    Diniz, M.; Pereira, C. A. B.; Stern, J. M.

    2008-11-01

    In order to estimate causal relations, the time series econometrics has to be aware of spurious correlation, a problem first mentioned by Yule [21]. To solve the problem, one can work with differenced series or use multivariate models like VAR or VEC models. In this case, the analysed series are going to present a long run relation i.e. a cointegration relation. Even though the Bayesian literature about inference on VAR/VEC models is quite advanced, Bauwens et al. [2] highlight that "the topic of selecting the cointegrating rank has not yet given very useful and convincing results." This paper presents the Full Bayesian Significance Test applied to cointegration rank selection tests in multivariate (VAR/VEC) time series models and shows how to implement it using available in the literature and simulated data sets. A standard non-informative prior is assumed.

  11. BASINs and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & Non-point Sources (BASINS) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT). The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential effects of climate change on streamflow and water quality. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  12. Forecasting Hourly Water Demands With Seasonal Autoregressive Models for Real-Time Application

    NASA Astrophysics Data System (ADS)

    Chen, Jinduan; Boccelli, Dominic L.

    2018-02-01

    Consumer water demands are not typically measured at temporal or spatial scales adequate to support real-time decision making, and recent approaches for estimating unobserved demands using observed hydraulic measurements are generally not capable of forecasting demands and uncertainty information. While time series modeling has shown promise for representing total system demands, these models have generally not been evaluated at spatial scales appropriate for representative real-time modeling. This study investigates the use of a double-seasonal time series model to capture daily and weekly autocorrelations to both total system demands and regional aggregated demands at a scale that would capture demand variability across a distribution system. Emphasis was placed on the ability to forecast demands and quantify uncertainties with results compared to traditional time series pattern-based demand models as well as nonseasonal and single-seasonal time series models. Additional research included the implementation of an adaptive-parameter estimation scheme to update the time series model when unobserved changes occurred in the system. For two case studies, results showed that (1) for the smaller-scale aggregated water demands, the log-transformed time series model resulted in improved forecasts, (2) the double-seasonal model outperformed other models in terms of forecasting errors, and (3) the adaptive adjustment of parameters during forecasting improved the accuracy of the generated prediction intervals. These results illustrate the capabilities of time series modeling to forecast both water demands and uncertainty estimates at spatial scales commensurate for real-time modeling applications and provide a foundation for developing a real-time integrated demand-hydraulic model.

  13. Time series models on analysing mortality rates and acute childhood lymphoid leukaemia.

    PubMed

    Kis, Maria

    2005-01-01

    In this paper we demonstrate applying time series models on medical research. The Hungarian mortality rates were analysed by autoregressive integrated moving average models and seasonal time series models examined the data of acute childhood lymphoid leukaemia.The mortality data may be analysed by time series methods such as autoregressive integrated moving average (ARIMA) modelling. This method is demonstrated by two examples: analysis of the mortality rates of ischemic heart diseases and analysis of the mortality rates of cancer of digestive system. Mathematical expressions are given for the results of analysis. The relationships between time series of mortality rates were studied with ARIMA models. Calculations of confidence intervals for autoregressive parameters by tree methods: standard normal distribution as estimation and estimation of the White's theory and the continuous time case estimation. Analysing the confidence intervals of the first order autoregressive parameters we may conclude that the confidence intervals were much smaller than other estimations by applying the continuous time estimation model.We present a new approach to analysing the occurrence of acute childhood lymphoid leukaemia. We decompose time series into components. The periodicity of acute childhood lymphoid leukaemia in Hungary was examined using seasonal decomposition time series method. The cyclic trend of the dates of diagnosis revealed that a higher percent of the peaks fell within the winter months than in the other seasons. This proves the seasonal occurrence of the childhood leukaemia in Hungary.

  14. What Drives Parents? A Case Sensitive Inquiry into Parents' Mode Preferences for the Journey to School

    ERIC Educational Resources Information Center

    Zuniga, Kelly Draper

    2010-01-01

    This dissertation uses a case-sensitive approach to examine an active travel intervention's diverse target population. It builds on a series of travel choice models, and draws key conceptual themes from Chapin's (1974) human activity model, which highlights opportunity-related and propensity-related factors associated with behaviors. The research…

  15. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  16. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  17. Inference of scale-free networks from gene expression time series.

    PubMed

    Daisuke, Tominaga; Horton, Paul

    2006-04-01

    Quantitative time-series observation of gene expression is becoming possible, for example by cell array technology. However, there are no practical methods with which to infer network structures using only observed time-series data. As most computational models of biological networks for continuous time-series data have a high degree of freedom, it is almost impossible to infer the correct structures. On the other hand, it has been reported that some kinds of biological networks, such as gene networks and metabolic pathways, may have scale-free properties. We hypothesize that the architecture of inferred biological network models can be restricted to scale-free networks. We developed an inference algorithm for biological networks using only time-series data by introducing such a restriction. We adopt the S-system as the network model, and a distributed genetic algorithm to optimize models to fit its simulated results to observed time series data. We have tested our algorithm on a case study (simulated data). We compared optimization under no restriction, which allows for a fully connected network, and under the restriction that the total number of links must equal that expected from a scale free network. The restriction reduced both false positive and false negative estimation of the links and also the differences between model simulation and the given time-series data.

  18. Development of an On-board Failure Diagnostics and Prognostics System for Solid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Smelyanskiy, Vadim N.; Luchinsky, Dmitry G.; Osipov, Vyatcheslav V.; Timucin, Dogan A.; Uckun, Serdar

    2009-01-01

    We develop a case breach model for the on-board fault diagnostics and prognostics system for subscale solid-rocket boosters (SRBs). The model development was motivated by recent ground firing tests, in which a deviation of measured time-traces from the predicted time-series was observed. A modified model takes into account the nozzle ablation, including the effect of roughness of the nozzle surface, the geometry of the fault, and erosion and burning of the walls of the hole in the metal case. The derived low-dimensional performance model (LDPM) of the fault can reproduce the observed time-series data very well. To verify the performance of the LDPM we build a FLUENT model of the case breach fault and demonstrate a good agreement between theoretical predictions based on the analytical solution of the model equations and the results of the FLUENT simulations. We then incorporate the derived LDPM into an inferential Bayesian framework and verify performance of the Bayesian algorithm for the diagnostics and prognostics of the case breach fault. It is shown that the obtained LDPM allows one to track parameters of the SRB during the flight in real time, to diagnose case breach fault, and to predict its values in the future. The application of the method to fault diagnostics and prognostics (FD&P) of other SRB faults modes is discussed.

  19. Multivariable and Bayesian Network Analysis of Outcome Predictors in Acute Aneurysmal Subarachnoid Hemorrhage: Review of a Pure Surgical Series in the Post-International Subarachnoid Aneurysm Trial Era.

    PubMed

    Zador, Zsolt; Huang, Wendy; Sperrin, Matthew; Lawton, Michael T

    2018-06-01

    Following the International Subarachnoid Aneurysm Trial (ISAT), evolving treatment modalities for acute aneurysmal subarachnoid hemorrhage (aSAH) has changed the case mix of patients undergoing urgent surgical clipping. To update our knowledge on outcome predictors by analyzing admission parameters in a pure surgical series using variable importance ranking and machine learning. We reviewed a single surgeon's case series of 226 patients suffering from aSAH treated with urgent surgical clipping. Predictions were made using logistic regression models, and predictive performance was assessed using areas under the receiver operating curve (AUC). We established variable importance ranking using partial Nagelkerke R2 scores. Probabilistic associations between variables were depicted using Bayesian networks, a method of machine learning. Importance ranking showed that World Federation of Neurosurgical Societies (WFNS) grade and age were the most influential outcome prognosticators. Inclusion of only these 2 predictors was sufficient to maintain model performance compared to when all variables were considered (AUC = 0.8222, 95% confidence interval (CI): 0.7646-0.88 vs 0.8218, 95% CI: 0.7616-0.8821, respectively, DeLong's P = .992). Bayesian networks showed that age and WFNS grade were associated with several variables such as laboratory results and cardiorespiratory parameters. Our study is the first to report early outcomes and formal predictor importance ranking following aSAH in a post-ISAT surgical case series. Models showed good predictive power with fewer relevant predictors than in similar size series. Bayesian networks proved to be a powerful tool in visualizing the widespread association of the 2 key predictors with admission variables, explaining their importance and demonstrating the potential for hypothesis generation.

  20. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  1. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the ARMA class. In most cases the relative improvement of SETAR models against AR models of first order was low ranging between 1% and 4% with the exception of the three-regime model for the River Stour time-series where the improvement was 48.9%. In comparison, the relative improvement of MSW models was between 44.6% and 52.5 for two-regime and from 60.4% to 75% for three-regime models. However, the visual assessment of models plotted against original datasets showed that despite a high value of RSS, some ARMA models could describe the analyzed time-series better than AR, MA and SETAR models with lower values of RSS. In both datasets MSW models provided a very good visual fit describing most of the extreme values.

  2. Weather variability and the incidence of cryptosporidiosis: comparison of time series poisson regression and SARIMA models.

    PubMed

    Hu, Wenbiao; Tong, Shilu; Mengersen, Kerrie; Connell, Des

    2007-09-01

    Few studies have examined the relationship between weather variables and cryptosporidiosis in Australia. This paper examines the potential impact of weather variability on the transmission of cryptosporidiosis and explores the possibility of developing an empirical forecast system. Data on weather variables, notified cryptosporidiosis cases, and population size in Brisbane were supplied by the Australian Bureau of Meteorology, Queensland Department of Health, and Australian Bureau of Statistics for the period of January 1, 1996-December 31, 2004, respectively. Time series Poisson regression and seasonal auto-regression integrated moving average (SARIMA) models were performed to examine the potential impact of weather variability on the transmission of cryptosporidiosis. Both the time series Poisson regression and SARIMA models show that seasonal and monthly maximum temperature at a prior moving average of 1 and 3 months were significantly associated with cryptosporidiosis disease. It suggests that there may be 50 more cases a year for an increase of 1 degrees C maximum temperature on average in Brisbane. Model assessments indicated that the SARIMA model had better predictive ability than the Poisson regression model (SARIMA: root mean square error (RMSE): 0.40, Akaike information criterion (AIC): -12.53; Poisson regression: RMSE: 0.54, AIC: -2.84). Furthermore, the analysis of residuals shows that the time series Poisson regression appeared to violate a modeling assumption, in that residual autocorrelation persisted. The results of this study suggest that weather variability (particularly maximum temperature) may have played a significant role in the transmission of cryptosporidiosis. A SARIMA model may be a better predictive model than a Poisson regression model in the assessment of the relationship between weather variability and the incidence of cryptosporidiosis.

  3. Case Studies of Successful Schoolwide Enrichment Model-Reading (SEM-R) Classroom Implementations. Research Monograph Series. RM10204

    ERIC Educational Resources Information Center

    Reis, Sally M.; Little, Catherine A.; Fogarty, Elizabeth; Housand, Angela M.; Housand, Brian C.; Sweeny, Sheelah M.; Eckert, Rebecca D.; Muller, Lisa M.

    2010-01-01

    The purpose of this qualitative study was to examine the scaling up of the Schoolwide Enrichment Model in Reading (SEM-R) in 11 elementary and middle schools in geographically diverse sites across the country. Qualitative comparative analysis was used in this study, with multiple data sources compiled into 11 in-depth school case studies…

  4. Fully Bayesian Estimation of Data from Single Case Designs

    ERIC Educational Resources Information Center

    Rindskopf, David

    2013-01-01

    Single case designs (SCDs) generally consist of a small number of short time series in two or more phases. The analysis of SCDs statistically fits in the framework of a multilevel model, or hierarchical model. The usual analysis does not take into account the uncertainty in the estimation of the random effects. This not only has an effect on the…

  5. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    PubMed Central

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  6. A four-stage hybrid model for hydrological time series forecasting.

    PubMed

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  7. Road safety forecasts in five European countries using structural time series models.

    PubMed

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  8. Effect of temperature and precipitation on salmonellosis cases in South-East Queensland, Australia: an observational study

    PubMed Central

    Barnett, Adrian Gerard

    2016-01-01

    Objective Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather–salmonellosis associations. Design We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. Results The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5°C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Conclusions Switching models improve on traditional time series models in quantifying weather–salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis. PMID:26916693

  9. ASSESSING MULTIMEDIA/MULTIPATHWAY EXPOSURE TO ARSENIC USING A MECHANISTIC SOURCE-TO-DOSE MODELING FRAMEWORK

    EPA Science Inventory

    A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...

  10. Boolean network identification from perturbation time series data combining dynamics abstraction and logic programming.

    PubMed

    Ostrowski, M; Paulevé, L; Schaub, T; Siegel, A; Guziolowski, C

    2016-11-01

    Boolean networks (and more general logic models) are useful frameworks to study signal transduction across multiple pathways. Logic models can be learned from a prior knowledge network structure and multiplex phosphoproteomics data. However, most efficient and scalable training methods focus on the comparison of two time-points and assume that the system has reached an early steady state. In this paper, we generalize such a learning procedure to take into account the time series traces of phosphoproteomics data in order to discriminate Boolean networks according to their transient dynamics. To that end, we identify a necessary condition that must be satisfied by the dynamics of a Boolean network to be consistent with a discretized time series trace. Based on this condition, we use Answer Set Programming to compute an over-approximation of the set of Boolean networks which fit best with experimental data and provide the corresponding encodings. Combined with model-checking approaches, we end up with a global learning algorithm. Our approach is able to learn logic models with a true positive rate higher than 78% in two case studies of mammalian signaling networks; for a larger case study, our method provides optimal answers after 7min of computation. We quantified the gain in our method predictions precision compared to learning approaches based on static data. Finally, as an application, our method proposes erroneous time-points in the time series data with respect to the optimal learned logic models. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Relating Factor Models for Longitudinal Data to Quasi-Simplex and NARMA Models

    ERIC Educational Resources Information Center

    Rovine, Michael J.; Molenaar, Peter C. M.

    2005-01-01

    In this article we show the one-factor model can be rewritten as a quasi-simplex model. Using this result along with addition theorems from time series analysis, we describe a common general model, the nonstationary autoregressive moving average (NARMA) model, that includes as a special case, any latent variable model with continuous indicators…

  12. [Study on the ARIMA model application to predict echinococcosis cases in China].

    PubMed

    En-Li, Tan; Zheng-Feng, Wang; Wen-Ce, Zhou; Shi-Zhu, Li; Yan, Lu; Lin, Ai; Yu-Chun, Cai; Xue-Jiao, Teng; Shun-Xian, Zhang; Zhi-Sheng, Dang; Chun-Li, Yang; Jia-Xu, Chen; Wei, Hu; Xiao-Nong, Zhou; Li-Guang, Tian

    2018-02-26

    To predict the monthly reported echinococcosis cases in China with the autoregressive integrated moving average (ARIMA) model, so as to provide a reference for prevention and control of echinococcosis. SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported echinococcosis cases of time series from 2007 to 2015 and 2007 to 2014, respectively, and the accuracies of the two ARIMA models were compared. The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2015 was ARIMA (1, 0, 0) (1, 1, 0) 12 , the relative error among reported cases and predicted cases was -13.97%, AR (1) = 0.367 ( t = 3.816, P < 0.001), SAR (1) = -0.328 ( t = -3.361, P = 0.001), and Ljung-Box Q = 14.119 ( df = 16, P = 0.590) . The model based on the data of the monthly reported cases of echinococcosis in China from 2007 to 2014 was ARIMA (1, 0, 0) (1, 0, 1) 12 , the relative error among reported cases and predicted cases was 0.56%, AR (1) = 0.413 ( t = 4.244, P < 0.001), SAR (1) = 0.809 ( t = 9.584, P < 0.001), SMA (1) = 0.356 ( t = 2.278, P = 0.025), and Ljung-Box Q = 18.924 ( df = 15, P = 0.217). The different time series may have different ARIMA models as for the same infectious diseases. It is needed to be further verified that the more data are accumulated, the shorter time of predication is, and the smaller the average of the relative error is. The establishment and prediction of an ARIMA model is a dynamic process that needs to be adjusted and optimized continuously according to the accumulated data, meantime, we should give full consideration to the intensity of the work related to infectious diseases reported (such as disease census and special investigation).

  13. Uniting Mandelbrot’s Noah and Joseph Effects in Toy Models of Natural Hazard Time Series

    NASA Astrophysics Data System (ADS)

    Credgington, D.; Watkins, N. W.; Chapman, S. C.; Rosenberg, S. J.; Sanchez, R.

    2009-12-01

    The forecasting of extreme events is a highly topical, cross-disciplinary problem. One aspect which is potentially tractable even when the events themselves are stochastic is the probability of a “burst” of a given size and duration, defined as the area between a time series and a constant threshold. Many natural time series depart from the simplest, Brownian, case and in the 1960s Mandelbrot developed the use of fractals to describe these departures. In particular he proposed two kinds of fractal model to capture the way in which natural data is often persistent in time (his “Joseph effect”, common in hydrology and exemplified by fractional Brownian motion) and/or prone to heavy tailed jumps (the “Noah effect”, typical of economic index time series, for which he gave Levy flights as an examplar). Much of the earlier modelling, however, has emphasised one of the Noah and Joseph parameters (the tail exponent mu and one derived from the temporal behaviour such as power spectral beta) at the other one's expense. I will describe work [1] in which we applied a simple self-affine stable model-linear fractional stable motion (LFSM)-which unifies both effects to better describe natural data, in this case from space physics. I will show how we have resolved some contradictions seen in earlier work, where purely Joseph or Noah descriptions had been sought. I will also show recent work [2] using numerical simulations of LFSM and simple analytic scaling arguments to study the problem of the area between a fractional Levy model time series and a threshold. [1] Watkins et al, Space Science Reviews [2005] [2] Watkins et al, Physical Review E [2009

  14. Evaluation of the impact on human salmonellosis of control measures targeted to Salmonella Enteritidis and Typhimurium in poultry breeding using time-series analysis and intervention models in France.

    PubMed

    Poirier, E; Watier, L; Espie, E; Weill, F-X; De Valk, H; Desenclos, J-C

    2008-09-01

    In France, salmonellosis is the main cause of foodborne bacterial infection with serotypes Enteritis (SE) and Typhimurium (ST) accounting for 70% of all cases. French authorities implemented a national control programme targeting SE and ST in poultry and eggs from October 1998 onwards. A 33% decrease in salmonellosis has been observed since implementation. We designed an evaluation of the impact of this control programme on SE and ST human infections in France. Using monthly Salmonella human isolate reports to the National Reference Centre we defined two intervention series (SE and ST) and one control series comprising serotypes not know to be associated with poultry or eggs. The series, from 1992 to 2003, were analysed using autoregressive moving average models (ARMA). To test the hypothesis of a reduction of SE and ST human cases >0 after the programme started and to estimate its size, we introduced an intervention model to the ARMA modelling. In contrast to the control series, we found an annual reduction of 555 (95% CI 148-964) SE and of 492 (95% CI 0-1092) ST human infections, representing respectively a 21% and 18% decrease. For SE, the decrease occurred sharply after implementation while for ST, it followed a progressive decrease that started early in 1998. Our study, suggests a true relation between the Salmonella control programme and the subsequent decrease observed for the two targeted serotypes. For ST, however, the decrease prior to the intervention may also reflect control measures implemented earlier by the cattle and milk industry.

  15. Generalized multiplicative error models: Asymptotic inference and empirical analysis

    NASA Astrophysics Data System (ADS)

    Li, Qian

    This dissertation consists of two parts. The first part focuses on extended Multiplicative Error Models (MEM) that include two extreme cases for nonnegative series. These extreme cases are common phenomena in high-frequency financial time series. The Location MEM(p,q) model incorporates a location parameter so that the series are required to have positive lower bounds. The estimator for the location parameter turns out to be the minimum of all the observations and is shown to be consistent. The second case captures the nontrivial fraction of zero outcomes feature in a series and combines a so-called Zero-Augmented general F distribution with linear MEM(p,q). Under certain strict stationary and moment conditions, we establish a consistency and asymptotic normality of the semiparametric estimation for these two new models. The second part of this dissertation examines the differences and similarities between trades in the home market and trades in the foreign market of cross-listed stocks. We exploit the multiplicative framework to model trading duration, volume per trade and price volatility for Canadian shares that are cross-listed in the New York Stock Exchange (NYSE) and the Toronto Stock Exchange (TSX). We explore the clustering effect, interaction between trading variables, and the time needed for price equilibrium after a perturbation for each market. The clustering effect is studied through the use of univariate MEM(1,1) on each variable, while the interactions among duration, volume and price volatility are captured by a multivariate system of MEM(p,q). After estimating these models by a standard QMLE procedure, we exploit the Impulse Response function to compute the calendar time for a perturbation in these variables to be absorbed into price variance, and use common statistical tests to identify the difference between the two markets in each aspect. These differences are of considerable interest to traders, stock exchanges and policy makers.

  16. Time lag between immigration and tuberculosis rates in immigrants in the Netherlands: a time-series analysis.

    PubMed

    van Aart, C; Boshuizen, H; Dekkers, A; Korthals Altes, H

    2017-05-01

    In low-incidence countries, most tuberculosis (TB) cases are foreign-born. We explored the temporal relationship between immigration and TB in first-generation immigrants between 1995 and 2012 to assess whether immigration can be a predictor for TB in immigrants from high-incidence countries. We obtained monthly data on immigrant TB cases and immigration for the three countries of origin most frequently represented among TB cases in the Netherlands: Morocco, Somalia and Turkey. The best-fit seasonal autoregressive integrated moving average (SARIMA) model to the immigration time-series was used to prewhiten the TB time series. The cross-correlation function (CCF) was then computed on the residual time series to detect time lags between immigration and TB rates. We identified a 17-month lag between Somali immigration and Somali immigrant TB cases, but no time lag for immigrants from Morocco and Turkey. The absence of a lag in the Moroccan and Turkish population may be attributed to the relatively low TB prevalence in the countries of origin and an increased likelihood of reactivation TB in an ageing immigrant population. Understanding the time lag between Somali immigration and TB disease would benefit from a closer epidemiological analysis of cohorts of Somali cases diagnosed within the first years after entry.

  17. [Automated detection of estrus and mastitis in dairy cows].

    PubMed

    de Mol, R M

    2001-02-15

    The development and test of detection models for oestrus and mastitis in dairy cows is described in a PhD thesis that was defended in Wageningen on June 5, 2000. These models were based on sensors for milk yield, milk temperature, electrical conductivity of milk, and cow activity and concentrate intake, and on combined processing of the sensor data. The models alert farmers to cows that need attention, because of possible oestrus or mastitis. A first detection model for cows, milked twice a day, was based on time series models for the sensor variables. A time series model describes the dependence between successive observations. The parameters of the time series models were fitted on-line for each cow after each milking by means of a Kalman filter, a mathematical method to estimate the state of a system on-line. The Kalman filter gives the best estimate of the current state of a system based on all preceding observations. This model was tested for 2 years on two experimental farms, and under field conditions on four farms over several years. A second detection model, for cow milked in an automatic milking system (AMS), was based on a generalization of the first model. Two data sets (one small, one large) were used for testing. The results for oestrus detection were good for both models. The results for mastitis detection were varying (in some cases good, in other cases moderate). Fuzzy logic was used to classify mastitis and oestrus alerts with both detection models, to reduce the number of false positive alerts. Fuzzy logic makes approximate reasoning possible, where statements can be partly true or false. Input for the fuzzy logic model were alerts from the detection models and additional information. The number of false positive alerts decreased considerably, while the number of detected cases remained at the same level. These models make automated detection possible in practice.

  18. Computer aided system engineering and analysis (CASE/A) modeling package for ECLS systems - An overview

    NASA Technical Reports Server (NTRS)

    Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.

    1990-01-01

    An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.

  19. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  20. Renormalization group analysis of the Reynolds stress transport equation

    NASA Technical Reports Server (NTRS)

    Rubinstein, R.; Barton, J. M.

    1992-01-01

    The pressure velocity correlation and return to isotropy term in the Reynolds stress transport equation are analyzed using the Yakhot-Orszag renormalization group. The perturbation series for the relevant correlations, evaluated to lowest order in the epsilon-expansion of the Yakhot-Orszag theory, are infinite series in tensor product powers of the mean velocity gradient and its transpose. Formal lowest order Pade approximations to the sums of these series produce a fast pressure strain model of the form proposed by Launder, Reece, and Rodi, and a return to isotropy model of the form proposed by Rotta. In both cases, the model constant are computed theoretically. The predicted Reynolds stress ratios in simple shear flows are evaluated and compared with experimental data. The possibility is discussed of driving higher order nonlinear models by approximating the sums more accurately.

  1. A seasonal Bartlett-Lewis Rectangular Pulse model

    NASA Astrophysics Data System (ADS)

    Ritschel, Christoph; Agbéko Kpogo-Nuwoklo, Komlan; Rust, Henning; Ulbrich, Uwe; Névir, Peter

    2016-04-01

    Precipitation time series with a high temporal resolution are needed as input for several hydrological applications, e.g. river runoff or sewer system models. As adequate observational data sets are often not available, simulated precipitation series come to use. Poisson-cluster models are commonly applied to generate these series. It has been shown that this class of stochastic precipitation models is able to well reproduce important characteristics of observed rainfall. For the gauge based case study presented here, the Bartlett-Lewis rectangular pulse model (BLRPM) has been chosen. As it has been shown that certain model parameters vary with season in a midlatitude moderate climate due to different rainfall mechanisms dominating in winter and summer, model parameters are typically estimated separately for individual seasons or individual months. Here, we suggest a simultaneous parameter estimation for the whole year under the assumption that seasonal variation of parameters can be described with harmonic functions. We use an observational precipitation series from Berlin with a high temporal resolution to exemplify the approach. We estimate BLRPM parameters with and without this seasonal extention and compare the results in terms of model performance and robustness of the estimation.

  2. A Method for Comparing Multivariate Time Series with Different Dimensions

    PubMed Central

    Tapinos, Avraam; Mendes, Pedro

    2013-01-01

    In many situations it is desirable to compare dynamical systems based on their behavior. Similarity of behavior often implies similarity of internal mechanisms or dependency on common extrinsic factors. While there are widely used methods for comparing univariate time series, most dynamical systems are characterized by multivariate time series. Yet, comparison of multivariate time series has been limited to cases where they share a common dimensionality. A semi-metric is a distance function that has the properties of non-negativity, symmetry and reflexivity, but not sub-additivity. Here we develop a semi-metric – SMETS – that can be used for comparing groups of time series that may have different dimensions. To demonstrate its utility, the method is applied to dynamic models of biochemical networks and to portfolios of shares. The former is an example of a case where the dependencies between system variables are known, while in the latter the system is treated (and behaves) as a black box. PMID:23393554

  3. Second-degree Stokes coefficients from multi-satellite SLR

    NASA Astrophysics Data System (ADS)

    Bloßfeld, Mathis; Müller, Horst; Gerstl, Michael; Štefka, Vojtěch; Bouman, Johannes; Göttl, Franziska; Horwath, Martin

    2015-09-01

    The long wavelength part of the Earth's gravity field can be determined, with varying accuracy, from satellite laser ranging (SLR). In this study, we investigate the combination of up to ten geodetic SLR satellites using iterative variance component estimation. SLR observations to different satellites are combined in order to identify the impact of each satellite on the estimated Stokes coefficients. The combination of satellite-specific weekly or monthly arcs allows to reduce parameter correlations of the single-satellite solutions and leads to alternative estimates of the second-degree Stokes coefficients. This alternative time series might be helpful for assessing the uncertainty in the impact of the low-degree Stokes coefficients on geophysical investigations. In order to validate the obtained time series of second-degree Stokes coefficients, a comparison with the SLR RL05 time series of the Center of Space Research (CSR) is done. This investigation shows that all time series are comparable to the CSR time series. The precision of the weekly/monthly and coefficients is analyzed by comparing mass-related equatorial excitation functions with geophysical model results and reduced geodetic excitation functions. In case of , the annual amplitude and phase of the DGFI solution agrees better with three of four geophysical model combinations than other time series. In case of , all time series agree very well to each other. The impact of on the ice mass trend estimates for Antarctica are compared based on CSR GRACE RL05 solutions, in which different monthly time series are used for replacing. We found differences in the long-term Antarctic ice loss of Gt/year between the GRACE solutions induced by the different SLR time series of CSR and DGFI, which is about 13 % of the total ice loss of Antarctica. This result shows that Antarctic ice mass loss quantifications must be carefully interpreted.

  4. Models for short term malaria prediction in Sri Lanka

    PubMed Central

    Briët, Olivier JT; Vounatsou, Penelope; Gunawardena, Dissanayake M; Galappaththy, Gawrie NL; Amerasinghe, Priyanie H

    2008-01-01

    Background Malaria in Sri Lanka is unstable and fluctuates in intensity both spatially and temporally. Although the case counts are dwindling at present, given the past history of resurgence of outbreaks despite effective control measures, the control programmes have to stay prepared. The availability of long time series of monitored/diagnosed malaria cases allows for the study of forecasting models, with an aim to developing a forecasting system which could assist in the efficient allocation of resources for malaria control. Methods Exponentially weighted moving average models, autoregressive integrated moving average (ARIMA) models with seasonal components, and seasonal multiplicative autoregressive integrated moving average (SARIMA) models were compared on monthly time series of district malaria cases for their ability to predict the number of malaria cases one to four months ahead. The addition of covariates such as the number of malaria cases in neighbouring districts or rainfall were assessed for their ability to improve prediction of selected (seasonal) ARIMA models. Results The best model for forecasting and the forecasting error varied strongly among the districts. The addition of rainfall as a covariate improved prediction of selected (seasonal) ARIMA models modestly in some districts but worsened prediction in other districts. Improvement by adding rainfall was more frequent at larger forecasting horizons. Conclusion Heterogeneity of patterns of malaria in Sri Lanka requires regionally specific prediction models. Prediction error was large at a minimum of 22% (for one of the districts) for one month ahead predictions. The modest improvement made in short term prediction by adding rainfall as a covariate to these prediction models may not be sufficient to merit investing in a forecasting system for which rainfall data are routinely processed. PMID:18460204

  5. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    NASA Astrophysics Data System (ADS)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  6. Syntheses of the current model applications for managing water and needs for experimental data and model improvements to enhance these applications

    USDA-ARS?s Scientific Manuscript database

    This volume of the Advances in Agricultural Systems Modeling series presents 14 different case studies of model applications to help make the best use of limited water in agriculture. These examples show that models have tremendous potential and value in enhancing site-specific water management for ...

  7. A time series model: First-order integer-valued autoregressive (INAR(1))

    NASA Astrophysics Data System (ADS)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  8. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    PubMed

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  9. Prediction of gastrointestinal disease with over-the-counter diarrheal remedy sales records in the San Francisco Bay Area.

    PubMed

    Kirian, Michelle L; Weintraub, June M

    2010-07-20

    Water utilities continue to be interested in implementing syndromic surveillance for the enhanced detection of waterborne disease outbreaks. The authors evaluated the ability of sales of over-the-counter diarrheal remedies available from the National Retail Data Monitor to predict endemic and epidemic gastrointestinal disease in the San Francisco Bay Area. Time series models were fit to weekly diarrheal remedy sales and diarrheal illness case counts. Cross-correlations between the pre-whitened residual series were calculated. Diarrheal remedy sales model residuals were regressed on the number of weekly outbreaks and outbreak-associated cases. Diarrheal remedy sales models were used to auto-forecast one week-ahead sales. The sensitivity and specificity of signals, generated by observed diarrheal remedy sales exceeding the upper 95% forecast confidence interval, in predicting weekly outbreaks were calculated. No significant correlations were identified between weekly diarrheal remedy sales and diarrhea illness case counts, outbreak counts, or the number of outbreak-associated cases. Signals generated by forecasting with the diarrheal remedy sales model did not coincide with outbreak weeks more reliably than signals chosen randomly. This work does not support the implementation of syndromic surveillance for gastrointestinal disease with data available though the National Retail Data Monitor.

  10. Covariance Function for Nearshore Wave Assimilation Systems

    DTIC Science & Technology

    2018-01-30

    covariance can be modeled by a parameterized Gaussian function, for nearshore wave assimilation applications, the covariance function depends primarily on...case of missing values at the compiled time series, the gaps were filled by weighted interpolation. The weights depend on the number of the...averaging, in order to create the continuous time series, filters out the dependency on the instantaneous meteorological and oceanographic conditions

  11. Case Series in Cognitive Neuropsychology: Promise, Perils and Proper Perspective

    PubMed Central

    Rapp, Brenda

    2012-01-01

    Schwartz & Dell (2010) advocated for a major role for case series investigations in cognitive neuropsychology. They defined the key features of this approach and presented a number of arguments and examples illustrating the benefits of case series studies and their contribution to computational cognitive neuropsychology. In the Special Issue on “Case Series in Cognitive Neuropsychology” there are six commentaries on Schwartz and Dell (2010) as well as a response to the six commentaries by Dell and Schwartz. In this paper, I provide a brief summary of the key points made in Schwartz and Dell (2010) and I review the promise and perils of case series design as revealed by the six commentaries. I conclude by placing the set of papers within a broader perspective, providing some clarification of the historical record on case series and single case approaches, raising some cautionary notes for case series studies and situating both case series and single case approaches within the larger context of theory development in the cognitive sciences. PMID:22746685

  12. BASINS and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications (External Review Draft)

    EPA Science Inventory

    This draft report supports application of two recently developed water modeling tools, the BASINS and WEPP climate assessment tools. The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments...

  13. Evolving Curricular Models in Culinary Arts: An Instrumental Case Study of a Technical Field

    ERIC Educational Resources Information Center

    Cossio, Allison

    2016-01-01

    The purpose of this research study was to examine how chefs and other individuals in the food industry understood the field of culinary arts. This study used an instrumental case study with purposeful sampling of multiple cases. Through a series of open-ended interviews using snowball-sampling strategy that concluded with 45 participants sharing…

  14. Evaluation of the impact on human salmonellosis of control measures targeted to Salmonella Enteritidis and Typhimurium in poultry breeding using time-series analysis and intervention models in France

    PubMed Central

    POIRIER, E.; WATIER, L.; ESPIE, E.; WEILL, F.-X.; VALK, H. DE; DESENCLOS, J.-C.

    2008-01-01

    SUMMARY In France, salmonellosis is the main cause of foodborne bacterial infection with serotypes Enteritis (SE) and Typhimurium (ST) accounting for 70% of all cases. French authorities implemented a national control programme targeting SE and ST in poultry and eggs from October 1998 onwards. A 33% decrease in salmonellosis has been observed since implementation. We designed an evaluation of the impact of this control programme on SE and ST human infections in France. Using monthly Salmonella human isolate reports to the National Reference Centre we defined two intervention series (SE and ST) and one control series comprising serotypes not know to be associated with poultry or eggs. The series, from 1992 to 2003, were analysed using autoregressive moving average models (ARMA). To test the hypothesis of a reduction of SE and ST human cases >0 after the programme started and to estimate its size, we introduced an intervention model to the ARMA modelling. In contrast to the control series, we found an annual reduction of 555 (95% CI 148–964) SE and of 492 (95% CI 0–1092) ST human infections, representing respectively a 21% and 18% decrease. For SE, the decrease occurred sharply after implementation while for ST, it followed a progressive decrease that started early in 1998. Our study, suggests a true relation between the Salmonella control programme and the subsequent decrease observed for the two targeted serotypes. For ST, however, the decrease prior to the intervention may also reflect control measures implemented earlier by the cattle and milk industry. PMID:18047748

  15. Time series modelling to forecast prehospital EMS demand for diabetic emergencies.

    PubMed

    Villani, Melanie; Earnest, Arul; Nanayakkara, Natalie; Smith, Karen; de Courten, Barbora; Zoungas, Sophia

    2017-05-05

    Acute diabetic emergencies are often managed by prehospital Emergency Medical Services (EMS). The projected growth in prevalence of diabetes is likely to result in rising demand for prehospital EMS that are already under pressure. The aims of this study were to model the temporal trends and provide forecasts of prehospital attendances for diabetic emergencies. A time series analysis on monthly cases of hypoglycemia and hyperglycemia was conducted using data from the Ambulance Victoria (AV) electronic database between 2009 and 2015. Using the seasonal autoregressive integrated moving average (SARIMA) modelling process, different models were evaluated. The most parsimonious model with the highest accuracy was selected. Forty-one thousand four hundred fifty-four prehospital diabetic emergencies were attended over a seven-year period with an increase in the annual median monthly caseload between 2009 (484.5) and 2015 (549.5). Hypoglycemia (70%) and people with type 1 diabetes (48%) accounted for most attendances. The SARIMA (0,1,0,12) model provided the best fit, with a MAPE of 4.2% and predicts a monthly caseload of approximately 740 by the end of 2017. Prehospital EMS demand for diabetic emergencies is increasing. SARIMA time series models are a valuable tool to allow forecasting of future caseload with high accuracy and predict increasing cases of prehospital diabetic emergencies into the future. The model generated by this study may be used by service providers to allow appropriate planning and resource allocation of EMS for diabetic emergencies.

  16. Using missing ordinal patterns to detect nonlinearity in time series data.

    PubMed

    Kulp, Christopher W; Zunino, Luciano; Osborne, Thomas; Zawadzki, Brianna

    2017-08-01

    The number of missing ordinal patterns (NMP) is the number of ordinal patterns that do not appear in a series after it has been symbolized using the Bandt and Pompe methodology. In this paper, the NMP is demonstrated as a test for nonlinearity using a surrogate framework in order to see if the NMP for a series is statistically different from the NMP of iterative amplitude adjusted Fourier transform (IAAFT) surrogates. It is found that the NMP works well as a test statistic for nonlinearity, even in the cases of very short time series. Both model and experimental time series are used to demonstrate the efficacy of the NMP as a test for nonlinearity.

  17. Integrating distributional, spatial prioritization, and individual-based models to evaluate potential critical habitat networks: A case study using the Northern Spotted Owl

    EPA Science Inventory

    As part of the northern spotted owl recovery planning effort, we evaluated a series of alternative critical habitat scenarios using a species-distribution model (MaxEnt), a conservation-planning model (Zonation), and an individual-based population model (HexSim). With this suite ...

  18. Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.

    PubMed

    Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R

    2009-08-01

    This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.

  19. Selection of Worst-Case Pesticide Leaching Scenarios for Pesticide Registration

    NASA Astrophysics Data System (ADS)

    Vereecken, H.; Tiktak, A.; Boesten, J.; Vanderborght, J.

    2010-12-01

    The use of pesticides, fertilizers and manure in intensive agriculture may have a negative impact on the quality of ground- and surface water resources. Legislative action has been undertaken in many countries to protect surface and groundwater resources from contamination by surface applied agrochemicals. Of particular concern are pesticides. The registration procedure plays an important role in the regulation of pesticide use in the European Union. In order to register a certain pesticide use, the notifier needs to prove that the use does not entail a risk of groundwater contamination. Therefore, leaching concentrations of the pesticide need to be assessed using model simulations for so called worst-case scenarios. In the current procedure, a worst-case scenario represents a parameterized pesticide fate model for a certain soil and a certain time series of weather conditions that tries to represent all relevant processes such as transient water flow, root water uptake, pesticide transport, sorption, decay and volatilisation as accurate as possible. Since this model has been parameterized for only one soil and weather time series, it is uncertain whether it represents a worst-case condition for a certain pesticide use. We discuss an alternative approach that uses a simpler model that requires less detailed information about the soil and weather conditions but still represents the effect of soil and climate on pesticide leaching using information that is available for the entire European Union. A comparison between the two approaches demonstrates that the higher precision that the detailed model provides for the prediction of pesticide leaching at a certain site is counteracted by its smaller accuracy to represent a worst case condition. The simpler model predicts leaching concentrations less precise at a certain site but has a complete coverage of the area so that it selects a worst-case condition more accurately.

  20. Signatures of ecological processes in microbial community time series.

    PubMed

    Faust, Karoline; Bauchinger, Franziska; Laroche, Béatrice; de Buyl, Sophie; Lahti, Leo; Washburne, Alex D; Gonze, Didier; Widder, Stefanie

    2018-06-28

    Growth rates, interactions between community members, stochasticity, and immigration are important drivers of microbial community dynamics. In sequencing data analysis, such as network construction and community model parameterization, we make implicit assumptions about the nature of these drivers and thereby restrict model outcome. Despite apparent risk of methodological bias, the validity of the assumptions is rarely tested, as comprehensive procedures are lacking. Here, we propose a classification scheme to determine the processes that gave rise to the observed time series and to enable better model selection. We implemented a three-step classification scheme in R that first determines whether dependence between successive time steps (temporal structure) is present in the time series and then assesses with a recently developed neutrality test whether interactions between species are required for the dynamics. If the first and second tests confirm the presence of temporal structure and interactions, then parameters for interaction models are estimated. To quantify the importance of temporal structure, we compute the noise-type profile of the community, which ranges from black in case of strong dependency to white in the absence of any dependency. We applied this scheme to simulated time series generated with the Dirichlet-multinomial (DM) distribution, Hubbell's neutral model, the generalized Lotka-Volterra model and its discrete variant (the Ricker model), and a self-organized instability model, as well as to human stool microbiota time series. The noise-type profiles for all but DM data clearly indicated distinctive structures. The neutrality test correctly classified all but DM and neutral time series as non-neutral. The procedure reliably identified time series for which interaction inference was suitable. Both tests were required, as we demonstrated that all structured time series, including those generated with the neutral model, achieved a moderate to high goodness of fit to the Ricker model. We present a fast and robust scheme to classify community structure and to assess the prevalence of interactions directly from microbial time series data. The procedure not only serves to determine ecological drivers of microbial dynamics, but also to guide selection of appropriate community models for prediction and follow-up analysis.

  1. Robust Semi-Active Ride Control under Stochastic Excitation

    DTIC Science & Technology

    2014-01-01

    broad classes of time-series models which are of practical importance; the Auto-Regressive (AR) models, the Integrated (I) models, and the Moving...Average (MA) models [12]. Combinations of these models result in autoregressive moving average (ARMA) and autoregressive integrated moving average...Down Up 4) Down Down These four cases can be written in compact form as: (20) Where is the Heaviside

  2. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    PubMed

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  3. Reliability prediction of ontology-based service compositions using Petri net and time series models.

    PubMed

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy.

  4. Integrating species distributional, conservation planning, and individual based population models: A case study in critical habitat evaluation for the Northern Spotted Owl

    EPA Science Inventory

    Background / Question / Methods As part of the ongoing northern spotted owl recovery planning effort, we evaluated a series of alternative potential critical habitat scenarios using a species-distribution model (MaxEnt), a conservation-planning model (Zonation), and an individua...

  5. PARADIGM: The Partnership for Advancing Interdisciplinary Global Modeling Annual Report - Year 2

    DTIC Science & Technology

    2004-02-01

    case (a) when bacteria are able to regenerate ammonium based upon the composition of the dissolved organic pool. The export is also slightly larger...for diazotrophs and detritus. The addition of diazotrophs and detritus in the model follow the method of Fennel et al. [2002]. Time series of model

  6. Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case

    NASA Technical Reports Server (NTRS)

    Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.

    2010-01-01

    Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.

  7. Application of the Analog Method to Modelling Heat Waves: A Case Study with Power Transformers

    DTIC Science & Technology

    2017-04-21

    UNCLASSIFIED Massachusetts Institute of Technology Lincoln Laboratory APPLICATION OF THE ANALOG METHOD TO MODELLING HEAT WAVES: A CASE STUDY WITH...18 2 Calibration and validation statistics with the use of five atmospheric vari- ables to construct analogue diagnostics for JJA of transformer T2...electrical grid as a series of nodes (transformers) and edges (transmission lines) so that basic mathematical anal- ysis can be performed. The mathematics

  8. Nonlinear time-series-based adaptive control applications

    NASA Technical Reports Server (NTRS)

    Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.

    1991-01-01

    A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.

  9. Statistical distributions of extreme dry spell in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Zin, Wan Zawiah Wan; Jemain, Abdul Aziz

    2010-11-01

    Statistical distributions of annual extreme (AE) series and partial duration (PD) series for dry-spell event are analyzed for a database of daily rainfall records of 50 rain-gauge stations in Peninsular Malaysia, with recording period extending from 1975 to 2004. The three-parameter generalized extreme value (GEV) and generalized Pareto (GP) distributions are considered to model both series. In both cases, the parameters of these two distributions are fitted by means of the L-moments method, which provides a robust estimation of them. The goodness-of-fit (GOF) between empirical data and theoretical distributions are then evaluated by means of the L-moment ratio diagram and several goodness-of-fit tests for each of the 50 stations. It is found that for the majority of stations, the AE and PD series are well fitted by the GEV and GP models, respectively. Based on the models that have been identified, we can reasonably predict the risks associated with extreme dry spells for various return periods.

  10. Projects That Matter: Concepts and Models for Service-Learning in Engineering. AAHE's Series on Service-Learning in the Disciplines.

    ERIC Educational Resources Information Center

    Tsang, Edmund, Ed.

    This volume, the 14th in a series of monographs on service learning and academic disciplinary areas, is designed as a practical guide for faculty seeking to integrate service learning into an engineering course. The volume also deals with larger issues in engineering education and provides case studies of service-learning courses. The articles…

  11. Crude oil price forecasting based on hybridizing wavelet multiple linear regression model, particle swarm optimization techniques, and principal component analysis.

    PubMed

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series.

  12. Crude Oil Price Forecasting Based on Hybridizing Wavelet Multiple Linear Regression Model, Particle Swarm Optimization Techniques, and Principal Component Analysis

    PubMed Central

    Shabri, Ani; Samsudin, Ruhaidah

    2014-01-01

    Crude oil prices do play significant role in the global economy and are a key input into option pricing formulas, portfolio allocation, and risk measurement. In this paper, a hybrid model integrating wavelet and multiple linear regressions (MLR) is proposed for crude oil price forecasting. In this model, Mallat wavelet transform is first selected to decompose an original time series into several subseries with different scale. Then, the principal component analysis (PCA) is used in processing subseries data in MLR for crude oil price forecasting. The particle swarm optimization (PSO) is used to adopt the optimal parameters of the MLR model. To assess the effectiveness of this model, daily crude oil market, West Texas Intermediate (WTI), has been used as the case study. Time series prediction capability performance of the WMLR model is compared with the MLR, ARIMA, and GARCH models using various statistics measures. The experimental results show that the proposed model outperforms the individual models in forecasting of the crude oil prices series. PMID:24895666

  13. Impact of meteorological factors on the incidence of bacillary dysentery in Beijing, China: A time series analysis (1970-2012).

    PubMed

    Yan, Long; Wang, Hong; Zhang, Xuan; Li, Ming-Yue; He, Juan

    2017-01-01

    Influence of meteorological variables on the transmission of bacillary dysentery (BD) is under investigated topic and effective forecasting models as public health tool are lacking. This paper aimed to quantify the relationship between meteorological variables and BD cases in Beijing and to establish an effective forecasting model. A time series analysis was conducted in the Beijing area based upon monthly data on weather variables (i.e. temperature, rainfall, relative humidity, vapor pressure, and wind speed) and on the number of BD cases during the period 1970-2012. Autoregressive integrated moving average models with explanatory variables (ARIMAX) were built based on the data from 1970 to 2004. Prediction of monthly BD cases from 2005 to 2012 was made using the established models. The prediction accuracy was evaluated by the mean square error (MSE). Firstly, temperature with 2-month and 7-month lags and rainfall with 12-month lag were found positively correlated with the number of BD cases in Beijing. Secondly, ARIMAX model with covariates of temperature with 7-month lag (β = 0.021, 95% confidence interval(CI): 0.004-0.038) and rainfall with 12-month lag (β = 0.023, 95% CI: 0.009-0.037) displayed the highest prediction accuracy. The ARIMAX model developed in this study showed an accurate goodness of fit and precise prediction accuracy in the short term, which would be beneficial for government departments to take early public health measures to prevent and control possible BD popularity.

  14. Sound scattering by several zooplankton groups. II. Scattering models.

    PubMed

    Stanton, T K; Chu, D; Wiebe, P H

    1998-01-01

    Mathematical scattering models are derived and compared with data from zooplankton from several gross anatomical groups--fluidlike, elastic shelled, and gas bearing. The models are based upon the acoustically inferred boundary conditions determined from laboratory backscattering data presented in part I of this series [Stanton et al., J. Acoust. Soc. Am. 103, 225-235 (1998)]. The models use a combination of ray theory, modal-series solution, and distorted wave Born approximation (DWBA). The formulations, which are inherently approximate, are designed to include only the dominant scattering mechanisms as determined from the experiments. The models for the fluidlike animals (euphausiids in this case) ranged from the simplest case involving two rays, which could qualitatively describe the structure of target strength versus frequency for single pings, to the most complex case involving a rough inhomogeneous asymmetrically tapered bent cylinder using the DWBA-based formulation which could predict echo levels over all angles of incidence (including the difficult region of end-on incidence). The model for the elastic shelled body (gastropods in this case) involved development of an analytical model which takes into account irregularities and discontinuities of the shell. The model for gas-bearing animals (siphonophores) is a hybrid model which is composed of the summation of the exact solution to the gas sphere and the approximate DWBA-based formulation for arbitrarily shaped fluidlike bodies. There is also a simplified ray-based model for the siphonophore. The models are applied to data involving single pings, ping-to-ping variability, and echoes averaged over many pings. There is reasonable qualitative agreement between the predictions and single ping data, and reasonable quantitative agreement between the predictions and variability and averages of echo data.

  15. dc properties of series-parallel arrays of Josephson junctions in an external magnetic field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewandowski, S.J.

    1991-04-01

    A detailed dc theory of superconducting multijunction interferometers has previously been developed by several authors for the case of parallel junction arrays. The theory is now extended to cover the case of a loop containing several junctions connected in series. The problem is closely associated with high-{ital T}{sub {ital c}} superconductors and their clusters of intrinsic Josephson junctions. These materials exhibit spontaneous interferometric effects, and there is no reason to assume that the intrinsic junctions form only parallel arrays. A simple formalism of phase states is developed in order to express the superconducting phase differences across the junctions forming amore » series array as functions of the phase difference across the weakest junction of the system, and to relate the differences in critical currents of the junctions to gaps in the allowed ranges of their phase functions. This formalism is used to investigate the energy states of the array, which in the case of different junctions are split and separated by energy barriers of height depending on the phase gaps. Modifications of the washboard model of a single junction are shown. Next a superconducting inductive loop containing a series array of two junctions is considered, and this model is used to demonstrate the transitions between phase states and the associated instabilities. Finally, the critical current of a parallel connection of two series arrays is analyzed and shown to be a multivalued function of the externally applied magnetic flux. The instabilities caused by the presence of intrinsic serial junctions in granular high-{ital T}{sub {ital c}} materials are pointed out as a potential source of additional noise.« less

  16. Comparisons of regional Hydrological Angular Momentum (HAM) of the different models

    NASA Astrophysics Data System (ADS)

    Nastula, J.; Kolaczek, B.; Popinski, W.

    2006-10-01

    In the paper hydrological excitations of the polar motion (HAM) were computed from various hydrological data series (NCEP, ECMWF, CPC water storage and LaD World Simulations of global continental water). HAM series obtained from these four models and the geodetic excitation function GEOD computed from the polar motion COMB03 data were compared in the seasonal spectral band. The results show big differences of these hydrological excitation functions as well as of their spectra in the seasonal spectra band. Seasonal oscillations of the global geophysical excitation functions (AAM + OAM + HAM) in all cases besides the NCEP/NCAR model are smaller than the geodetic excitation function. It means that these models need further improvement and perhaps not only hydrological models need improvements.

  17. Analysis and generation of groundwater concentration time series

    NASA Astrophysics Data System (ADS)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  18. On the maximum-entropy/autoregressive modeling of time series

    NASA Technical Reports Server (NTRS)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  19. Modified superposition: A simple time series approach to closed-loop manual controller identification

    NASA Technical Reports Server (NTRS)

    Biezad, D. J.; Schmidt, D. K.; Leban, F.; Mashiko, S.

    1986-01-01

    Single-channel pilot manual control output in closed-tracking tasks is modeled in terms of linear discrete transfer functions which are parsimonious and guaranteed stable. The transfer functions are found by applying a modified super-position time series generation technique. A Levinson-Durbin algorithm is used to determine the filter which prewhitens the input and a projective (least squares) fit of pulse response estimates is used to guarantee identified model stability. Results from two case studies are compared to previous findings, where the source of data are relatively short data records, approximately 25 seconds long. Time delay effects and pilot seasonalities are discussed and analyzed. It is concluded that single-channel time series controller modeling is feasible on short records, and that it is important for the analyst to determine a criterion for best time domain fit which allows association of model parameter values, such as pure time delay, with actual physical and physiological constraints. The purpose of the modeling is thus paramount.

  20. Evaluating growth models: A case study using PrognosisBC

    Treesearch

    Peter Marshall; Pablo Parysow; Shadrach Akindele

    2008-01-01

    The ability of the PrognosisBC (Version 3.0) growth model to predict tree and stand growth was assessed against a series of remeasured permanent sample plots, including some which had been precommercially thinned. In addition, the model was evaluated for logical consistency across a variety of stand structures using simulation. By the end of the...

  1. Evaluation of a collaborative model: a case study analysis of watershed planning in the intermountain west

    Treesearch

    Gary Bentrup

    2001-01-01

    Collaborative planning processes have become increasingly popular for addressing environmental planning issues, resulting in a number of conceptual models for collaboration. A model proposed by Selin and Chavez suggests that collaboration emerges from a series of antecedents and then proceeds sequentially through problem-setting, direction-setting, implementation, and...

  2. Scaling Equations for Ballistic Modeling of Solid Rocket Motor Case Breach

    NASA Technical Reports Server (NTRS)

    McMillin, Joshua E.

    2006-01-01

    This paper explores the development of a series of scaling equations that can take a known nominal motor performance and scale it for small and growing case failures. This model was developed for the Malfunction-Turn Study as part of Return to Flight activities for the Space Shuttle program. To verify the model, data from the Challenger accident (STS- 51L) were used. The model is able to predict the motor performance beyond the last recorded Challenger data and show how the failed right hand booster would have performed if the vehicle had remained intact.

  3. A review and reanalysis of Bruno Schulz's "Erkrankungsalter schizophrener Eltern und Kinder [Age at onset of illness in schizophrenic parents and offspring]:" Zeitschrift für die gesamte Neurologie und Psychiatrie, 168, 709-721, 1940.

    PubMed

    Sham, P C; Zerbin-Rüdin, E; Kendler, K S

    1995-01-01

    Nearly all previous evidence of the familial transmission of age at onset of schizophrenia has been in siblings and twins. In his paper, Bruno Schulz examined the age at onset distribution of schizophrenia in affected parent and offspring pairs, using a systematic series of ascertained cases (n = 106), as well as a second series of chronic in-patients (n = 36). The parent-offspring correlation in age at onset, for cases with a definite diagnosis in the systematically ascertained series, was estimated at 0.346 (95% confidence interval 0.134, 0.528). Schulz did not test for differences between the two series and between males and females, but our reanalysis, using correlational methods and a mixed linear model, did not detect any significant differences. These results are consistent with previous findings that age at onset of schizophrenia is influenced by familial factors which may be genetic.

  4. Effect of temperature and precipitation on salmonellosis cases in South-East Queensland, Australia: an observational study.

    PubMed

    Stephen, Dimity Maree; Barnett, Adrian Gerard

    2016-02-25

    Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather-salmonellosis associations. We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5 °C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Switching models improve on traditional time series models in quantifying weather-salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    PubMed

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  6. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    PubMed Central

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  7. Burden of salmonellosis, campylobacteriosis and listeriosis: a time series analysis, Belgium, 2012 to 2020

    PubMed Central

    Maertens de Noordhout, Charline; Devleesschauwer, Brecht; Haagsma, Juanita A; Havelaar, Arie H; Bertrand, Sophie; Vandenberg, Olivier; Quoilin, Sophie; Brandt, Patrick T; Speybroeck, Niko

    2017-01-01

    Salmonellosis, campylobacteriosis and listeriosis are food-borne diseases. We estimated and forecasted the number of cases of these three diseases in Belgium from 2012 to 2020, and calculated the corresponding number of disability-adjusted life years (DALYs). The salmonellosis time series was fitted with a Bai and Perron two-breakpoint model, while a dynamic linear model was used for campylobacteriosis and a Poisson autoregressive model for listeriosis. The average monthly number of cases of salmonellosis was 264 (standard deviation (SD): 86) in 2012 and predicted to be 212 (SD: 87) in 2020; campylobacteriosis case numbers were 633 (SD: 81) and 1,081 (SD: 311); listeriosis case numbers were 5 (SD: 2) in 2012 and 6 (SD: 3) in 2014. After applying correction factors, the estimated DALYs for salmonellosis were 102 (95% uncertainty interval (UI): 8–376) in 2012 and predicted to be 82 (95% UI: 6–310) in 2020; campylobacteriosis DALYs were 1,019 (95% UI: 137–3,181) and 1,736 (95% UI: 178–5,874); listeriosis DALYs were 208 (95% UI: 192–226) in 2012 and 252 (95% UI: 200–307) in 2014. New actions are needed to reduce the risk of food-borne infection with Campylobacter spp. because campylobacteriosis incidence may almost double through 2020. PMID:28935025

  8. Burden of salmonellosis, campylobacteriosis and listeriosis: a time series analysis, Belgium, 2012 to 2020.

    PubMed

    Maertens de Noordhout, Charline; Devleesschauwer, Brecht; Haagsma, Juanita A; Havelaar, Arie H; Bertrand, Sophie; Vandenberg, Olivier; Quoilin, Sophie; Brandt, Patrick T; Speybroeck, Niko

    2017-09-21

    Salmonellosis, campylobacteriosis and listeriosis are food-borne diseases. We estimated and forecasted the number of cases of these three diseases in Belgium from 2012 to 2020, and calculated the corresponding number of disability-adjusted life years (DALYs). The salmonellosis time series was fitted with a Bai and Perron two-breakpoint model, while a dynamic linear model was used for campylobacteriosis and a Poisson autoregressive model for listeriosis. The average monthly number of cases of salmonellosis was 264 (standard deviation (SD): 86) in 2012 and predicted to be 212 (SD: 87) in 2020; campylobacteriosis case numbers were 633 (SD: 81) and 1,081 (SD: 311); listeriosis case numbers were 5 (SD: 2) in 2012 and 6 (SD: 3) in 2014. After applying correction factors, the estimated DALYs for salmonellosis were 102 (95% uncertainty interval (UI): 8-376) in 2012 and predicted to be 82 (95% UI: 6-310) in 2020; campylobacteriosis DALYs were 1,019 (95% UI: 137-3,181) and 1,736 (95% UI: 178-5,874); listeriosis DALYs were 208 (95% UI: 192-226) in 2012 and 252 (95% UI: 200-307) in 2014. New actions are needed to reduce the risk of food-borne infection with Campylobacter spp. because campylobacteriosis incidence may almost double through 2020. This article is copyright of The Authors, 2017.

  9. Fisher information framework for time series modeling

    NASA Astrophysics Data System (ADS)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  10. Modeling Polio Data Using the First Order Non-Negative Integer-Valued Autoregressive, INAR(1), Model

    NASA Astrophysics Data System (ADS)

    Vazifedan, Turaj; Shitan, Mahendran

    Time series data may consists of counts, such as the number of road accidents, the number of patients in a certain hospital, the number of customers waiting for service at a certain time and etc. When the value of the observations are large it is usual to use Gaussian Autoregressive Moving Average (ARMA) process to model the time series. However if the observed counts are small, it is not appropriate to use ARMA process to model the observed phenomenon. In such cases we need to model the time series data by using Non-Negative Integer valued Autoregressive (INAR) process. The modeling of counts data is based on the binomial thinning operator. In this paper we illustrate the modeling of counts data using the monthly number of Poliomyelitis data in United States between January 1970 until December 1983. We applied the AR(1), Poisson regression model and INAR(1) model and the suitability of these models were assessed by using the Index of Agreement(I.A.). We found that INAR(1) model is more appropriate in the sense it had a better I.A. and it is natural since the data are counts.

  11. PM₁₀ exposure and non-accidental mortality in Asian populations: a meta-analysis of time-series and case-crossover studies.

    PubMed

    Park, Hye Yin; Bae, Sanghyuk; Hong, Yun-Chul

    2013-01-01

    We investigated the association between particulate matter less than 10 µm in aerodynamic diameter (PM₁₀) exposure and non-accidental mortality in Asian populations by meta-analysis, using both time-series and case-crossover analysis. Among the 819 published studies searched from PubMed and EMBASE using key words related to PM₁₀ exposure and non-accidental mortality in Asian countries, 8 time-series and 4 case-crossover studies were selected for meta-analysis after exclusion by selection criteria. We obtained the relative risk (RR) and 95% confidence intervals (CI) of non-accidental mortality per 10 µg/m³ increase of daily PM₁₀ from each study. We used Q statistics to test the heterogeneity of the results among the different studies and evaluated for publication bias using Begg funnel plot and Egger test. Testing for heterogeneity showed significance (p<0.001); thus, we applied a random-effects model. RR (95% CI) per 10 µg/m³ increase of daily PM₁₀ for both the time-series and case-crossover studies combined, time-series studies relative risk only, and case-crossover studies only, were 1.0047 (1.0033 to 1.0062), 1.0057 (1.0029 to 1.0086), and 1.0027 (1.0010 to 1.0043), respectively. The non-significant Egger test suggested that this analysis was not likely to have a publication bias. We found a significant positive association between PM₁₀ exposure and non-accidental mortality among Asian populations. Continued investigations are encouraged to contribute to the health impact assessment and public health management of air pollution in Asian countries.

  12. Models Role within Active Learning in Biology. A Case Study

    ERIC Educational Resources Information Center

    Pop-Pacurar, Irina; Tirla, Felicia-Doina

    2009-01-01

    In order to integrate ideas and information creatively, to motivate students and activate their thinking, we have used in Biology classes a series of active methods, among which the methods of critical thinking, which had very good results. Still, in the case of some intuitive, abstract, more difficult topics, such as the cell structure,…

  13. Mechanical energy of the trunk during walking--does the model used influence the results?

    PubMed

    Syczewska, Małgorzata

    2009-01-01

    The paper presents two trunk models. In the first one, the trunk is modelled as a series of seven segments, whose dimensions and inertial properties are parametrically based on body stature and body mass. In the second one, the trunk is modelled as one rigid segment. These models are used to calculate kinetic energy of the trunk relative movement with respect to the body centre of mass. The results show that in the case of healthy subject both models give similar results, but in the case of stroke subjects the simplified model leads to the underestimation of the energy amount and does not reflect all phases of gait when energy is generated.

  14. Reliability Prediction of Ontology-Based Service Compositions Using Petri Net and Time Series Models

    PubMed Central

    Li, Jia; Xia, Yunni; Luo, Xin

    2014-01-01

    OWL-S, one of the most important Semantic Web service ontologies proposed to date, provides a core ontological framework and guidelines for describing the properties and capabilities of their web services in an unambiguous, computer interpretable form. Predicting the reliability of composite service processes specified in OWL-S allows service users to decide whether the process meets the quantitative quality requirement. In this study, we consider the runtime quality of services to be fluctuating and introduce a dynamic framework to predict the runtime reliability of services specified in OWL-S, employing the Non-Markovian stochastic Petri net (NMSPN) and the time series model. The framework includes the following steps: obtaining the historical response times series of individual service components; fitting these series with a autoregressive-moving-average-model (ARMA for short) and predicting the future firing rates of service components; mapping the OWL-S process into a NMSPN model; employing the predicted firing rates as the model input of NMSPN and calculating the normal completion probability as the reliability estimate. In the case study, a comparison between the static model and our approach based on experimental data is presented and it is shown that our approach achieves higher prediction accuracy. PMID:24688429

  15. A discrete scattering series representation for lattice embedded models of chain cyclization

    NASA Astrophysics Data System (ADS)

    Fraser, Simon J.; Winnik, Mitchell A.

    1980-01-01

    In this paper we develop a lattice based model of chain cyclization in the presence of a set of occupied sites V in the lattice. We show that within the approximation of a Markovian chain propagator the effect of V on the partition function for the system can be written as a time-ordered exponential series in which V behaves like a scattering potential and chainlength is the timelike parameter. The discrete and finite nature of this model allows us to obtain rigorous upper and lower bounds to the series limit. We adapt these formulas to calculation of the partition functions and cyclization probabilities of terminally and globally cyclizing chains. Two classes of cyclization are considered: in the first model the target set H may be visited repeatedly (the Markovian model); in the second case vertices in H may be visited at most once(the non-Markovian or taboo model). This formulation depends on two fundamental combinatorial structures, namely the inclusion-exclusion principle and the set of subsets of a set. We have tried to interpret these abstract structures with physical analogies throughout the paper.

  16. Modelling inflation in transportation, comunication and financial services using B-Spline time series model

    NASA Astrophysics Data System (ADS)

    Suparti; Prahutama, Alan; Santoso, Rukun

    2018-05-01

    Inflation is an increase in the price of goods and services in general where the goods and services are the basic needs of society or the decline of the selling power of a country’s currency. Significant inflationary increases occurred in 2013. This increase was contributed by a significant increase in some inflation sectors / groups i.e transportation, communication and financial services; the foodstuff sector, and the housing, water, electricity, gas and fuel sectors. However, significant contributions occurred in the transportation, communications and financial services sectors. In the model of IFIs in the transportation, communication and financial services sector use the B-Spline time series approach, where the predictor variable is Yt, whereas the predictor is a significant lag (in this case Yt-1). In modeling B-spline time series determined the order and the optimum knot point. Optimum knot determination using Generalized Cross Validation (GCV). In inflation modeling for transportation sector, communication and financial services obtained model of B-spline order 2 with 2 points knots produce MAPE less than 50%.

  17. Mechanical Behavior of Collagen-Fibrin Co-Gels Reflects Transition From Series to Parallel Interactions With Increasing Collagen Content

    PubMed Central

    Lai, Victor K.; Lake, Spencer P.; Frey, Christina R.; Tranquillo, Robert T.; Barocas, Victor H.

    2012-01-01

    Fibrin and collagen, biopolymers occurring naturally in the body, are biomaterials commonly-used as scaffolds for tissue engineering. How collagen and fibrin interact to confer macroscopic mechanical properties in collagen-fibrin composite systems remains poorly understood. In this study, we formulated collagen-fibrin co-gels at different collagen-tofibrin ratios to observe changes in the overall mechanical behavior and microstructure. A modeling framework of a two-network system was developed by modifying our micro-scale model, considering two forms of interaction between the networks: (a) two interpenetrating but noninteracting networks (“parallel”), and (b) a single network consisting of randomly alternating collagen and fibrin fibrils (“series”). Mechanical testing of our gels show that collagen-fibrin co-gels exhibit intermediate properties (UTS, strain at failure, tangent modulus) compared to those of pure collagen and fibrin. The comparison with model predictions show that the parallel and series model cases provide upper and lower bounds, respectively, for the experimental data, suggesting that a combination of such interactions exists between the collagen and fibrin in co-gels. A transition from the series model to the parallel model occurs with increasing collagen content, with the series model best describing predominantly fibrin co-gels, and the parallel model best describing predominantly collagen co-gels. PMID:22482659

  18. Variables affecting the financial viability of your practice: a case study.

    PubMed

    Binderman, J

    2001-01-01

    Utilizing the discussion of variables affecting practice financial viability, a case study is considered. The case study reveals the relative impact multiple variables have upon the bottom line, including: practice capacity, percentage of capitation, and fee-for-service in the practice, as well as patient visit rates and patient churning. This article presents basic financial information through a case study model, utilizing a series of worksheets that can be adapted to any practice situation to encourage improved financial viability.

  19. Studies in astronomical time series analysis: Modeling random processes in the time domain

    NASA Technical Reports Server (NTRS)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  20. A Case-Series Test of the Interactive Two-Step Model of Lexical Access: Predicting Word Repetition from Picture Naming

    ERIC Educational Resources Information Center

    Dell, Gary S.; Martin, Nadine; Schwartz, Myrna F.

    2007-01-01

    Lexical access in language production, and particularly pathologies of lexical access, are often investigated by examining errors in picture naming and word repetition. In this article, we test a computational approach to lexical access, the two-step interactive model, by examining whether the model can quantitatively predict the repetition-error…

  1. A multi-tiered time-series modelling approach to forecasting respiratory syncytial virus incidence at the local level.

    PubMed

    Spaeder, M C; Fackler, J C

    2012-04-01

    Respiratory syncytial virus (RSV) is the most common cause of documented viral respiratory infections, and the leading cause of hospitalization, in young children. We performed a retrospective time-series analysis of all patients aged <18 years with laboratory-confirmed RSV within a network of multiple affiliated academic medical institutions. Forecasting models of weekly RSV incidence for the local community, inpatient paediatric hospital and paediatric intensive-care unit (PICU) were created. Ninety-five percent confidence intervals calculated around our models' 2-week forecasts were accurate to ±9·3, ±7·5 and ±1·5 cases/week for the local community, inpatient hospital and PICU, respectively. Our results suggest that time-series models may be useful tools in forecasting the burden of RSV infection at the local and institutional levels, helping communities and institutions to optimize distribution of resources based on the changing burden and severity of illness in their respective communities.

  2. [Application of ARIMA model to predict number of malaria cases in China].

    PubMed

    Hui-Yu, H; Hua-Qin, S; Shun-Xian, Z; Lin, A I; Yan, L U; Yu-Chun, C; Shi-Zhu, L I; Xue-Jiao, T; Chun-Li, Y; Wei, H U; Jia-Xu, C

    2017-08-15

    Objective To study the application of autoregressive integrated moving average (ARIMA) model to predict the monthly reported malaria cases in China, so as to provide a reference for prevention and control of malaria. Methods SPSS 24.0 software was used to construct the ARIMA models based on the monthly reported malaria cases of the time series of 20062015 and 2011-2015, respectively. The data of malaria cases from January to December, 2016 were used as validation data to compare the accuracy of the two ARIMA models. Results The models of the monthly reported cases of malaria in China were ARIMA (2, 1, 1) (1, 1, 0) 12 and ARIMA (1, 0, 0) (1, 1, 0) 12 respectively. The comparison between the predictions of the two models and actual situation of malaria cases showed that the ARIMA model based on the data of 2011-2015 had a higher accuracy of forecasting than the model based on the data of 2006-2015 had. Conclusion The establishment and prediction of ARIMA model is a dynamic process, which needs to be adjusted unceasingly according to the accumulated data, and in addition, the major changes of epidemic characteristics of infectious diseases must be considered.

  3. Convergence and divergence in spherical harmonic series of the gravitational field generated by high-resolution planetary topography—A case study for the Moon

    NASA Astrophysics Data System (ADS)

    Hirt, Christian; Kuhn, Michael

    2017-08-01

    Theoretically, spherical harmonic (SH) series expansions of the external gravitational potential are guaranteed to converge outside the Brillouin sphere enclosing all field-generating masses. Inside that sphere, the series may be convergent or may be divergent. The series convergence behavior is a highly unstable quantity that is little studied for high-resolution mass distributions. Here we shed light on the behavior of SH series expansions of the gravitational potential of the Moon. We present a set of systematic numerical experiments where the gravity field generated by the topographic masses is forward-modeled in spherical harmonics and with numerical integration techniques at various heights and different levels of resolution, increasing from harmonic degree 90 to 2160 ( 61 to 2.5 km scales). The numerical integration is free from any divergence issues and therefore suitable to reliably assess convergence versus divergence of the SH series. Our experiments provide unprecedented detailed insights into the divergence issue. We show that the SH gravity field of degree-180 topography is convergent anywhere in free space. When the resolution of the topographic mass model is increased to degree 360, divergence starts to affect very high degree gravity signals over regions deep inside the Brillouin sphere. For degree 2160 topography/gravity models, severe divergence (with several 1000 mGal amplitudes) prohibits accurate gravity modeling over most of the topography. As a key result, we formulate a new hypothesis to predict divergence: if the potential degree variances show a minimum, then the SH series expansions diverge somewhere inside the Brillouin sphere and modeling of the internal potential becomes relevant.

  4. Comparison of INAR(1)-Poisson model and Markov prediction model in forecasting the number of DHF patients in west java Indonesia

    NASA Astrophysics Data System (ADS)

    Ahdika, Atina; Lusiyana, Novyan

    2017-02-01

    World Health Organization (WHO) noted Indonesia as the country with the highest dengue (DHF) cases in Southeast Asia. There are no vaccine and specific treatment for DHF. One of the efforts which can be done by both government and resident is doing a prevention action. In statistics, there are some methods to predict the number of DHF cases to be used as the reference to prevent the DHF cases. In this paper, a discrete time series model, INAR(1)-Poisson model in specific, and Markov prediction model are used to predict the number of DHF patients in West Java Indonesia. The result shows that MPM is the best model since it has the smallest value of MAE (mean absolute error) and MAPE (mean absolute percentage error).

  5. Time series analysis as input for clinical predictive modeling: Modeling cardiac arrest in a pediatric ICU

    PubMed Central

    2011-01-01

    Background Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. Methods We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Results Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9) training models for various data subsets; and 10) measuring model performance characteristics in unseen data to estimate their external validity. Conclusions We have proposed a ten step process that results in data sets that contain time series features and are suitable for predictive modeling by a number of methods. We illustrated the process through an example of cardiac arrest prediction in a pediatric intensive care setting. PMID:22023778

  6. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    PubMed

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9) training models for various data subsets; and 10) measuring model performance characteristics in unseen data to estimate their external validity. We have proposed a ten step process that results in data sets that contain time series features and are suitable for predictive modeling by a number of methods. We illustrated the process through an example of cardiac arrest prediction in a pediatric intensive care setting.

  7. 75 FR 68548 - Airworthiness Directives; Airbus Model A318, A319, A320, and A321 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-08

    ...: One case of elevator servo-control disconnection has been experienced on an aeroplane of the A320 family. Investigation has revealed that the failure occurred at the servo-control rod eye-end. Further to... servo-control rod eye-ends. In several cases, both actuators of the same elevator surface were affected...

  8. A Generalized Least Squares Regression Approach for Computing Effect Sizes in Single-Case Research: Application Examples

    ERIC Educational Resources Information Center

    Maggin, Daniel M.; Swaminathan, Hariharan; Rogers, Helen J.; O'Keeffe, Breda V.; Sugai, George; Horner, Robert H.

    2011-01-01

    A new method for deriving effect sizes from single-case designs is proposed. The strategy is applicable to small-sample time-series data with autoregressive errors. The method uses Generalized Least Squares (GLS) to model the autocorrelation of the data and estimate regression parameters to produce an effect size that represents the magnitude of…

  9. Socialism. Grade Ten, Unit Two, 10.2. Comprehensive Social Studies Curriculum for the Inner City.

    ERIC Educational Resources Information Center

    Malone, Helen

    The socialism unit of the tenth grade level of the FICSS series (Focus on Inner City Social Studies -- see SO 008 271) explores a selected history of socialist thought and the theoretical model of socialism. Three case studies of socialism are explored: Great Britain, Sweden, and Israel. The case studies are designed to answer questions concerning…

  10. Evaluation of the effects of climate and man intervention on ground waters and their dependent ecosystems using time series analysis

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Stefanopoulos, Kyriakos

    2011-06-01

    SummaryGroundwaters and their dependent ecosystems are affected both by the meteorological conditions as well as from human interventions, mainly in the form of groundwater abstractions for irrigation needs. This work aims at investigating the quantitative effects of meteorological conditions and man intervention on groundwater resources and their dependent ecosystems. Various seasonal Auto-Regressive Integrated Moving Average (ARIMA) models with external predictor variables were used in order to model the influence of meteorological conditions and man intervention on the groundwater level time series. Initially, a seasonal ARIMA model that simulates the abstraction time series using as external predictor variable temperature ( T) was prepared. Thereafter, seasonal ARIMA models were developed in order to simulate groundwater level time series in 8 monitoring locations, using the appropriate predictor variables determined for each individual case. The spatial component was introduced through the use of Geographical Information Systems (GIS). Application of the proposed methodology took place in the Neon Sidirochorion alluvial aquifer (Northern Greece), for which a 7-year long time series (i.e., 2003-2010) of piezometric and groundwater abstraction data exists. According to the developed ARIMA models, three distinct groups of groundwater level time series exist; the first one proves to be dependent only on the meteorological parameters, the second group demonstrates a mixed dependence both on meteorological conditions and on human intervention, whereas the third group shows a clear influence from man intervention. Moreover, there is evidence that groundwater abstraction has affected an important protected ecosystem.

  11. Rainfall disaggregation for urban hydrology: Effects of spatial consistence

    NASA Astrophysics Data System (ADS)

    Müller, Hannes; Haberlandt, Uwe

    2015-04-01

    For urban hydrology rainfall time series with a high temporal resolution are crucial. Observed time series of this kind are very short in most cases, so they cannot be used. On the contrary, time series with lower temporal resolution (daily measurements) exist for much longer periods. The objective is to derive time series with a long duration and a high resolution by disaggregating time series of the non-recording stations with information of time series of the recording stations. The multiplicative random cascade model is a well-known disaggregation model for daily time series. For urban hydrology it is often assumed, that a day consists of only 1280 minutes in total as starting point for the disaggregation process. We introduce a new variant for the cascade model, which is functional without this assumption and also outperforms the existing approach regarding time series characteristics like wet and dry spell duration, average intensity, fraction of dry intervals and extreme value representation. However, in both approaches rainfall time series of different stations are disaggregated without consideration of surrounding stations. This yields in unrealistic spatial patterns of rainfall. We apply a simulated annealing algorithm that has been used successfully for hourly values before. Relative diurnal cycles of the disaggregated time series are resampled to reproduce the spatial dependence of rainfall. To describe spatial dependence we use bivariate characteristics like probability of occurrence, continuity ratio and coefficient of correlation. Investigation area is a sewage system in Northern Germany. We show that the algorithm has the capability to improve spatial dependence. The influence of the chosen disaggregation routine and the spatial dependence on overflow occurrences and volumes of the sewage system will be analyzed.

  12. Task 7: Endwall treatment inlet flow distortion analysis

    NASA Technical Reports Server (NTRS)

    Hall, E. J.; Topp, D. A.; Heidegger, N. J.; McNulty, G. S.; Weber, K. F.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields, and to perform a series of detailed numerical predictions to assess the effectiveness of various endwall treatments for enhancing the efficiency and stall margin of modern high speed fan rotors. Particular attention was given to examining the effectiveness of endwall treatments to counter the undesirable effects of inflow distortion. Calculations were performed using three different gridding techniques based on the type of casing treatment being tested and the level of complexity desired in the analysis. In each case, the casing treatment itself is modeled as a discrete object in the overall analysis, and the flow through the casing treatment is determined as part of the solution. A series of calculations were performed for both treated and untreated modern fan rotors both with and without inflow distortion. The effectiveness of the various treatments were quantified, and several physical mechanisms by which the effectiveness of endwall treatments is achieved are discussed.

  13. Bias correction of risk estimates in vaccine safety studies with rare adverse events using a self-controlled case series design.

    PubMed

    Zeng, Chan; Newcomer, Sophia R; Glanz, Jason M; Shoup, Jo Ann; Daley, Matthew F; Hambidge, Simon J; Xu, Stanley

    2013-12-15

    The self-controlled case series (SCCS) method is often used to examine the temporal association between vaccination and adverse events using only data from patients who experienced such events. Conditional Poisson regression models are used to estimate incidence rate ratios, and these models perform well with large or medium-sized case samples. However, in some vaccine safety studies, the adverse events studied are rare and the maximum likelihood estimates may be biased. Several bias correction methods have been examined in case-control studies using conditional logistic regression, but none of these methods have been evaluated in studies using the SCCS design. In this study, we used simulations to evaluate 2 bias correction approaches-the Firth penalized maximum likelihood method and Cordeiro and McCullagh's bias reduction after maximum likelihood estimation-with small sample sizes in studies using the SCCS design. The simulations showed that the bias under the SCCS design with a small number of cases can be large and is also sensitive to a short risk period. The Firth correction method provides finite and less biased estimates than the maximum likelihood method and Cordeiro and McCullagh's method. However, limitations still exist when the risk period in the SCCS design is short relative to the entire observation period.

  14. Assessment of trend and seasonality in road accident data: an Iranian case study.

    PubMed

    Razzaghi, Alireza; Bahrampour, Abbas; Baneshi, Mohammad Reza; Zolala, Farzaneh

    2013-06-01

    Road traffic accidents and their related deaths have become a major concern, particularly in developing countries. Iran has adopted a series of policies and interventions to control the high number of accidents occurring over the past few years. In this study we used a time series model to understand the trend of accidents, and ascertain the viability of applying ARIMA models on data from Taybad city. This study is a cross-sectional study. We used data from accidents occurring in Taybad between 2007 and 2011. We obtained the data from the Ministry of Health (MOH) and used the time series method with a time lag of one month. After plotting the trend, non-stationary data in mean and variance were removed using Box-Cox transformation and a differencing method respectively. The ACF and PACF plots were used to control the stationary situation. The traffic accidents in our study had an increasing trend over the five years of study. Based on ACF and PACF plots gained after applying Box-Cox transformation and differencing, data did not fit to a time series model. Therefore, neither ARIMA model nor seasonality were observed. Traffic accidents in Taybad have an upward trend. In addition, we expected either the AR model, MA model or ARIMA model to have a seasonal trend, yet this was not observed in this analysis. Several reasons may have contributed to this situation, such as uncertainty of the quality of data, weather changes, and behavioural factors that are not taken into account by time series analysis.

  15. Study on common seasonal signals in GPS time series and environmental loadings using Multichannel Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gruszczynska, Marta; Rosat, Severine; Klos, Anna; Bogusz, Janusz

    2017-04-01

    Seasonal oscillations in the GPS position time series can arise from real geophysical effects and numerical artefacts. According to Dong et al. (2002) environmental loading effects can account for approximately 40% of the total variance of the annual signals in GPS time series, however using generally acknowledged methods (e.g. Least Squares Estimation, Wavelet Decomposition, Singular Spectrum Analysis) to model seasonal signals we are not able to separate real from spurious signals (effects of mismodelling aliased into annual period as well as draconitic). Therefore, we propose to use Multichannel Singular Spectrum Analysis (MSSA) to determine seasonal oscillations (with annual and semi-annual periods) from GPS position time series and environmental loading displacement models. The MSSA approach is an extension of the classical Karhunen-Loève method and it is a special case of SSA for multivariate time series. The main advantage of MSSA is the possibility to extract common seasonal signals for stations from selected area and to investigate the causality between a set of time series as well. In this research, we explored the ability of MSSA application to separate real geophysical effects from spurious effects in GPS time series. For this purpose, we used GPS position changes and environmental loading models. We analysed the topocentric time series from 250 selected stations located worldwide, delivered from Network Solution obtained by the International GNSS Service (IGS) as a contribution to the latest realization of the International Terrestrial Reference System (namely ITRF2014, Rebishung et al., 2016). We also researched atmospheric, hydrological and non-tidal oceanic loading models provided by the EOST/IPGS Loading Service in the Centre-of-Figure (CF) reference frame. The analysed displacements were estimated from ERA-Interim (surface pressure), MERRA-land (soil moisture and snow) as well as ECCO2 ocean bottom pressure. We used Multichannel Singular Spectrum Analysis to determine common seasonal signals in two case studies with adopted a 3-years lag-window as the optimal window size. We also inferred the statistical significance of oscillations through the Monte Carlo MSSA method (Allen and Robertson, 1996). In the first case study, we investigated the common spatio-temporal seasonal signals for all stations. For this purpose, we divided selected stations with respect to the continents. For instance, for stations located in Europe, seasonal oscillations accounts for approximately 45% of the GPS-derived data variance. Much higher variance of seasonal signals is explained by hydrological loadings of about 92%, while the non-tidal oceanic loading accounted for 31% of total variance. In the second case study, we analysed the capability of the MSSA method to establish a causality between several time series. Each of estimated Principal Component represents pattern of the common signal for all analysed data. For ZIMM station (Zimmerwald, Switzerland), the 1st, 2nd and 9th, 10th Principal Components, which accounts for 35% of the variance, corresponds to the annual and semi-annual signals. In this part, we applied the non-parametric MSSA approach to extract the common seasonal signals for GPS time series and environmental loadings for each of the 250 stations with clear statement, that some part of seasonal signal reflects the real geophysical effects. REFERENCES: 1. Allen, M. and Robertson, A.: 1996, Distinguishing modulated oscillations from coloured noise in multivariate datasets. Climate Dynamics, 12, No. 11, 775-784. DOI: 10.1007/s003820050142. 2. Dong, D., Fang, P., Bock, Y., Cheng, M.K. and Miyazaki, S.: 2002, Anatomy of apparent seasonal variations from GPS-derived site position time series. Journal of Geophysical Research, 107, No. B4, 2075. DOI: 10.1029/2001JB000573. 3. Rebischung, P., Altamimi, Z., Ray, J. and Garayt, B.: 2016, The IGS contribution to ITRF2014. Journal of Geodesy, 90, No. 7, 611-630. DOI:10.1007/s00190-016-0897-6.

  16. Improvement of downscaled rainfall and temperature across generations over the Western Himalayan region of India

    NASA Astrophysics Data System (ADS)

    Das, L.; Dutta, M.; Akhter, J.; Meher, J. K.

    2016-12-01

    It is a challenging task to create station level (local scale) climate change information over the mountainous locations of Western Himalayan Region (WHR) in India because of limited data availability and poor data quality. In the present study, missing values of station data were handled through Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge and 16 number of temperature station data having continuous record during 1901­2005 and 1969­2009 period respectively were considered as reference stations for developing downscaled rainfall and temperature time series from five commonly available GCMs in the IPCC's different generation assessment reports namely 2nd, 3rd, 4th and 5th hereafter known as SAR, TAR, AR4 and AR5 respectively. Downscaled models were developed using the combined data from the ERA-interim reanalysis and GCMs historical runs (in spite of forcing were not identical in different generation) as predictor and station level rainfall and temperature as predictands. Station level downscaled rainfall and temperature time series were constructed for five GCMs available in each generation. Regional averaged downscaled time series comprising of all stations was prepared for each model and generation and the downscaled results were compared with observed time series. Finally an Overall Model Improvement Index (OMII) was developed using the downscaling results, which was used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the Empirical Statistical Downscaling (ESD) methods. In case of temperature, models have improved from SAR to AR5 over the study area. In all most all the GCMs TAR is showing worst performance over the WHR by considering the different statistical indices used in this study. In case of precipitation, no model has shown gradual improvement from SAR to AR5 both for interpolated and downscaled values.

  17. Adapting dialectical behavior therapy for outpatient adult anorexia nervosa--a pilot study.

    PubMed

    Chen, Eunice Y; Segal, Kay; Weissman, Jessica; Zeffiro, Thomas A; Gallop, Robert; Linehan, Marsha M; Bohus, Martin; Lynch, Thomas R

    2015-01-01

    Anorexia Nervosa (AN) is associated with excessive self-control. This iterative case series describes the augmentation of Dialectical Behavior Therapy (DBT) for outpatient adult AN with skills addressing emotional and behavioral overcontrol. An overly controlled style is theorized to develop from the transaction between an individual with heightened threat sensitivity and reduced reward sensitivity, interacting with an environment reinforcing overcontrol and punishing imperfection. Case Series 1 utilized standard DBT, resulting in retention of 5/6 patients and a body mass index (BMI) effect size increase of d = -0.5 from pre- to post-treatment. Case series 2, using standard DBT augmented with skills addressing overcontrol, resulted in retention of 8/9 patients with an effect size increase in BMI at post-treatment that was maintained at 6- and 12-months follow-up (d = -1.12, d = -0.87, and d = -1.12). Findings suggest that skills training targeting rigidity and increasing openness and social connectedness warrant further study of this model and treatment for AN. © 2014 Wiley Periodicals, Inc.

  18. Time series analysis of gold production in Malaysia

    NASA Astrophysics Data System (ADS)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  19. Exploratory wavelet analysis of dengue seasonal patterns in Colombia.

    PubMed

    Fernández-Niño, Julián Alfredo; Cárdenas-Cárdenas, Luz Mery; Hernández-Ávila, Juan Eugenio; Palacio-Mejía, Lina Sofía; Castañeda-Orjuela, Carlos Andrés

    2015-12-04

    Dengue has a seasonal behavior associated with climatic changes, vector cycles, circulating serotypes, and population dynamics. The wavelet analysis makes it possible to separate a very long time series into calendar time and periods. This is the first time this technique is used in an exploratory manner to model the behavior of dengue in Colombia.  To explore the annual seasonal dengue patterns in Colombia and in its five most endemic municipalities for the period 2007 to 2012, and for roughly annual cycles between 1978 and 2013 at the national level.  We made an exploratory wavelet analysis using data from all incident cases of dengue per epidemiological week for the period 2007 to 2012, and per year for 1978 to 2013. We used a first-order autoregressive model as the null hypothesis.  The effect of the 2010 epidemic was evident in both the national time series and the series for the five municipalities. Differences in interannual seasonal patterns were observed among municipalities. In addition, we identified roughly annual cycles of 2 to 5 years since 2004 at a national level.  Wavelet analysis is useful to study a long time series containing changing seasonal patterns, as is the case of dengue in Colombia, and to identify differences among regions. These patterns need to be explored at smaller aggregate levels, and their relationships with different predictive variables need to be investigated.

  20. Model-based prediction of nephropathia epidemica outbreaks based on climatological and vegetation data and bank vole population dynamics.

    PubMed

    Haredasht, S Amirpour; Taylor, C J; Maes, P; Verstraeten, W W; Clement, J; Barrios, M; Lagrou, K; Van Ranst, M; Coppin, P; Berckmans, D; Aerts, J-M

    2013-11-01

    Wildlife-originated zoonotic diseases in general are a major contributor to emerging infectious diseases. Hantaviruses more specifically cause thousands of human disease cases annually worldwide, while understanding and predicting human hantavirus epidemics pose numerous unsolved challenges. Nephropathia epidemica (NE) is a human infection caused by Puumala virus, which is naturally carried and shed by bank voles (Myodes glareolus). The objective of this study was to develop a method that allows model-based predicting 3 months ahead of the occurrence of NE epidemics. Two data sets were utilized to develop and test the models. These data sets were concerned with NE cases in Finland and Belgium. In this study, we selected the most relevant inputs from all the available data for use in a dynamic linear regression (DLR) model. The number of NE cases in Finland were modelled using data from 1996 to 2008. The NE cases were predicted based on the time series data of average monthly air temperature (°C) and bank voles' trapping index using a DLR model. The bank voles' trapping index data were interpolated using a related dynamic harmonic regression model (DHR). Here, the DLR and DHR models used time-varying parameters. Both the DHR and DLR models were based on a unified state-space estimation framework. For the Belgium case, no time series of the bank voles' population dynamics were available. Several studies, however, have suggested that the population of bank voles is related to the variation in seed production of beech and oak trees in Northern Europe. Therefore, the NE occurrence pattern in Belgium was predicted based on a DLR model by using remotely sensed phenology parameters of broad-leaved forests, together with the oak and beech seed categories and average monthly air temperature (°C) using data from 2001 to 2009. Our results suggest that even without any knowledge about hantavirus dynamics in the host population, the time variation in NE outbreaks in Finland could be predicted 3 months ahead with a 34% mean relative prediction error (MRPE). This took into account solely the population dynamics of the carrier species (bank voles). The time series analysis also revealed that climate change, as represented by the vegetation index, changes in forest phenology derived from satellite images and directly measured air temperature, may affect the mechanics of NE transmission. NE outbreaks in Belgium were predicted 3 months ahead with a 40% MRPE, based only on the climatological and vegetation data, in this case, without any knowledge of the bank vole's population dynamics. In this research, we demonstrated that NE outbreaks can be predicted using climate and vegetation data or the bank vole's population dynamics, by using dynamic data-based models with time-varying parameters. Such a predictive modelling approach might be used as a step towards the development of new tools for the prevention of future NE outbreaks. © 2012 Blackwell Verlag GmbH.

  1. Signal detection of adverse events with imperfect confirmation rates in vaccine safety studies using self-controlled case series design.

    PubMed

    Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason

    2014-05-01

    The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. GSTARS computer models and their applications, Part II: Applications

    USGS Publications Warehouse

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  3. A comparison of performance of several artificial intelligence methods for forecasting monthly discharge time series

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin

    2009-08-01

    SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.

  4. Treatment of Obsessive Compulsive Disorder in Young Children: An Intervention Model and Case Series

    ERIC Educational Resources Information Center

    Ginsburg, Golda S.; Burstein, Marcy; Becker, Kimberly D.; Drake, Kelly L.

    2011-01-01

    This article presents an intervention model for young children with obsessive-compulsive disorder (OCD). The intervention, designed to reduce compulsive behavior and improve parenting practices, was tested using a multiple baseline design with 7 children (M = 6 years old; 57% female) in which participants were randomly assigned to 1, 2, or 3 weeks…

  5. A view on coupled cluster perturbation theory using a bivariational Lagrangian formulation.

    PubMed

    Kristensen, Kasper; Eriksen, Janus J; Matthews, Devin A; Olsen, Jeppe; Jørgensen, Poul

    2016-02-14

    We consider two distinct coupled cluster (CC) perturbation series that both expand the difference between the energies of the CCSD (CC with single and double excitations) and CCSDT (CC with single, double, and triple excitations) models in orders of the Møller-Plesset fluctuation potential. We initially introduce the E-CCSD(T-n) series, in which the CCSD amplitude equations are satisfied at the expansion point, and compare it to the recently developed CCSD(T-n) series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)], in which not only the CCSD amplitude, but also the CCSD multiplier equations are satisfied at the expansion point. The computational scaling is similar for the two series, and both are term-wise size extensive with a formal convergence towards the CCSDT target energy. However, the two series are different, and the CCSD(T-n) series is found to exhibit a more rapid convergence up through the series, which we trace back to the fact that more information at the expansion point is utilized than for the E-CCSD(T-n) series. The present analysis can be generalized to any perturbation expansion representing the difference between a parent CC model and a higher-level target CC model. In general, we demonstrate that, whenever the parent parameters depend upon the perturbation operator, a perturbation expansion of the CC energy (where only parent amplitudes are used) differs from a perturbation expansion of the CC Lagrangian (where both parent amplitudes and parent multipliers are used). For the latter case, the bivariational Lagrangian formulation becomes more than a convenient mathematical tool, since it facilitates a different and faster convergent perturbation series than the simpler energy-based expansion.

  6. The Effects of an Intervention Combining Peer Tutoring with Story Mapping on the Text Comprehension of Struggling Readers: A Case Report

    ERIC Educational Resources Information Center

    Grünke, Matthias; Leidig, Tatjana

    2017-01-01

    This single-case study tested a peer tutoring model using a visualizing strategy (story mapping) to teach struggling students better text comprehension. Three teams each consisting of a tutor and a tutee attending a fourth-grade general education classroom participated in the experiment. A short series of observations was carried out before and…

  7. Forecasting malaria incidence based on monthly case reports and environmental factors in Karuzi, Burundi, 1997–2003

    PubMed Central

    Gomez-Elipe, Alberto; Otero, Angel; van Herp, Michel; Aguirre-Jaime, Armando

    2007-01-01

    Background The objective of this work was to develop a model to predict malaria incidence in an area of unstable transmission by studying the association between environmental variables and disease dynamics. Methods The study was carried out in Karuzi, a province in the Burundi highlands, using time series of monthly notifications of malaria cases from local health facilities, data from rain and temperature records, and the normalized difference vegetation index (NDVI). Using autoregressive integrated moving average (ARIMA) methodology, a model showing the relation between monthly notifications of malaria cases and the environmental variables was developed. Results The best forecasting model (R2adj = 82%, p < 0.0001 and 93% forecasting accuracy in the range ± 4 cases per 100 inhabitants) included the NDVI, mean maximum temperature, rainfall and number of malaria cases in the preceding month. Conclusion This model is a simple and useful tool for producing reasonably reliable forecasts of the malaria incidence rate in the study area. PMID:17892540

  8. Power Laws, Scale Invariance and the Generalized Frobenius Series:

    NASA Astrophysics Data System (ADS)

    Visser, Matt; Yunes, Nicolas

    We present a self-contained formalism for calculating the background solution, the linearized solutions and a class of generalized Frobenius-like solutions to a system of scale-invariant differential equations. We first cast the scale-invariant model into its equidimensional and autonomous forms, find its fixed points, and then obtain power-law background solutions. After linearizing about these fixed points, we find a second linearized solution, which provides a distinct collection of power laws characterizing the deviations from the fixed point. We prove that generically there will be a region surrounding the fixed point in which the complete general solution can be represented as a generalized Frobenius-like power series with exponents that are integer multiples of the exponents arising in the linearized problem. While discussions of the linearized system are common, and one can often find a discussion of power-series with integer exponents, power series with irrational (indeed complex) exponents are much rarer in the extant literature. The Frobenius-like series we encounter can be viewed as a variant of the rarely-discussed Liapunov expansion theorem (not to be confused with the more commonly encountered Liapunov functions and Liapunov exponents). As specific examples we apply these ideas to Newtonian and relativistic isothermal stars and construct two separate power series with the overlapping radius of convergence. The second of these power series solutions represents an expansion around "spatial infinity," and in realistic models it is this second power series that gives information about the stellar core, and the damped oscillations in core mass and core radius as the central pressure goes to infinity. The power-series solutions we obtain extend classical results; as exemplified for instance by the work of Lane, Emden, and Chandrasekhar in the Newtonian case, and that of Harrison, Thorne, Wakano, and Wheeler in the relativistic case. We also indicate how to extend these ideas to situations where fixed points may not exist — either due to "monotone" flow or due to the presence of limit cycles. Monotone flow generically leads to logarithmic deviations from scaling, while limit cycles generally lead to discrete self-similar solutions.

  9. Estimating parameter values of a socio-hydrological flood model

    NASA Astrophysics Data System (ADS)

    Holkje Barendrecht, Marlies; Viglione, Alberto; Kreibich, Heidi; Vorogushyn, Sergiy; Merz, Bruno; Blöschl, Günter

    2018-06-01

    Socio-hydrological modelling studies that have been published so far show that dynamic coupled human-flood models are a promising tool to represent the phenomena and the feedbacks in human-flood systems. So far these models are mostly generic and have not been developed and calibrated to represent specific case studies. We believe that applying and calibrating these type of models to real world case studies can help us to further develop our understanding about the phenomena that occur in these systems. In this paper we propose a method to estimate the parameter values of a socio-hydrological model and we test it by applying it to an artificial case study. We postulate a model that describes the feedbacks between floods, awareness and preparedness. After simulating hypothetical time series with a given combination of parameters, we sample few data points for our variables and try to estimate the parameters given these data points using Bayesian Inference. The results show that, if we are able to collect data for our case study, we would, in theory, be able to estimate the parameter values for our socio-hydrological flood model.

  10. Calculation of Rate Spectra from Noisy Time Series Data

    PubMed Central

    Voelz, Vincent A.; Pande, Vijay S.

    2011-01-01

    As the resolution of experiments to measure folding kinetics continues to improve, it has become imperative to avoid bias that may come with fitting data to a predetermined mechanistic model. Towards this end, we present a rate spectrum approach to analyze timescales present in kinetic data. Computing rate spectra of noisy time series data via numerical discrete inverse Laplace transform is an ill-conditioned inverse problem, so a regularization procedure must be used to perform the calculation. Here, we show the results of different regularization procedures applied to noisy multi-exponential and stretched exponential time series, as well as data from time-resolved folding kinetics experiments. In each case, the rate spectrum method recapitulates the relevant distribution of timescales present in the data, with different priors on the rate amplitudes naturally corresponding to common biases toward simple phenomenological models. These results suggest an attractive alternative to the “Occam’s razor” philosophy of simply choosing models with the fewest number of relaxation rates. PMID:22095854

  11. A retrospective study of gastric dilatation and gastric dilatation and volvulus in working farm dogs in New Zealand.

    PubMed

    Hendriks, M M; Hill, K E; Cogger, N; Jones, B R; Cave, N J

    2012-05-01

    To present findings from a case series of gastric dilatation (GD) or gastric dilatation and volvulus (GDV) in working farm dogs in New Zealand that were examined at veterinary clinics, and to identify possible risk factors for GD or GDV in working farm dogs in New Zealand using a case-control study. This retrospective study included a case-series and a case-control study. The case series analysed information from 62 case records of GD or GDV in working farm dogs seen between August 2004 and September 2009 at 13 veterinary clinics throughout New Zealand. Cases were classified as GD or GDV if the diagnosis was confirmed by radiography, surgery or post-mortem examination. Details of history and treatment, as well as outcomes, were obtained for each case. For the case-control study, records of 41 working farm dogs with GD or GDV (cases) seen between April 2008 and April 2009, and 82 working farm dogs examined because of trauma over the same period and in the same 13 clinics (controls), were used to model the risk factors for GD or GDV. From the case-series study, 40/62 (65%) cases of GD or GDV that were examined and treated at the veterinary clinics returned to work. Of the 41 dogs where the gastric contents were recorded, 25 (61%) had predominantly food or bones in the stomach, and 26/27 dogs had a history of having eaten meat, bones or scavenged a carcass. The case-control study showed that the significant risk factors for GD or GDV, compared with control dogs presenting with trauma, were breed, age and season. The odds that a case of GD or GDV was a Huntaway, after adjusting for age and season, was 19 times higher than the odds a control was a Huntaway. Gender and bodyweight were not identified as risk factors. A high proportion of farm working dogs with GD or GDV were successfully treated by veterinarians. The risk of a case of GD or GDV being a Huntaway was significantly higher than for a dog presenting as a trauma case. However the influences of the season of the year, climatic factors and nutritional factors on the pathogenesis need to be identified before adequate preventative measures can be recommended.

  12. Effects of linear trends on estimation of noise in GNSS position time-series

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this study, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that themore » effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Finally, overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.« less

  13. Effects of linear trends on estimation of noise in GNSS position time-series

    NASA Astrophysics Data System (ADS)

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2017-01-01

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this paper, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that the effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.

  14. Effects of linear trends on estimation of noise in GNSS position time-series

    DOE PAGES

    Dmitrieva, K.; Segall, P.; Bradley, A. M.

    2016-10-20

    A thorough understanding of time-dependent noise in Global Navigation Satellite System (GNSS) position time-series is necessary for computing uncertainties in any signals found in the data. However, estimation of time-correlated noise is a challenging task and is complicated by the difficulty in separating noise from signal, the features of greatest interest in the time-series. In this study, we investigate how linear trends affect the estimation of noise in daily GNSS position time-series. We use synthetic time-series to study the relationship between linear trends and estimates of time-correlated noise for the six most commonly cited noise models. We find that themore » effects of added linear trends, or conversely de-trending, vary depending on the noise model. The commonly adopted model of random walk (RW), flicker noise (FN) and white noise (WN) is the most severely affected by de-trending, with estimates of low-amplitude RW most severely biased. FN plus WN is least affected by adding or removing trends. Non-integer power-law noise estimates are also less affected by de-trending, but are very sensitive to the addition of trend when the spectral index is less than one. We derive an analytical relationship between linear trends and the estimated RW variance for the special case of pure RW noise. Finally, overall, we find that to ascertain the correct noise model for GNSS position time-series and to estimate the correct noise parameters, it is important to have independent constraints on the actual trends in the data.« less

  15. A lifestyle intervention program for successfully addressing major cardiometabolic risks in persons with SCI: a three-subject case series

    PubMed Central

    Bigford, Gregory E; Mendez, Armando J; Betancourt, Luisa; Burns-Drecq, Patricia; Backus, Deborah; Nash, Mark S

    2017-01-01

    Introduction This study is a prospective case series analyzing the effects of a comprehensive lifestyle intervention program in three patients with chronic paraplegia having major risks for the cardiometabolic syndrome (CMS). Case presentation: Individuals underwent an intense 6-month program of circuit resistance exercise, nutrition using a Mediterranean diet and behavioral support, followed by a 6-month extension (maintenance) phase involving minimal support. The primary goal was a 7% reduction of body mass. Other outcomes analyzed insulin resistance using the HOMA-IR model, and plasma levels of fasting triglycerides and high-density lipoprotein cholesterol. All participants achieved the goal for 7% reduction of body mass and maintained the loss after the MP. Improvements were observed in 2/3 subjects for HOMA-IR and high-density lipoprotein cholesterol. All participants improved their risk for plasma triglycerides. Discussion: We conclude, in a three-person case series of persons with chronic paraplegia, a lifestyle intervention program involving circuit resistance training, a calorie-restrictive Mediterranean-style diet and behavioral support, results in clinically significant loss of body mass and effectively reduced component risks for CMS and diabetes. These results were for the most part maintained after a 6-month MP involving minimal supervision. PMID:28382218

  16. Coupling Matched Molecular Pairs with Machine Learning for Virtual Compound Optimization.

    PubMed

    Turk, Samo; Merget, Benjamin; Rippmann, Friedrich; Fulle, Simone

    2017-12-26

    Matched molecular pair (MMP) analyses are widely used in compound optimization projects to gain insights into structure-activity relationships (SAR). The analysis is traditionally done via statistical methods but can also be employed together with machine learning (ML) approaches to extrapolate to novel compounds. The here introduced MMP/ML method combines a fragment-based MMP implementation with different machine learning methods to obtain automated SAR decomposition and prediction. To test the prediction capabilities and model transferability, two different compound optimization scenarios were designed: (1) "new fragments" which occurs when exploring new fragments for a defined compound series and (2) "new static core and transformations" which resembles for instance the identification of a new compound series. Very good results were achieved by all employed machine learning methods especially for the new fragments case, but overall deep neural network models performed best, allowing reliable predictions also for the new static core and transformations scenario, where comprehensive SAR knowledge of the compound series is missing. Furthermore, we show that models trained on all available data have a higher generalizability compared to models trained on focused series and can extend beyond chemical space covered in the training data. Thus, coupling MMP with deep neural networks provides a promising approach to make high quality predictions on various data sets and in different compound optimization scenarios.

  17. A modelling approach to assessing the timescale uncertainties in proxy series with chronological errors

    NASA Astrophysics Data System (ADS)

    Divine, D. V.; Godtliebsen, F.; Rue, H.

    2012-01-01

    The paper proposes an approach to assessment of timescale errors in proxy-based series with chronological uncertainties. The method relies on approximation of the physical process(es) forming a proxy archive by a random Gamma process. Parameters of the process are partly data-driven and partly determined from prior assumptions. For a particular case of a linear accumulation model and absolutely dated tie points an analytical solution is found suggesting the Beta-distributed probability density on age estimates along the length of a proxy archive. In a general situation of uncertainties in the ages of the tie points the proposed method employs MCMC simulations of age-depth profiles yielding empirical confidence intervals on the constructed piecewise linear best guess timescale. It is suggested that the approach can be further extended to a more general case of a time-varying expected accumulation between the tie points. The approach is illustrated by using two ice and two lake/marine sediment cores representing the typical examples of paleoproxy archives with age models based on tie points of mixed origin.

  18. Improving short-term forecasting during ramp events by means of Regime-Switching Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Gallego, C.; Costa, A.; Cuerva, A.

    2010-09-01

    Since nowadays wind energy can't be neither scheduled nor large-scale storaged, wind power forecasting has been useful to minimize the impact of wind fluctuations. In particular, short-term forecasting (characterised by prediction horizons from minutes to a few days) is currently required by energy producers (in a daily electricity market context) and the TSO's (in order to keep the stability/balance of an electrical system). Within the short-term background, time-series based models (i.e., statistical models) have shown a better performance than NWP models for horizons up to few hours. These models try to learn and replicate the dynamic shown by the time series of a certain variable. When considering the power output of wind farms, ramp events are usually observed, being characterized by a large positive gradient in the time series (ramp-up) or negative (ramp-down) during relatively short time periods (few hours). Ramp events may be motivated by many different causes, involving generally several spatial scales, since the large scale (fronts, low pressure systems) up to the local scale (wind turbine shut-down due to high wind speed, yaw misalignment due to fast changes of wind direction). Hence, the output power may show unexpected dynamics during ramp events depending on the underlying processes; consequently, traditional statistical models considering only one dynamic for the hole power time series may be inappropriate. This work proposes a Regime Switching (RS) model based on Artificial Neural Nets (ANN). The RS-ANN model gathers as many ANN's as different dynamics considered (called regimes); a certain ANN is selected so as to predict the output power, depending on the current regime. The current regime is on-line updated based on a gradient criteria, regarding the past two values of the output power. 3 Regimes are established, concerning ramp events: ramp-up, ramp-down and no-ramp regime. In order to assess the skillness of the proposed RS-ANN model, a single-ANN model (without regime classification) is adopted as a reference model. Both models are evaluated in terms of Improvement over Persistence on the Mean Square Error basis (IoP%) when predicting horizons form 1 time-step to 5. The case of a wind farm located in the complex terrain of Alaiz (north of Spain) has been considered. Three years of available power output data with a hourly resolution have been employed: two years for training and validation of the model and the last year for assessing the accuracy. Results showed that the RS-ANN overcame the single-ANN model for one step-ahead forecasts: the overall IoP% was up to 8.66% for the RS-ANN model (depending on the gradient criterion selected to consider the ramp regime triggered) and 6.16% for the single-ANN. However, both models showed similar accuracy for larger horizons. A locally-weighted evaluation during ramp events for one-step ahead was also performed. It was found that the IoP% during ramps-up increased from 17.60% (case of single-ANN) to 22.25% (case of RS-ANN); however, during the ramps-down events this improvement increased from 18.55% to 19.55%. Three main conclusions are derived from this case study: It highlights the importance of considering statistical models capable of differentiate several regimes showed by the output power time series in order to improve the forecasting during extreme events like ramps. On-line regime classification based on available power output data didn't seem to contribute to improve forecasts for horizons beyond one-step ahead. Tacking into account other explanatory variables (local wind measurements, NWP outputs) could lead to a better understanding of ramp events, improving the regime assessment also for further horizons. The RS-ANN model slightly overcame the single-ANN during ramp-down events. If further research reinforce this effect, special attention should be addressed to understand the underlying processes during ramp-down events.

  19. Analysis of forecasting malaria case with climatic factors as predictor in Mandailing Natal Regency: a time series study

    NASA Astrophysics Data System (ADS)

    Aulia, D.; Ayu, S. F.; Matondang, A.

    2018-01-01

    Malaria is the most contagious global concern. As a public health problem with outbreaks, affect the quality of life and economy, also could lead to death. Therefore, this research is to forecast malaria cases with climatic factors as predictors in Mandailing Natal Regency. The total number of positive malaria cases on January 2008 to December 2016 were taken from health department of Mandailing Natal Regency. Climates data such as rainfall, humidity, and temperature were taken from Center of Statistic Department of Mandailing Natal Regency. E-views ver. 9 is used to analyze this study. Autoregressive integrated average, ARIMA (0,1,1) (1,0,0)12 is the best model to explain the 67,2% variability data in time series study. Rainfall (P value = 0.0005), temperature (P value = 0,0029) and humidity (P value = 0.0001) are significant predictors for malaria transmission. Seasonal adjusted factor (SAF) in November and March shows peak for malaria cases.

  20. Passenger car crippling end-load test and analyses

    DOT National Transportation Integrated Search

    2017-09-01

    The Transportation Technology Center, Inc. (TTCI) performed a series of full-scale tests and a finite element analysis (FEA) in a case study that may become a model for manufacturers seeking to use the waiver process of Tier I crashworthiness and occ...

  1. A combinatorial framework to quantify peak/pit asymmetries in complex dynamics.

    PubMed

    Hasson, Uri; Iacovacci, Jacopo; Davis, Ben; Flanagan, Ryan; Tagliazucchi, Enzo; Laufs, Helmut; Lacasa, Lucas

    2018-02-23

    We explore a combinatorial framework which efficiently quantifies the asymmetries between minima and maxima in local fluctuations of time series. We first showcase its performance by applying it to a battery of synthetic cases. We find rigorous results on some canonical dynamical models (stochastic processes with and without correlations, chaotic processes) complemented by extensive numerical simulations for a range of processes which indicate that the methodology correctly distinguishes different complex dynamics and outperforms state of the art metrics in several cases. Subsequently, we apply this methodology to real-world problems emerging across several disciplines including cases in neurobiology, finance and climate science. We conclude that differences between the statistics of local maxima and local minima in time series are highly informative of the complex underlying dynamics and a graph-theoretic extraction procedure allows to use these features for statistical learning purposes.

  2. Case reports and case series in prehospital emergency care research.

    PubMed

    Patterson, P Daniel; Weaver, Matthew; Clark, Sunday; Yealy, Donald M

    2010-11-01

    Research begins with a clearly stated question, problem or hypothesis. The selection of a study design appropriate to the task is the next key step. This paper provides guidance for the use of case report and case series designs by describing the 'what', 'when' and 'how' of both designs. Also described is the use of case reports and case series study designs in prehospital emergency research and the quality of published literature from 2000 to mid-2008.

  3. 75 FR 28475 - Airworthiness Directives; Airbus Model A330-200 and -300 Series Airplanes, and Model A340-300...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... unsafe condition as: In the door 2 area, the hat-racks are supplied with a basic wire harness which includes ``Oxygen Masks'' activation. In case of a monument installation, the respective non-used hat- rack... door 2 area, the hat-racks are supplied with a basic wire harness which includes ``Oxygen Masks...

  4. Characterizations of Social-Based and Self-Based Contexts Associated with Students' Awareness, Evaluation, and Regulation of Their Thinking during Small-Group Mathematical Modeling

    ERIC Educational Resources Information Center

    Magiera, Marta T.; Zawojewski, Judith S.

    2011-01-01

    This exploratory study focused on characterizing problem-solving situations associated with spontaneous metacognitive activity. The results came from connected case studies of a group of 3 purposefully selected 9th-grade students working collaboratively on a series of 5 modeling problems. Students' descriptions of their own thinking during…

  5. Treating the Tough Adolescent: A Family-Based, Step-by-Step Guide. The Guilford Family Therapy Series.

    ERIC Educational Resources Information Center

    Sells, Scott P.

    A model for treating difficult adolescents and their families is presented. Part 1 offers six basic assumptions about the causes of severe behavioral problems and presents the treatment model with guidelines necessary to address each of these six causes. Case examples highlight and clarify major points within each of the 15 procedural steps of the…

  6. Optimal Parameter Selection for Support Vector Machine Based on Artificial Bee Colony Algorithm: A Case Study of Grid-Connected PV System Power Prediction.

    PubMed

    Gao, Xiang-Ming; Yang, Shi-Feng; Pan, San-Bo

    2017-01-01

    Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD) and support vector machine (SVM) optimized with an artificial bee colony (ABC) algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization.

  7. Optimal Parameter Selection for Support Vector Machine Based on Artificial Bee Colony Algorithm: A Case Study of Grid-Connected PV System Power Prediction

    PubMed Central

    2017-01-01

    Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD) and support vector machine (SVM) optimized with an artificial bee colony (ABC) algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization. PMID:28912803

  8. Simulation of Ground Winds Time Series for the NASA Crew Launch Vehicle (CLV)

    NASA Technical Reports Server (NTRS)

    Adelfang, Stanley I.

    2008-01-01

    Simulation of wind time series based on power spectrum density (PSD) and spectral coherence models for ground wind turbulence is described. The wind models, originally developed for the Shuttle program, are based on wind measurements at the NASA 150-m meteorological tower at Cape Canaveral, FL. The current application is for the design and/or protection of the CLV from wind effects during on-pad exposure during periods from as long as days prior to launch, to seconds or minutes just prior to launch and seconds after launch. The evaluation of vehicle response to wind will influence the design and operation of constraint systems for support of the on-pad vehicle. Longitudinal and lateral wind component time series are simulated at critical vehicle locations. The PSD model for wind turbulence is a function of mean wind speed, elevation and temporal frequency. Integration of the PSD equation over a selected frequency range yields the variance of the time series to be simulated. The square root of the PSD defines a low-pass filter that is applied to adjust the components of the Fast Fourier Transform (FFT) of Gaussian white noise. The first simulated time series near the top of the launch vehicle is the inverse transform of the adjusted FFT. Simulation of the wind component time series at the nearest adjacent location (and all other succeeding next nearest locations) is based on a model for the coherence between winds at two locations as a function of frequency and separation distance, where the adjacent locations are separated vertically and/or horizontally. The coherence function is used to calculate a coherence weighted FFT of the wind at the next nearest location, given the FFT of the simulated time series at the previous location and the essentially incoherent FFT of the wind at the selected location derived a priori from the PSD model. The simulated time series at each adjacent location is the inverse Fourier transform of the coherence weighted FFT. For a selected design case, the equations, the process and the simulated time series at multiple vehicle stations are presented.

  9. Automated Bayesian model development for frequency detection in biological time series.

    PubMed

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure.

  10. Automated Bayesian model development for frequency detection in biological time series

    PubMed Central

    2011-01-01

    Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time series often deviate significantly from the requirements of optimality for Fourier transformation. In this paper we present an alternative approach based on Bayesian inference. We show the value of placing spectral analysis in the framework of Bayesian inference and demonstrate how model comparison can automate this procedure. PMID:21702910

  11. Thermal form-factor approach to dynamical correlation functions of integrable lattice models

    NASA Astrophysics Data System (ADS)

    Göhmann, Frank; Karbach, Michael; Klümper, Andreas; Kozlowski, Karol K.; Suzuki, Junji

    2017-11-01

    We propose a method for calculating dynamical correlation functions at finite temperature in integrable lattice models of Yang-Baxter type. The method is based on an expansion of the correlation functions as a series over matrix elements of a time-dependent quantum transfer matrix rather than the Hamiltonian. In the infinite Trotter-number limit the matrix elements become time independent and turn into the thermal form factors studied previously in the context of static correlation functions. We make this explicit with the example of the XXZ model. We show how the form factors can be summed utilizing certain auxiliary functions solving finite sets of nonlinear integral equations. The case of the XX model is worked out in more detail leading to a novel form-factor series representation of the dynamical transverse two-point function.

  12. Epidemiology meets econometrics: using time-series analysis to observe the impact of bed occupancy rates on the spread of multidrug-resistant bacteria.

    PubMed

    Kaier, K; Meyer, E; Dettenkofer, M; Frank, U

    2010-10-01

    Two multivariate time-series analyses were carried out to identify the impact of bed occupancy rates, turnover intervals and the average length of hospital stay on the spread of multidrug-resistant bacteria in a teaching hospital. Epidemiological data on the incidences of meticillin-resistant Staphylococcus aureus (MRSA) and extended-spectrum beta-lactamase (ESBL)-producing bacteria were collected. Time-series of bed occupancy rates, turnover intervals and the average length of stay were tested for inclusion in the models as independent variables. Incidence was defined as nosocomial cases per 1000 patient-days. This included all patients infected or colonised with MRSA/ESBL more than 48h after admission. Between January 2003 and July 2008, a mean incidence of 0.15 nosocomial MRSA cases was identified. ESBL was not included in the surveillance until January 2005. Between January 2005 and July 2008 the mean incidence of nosocomial ESBL was also 0.15 cases per 1000 patient-days. The two multivariate models demonstrate a temporal relationship between bed occupancy rates in general wards and the incidence of nosocomial MRSA and ESBL. Similarly, the temporal relationship between the monthly average length of stay in intensive care units (ICUs) and the incidence of nosocomial MRSA and ESBL was demonstrated. Overcrowding in general wards and long periods of ICU stay were identified as factors influencing the spread of multidrug-resistant bacteria in hospital settings. Copyright 2010 The Hospital Infection Society. Published by Elsevier Ltd. All rights reserved.

  13. Flood mapping in ungauged basins using fully continuous hydrologic-hydraulic modeling

    NASA Astrophysics Data System (ADS)

    Grimaldi, Salvatore; Petroselli, Andrea; Arcangeletti, Ettore; Nardi, Fernando

    2013-04-01

    SummaryIn this work, a fully-continuous hydrologic-hydraulic modeling framework for flood mapping is introduced and tested. It is characterized by a simulation of a long rainfall time series at sub-daily resolution that feeds a continuous rainfall-runoff model producing a discharge time series that is directly given as an input to a bi-dimensional hydraulic model. The main advantage of the proposed approach is to avoid the use of the design hyetograph and the design hydrograph that constitute the main source of subjective analysis and uncertainty for standard methods. The proposed procedure is optimized for small and ungauged watersheds where empirical models are commonly applied. Results of a simple real case study confirm that this experimental fully-continuous framework may pave the way for the implementation of a less subjective and potentially automated procedure for flood hazard mapping.

  14. Validation of the prognostic value of lymph node ratio in patients with cutaneous melanoma: a population-based study of 8,177 cases.

    PubMed

    Mocellin, Simone; Pasquali, Sandro; Rossi, Carlo Riccardo; Nitti, Donato

    2011-07-01

    The proportion of positive among examined lymph nodes (lymph node ratio [LNR]) has been recently proposed as an useful and easy-to-calculate prognostic factor for patients with cutaneous melanoma. However, its independence from the standard prognostic system TNM has not been formally proven in a large series of patients. Patients with histologically proven cutaneous melanoma were identified from the Surveillance Epidemiology End Results database. Disease-specific survival was the clinical outcome of interest. The prognostic ability of conventional factors and LNR was assessed by multivariable survival analysis using the Cox regression model. Eligible patients (n = 8,177) were diagnosed with melanoma between 1998 and 2006. Among lymph node-positive cases (n = 3,872), most LNR values ranged from 1% to 10% (n = 2,187). In the whole series (≥5 lymph nodes examined) LNR significantly contributed to the Cox model independently of the TNM effect on survival (hazard ratio, 1.28; 95% confidence interval, 1.23-1.32; P < .0001). On subgroup analysis, the significant and independent prognostic value of LNR was confirmed both in patients with ≥10 lymph nodes examined (n = 4,381) and in those with TNM stage III disease (n = 3,658). In all cases, LNR increased the prognostic accuracy of the survival model. In this large series of patients, the LNR independently predicted disease-specific survival, improving the prognostic accuracy of the TNM system. Accordingly, the LNR should be taken into account for the stratification of patients' risk, both in clinical and research settings. Copyright © 2011 Mosby, Inc. All rights reserved.

  15. Porous and nonporous orbital implants for treating the anophthalmic socket: A meta-analysis of case series studies.

    PubMed

    Schellini, Silvana; Jorge, Eliane; Sousa, Roberta; Burroughs, John; El-Dib, Regina

    2016-01-01

    To assess the efficacy and safety of porous and nonporous implants for management of the anophthalmic socket. Case series meta-analysis was conducted with no language restriction, including studies from: PUBMED, EMBASE and LILACS. Study eligibility criteria were case series design with more than 20 cases reported, use of porous and/or nonporous orbital implants, anophthalmic socket and, treatment success defined as no implant exposure or extrusion. Complications rates from each included study were quantified. Proportional meta-analysis was performed on both outcomes with a random-effects model and the 95% confidential intervals were calculated. A total of 35 case series studies with a total of 3,805 patients were included in the meta-analysis. There are no studies comparing porous and nonporous implants in the anophthalmic socket treatment. There was no statistically significant difference between porous polyethylene (PP) and hydroxyapatite (HA) on implant exposure: 0.026 (0.012-0.045) vs 0.054 (0.041-0.070), respectively and, neither on implant extrusion: 0.0042 (0.0008-0.010) vs. 0.018 (0.004-0.042), respectively. However, there was a significant difference supporting the use of PP when compared to bioceramic implant: 0.026 (0.012 -0.045) vs. 0.12 (0.06-0.20), respectively, on implant exposure. PP implants showed lower chance of exposure than bioceramic implant for anophthalmic socket reconstruction, although we cannot rule out the possibility of heterogeneity bias due to the nature and level of evidence of the included studies. Clinical trials are necessary to expand the knowledge of porous and nonporous orbital implants in the anophthalmic socket management.

  16. Employment as a welder and Parkinson disease among heavy equipment manufacturing workers.

    PubMed

    Marsh, Gary M; Gula, Mary Jean

    2006-10-01

    We investigated whether employment as a welder with potential exposure to manganese and other substances is associated with Parkinson disease (PD), parkinsonism or related neurological disorders, or accelerates the age of onset of PD. We selected cases and controls from 12,595 persons ever employed at three Caterpillar Inc. (CAT) plants between 1976 and 2004 with potential to make a medical insurance claim between 1998 and 2004. Cases had filed a claim for 1) PD, 2) "secondary parkinsonism", 3) "other degenerative diseases of the basal ganglia" or 4) "essential and other specific forms of tremor". Cases were grouped by claims: Group 1-claims 1 and 2 and Group 2-claims 1 to 4, and as study period incident (SPI) or prevalent. Each case was matched to two series of 10 controls each on date of case's first claim, year of birth, race and sex. Series I was also matched on plant. Odds ratios (OR) and 95% confidence intervals (CI) for the variable, "ever welder in any CAT plant" were: Group 1-SPI Cases: Series I (OR = .76, CI = .26-2.19), Series II (OR = .81, CI = .29-2.25); Group 1- Prevalent Cases: Series I (OR = .82, CI = .36-1.86), Series II (OR = .97, CI = .42-2.23); Group 2- SPI Cases: Series I (OR = 1.03, CI = .57-1.87), Series II (OR = 1.21, CI = .67-2.20) Group 2-Prevalent Cases: Series I (OR = 1.02, CI = .62-1.71), Series II (OR = .86, CI = .51-1.43). Our finding of no statistically significant associations for welding employment was maintained following adjustment for potential confounding and evaluation of possible effect modification. Employment as a welder did not accelerate the age of onset of PD. Our study supported the conclusion that employment as a welder is not associated with Parkinson disease, parkinsonism or a related neurological disorder.

  17. Hydrological time series modeling: A comparison between adaptive neuro-fuzzy, neural network and autoregressive techniques

    NASA Astrophysics Data System (ADS)

    Lohani, A. K.; Kumar, Rakesh; Singh, R. D.

    2012-06-01

    SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.

  18. Detecting and interpreting distortions in hierarchical organization of complex time series

    NASA Astrophysics Data System (ADS)

    DroŻdŻ, Stanisław; OświÈ©cimka, Paweł

    2015-03-01

    Hierarchical organization is a cornerstone of complexity and multifractality constitutes its central quantifying concept. For model uniform cascades the corresponding singularity spectra are symmetric while those extracted from empirical data are often asymmetric. Using selected time series representing such diverse phenomena as price changes and intertransaction times in financial markets, sentence length variability in narrative texts, Missouri River discharge, and sunspot number variability as examples, we show that the resulting singularity spectra appear strongly asymmetric, more often left sided but in some cases also right sided. We present a unified view on the origin of such effects and indicate that they may be crucially informative for identifying the composition of the time series. One particularly intriguing case of this latter kind of asymmetry is detected in the daily reported sunspot number variability. This signals that either the commonly used famous Wolf formula distorts the real dynamics in expressing the largest sunspot numbers or, if not, that their dynamics is governed by a somewhat different mechanism.

  19. Genome-wide association study in discordant sibships identifies multiple inherited susceptibility alleles linked to lung cancer.

    PubMed

    Galvan, Antonella; Falvella, Felicia S; Frullanti, Elisa; Spinola, Monica; Incarbone, Matteo; Nosotti, Mario; Santambrogio, Luigi; Conti, Barbara; Pastorino, Ugo; Gonzalez-Neira, Anna; Dragani, Tommaso A

    2010-03-01

    We analyzed a series of young (median age = 52 years) non-smoker lung cancer patients and their unaffected siblings as controls, using a genome-wide 620 901 single-nucleotide polymorphism (SNP) array analysis and a case-control DNA pooling approach. We identified 82 putatively associated SNPs that were retested by individual genotyping followed by use of the sib transmission disequilibrium test, pointing to 36 SNPs associated with lung cancer risk in the discordant sibs series. Analysis of these 36 SNPs in a polygenic model characterized by additive and interchangeable effects of rare alleles revealed a highly statistically significant dosage-dependent association between risk allele carrier status and proportion of cancer cases. Replication of the same 36 SNPs in a population-based series confirmed the association with lung cancer for three SNPs, suggesting that phenocopies and genetic heterogeneity can play a major role in the complex genetics of lung cancer risk in the general population.

  20. Efficient multidimensional regularization for Volterra series estimation

    NASA Astrophysics Data System (ADS)

    Birpoutsoukis, Georgios; Csurcsia, Péter Zoltán; Schoukens, Johan

    2018-05-01

    This paper presents an efficient nonparametric time domain nonlinear system identification method. It is shown how truncated Volterra series models can be efficiently estimated without the need of long, transient-free measurements. The method is a novel extension of the regularization methods that have been developed for impulse response estimates of linear time invariant systems. To avoid the excessive memory needs in case of long measurements or large number of estimated parameters, a practical gradient-based estimation method is also provided, leading to the same numerical results as the proposed Volterra estimation method. Moreover, the transient effects in the simulated output are removed by a special regularization method based on the novel ideas of transient removal for Linear Time-Varying (LTV) systems. Combining the proposed methodologies, the nonparametric Volterra models of the cascaded water tanks benchmark are presented in this paper. The results for different scenarios varying from a simple Finite Impulse Response (FIR) model to a 3rd degree Volterra series with and without transient removal are compared and studied. It is clear that the obtained models capture the system dynamics when tested on a validation dataset, and their performance is comparable with the white-box (physical) models.

  1. Statefinder diagnostic and constraints on the Palatini f(R) gravity theories

    NASA Astrophysics Data System (ADS)

    Cao, Shu-Lei; Li, Song; Yu, Hao-Ran; Zhang, Tong-Jie

    2018-03-01

    We focus on a series of f(R) gravity theories in Palatini formalism to investigate the probabilities of producing late-time acceleration for the flat Friedmann-Robertson-Walker (FRW) universe. We apply a statefinder diagnostic to these cosmological models for chosen series of parameters to see if they can be distinguished from one another. The diagnostic involves the statefinder pair {r, s}, where r is derived from the scale factor a and its higher derivatives with respect to the cosmic time t, and s is expressed by r and the deceleration parameter q. In conclusion, we find that although two types of f(R) theories: (i) f(R) = R + αRm – βR ‑n and (ii) f(R) = R + α ln R – β can lead to late-time acceleration, their evolutionary trajectories in the r – s and r – q planes reveal different evolutionary properties, which certainly justify the merits of the statefinder diagnostic. Additionally, we utilize the observational Hubble parameter data (OHD) to constrain these models of f(R) gravity. As a result, except for m = n = 1/2 in case (i), α = 0 in case (i) and case (ii) allow the ΛCDM model to exist in the 1σ confidence region. After applying the statefinder diagnostic to the best-fit models, we find that all the best-fit models are capable of going through the deceleration/acceleration transition stage with a late-time acceleration epoch, and all these models turn to the de Sitter point ({r, s} = {1, 0}) in the future. Also, the evolutionary differences between these models are distinct, especially in the r – s plane, which makes the statefinder diagnostic more reliable in discriminating cosmological models.

  2. A lifestyle intervention program for successfully addressing major cardiometabolic risks in persons with SCI: a three-subject case series.

    PubMed

    Bigford, Gregory E; Mendez, Armando J; Betancourt, Luisa; Burns-Drecq, Patricia; Backus, Deborah; Nash, Mark S

    2017-01-01

    This study is a prospective case series analyzing the effects of a comprehensive lifestyle intervention program in three patients with chronic paraplegia having major risks for the cardiometabolic syndrome (CMS). Individuals underwent an intense 6-month program of circuit resistance exercise, nutrition using a Mediterranean diet and behavioral support, followed by a 6-month extension (maintenance) phase involving minimal support. The primary goal was a 7% reduction of body mass. Other outcomes analyzed insulin resistance using the HOMA-IR model, and plasma levels of fasting triglycerides and high-density lipoprotein cholesterol. All participants achieved the goal for 7% reduction of body mass and maintained the loss after the MP. Improvements were observed in 2/3 subjects for HOMA-IR and high-density lipoprotein cholesterol. All participants improved their risk for plasma triglycerides. We conclude, in a three-person case series of persons with chronic paraplegia, a lifestyle intervention program involving circuit resistance training, a calorie-restrictive Mediterranean-style diet and behavioral support, results in clinically significant loss of body mass and effectively reduced component risks for CMS and diabetes. These results were for the most part maintained after a 6-month MP involving minimal supervision.

  3. Whirl measurements on leakage flows in turbomachine models

    NASA Technical Reports Server (NTRS)

    Addlesee, A. J.; Altiparmak, D.; Pan, S.

    1994-01-01

    The beneficial effects claimed for whirl control devices demonstrate that the dynamic behavior of rotors is influenced by the fluid whirl in shaft and balance drum seals. The present paper reports results from two series of experiments, the first on the factors affecting the whirl at the seal inlet, and the second on the variation of whirl velocity along the seal. In both cases the LDA measurement technique required the clearance between the fixed and rotating parts of the models to be substantially greater than occurs in real machines, but the results are indicative nevertheless. Experimental and theoretical results are given for the radial distribution of whirl velocity in the gap between impeller shroud and pump casing. Results of tests with modified stator surfaces are also shown. This work leads naturally into the second series of experiments where some preliminary measurements of velocity distribution in the clearance between a fixed stator and a rotating shaft are reported for a range of inlet whirl conditions.

  4. Spatio-Temporal Trends and Risk Factors for Shigella from 2001 to 2011 in Jiangsu Province, People's Republic of China

    PubMed Central

    Bao, Changjun; Hu, Jianli; Liu, Wendong; Liang, Qi; Wu, Ying; Norris, Jessie; Peng, Zhihang; Yu, Rongbin; Shen, Hongbing; Chen, Feng

    2014-01-01

    Objective This study aimed to describe the spatial and temporal trends of Shigella incidence rates in Jiangsu Province, People's Republic of China. It also intended to explore complex risk modes facilitating Shigella transmission. Methods County-level incidence rates were obtained for analysis using geographic information system (GIS) tools. Trend surface and incidence maps were established to describe geographic distributions. Spatio-temporal cluster analysis and autocorrelation analysis were used for detecting clusters. Based on the number of monthly Shigella cases, an autoregressive integrated moving average (ARIMA) model successfully established a time series model. A spatial correlation analysis and a case-control study were conducted to identify risk factors contributing to Shigella transmissions. Results The far southwestern and northwestern areas of Jiangsu were the most infected. A cluster was detected in southwestern Jiangsu (LLR = 11674.74, P<0.001). The time series model was established as ARIMA (1, 12, 0), which predicted well for cases from August to December, 2011. Highways and water sources potentially caused spatial variation in Shigella development in Jiangsu. The case-control study confirmed not washing hands before dinner (OR = 3.64) and not having access to a safe water source (OR = 2.04) as the main causes of Shigella in Jiangsu Province. Conclusion Improvement of sanitation and hygiene should be strengthened in economically developed counties, while access to a safe water supply in impoverished areas should be increased at the same time. PMID:24416167

  5. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    NASA Astrophysics Data System (ADS)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  6. A comparison of economic evaluation models as applied to geothermal energy technology

    NASA Technical Reports Server (NTRS)

    Ziman, G. M.; Rosenberg, L. S.

    1983-01-01

    Several cost estimation and financial cash flow models have been applied to a series of geothermal case studies. In order to draw conclusions about relative performance and applicability of these models to geothermal projects, the consistency of results was assessed. The model outputs of principal interest in this study were net present value, internal rate of return, or levelized breakeven price. The models used were VENVAL, a venture analysis model; the Geothermal Probabilistic Cost Model (GPC Model); the Alternative Power Systems Economic Analysis Model (APSEAM); the Geothermal Loan Guarantee Cash Flow Model (GCFM); and the GEOCOST and GEOCITY geothermal models. The case studies to which the models were applied include a geothermal reservoir at Heber, CA; a geothermal eletric power plant to be located at the Heber site; an alcohol fuels production facility to be built at Raft River, ID; and a direct-use, district heating system in Susanville, CA.

  7. GASAKe: forecasting landslide activations by a genetic-algorithms-based hydrological model

    NASA Astrophysics Data System (ADS)

    Terranova, O. G.; Gariano, S. L.; Iaquinta, P.; Iovine, G. G. R.

    2015-07-01

    GASAKe is a new hydrological model aimed at forecasting the triggering of landslides. The model is based on genetic algorithms and allows one to obtain thresholds for the prediction of slope failures using dates of landslide activations and rainfall series. It can be applied to either single landslides or a set of similar slope movements in a homogeneous environment. Calibration of the model provides families of optimal, discretized solutions (kernels) that maximize the fitness function. Starting from the kernels, the corresponding mobility functions (i.e., the predictive tools) can be obtained through convolution with the rain series. The base time of the kernel is related to the magnitude of the considered slope movement, as well as to the hydro-geological complexity of the site. Generally, shorter base times are expected for shallow slope instabilities compared to larger-scale phenomena. Once validated, the model can be applied to estimate the timing of future landslide activations in the same study area, by employing measured or forecasted rainfall series. Examples of application of GASAKe to a medium-size slope movement (the Uncino landslide at San Fili, in Calabria, southern Italy) and to a set of shallow landslides (in the Sorrento Peninsula, Campania, southern Italy) are discussed. In both cases, a successful calibration of the model has been achieved, despite unavoidable uncertainties concerning the dates of occurrence of the slope movements. In particular, for the Sorrento Peninsula case, a fitness of 0.81 has been obtained by calibrating the model against 10 dates of landslide activation; in the Uncino case, a fitness of 1 (i.e., neither missing nor false alarms) has been achieved using five activations. As for temporal validation, the experiments performed by considering further dates of activation have also proved satisfactory. In view of early-warning applications for civil protection, the capability of the model to simulate the occurrences of the Uncino landslide has been tested by means of a progressive, self-adaptive procedure. Finally, a sensitivity analysis has been performed by taking into account the main parameters of the model. The obtained results are quite promising, given the high performance of the model against different types of slope instabilities characterized by several historical activations. Nevertheless, further refinements are still needed for application to landslide risk mitigation within early-warning and decision-support systems.

  8. GASAKe: forecasting landslide activations by a genetic-algorithms based hydrological model

    NASA Astrophysics Data System (ADS)

    Terranova, O. G.; Gariano, S. L.; Iaquinta, P.; Iovine, G. G. R.

    2015-02-01

    GASAKe is a new hydrological model aimed at forecasting the triggering of landslides. The model is based on genetic-algorithms and allows to obtaining thresholds of landslide activation from the set of historical occurrences and from the rainfall series. GASAKe can be applied to either single landslides or set of similar slope movements in a homogeneous environment. Calibration of the model is based on genetic-algorithms, and provides for families of optimal, discretized solutions (kernels) that maximize the fitness function. Starting from these latter, the corresponding mobility functions (i.e. the predictive tools) can be obtained through convolution with the rain series. The base time of the kernel is related to the magnitude of the considered slope movement, as well as to hydro-geological complexity of the site. Generally, smaller values are expected for shallow slope instabilities with respect to large-scale phenomena. Once validated, the model can be applied to estimate the timing of future landslide activations in the same study area, by employing recorded or forecasted rainfall series. Example of application of GASAKe to a medium-scale slope movement (the Uncino landslide at San Fili, in Calabria, Southern Italy) and to a set of shallow landslides (in the Sorrento Peninsula, Campania, Southern Italy) are discussed. In both cases, a successful calibration of the model has been achieved, despite unavoidable uncertainties concerning the dates of landslide occurrence. In particular, for the Sorrento Peninsula case, a fitness of 0.81 has been obtained by calibrating the model against 10 dates of landslide activation; in the Uncino case, a fitness of 1 (i.e. neither missing nor false alarms) has been achieved against 5 activations. As for temporal validation, the experiments performed by considering the extra dates of landslide activation have also proved satisfactory. In view of early-warning applications for civil protection purposes, the capability of the model to simulate the occurrences of the Uncino landslide has been tested by means of a progressive, self-adaptive procedure. Finally, a sensitivity analysis has been performed by taking into account the main parameters of the model. The obtained results are quite promising, given the high performance of the model obtained against different types of slope instabilities, characterized by several historical activations. Nevertheless, further refinements are still needed for applications to landslide risk mitigation within early-warning and decision-support systems.

  9. Wavelet-based multiscale performance analysis: An approach to assess and improve hydrological models

    NASA Astrophysics Data System (ADS)

    Rathinasamy, Maheswaran; Khosa, Rakesh; Adamowski, Jan; ch, Sudheer; Partheepan, G.; Anand, Jatin; Narsimlu, Boini

    2014-12-01

    The temporal dynamics of hydrological processes are spread across different time scales and, as such, the performance of hydrological models cannot be estimated reliably from global performance measures that assign a single number to the fit of a simulated time series to an observed reference series. Accordingly, it is important to analyze model performance at different time scales. Wavelets have been used extensively in the area of hydrological modeling for multiscale analysis, and have been shown to be very reliable and useful in understanding dynamics across time scales and as these evolve in time. In this paper, a wavelet-based multiscale performance measure for hydrological models is proposed and tested (i.e., Multiscale Nash-Sutcliffe Criteria and Multiscale Normalized Root Mean Square Error). The main advantage of this method is that it provides a quantitative measure of model performance across different time scales. In the proposed approach, model and observed time series are decomposed using the Discrete Wavelet Transform (known as the à trous wavelet transform), and performance measures of the model are obtained at each time scale. The applicability of the proposed method was explored using various case studies-both real as well as synthetic. The synthetic case studies included various kinds of errors (e.g., timing error, under and over prediction of high and low flows) in outputs from a hydrologic model. The real time case studies investigated in this study included simulation results of both the process-based Soil Water Assessment Tool (SWAT) model, as well as statistical models, namely the Coupled Wavelet-Volterra (WVC), Artificial Neural Network (ANN), and Auto Regressive Moving Average (ARMA) methods. For the SWAT model, data from Wainganga and Sind Basin (India) were used, while for the Wavelet Volterra, ANN and ARMA models, data from the Cauvery River Basin (India) and Fraser River (Canada) were used. The study also explored the effect of the choice of the wavelets in multiscale model evaluation. It was found that the proposed wavelet-based performance measures, namely the MNSC (Multiscale Nash-Sutcliffe Criteria) and MNRMSE (Multiscale Normalized Root Mean Square Error), are a more reliable measure than traditional performance measures such as the Nash-Sutcliffe Criteria (NSC), Root Mean Square Error (RMSE), and Normalized Root Mean Square Error (NRMSE). Further, the proposed methodology can be used to: i) compare different hydrological models (both physical and statistical models), and ii) help in model calibration.

  10. [The warning model and influence of climatic changes on hemorrhagic fever with renal syndrome in Changsha city].

    PubMed

    Xiao, Hong; Tian, Huai-yu; Zhang, Xi-xing; Zhao, Jian; Zhu, Pei-juan; Liu, Ru-chun; Chen, Tian-mu; Dai, Xiang-yu; Lin, Xiao-ling

    2011-10-01

    To realize the influence of climatic changes on the transmission of hemorrhagic fever with renal syndrome (HFRS), and to explore the adoption of climatic factors in warning HFRS. A total of 2171 cases of HFRS and the synchronous climatic data in Changsha from 2000 to 2009 were collected to a climate-based forecasting model for HFRS transmission. The Cochran-Armitage trend test was employed to explore the variation trend of the annual incidence of HFRS. Cross-correlations analysis was then adopted to assess the time-lag period between the climatic factors, including monthly average temperature, relative humidity, rainfall and Multivariate Elño-Southern Oscillation Index (MEI) and the monthly HFRS cases. Finally the time-series Poisson regression model was constructed to analyze the influence of different climatic factors on the HFRS transmission. The annual incidence of HFRS in Changsha between 2000 - 2009 was 13.09/100 000 (755 cases), 9.92/100 000 (578 cases), 5.02/100 000 (294 cases), 2.55/100 000 (150 cases), 1.13/100 000 (67 cases), 1.16/100 000 (70 cases), 0.95/100 000 (58 cases), 1.40/100 000 (87 cases), 0.75/100 000 (47 cases) and 1.02/100 000 (65 cases), respectively. The incidence showed a decline during these years (Z = -5.78, P < 0.01). The results of Poisson regression model indicated that the monthly average temperature (18.00°C, r = 0.26, P < 0.01, 1-month lag period; IRR = 1.02, 95%CI: 1.00 - 1.03, P < 0.01), relative humidity (75.50%, r = 0.62, P < 0.01, 3-month lag period; IRR = 1.03, 95%CI: 1.02 - 1.04, P < 0.01), rainfall (112.40 mm, r = 0.25, P < 0.01, 6-month lag period; IRR = 1.01, 95CI: 1.01 - 1.02, P = 0.02), and MEI (r = 0.31, P < 0.01, 3-month lag period; IRR = 0.77, 95CI: 0.67 - 0.88, P < 0.01) were closely associated with monthly HFRS cases (18.10 cases). Climate factors significantly influence the incidence of HFRS. If the influence of variable-autocorrelation, seasonality, and long-term trend were controlled, the accuracy of forecasting by the time-series Poisson regression model in Changsha would be comparatively high, and we could forecast the incidence of HFRS in advance.

  11. Assessment of a Bidirectional Reflectance Distribution Correction of Above-Water and Satellite Water-Leaving Radiance in Coastal Waters

    DTIC Science & Technology

    2012-01-10

    water and satellite water-leaving radiance data for bidirectional effects. The proposed model is first validated with a one year time series of in situ... model is proposed to correct above-water and satellite water-leaving radiance data for bidirectional effects. The proposed model is first validated with...proposed model over the current one, demonstrating the need for a specific case 2 water BRDF correction algorithm as well as the feasibility of enhancing

  12. Climatic Variables and Malaria Morbidity in Mutale Local Municipality, South Africa: A 19-Year Data Analysis.

    PubMed

    Adeola, Abiodun M; Botai, Joel O; Rautenbach, Hannes; Adisa, Omolola M; Ncongwane, Katlego P; Botai, Christina M; Adebayo-Ojo, Temitope C

    2017-11-08

    The north-eastern parts of South Africa, comprising the Limpopo Province, have recorded a sudden rise in the rate of malaria morbidity and mortality in the 2017 malaria season. The epidemiological profiles of malaria, as well as other vector-borne diseases, are strongly associated with climate and environmental conditions. A retrospective understanding of the relationship between climate and the occurrence of malaria may provide insight into the dynamics of the disease's transmission and its persistence in the north-eastern region. In this paper, the association between climatic variables and the occurrence of malaria was studied in the Mutale local municipality in South Africa over a period of 19-year. Time series analysis was conducted on monthly climatic variables and monthly malaria cases in the Mutale municipality for the period of 1998-2017. Spearman correlation analysis was performed and the Seasonal Autoregressive Integrated Moving Average (SARIMA) model was developed. Microsoft Excel was used for data cleaning, and statistical software R was used to analyse the data and develop the model. Results show that both climatic variables' and malaria cases' time series exhibited seasonal patterns, showing a number of peaks and fluctuations. Spearman correlation analysis indicated that monthly total rainfall, mean minimum temperature, mean maximum temperature, mean average temperature, and mean relative humidity were significantly and positively correlated with monthly malaria cases in the study area. Regression analysis showed that monthly total rainfall and monthly mean minimum temperature ( R ² = 0.65), at a two-month lagged effect, are the most significant climatic predictors of malaria transmission in Mutale local municipality. A SARIMA (2,1,2) (1,1,1) model fitted with only malaria cases has a prediction performance of about 51%, and the SARIMAX (2,1,2) (1,1,1) model with climatic variables as exogenous factors has a prediction performance of about 72% in malaria cases. The model gives a close comparison between the predicted and observed number of malaria cases, hence indicating that the model provides an acceptable fit to predict the number of malaria cases in the municipality. To sum up, the association between the climatic variables and malaria cases provides clues to better understand the dynamics of malaria transmission. The lagged effect detected in this study can help in adequate planning for malaria intervention.

  13. Assessing the economic impacts of drought from the perspective of profit loss rate: a case study of the sugar industry in China

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Lin, L.; Chen, H.

    2015-02-01

    Natural disasters have enormous impacts on human society, especially on the development of the economy. To support decision making in mitigation and adaption to natural disasters, assessment of economic impacts is fundamental and of great significance. Based on a review of the literature of economic impact evaluation, this paper proposes a new assessment model of economic impact from drought by using the sugar industry in China as a case study, which focuses on the generation and transfer of economic impacts along a simple value chain involving only sugarcane growers and a sugar producing company. A perspective of profit loss rate is applied to scale economic impact with a model based on cost-and-benefit analysis. By using analysis of "with-and-without", profit loss is defined as the difference in profits between disaster-hit and disaster-free scenarios. To calculate profit, analysis on a time series of sugar price is applied. With the support of a linear regression model, an endogenous trend in sugar price is identified, and the time series of sugar price "without" disaster is obtained using an autoregressive error model to separate impact by disasters from the internal trend in sugar price. Unlike the settings in other assessment models, representative sugar prices, which represent value level in disaster-free condition and disaster-hit condition, are integrated from a long time series that covers the whole period of drought. As a result, it is found that in a rigid farming contract, sugarcane growers suffer far more than the sugar company when impacted by severe drought, which may promote the reflections on economic equality among various economic bodies at the occurrence of natural disasters.

  14. GPS IPW as a Meteorological Parameter and Climate Global Change Indicator

    NASA Astrophysics Data System (ADS)

    Kruczyk, M.; Liwosz, T.

    2011-12-01

    Paper focuses on comprehensive investigation of the GPS derived IPW (Integrated Precipitable Water, also IWV) as a geophysical tool. GPS meteorology is now widely acknowledged indirect method of atmosphere sensing. First we demonstrate GPS IPW quality. Most thorough inter-technique comparisons of directly measured IPW are attainable only for some observatories (note modest percentage of GPS stations equipped with meteorological devices). Nonetheless we have managed to compare IPW series derived from GPS tropospheric solutions (ZTD mostly from IGS and EPN solutions) and some independent techniques. IPW values from meteorological sources we used are: radiosoundings, sun photometer and input fields of numerical weather prediction model. We can treat operational NWP models as meteorological database within which we can calculate IWV for all GPS stations independently from network of direct measurements (COSMO-LM model maintained by Polish Institute of Meteorology and Water Management was tried). Sunphotometer (CIMEL-318, Central Geophysical Observatory IGF PAS, Belsk, Poland) data seems the most genuine source - so we decided for direct collocation of GPS measurements and sunphotometer placing permanent GPS receiver on the roof of Belsk Observatory. Next we analyse IPW as geophysical parameter: IPW demonstrates some physical effects evoked by station location (height and series correlation coefficient as a function of distance) and weather patterns like dominant wind directions (in case of neighbouring stations). Deficiency of surface humidity data to model IPW is presented for different climates. This inadequacy and poor humidity data representation in NWP model extremely encourages investigating information exchange potential between Numerical Model and GPS network. The second and most important aspect of this study concerns long series of IPW (daily averaged) which can serve as climatological information indicator (water vapour role in climate system is hard to exaggerate). Especially intriguing are relatively unique shape of such series in different climates. Long lasting changes in weather conditions: 'dry' and 'wet' years are also visible. The longer and more uniform our series are the better chance to estimate the magnitude of climatological IWV changes. Homogenous ZTD solution during long period is great concern in this approach (problems with GPS strategy and reference system changes). In case of continental network (EUREF Permanent Network) reliable data we get only after reprocessing. Simple sinusoidal model has been adjusted to the IPW series (LS method) for selected stations (mainly Europe but also other continents - IGS stations), every year separately. Not only amplitudes but also phases of annual signal differ from year to year. Longer IPW series (up to 14 years) searched for some climatological signal sometimes reveal weak steady trend. Large number of GPS permanent stations, relative easiness of IPW derivation (only and surface meteo data needed apart from GPS solution) and water vapour significance in water cycle and global climate make this GPS IPW promising element of global environmental change monitoring.

  15. Outcomes of regenerative endodontic procedures.

    PubMed

    Law, Alan S

    2012-07-01

    The use of regenerative endodontic techniques holds great promise for the treatment of immature teeth with necrotic pulp tissue. Several published case reports and case series have demonstrated radiographic evidence of apical bone healing, increases in root length, and root wall thickness. Although histologic changes have been demonstrated in animal models, histology in human teeth is lacking. A summary of these outcomes is discussed in this article. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Getting started with the model for improvement: psychology and leadership in quality improvement.

    PubMed

    Pratap, J Nick; Varughese, Anna M; Adler, Elena; Kurth, C Dean

    2013-02-01

    Although the case for quality in hospitals is compelling, doctors are often uncertain how to achieve it. This article forms the third and final part of a series providing practical guidance on getting started with a first quality improvement project. Introduction.

  17. Assessments of higher-order ionospheric effects on GPS coordinate time series: A case study of CMONOC with longer time series

    NASA Astrophysics Data System (ADS)

    Jiang, Weiping; Deng, Liansheng; Zhou, Xiaohui; Ma, Yifang

    2014-05-01

    Higher-order ionospheric (HIO) corrections are proposed to become a standard part for precise GPS data analysis. For this study, we deeply investigate the impacts of the HIO corrections on the coordinate time series by implementing re-processing of the GPS data from Crustal Movement Observation Network of China (CMONOC). Nearly 13 year data are used in our three processing runs: (a) run NO, without HOI corrections, (b) run IG, both second- and third-order corrections are modeled using the International Geomagnetic Reference Field 11 (IGRF11) to model the magnetic field, (c) run ID, the same with IG but dipole magnetic model are applied. Both spectral analysis and noise analysis are adopted to investigate these effects. Results show that for CMONOC stations, HIO corrections are found to have brought an overall improvement. After the corrections are applied, the noise amplitudes decrease, with the white noise amplitudes showing a more remarkable variation. Low-latitude sites are more affected. For different coordinate components, the impacts vary. The results of an analysis of stacked periodograms show that there is a good match between the seasonal amplitudes and the HOI corrections, and the observed variations in the coordinate time series are related to HOI effects. HOI delays partially explain the seasonal amplitudes in the coordinate time series, especially for the U component. The annual amplitudes for all components are decreased for over one-half of the selected CMONOC sites. Additionally, the semi-annual amplitudes for the sites are much more strongly affected by the corrections. However, when diplole model is used, the results are not as optimistic as IGRF model. Analysis of dipole model indicate that HIO delay lead to the increase of noise amplitudes, and that HIO delays with dipole model can generate false periodic signals. When dipole model are used in modeling HIO terms, larger residual and noise are brought in rather than the effective improvements.

  18. Aquatic Trophic Productivity model: A decision support model for river restoration planning in the Methow River, Washington

    USGS Publications Warehouse

    Benjamin, Joseph R.; Bellmore, J. Ryan

    2016-05-19

    In this report, we outline the structure of a stream food-web model constructed to explore how alternative river restoration strategies may affect stream fish populations. We have termed this model the “Aquatic Trophic Productivity model” (ATP). We present the model structure, followed by three case study applications of the model to segments of the Methow River watershed in northern Washington. For two case studies (middle Methow River and lower Twisp River floodplain), we ran a series of simulations to explore how food-web dynamics respond to four distinctly different, but applied, strategies in the Methow River watershed: (1) reconnection of floodplain aquatic habitats, (2) riparian vegetation planting, (3) nutrient augmentation (that is, salmon carcass addition), and (4) enhancement of habitat suitability for fish. For the third case study, we conducted simulations to explore the potential fish and food-web response to habitat improvements conducted in 2012 at the Whitefish Island Side Channel, located in the middle Methow River.

  19. A Bayesian nonparametric approach to dynamical noise reduction

    NASA Astrophysics Data System (ADS)

    Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2018-06-01

    We propose a Bayesian nonparametric approach for the noise reduction of a given chaotic time series contaminated by dynamical noise, based on Markov Chain Monte Carlo methods. The underlying unknown noise process (possibly) exhibits heavy tailed behavior. We introduce the Dynamic Noise Reduction Replicator model with which we reconstruct the unknown dynamic equations and in parallel we replicate the dynamics under reduced noise level dynamical perturbations. The dynamic noise reduction procedure is demonstrated specifically in the case of polynomial maps. Simulations based on synthetic time series are presented.

  20. Effect of Inherited Genetic Information on Stochastic Predator-Prey Model

    NASA Astrophysics Data System (ADS)

    Duda, Artur; Dyś, Paweł; Nowicka, Alekandra; Dudek, Mirosław R.

    We discuss the Lotka-Volterra dynamics of two populations, preys and predators, in the case when the predators posses a genetic information. The genetic information is inherited according to the rules of the Penna model of genetic evolution. Each individual of the predator population is uniquely determined by sex, genotype and phenotype. In our case, the genes are represented by 8-bit integers and the phenotypes are defined with the help of the 8-state Potts model Hamiltonian. We showed that during time evolution, the population of the predators can experience a series of dynamical phase transitions which are connected with the different types of the dominant phenotypes present in the population.

  1. Estimating the basic reproduction rate of HFMD using the time series SIR model in Guangdong, China

    PubMed Central

    Du, Zhicheng; Zhang, Wangjian; Zhang, Dingmei; Yu, Shicheng

    2017-01-01

    Hand, foot, and mouth disease (HFMD) has caused a substantial burden of disease in China, especially in Guangdong Province. Based on notifiable cases, we use the time series Susceptible-Infected-Recovered model to estimate the basic reproduction rate (R0) and the herd immunity threshold, understanding the transmission and persistence of HFMD more completely for efficient intervention in this province. The standardized difference between the reported and fitted time series of HFMD was 0.009 (<0.2). The median basic reproduction rate of total, enterovirus 71, and coxsackievirus 16 cases in Guangdong were 4.621 (IQR: 3.907–5.823), 3.023 (IQR: 2.289–4.292) and 7.767 (IQR: 6.903–10.353), respectively. The heatmap of R0 showed semiannual peaks of activity, including a major peak in spring and early summer (about the 12th week) followed by a smaller peak in autumn (about the 36th week). The county-level model showed that Longchuan (R0 = 33), Gaozhou (R0 = 24), Huazhou (R0 = 23) and Qingxin (R0 = 19) counties have higher basic reproduction rate than other counties in the province. The epidemic of HFMD in Guangdong Province is still grim, and strategies like the World Health Organization’s expanded program on immunization need to be implemented. An elimination of HFMD in Guangdong might need a Herd Immunity Threshold of 78%. PMID:28692654

  2. Non-stationary time series modeling on caterpillars pest of palm oil for early warning system

    NASA Astrophysics Data System (ADS)

    Setiyowati, Susi; Nugraha, Rida F.; Mukhaiyar, Utriweni

    2015-12-01

    The oil palm production has an important role for the plantation and economic sector in Indonesia. One of the important problems in the cultivation of oil palm plantation is pests which causes damage to the quality of fruits. The caterpillar pest which feed palm tree's leaves will cause decline in quality of palm oil production. Early warning system is needed to minimize losses due to this pest. Here, we applied non-stationary time series modeling, especially the family of autoregressive models to predict the number of pests based on its historical data. We realized that there is some uniqueness of these pests data, i.e. the spike value that occur almost periodically. Through some simulations and case study, we obtain that the selection of constant factor has a significance influence to the model so that it can shoot the spikes value precisely.

  3. How bootstrap can help in forecasting time series with more than one seasonal pattern

    NASA Astrophysics Data System (ADS)

    Cordeiro, Clara; Neves, M. Manuela

    2012-09-01

    The search for the future is an appealing challenge in time series analysis. The diversity of forecasting methodologies is inevitable and is still in expansion. Exponential smoothing methods are the launch platform for modelling and forecasting in time series analysis. Recently this methodology has been combined with bootstrapping revealing a good performance. The algorithm (Boot. EXPOS) using exponential smoothing and bootstrap methodologies, has showed promising results for forecasting time series with one seasonal pattern. In case of more than one seasonal pattern, the double seasonal Holt-Winters methods and the exponential smoothing methods were developed. A new challenge was now to combine these seasonal methods with bootstrap and carry over a similar resampling scheme used in Boot. EXPOS procedure. The performance of such partnership will be illustrated for some well-know data sets existing in software.

  4. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  5. Causal discovery and inference: concepts and recent methodological advances.

    PubMed

    Spirtes, Peter; Zhang, Kun

    This paper aims to give a broad coverage of central concepts and principles involved in automated causal inference and emerging approaches to causal discovery from i.i.d data and from time series. After reviewing concepts including manipulations, causal models, sample predictive modeling, causal predictive modeling, and structural equation models, we present the constraint-based approach to causal discovery, which relies on the conditional independence relationships in the data, and discuss the assumptions underlying its validity. We then focus on causal discovery based on structural equations models, in which a key issue is the identifiability of the causal structure implied by appropriately defined structural equation models: in the two-variable case, under what conditions (and why) is the causal direction between the two variables identifiable? We show that the independence between the error term and causes, together with appropriate structural constraints on the structural equation, makes it possible. Next, we report some recent advances in causal discovery from time series. Assuming that the causal relations are linear with nonGaussian noise, we mention two problems which are traditionally difficult to solve, namely causal discovery from subsampled data and that in the presence of confounding time series. Finally, we list a number of open questions in the field of causal discovery and inference.

  6. Iterative Procedures for Exact Maximum Likelihood Estimation in the First-Order Gaussian Moving Average Model

    DTIC Science & Technology

    1990-11-01

    1 = Q- 1 - 1 QlaaQ- 1.1 + a’Q-1a This is a simple case of a general formula called Woodbury’s formula by some authors; see, for example, Phadke and...1 2. The First-Order Moving Average Model ..... .................. 3. Some Approaches to the Iterative...the approximate likelihood function in some time series models. Useful suggestions have been the Cholesky decomposition of the covariance matrix and

  7. Application of information theory methods to food web reconstruction

    USGS Publications Warehouse

    Moniz, L.J.; Cooch, E.G.; Ellner, S.P.; Nichols, J.D.; Nichols, J.M.

    2007-01-01

    In this paper we use information theory techniques on time series of abundances to determine the topology of a food web. At the outset, the food web participants (two consumers, two resources) are known; in addition we know that each consumer prefers one of the resources over the other. However, we do not know which consumer prefers which resource, and if this preference is absolute (i.e., whether or not the consumer will consume the non-preferred resource). Although the consumers and resources are identified at the beginning of the experiment, we also provide evidence that the consumers are not resources for each other, and the resources do not consume each other. We do show that there is significant mutual information between resources; the model is seasonally forced and some shared information between resources is expected. Similarly, because the model is seasonally forced, we expect shared information between consumers as they respond to the forcing of the resources. The model that we consider does include noise, and in an effort to demonstrate that these methods may be of some use in other than model data, we show the efficacy of our methods with decreasing time series size; in this particular case we obtain reasonably clear results with a time series length of 400 points. This approaches ecological time series lengths from real systems.

  8. Estimates of Zenith Total Delay trends from GPS reprocessing with autoregressive process

    NASA Astrophysics Data System (ADS)

    Klos, Anna; Hunegnaw, Addisu; Teferle, Felix Norman; Ebuy Abraha, Kibrom; Ahmed, Furqan; Bogusz, Janusz

    2017-04-01

    Nowadays, near real-time Zenith Total Delay (ZTD) estimates from Global Positioning System (GPS) observations are routinely assimilated into numerical weather prediction (NWP) models to improve the reliability of forecasts. On the other hand, ZTD time series derived from homogeneously re-processed GPS observations over long periods have the potential to improve our understanding of climate change on various temporal and spatial scales. With such time series only recently reaching somewhat adequate time spans, the application of GPS-derived ZTD estimates to climate monitoring is still to be developed further. In this research, we examine the character of noise in ZTD time series for 1995-2015 in order to estimate more realistic magnitudes of trend and its uncertainty than would be the case if the stochastic properties are not taken into account. Furthermore, the hourly sampled, homogeneously re-processed and carefully homogenized ZTD time series from over 700 globally distributed stations were classified into five major climate zones. We found that the amplitudes of annual signals reach values of 10-150 mm with minimum values for the polar and Alpine zones. The amplitudes of daily signals were estimated to be 0-12 mm with maximum values found for the dry zone. We examined seven different noise models for the residual ZTD time series after modelling all known periodicities. This identified a combination of white plus autoregressive process of fourth order to be optimal to match all changes in power of the ZTD data. When the stochastic properties are neglected, ie. a pure white noise model is employed, only 11 from 120 trends were insignificant. Using the optimum noise model more than half of the 120 examined trends became insignificant. We show that the uncertainty of ZTD trends is underestimated by a factor of 3-12 when the stochastic properties of the ZTD time series are ignored and we conclude that it is essential to properly model the noise characteristics of such time series when interpretations in terms of climate change are to be performed.

  9. The effectiveness of therapeutic assessment with an adult client: a single-case study using a time-series design.

    PubMed

    Aschieri, Filippo; Smith, Justin D

    2012-01-01

    This article presents the therapeutic assessment (TA; Finn, 2007) of a traumatized young woman named Claire. Claire reported feeling debilitated by academic demands and the expectations of her parents, and was finding it nearly impossible to progress in her studies. She was also finding it difficult to develop and sustain intimate relationships. The emotional aspects of close relationships were extremely difficult for her and she routinely blamed herself for her struggles in this arena. The assessor utilized the TA model for adults, with the exception of not including an optional intervention session. The steps of TA, particularly the extended inquiry and the discussion of test findings along the way, cultivated a supportive and empathic atmosphere with Claire. By employing the single-case time-series experimental design used in previous TA studies (e.g., Smith, Handler, & Nash, 2010; Smith, Wolf, Handler, & Nash, 2009), the authors demonstrated that Claire experienced statistically significant improvement correlated with the onset of TA. Results indicated that participation in TA coincided with a positive shift in the trajectory of her reported symptoms and with recognizing the affection she held for others in her life. This case illustrates the successful application of case-based time-series methodology in the evaluation of an adult TA. The potential implications for future study are discussed.

  10. 76 FR 45713 - Airworthiness Directives; Bombardier Inc. Model DHC-8-400 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... MLG extension/retraction system. * * * * * The unsafe condition is loss of control during landing. The... MLG extension/retraction system. This directive is to mandate the incorporation of a new maintenance... in the case of a failure of the normal MLG extension/retraction system. * * * * * The unsafe...

  11. Idea Bank.

    ERIC Educational Resources Information Center

    Science Teacher, 1993

    1993-01-01

    Presents a series of science teaching ideas with the following titles: When Demonstrations Are Misleading, Lasers and Refraction, An Improved Stair-Step Model, Correcting Your Compass, Seeing Is Not Believing, Food Coloring: From the Kitchen to the Lab, Punny Business, Portfolios in Science, Feathers or Gold: A Case for Using the Metric System,…

  12. Education and Gender Egalitarianism: The Case of China

    ERIC Educational Resources Information Center

    Shu, Xiaoling

    2004-01-01

    This study examined Chinese attitudes toward women's careers, marriage rights, sexual freedom, and the importance of having sons using a 1991 national sample of individuals and community-level data and through a series of nested multilevel models. Education influences gender attitudes in multiple ways at both the micro- and macrolevels.…

  13. Standardized residual as response function for order identification of multi input intervention analysis

    NASA Astrophysics Data System (ADS)

    Suhartono, Lee, Muhammad Hisyam; Rezeki, Sri

    2017-05-01

    Intervention analysis is a statistical model in the group of time series analysis which is widely used to describe the effect of an intervention caused by external or internal factors. An example of external factors that often occurs in Indonesia is a disaster, both natural or man-made disaster. The main purpose of this paper is to provide the results of theoretical studies on identification step for determining the order of multi inputs intervention analysis for evaluating the magnitude and duration of the impact of interventions on time series data. The theoretical result showed that the standardized residuals could be used properly as response function for determining the order of multi inputs intervention model. Then, these results are applied for evaluating the impact of a disaster on a real case in Indonesia, i.e. the magnitude and duration of the impact of the Lapindo mud on the volume of vehicles on the highway. Moreover, the empirical results showed that the multi inputs intervention model can describe and explain accurately the magnitude and duration of the impact of disasters on a time series data.

  14. Stochastic Hourly Weather Generator HOWGH: Validation and its Use in Pest Modelling under Present and Future Climates

    NASA Astrophysics Data System (ADS)

    Dubrovsky, M.; Hirschi, M.; Spirig, C.

    2014-12-01

    To quantify impact of the climate change on a specific pest (or any weather-dependent process) in a specific site, we may use a site-calibrated pest (or other) model and compare its outputs obtained with site-specific weather data representing present vs. perturbed climates. The input weather data may be produced by the stochastic weather generator. Apart from the quality of the pest model, the reliability of the results obtained in such experiment depend on an ability of the generator to represent the statistical structure of the real world weather series, and on the sensitivity of the pest model to possible imperfections of the generator. This contribution deals with the multivariate HOWGH weather generator, which is based on a combination of parametric and non-parametric statistical methods. Here, HOWGH is used to generate synthetic hourly series of three weather variables (solar radiation, temperature and precipitation) required by a dynamic pest model SOPRA to simulate the development of codling moth. The contribution presents results of the direct and indirect validation of HOWGH. In the direct validation, the synthetic series generated by HOWGH (various settings of its underlying model are assumed) are validated in terms of multiple climatic characteristics, focusing on the subdaily wet/dry and hot/cold spells. In the indirect validation, we assess the generator in terms of characteristics derived from the outputs of SOPRA model fed by the observed vs. synthetic series. The weather generator may be used to produce weather series representing present and future climates. In the latter case, the parameters of the generator may be modified by the climate change scenarios based on Global or Regional Climate Models. To demonstrate this feature, the results of codling moth simulations for future climate will be shown. Acknowledgements: The weather generator is developed and validated within the frame of projects WG4VALUE (project LD12029 sponsored by the Ministry of Education, Youth and Sports of CR), and VALUE (COST ES 1102 action).

  15. Computer simulation of the activity of the elderly person living independently in a Health Smart Home.

    PubMed

    Noury, N; Hadidi, T

    2012-12-01

    We propose a simulator of human activities collected with presence sensors in our experimental Health Smart Home "Habitat Intelligent pour la Sante (HIS)". We recorded 1492 days of data on several experimental HIS during the French national project "AILISA". On these real data, we built a mathematical model of the behavior of the data series, based on "Hidden Markov Models" (HMM). The model is then played on a computer to produce simulated data series with added flexibility to adjust the parameters in various scenarios. We also tested several methods to measure the similarity between our real and simulated data. Our simulator can produce large data base which can be further used to evaluate the algorithms to raise an alarm in case of loss in autonomy. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. Damage classification and estimation in experimental structures using time series analysis and pattern recognition

    NASA Astrophysics Data System (ADS)

    de Lautour, Oliver R.; Omenzetter, Piotr

    2010-07-01

    Developed for studying long sequences of regularly sampled data, time series analysis methods are being increasingly investigated for the use of Structural Health Monitoring (SHM). In this research, Autoregressive (AR) models were used to fit the acceleration time histories obtained from two experimental structures: a 3-storey bookshelf structure and the ASCE Phase II Experimental SHM Benchmark Structure, in undamaged and limited number of damaged states. The coefficients of the AR models were considered to be damage-sensitive features and used as input into an Artificial Neural Network (ANN). The ANN was trained to classify damage cases or estimate remaining structural stiffness. The results showed that the combination of AR models and ANNs are efficient tools for damage classification and estimation, and perform well using small number of damage-sensitive features and limited sensors.

  17. [Approximation to the dynamics of meningococcal meningitis through dynamic systems and time series].

    PubMed

    Canals, M

    1996-02-01

    Meningococcal meningitis is subjected to epidemiological surveillance due to its severity and the occasional presentation of epidemic outbreaks. This work analyses previous disease models, generate new ones and analyses monthly cases using ARIMA time series models. The results show that disease dynamics for closed populations is epidemic and the epidemic size is related to the proportion of carriers and the transmissiveness of the agent. In open populations, disease dynamics depends on the admission rate of susceptible and the relative admission of infected individuals. Our model considers a logistic populational growth and carrier admission proportional to populational size, generating an endemic dynamics. Considering a non-instantaneous system response, a greater realism is obtained establishing that the endemic situation may present a dynamics highly sensitive to initial conditions, depending on the transmissiveness and proportion of susceptible individuals in the population. Time series model showed an adequate predictive capacity in terms no longer than 10 months. The lack of long term predictability was attributed to local changes in the proportion of carriers or on transmissiveness that lead to chaotic dynamics over a seasonal pattern. Predictions for 1995 and 1996 were obtained.

  18. Inter-comparison of time series models of lake levels predicted by several modeling strategies

    NASA Astrophysics Data System (ADS)

    Khatibi, R.; Ghorbani, M. A.; Naghipour, L.; Jothiprakash, V.; Fathima, T. A.; Fazelifard, M. H.

    2014-04-01

    Five modeling strategies are employed to analyze water level time series of six lakes with different physical characteristics such as shape, size, altitude and range of variations. The models comprise chaos theory, Auto-Regressive Integrated Moving Average (ARIMA) - treated for seasonality and hence SARIMA, Artificial Neural Networks (ANN), Gene Expression Programming (GEP) and Multiple Linear Regression (MLR). Each is formulated on a different premise with different underlying assumptions. Chaos theory is elaborated in a greater detail as it is customary to identify the existence of chaotic signals by a number of techniques (e.g. average mutual information and false nearest neighbors) and future values are predicted using the Nonlinear Local Prediction (NLP) technique. This paper takes a critical view of past inter-comparison studies seeking a superior performance, against which it is reported that (i) the performances of all five modeling strategies vary from good to poor, hampering the recommendation of a clear-cut predictive model; (ii) the performances of the datasets of two cases are consistently better with all five modeling strategies; (iii) in other cases, their performances are poor but the results can still be fit-for-purpose; (iv) the simultaneous good performances of NLP and SARIMA pull their underlying assumptions to different ends, which cannot be reconciled. A number of arguments are presented including the culture of pluralism, according to which the various modeling strategies facilitate an insight into the data from different vantages.

  19. Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.

    PubMed

    Durdu, Omer Faruk

    2010-10-01

    In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic statistics of observed data in terms of mean. The ARIMA modeling approach is recommended for predicting boron concentration series of a river.

  20. Cognitive neuropsychology and its vicissitudes: The fate of Caramazza's axioms.

    PubMed

    Shallice, Tim

    2015-01-01

    Cognitive neuropsychology is characterized as the discipline in which one draws conclusions about the organization of the normal cognitive systems from the behaviour of brain-damaged individuals. In a series of papers, Caramazza, later in collaboration with McCloskey, put forward four assumptions as the bridge principles for making such inferences. Four potential pitfalls, one for each axiom, are discussed with respect to the use of single-case methods. Two of the pitfalls also apply to case series and group study procedures, and the other two are held to be indirectly testable or avoidable. Moreover, four other pitfalls are held to apply to case series or group study methods. It is held that inferences from single-case procedures may profitably be supported or rejected using case series/group study methods, but also that analogous support needs to be given in the other direction for functionally based case series or group studies. It is argued that at least six types of neuropsychological method are valuable for extrapolation to theories of the normal cognitive system but that the single- or multiple-case study remains a critical part of cognitive neuropsychology's methods.

  1. Impacts of a mass vaccination campaign against pandemic H1N1 2009 influenza in Taiwan: a time-series regression analysis.

    PubMed

    Wu, Un-In; Wang, Jann-Tay; Chang, Shan-Chwen; Chuang, Yu-Chung; Lin, Wei-Ru; Lu, Min-Chi; Lu, Po-Liang; Hu, Fu-Chang; Chuang, Jen-Hsiang; Chen, Yee-Chun

    2014-06-01

    A multicenter, hospital-wide, clinical and epidemiological study was conducted to assess the effectiveness of the mass influenza vaccination program during the 2009 H1N1 influenza pandemic, and the impact of the prioritization strategy among people at different levels of risk. Among the 34 359 medically attended patients who displayed an influenza-like illness and had a rapid influenza diagnostic test (RIDT) at one of the three participating hospitals, 21.0% tested positive for influenza A. The highest daily number of RIDT-positive cases in each hospital ranged from 33 to 56. A well-fitted multiple linear regression time-series model (R(2)=0.89) showed that the establishment of special community flu clinics averted an average of nine cases daily (p=0.005), and an increment of 10% in daily mean level of population immunity against pH1N1 through vaccination prevented five cases daily (p<0.001). Moreover, the regression model predicted five-fold or more RIDT-positive cases if the mass influenza vaccination program had not been implemented, and 39.1% more RIDT-positive cases if older adults had been prioritized for vaccination above school-aged children. Mass influenza vaccination was an effective control measure, and school-aged children should be assigned a higher priority for vaccination than older adults during an influenza pandemic. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Beyond linear fields: the Lie–Taylor expansion

    PubMed Central

    2017-01-01

    The work extends the linear fields’ solution of compressible nonlinear magnetohydrodynamics (MHD) to the case where the magnetic field depends on superlinear powers of position vector, usually, but not always, expressed in Cartesian components. Implications of the resulting Lie–Taylor series expansion for physical applicability of the Dolzhansky–Kirchhoff (D–K) equations are found to be positive. It is demonstrated how resistivity may be included in the D–K model. Arguments are put forward that the D–K equations may be regarded as illustrating properties of nonlinear MHD in the same sense that the Lorenz equations inform about the onset of convective turbulence. It is suggested that the Lie–Taylor series approach may lead to valuable insights into other fluid models. PMID:28265187

  3. Acoustic modeling and eigenanalysis of coupled rooms with a transparent coupling aperture of variable size

    NASA Astrophysics Data System (ADS)

    Shi, Shuangxia; Jin, Guoyong; Xiao, Bin; Liu, Zhigang

    2018-04-01

    This paper is concerned with the modeling and acoustic eigenanalysis of coupled spaces with a coupling aperture of variable size. A modeling method for this problem is developed based on the energy principle in combination with a 3D modified Fourier cosine series approach. Under this theoretical framework, the energy exchange property and acoustically transparent characteristics of the opening are taken into account via the inflow and outflow sound powers through the opening without any assumptions. The sound pressure in the subrooms is constructed in the form of the three-dimensional modified Fourier series with several auxiliary functions introduced to ensure the uniform convergence of the solution over the entire solution domain. The accuracy of the natural frequencies and mode shapes of three exemplary coupled rooms systems is verified against numerical data obtained by finite element method, with good agreement achieved. The present method offers a unified procedure for a variety of cases because the modification of any parameter from one case to another, such as the size and location of the coupling aperture, is as simple as modifying the material properties, requiring no changes to the solution procedures.

  4. Truncation of Spherical Harmonic Series and its Influence on Gravity Field Modelling

    NASA Astrophysics Data System (ADS)

    Fecher, T.; Gruber, T.; Rummel, R.

    2009-04-01

    Least-squares adjustment is a very common and effective tool for the calculation of global gravity field models in terms of spherical harmonic series. However, since the gravity field is a continuous field function its optimal representation by a finite series of spherical harmonics is connected with a set of fundamental problems. Particularly worth mentioning here are cut off errors and aliasing effects. These problems stem from the truncation of the spherical harmonic series and from the fact that the spherical harmonic coefficients cannot be determined independently of each other within the adjustment process in case of discrete observations. The latter is shown by the non-diagonal variance-covariance matrices of gravity field solutions. Sneeuw described in 1994 that the off-diagonal matrix elements - at least if data are equally weighted - are the result of a loss of orthogonality of Legendre polynomials on regular grids. The poster addresses questions arising from the truncation of spherical harmonic series in spherical harmonic analysis and synthesis. Such questions are: (1) How does the high frequency data content (outside the parameter space) affect the estimated spherical harmonic coefficients; (2) Where to truncate the spherical harmonic series in the adjustment process in order to avoid high frequency leakage?; (3) Given a set of spherical harmonic coefficients resulting from an adjustment, what is the effect of using only a truncated version of it?

  5. QSAR studies on carbonic anhydrase inhibitors: a case of ureido and thioureido derivatives of aromatic/heterocyclic sulfonamides.

    PubMed

    Agrawal, Vijay K; Sharma, Ruchi; Khadikar, Padmakar V

    2002-09-01

    QSAR studies on modelling of biological activity (hCAI) for a series of ureido and thioureido derivatives of aromatic/heterocyclic sulfonamides have been made using a pool of topological indices. Regression analysis of the data showed that excellent results were obtained in multiparametric correlations upon introduction of indicator parameters. The predictive abilities of the models are discussed using cross-validation parameters.

  6. Modeling malaria control intervention effect in KwaZulu-Natal, South Africa using intervention time series analysis.

    PubMed

    Ebhuoma, Osadolor; Gebreslasie, Michael; Magubane, Lethumusa

    The change of the malaria control intervention policy in South Africa (SA), re-introduction of dichlorodiphenyltrichloroethane (DDT), may be responsible for the low and sustained malaria transmission in KwaZulu-Natal (KZN). We evaluated the effect of the re-introduction of DDT on malaria in KZN and suggested practical ways the province can strengthen her already existing malaria control and elimination efforts, to achieve zero malaria transmission. We obtained confirmed monthly malaria cases in KZN from the malaria control program of KZN from 1998 to 2014. The seasonal autoregressive integrated moving average (SARIMA) intervention time series analysis (ITSA) was employed to model the effect of the re-introduction of DDT on confirmed monthly malaria cases. The result is an abrupt and permanent decline of monthly malaria cases (w 0 =-1174.781, p-value=0.003) following the implementation of the intervention policy. The sustained low malaria cases observed over a long period suggests that the continued usage of DDT did not result in insecticide resistance as earlier anticipated. It may be due to exophagic malaria vectors, which renders the indoor residual spraying not totally effective. Therefore, the feasibility of reducing malaria transmission to zero in KZN requires other reliable and complementary intervention resources to optimize the existing ones. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Climatic Variables and Malaria Morbidity in Mutale Local Municipality, South Africa: A 19-Year Data Analysis

    PubMed Central

    Botai, Joel O.; Rautenbach, Hannes; Ncongwane, Katlego P.; Botai, Christina M.

    2017-01-01

    The north-eastern parts of South Africa, comprising the Limpopo Province, have recorded a sudden rise in the rate of malaria morbidity and mortality in the 2017 malaria season. The epidemiological profiles of malaria, as well as other vector-borne diseases, are strongly associated with climate and environmental conditions. A retrospective understanding of the relationship between climate and the occurrence of malaria may provide insight into the dynamics of the disease’s transmission and its persistence in the north-eastern region. In this paper, the association between climatic variables and the occurrence of malaria was studied in the Mutale local municipality in South Africa over a period of 19-year. Time series analysis was conducted on monthly climatic variables and monthly malaria cases in the Mutale municipality for the period of 1998–2017. Spearman correlation analysis was performed and the Seasonal Autoregressive Integrated Moving Average (SARIMA) model was developed. Microsoft Excel was used for data cleaning, and statistical software R was used to analyse the data and develop the model. Results show that both climatic variables’ and malaria cases’ time series exhibited seasonal patterns, showing a number of peaks and fluctuations. Spearman correlation analysis indicated that monthly total rainfall, mean minimum temperature, mean maximum temperature, mean average temperature, and mean relative humidity were significantly and positively correlated with monthly malaria cases in the study area. Regression analysis showed that monthly total rainfall and monthly mean minimum temperature (R2 = 0.65), at a two-month lagged effect, are the most significant climatic predictors of malaria transmission in Mutale local municipality. A SARIMA (2,1,2) (1,1,1) model fitted with only malaria cases has a prediction performance of about 51%, and the SARIMAX (2,1,2) (1,1,1) model with climatic variables as exogenous factors has a prediction performance of about 72% in malaria cases. The model gives a close comparison between the predicted and observed number of malaria cases, hence indicating that the model provides an acceptable fit to predict the number of malaria cases in the municipality. To sum up, the association between the climatic variables and malaria cases provides clues to better understand the dynamics of malaria transmission. The lagged effect detected in this study can help in adequate planning for malaria intervention. PMID:29117114

  8. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    PubMed

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  9. Highly comparative time-series analysis: the empirical structure of time series and their methods

    PubMed Central

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  10. An assessment of the ability of Bartlett-Lewis type of rainfall models to reproduce drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-12-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as a test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis model types studied fail to preserve extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  11. A copula-based assessment of Bartlett-Lewis type of rainfall models for preserving drought statistics

    NASA Astrophysics Data System (ADS)

    Pham, M. T.; Vanhaute, W. J.; Vandenberghe, S.; De Baets, B.; Verhoest, N. E. C.

    2013-06-01

    Of all natural disasters, the economic and environmental consequences of droughts are among the highest because of their longevity and widespread spatial extent. Because of their extreme behaviour, studying droughts generally requires long time series of historical climate data. Rainfall is a very important variable for calculating drought statistics, for quantifying historical droughts or for assessing the impact on other hydrological (e.g. water stage in rivers) or agricultural (e.g. irrigation requirements) variables. Unfortunately, time series of historical observations are often too short for such assessments. To circumvent this, one may rely on the synthetic rainfall time series from stochastic point process rainfall models, such as Bartlett-Lewis models. The present study investigates whether drought statistics are preserved when simulating rainfall with Bartlett-Lewis models. Therefore, a 105 yr 10 min rainfall time series obtained at Uccle, Belgium is used as test case. First, drought events were identified on the basis of the Effective Drought Index (EDI), and each event was characterized by two variables, i.e. drought duration (D) and drought severity (S). As both parameters are interdependent, a multivariate distribution function, which makes use of a copula, was fitted. Based on the copula, four types of drought return periods are calculated for observed as well as simulated droughts and are used to evaluate the ability of the rainfall models to simulate drought events with the appropriate characteristics. Overall, all Bartlett-Lewis type of models studied fail in preserving extreme drought statistics, which is attributed to the model structure and to the model stationarity caused by maintaining the same parameter set during the whole simulation period.

  12. Numerical predictions and measurements in the lubrication of aeronautical engine and transmission components

    NASA Astrophysics Data System (ADS)

    Moraru, Laurentiu Eugen

    2005-11-01

    This dissertation treats a variety of aspects of the lubrication of mechanical components encountered in aeronautical engines and transmissions. The study covers dual clearance squeeze film dampers, mixed elastohydrodynamic lubrication (EHL) cases and thermal elastohydrodynamic contacts. The dual clearance squeeze film damper (SFD) invented by Fleming is investigated both theoretically and experimentally for cases when the sleeve that separates the two oil films is free to float and for cases when the separating sleeve is supported by a squirrel cage. The Reynolds equation is developed to handle each of these cases and it is solved analytically for short bearings. A rotordynamic model of a test rig is developed, for both the single and dual SFD cases. A computer code is written to calculate the motion of the test rig rotor. Experiments are performed in order to validate the theoretical results. Rotordynamics computations are found to favorably agree with measured data. A probabilistic model for mixed EHL is developed and implemented. Surface roughness of gears are measured and processed. The mixed EHL model incorporates the average flow model of Patir and Cheng and the elasto-plastic contact mechanics model of Chang Etsion and Bogy. The current algorithm allows for the computation of the load supported by an oil film and for the load supported by the elasto-plastically deformed asperities. This work also presents a way to incorporate the effect of the fluid induced roughness deformation by utilizing the "amplitude reduction" results provided by the deterministic analyses. The Lobatto point Gaussian integration algorithm of Elrod and Brewe was extended for thermal lubrication problems involving compressible lubricants and it was implemented in thermal elastohydrodynamic cases. The unknown variables across the film are written in series of Legendre polynomials. The thermal Reynolds equation is obtained in terms of the series coefficients and it is proven that it can only explicitly contain the information from the first three Legendre polynomials. A computer code was written to implement the Lobatto point algorithm for a EHL line contact. Use of the Labatto point calculation method has resulted in greater accuracy without the use of a larger number of grid points.

  13. Systematic review of the methodological and reporting quality of case series in surgery.

    PubMed

    Agha, R A; Fowler, A J; Lee, S-Y; Gundogan, B; Whitehurst, K; Sagoo, H K; Jeong, K J L; Altman, D G; Orgill, D P

    2016-09-01

    Case series are an important and common study type. No guideline exists for reporting case series and there is evidence of key data being missed from such reports. The first step in the process of developing a methodologically sound reporting guideline is a systematic review of literature relevant to the reporting deficiencies of case series. A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, Embase, Cochrane Methods Register, Science Citation Index and Conference Proceedings Citation index, from the start of indexing to 5 November 2014. Independent screening, eligibility assessments and data extraction were performed. Included articles were then analysed for five areas of deficiency: failure to use standardized definitions, missing or selective data (including the omission of whole cases or important variables), transparency or incomplete reporting, whether alternative study designs were considered, and other issues. Database searching identified 2205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequencies of methodological and reporting issues identified were: failure to use standardized definitions (57 per cent), missing or selective data (66 per cent), transparency or incomplete reporting (70 per cent), whether alternative study designs were considered (11 per cent) and other issues (52 per cent). The methodological and reporting quality of surgical case series needs improvement. The data indicate that evidence-based guidelines for the conduct and reporting of case series may be useful. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  14. A novel hybrid model for air quality index forecasting based on two-phase decomposition technique and modified extreme learning machine.

    PubMed

    Wang, Deyun; Wei, Shuai; Luo, Hongyuan; Yue, Chenqiang; Grunder, Olivier

    2017-02-15

    The randomness, non-stationarity and irregularity of air quality index (AQI) series bring the difficulty of AQI forecasting. To enhance forecast accuracy, a novel hybrid forecasting model combining two-phase decomposition technique and extreme learning machine (ELM) optimized by differential evolution (DE) algorithm is developed for AQI forecasting in this paper. In phase I, the complementary ensemble empirical mode decomposition (CEEMD) is utilized to decompose the AQI series into a set of intrinsic mode functions (IMFs) with different frequencies; in phase II, in order to further handle the high frequency IMFs which will increase the forecast difficulty, variational mode decomposition (VMD) is employed to decompose the high frequency IMFs into a number of variational modes (VMs). Then, the ELM model optimized by DE algorithm is applied to forecast all the IMFs and VMs. Finally, the forecast value of each high frequency IMF is obtained through adding up the forecast results of all corresponding VMs, and the forecast series of AQI is obtained by aggregating the forecast results of all IMFs. To verify and validate the proposed model, two daily AQI series from July 1, 2014 to June 30, 2016 collected from Beijing and Shanghai located in China are taken as the test cases to conduct the empirical study. The experimental results show that the proposed hybrid model based on two-phase decomposition technique is remarkably superior to all other considered models for its higher forecast accuracy. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Forecasting landslide activations by means of GA-SAKe. An example of application to three case studies in Calabria (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Iovine, Giulio G. R.; De Rango, Alessio; Gariano, Stefano L.; Terranova, Oreste G.

    2016-04-01

    GA-SAKe - the Genetic-Algorithm based release of the hydrological model SAKe (Self Adaptive Kernel) - allows to forecast the timing of activation of landslides [1, 2], based on dates of landslide activations and rainfall series. The model can be applied to either single or set of similar landslides in a homogeneous context. Calibration of the model is performed through Genetic-Algorithm, and provides families of optimal, discretized solutions (kernels) that maximize the fitness function. The mobility functions are obtained through convolution of the optimal kernels with rain series. The shape of the kernel, including its base time, is related to magnitude of the landslide and hydro-geological complexity of the slope. Once validated, the model can be applied to estimate the timing of future landslide activations in the same study area, by employing measured or forecasted rainfall. GA-SAKe is here employed to analyse the historical activations of three rock slides in Calabria (Southern Italy), threatening villages and main infrastructures. In particular: 1) the Acri-Serra di Buda case, developed within a Sackung, involving weathered crystalline and metamorphic rocks; for this case study, 6 dates of activation are available; 2) the San Fili-Uncino case, developed in clay and conglomerate overlaying gneiss and biotitic schist; for this case study, 7 dates of activation are available [2]; 3) the San Benedetto Ullano-San Rocco case, developed in weathered metamorphic rocks; for this case study, 3 dates of activation are available [1, 3, 4, 5]. The obtained results are quite promising, given the high performance of the model against slope movements characterized by numerous historical activations. Obtained results, in terms of shape and base time of the kernels, are compared by taking into account types and sizes of the considered case studies, and involved rock types. References [1] Terranova O.G., Iaquinta P., Gariano S.L., Greco R. & Iovine G. (2013) In: Landslide Science and Practice, Margottini, Canuti, Sassa (Eds.), Vol. 3, pp.73-79. [2] Terranova O.G., Gariano S.L., Iaquinta P. & Iovine G.G.R. (2015). Geosci. Model Dev., 8, 1955-1978. [3] Iovine G., Iaquinta P. & Terranova O. (2009). In Anderssen, Braddock & Newham (Eds.), Proc. 18th World IMACS Congr. and MODSIM09 Int. Congr. on Modelling and Simulation, pp. 2686-2693. [4] Iovine G., Lollino P., Gariano S.L. & Terranova O.G. (2010). NHESS, 10, 2341-2354. [5] Capparelli G., Iaquinta P., Iovine G., Terranova O.G. & Versace P. (2012). Natural Hazards, 61(1), pp.247-256.

  16. Vertical and pitching resonance of train cars moving over a series of simple beams

    NASA Astrophysics Data System (ADS)

    Yang, Y. B.; Yau, J. D.

    2015-02-01

    The resonant response, including both vertical and pitching motions, of an undamped sprung mass unit moving over a series of simple beams is studied by a semi-analytical approach. For a sprung mass that is very small compared with the beam, we first simplify the sprung mass as a constant moving force and obtain the response of the beam in closed form. With this, we then solve for the response of the sprung mass passing over a series of simple beams, and validate the solution by an independent finite element analysis. To evaluate the pitching resonance, we consider the cases of a two-axle model and a coach model traveling over rough rails supported by a series of simple beams. The resonance of a train car is characterized by the fact that its response continues to build up, as it travels over more and more beams. For train cars with long axle intervals, the vertical acceleration induced by pitching resonance dominates the peak response of the train traveling over a series of simple beams. The present semi-analytical study allows us to grasp the key parameters involved in the primary/sub-resonant responses. Other phenomena of resonance are also discussed in the exemplar study.

  17. Wind Field Extractions from SAR Sentinel-1 Images Using Electromagnetic Models

    NASA Astrophysics Data System (ADS)

    La, Tran Vu; Khenchaf, Ali; Comblet, Fabrice; Nahum, Carole

    2016-08-01

    Among available wind sources, i.e. measured data, numeric weather models, the retrieval of wind vectors from Synthetic Aperture Radar (SAR) data / images is particularly preferred due to a lot of SAR systems (available data in most meteorological conditions, revisit mode, high resolution, etc.). For this purpose, the retrieval of wind vectors is principally based on the empirical (EP) models, e.g. CMOD series in C-band. Little studies have been reported about the use of the electromagnetic (EM) models for wind vector retrieval, since it is quite complicated to invert. However, the EM models can be applied for most cases of polarization, frequency and wind regime. In order to evaluate the advantages and limits of the EM models for wind vector retrieval, we compare in this study estimated results by the EM and EP models for both cases of polarization (vertical-vertical, or VV-pol and horizontal- horizontal, or HH-pol).

  18. Therapeutic Assessment of Complex Trauma: A Single-Case Time-Series Study.

    PubMed

    Tarocchi, Anna; Aschieri, Filippo; Fantini, Francesca; Smith, Justin D

    2013-06-01

    The cumulative effect of repeated traumatic experiences in early childhood incrementally increases the risk of adjustment problems later in life. Surviving traumatic environments can lead to the development of an interrelated constellation of emotional and interpersonal symptoms termed complex posttraumatic stress disorder (CPTSD). Effective treatment of trauma begins with a multimethod psychological assessment and requires the use of several evidence-based therapeutic processes, including establishing a safe therapeutic environment, reprocessing the trauma, constructing a new narrative, and managing emotional dysregulation. Therapeutic Assessment (TA) is a semistructured, brief intervention that uses psychological testing to promote positive change. The case study of Kelly, a middle-aged woman with a history of repeated interpersonal trauma, illustrates delivery of the TA model for CPTSD. Results of this single-case time-series experiment indicate statistically significant symptom improvement as a result of participating in TA. We discuss the implications of these findings for assessing and treating trauma-related concerns, such as CPTSD.

  19. Regiospecific Ester Hydrolysis by Orange Peel Esterase - An Undergraduate Experiment.

    NASA Astrophysics Data System (ADS)

    Bugg, Timothy D. H.; Lewin, Andrew M.; Catlin, Eric R.

    1997-01-01

    A simple but effective experiment has been developed to demonstrate the regiospecificity of enzyme catalysis using an esterase activity easily isolated from orange peel. The experiment involves the preparation of diester derivatives of para-, meta- and ortho-hydroxybenzoic acid (e.g. methyl 4-acetoxy-benzoic acid). The derivatives are incubated with orange peel esterase, as a crude extract, and with commercially available pig liver esterase and porcine pancreatic lipase. The enzymatic hydrolysis reactions are monitored by thin layer chromatography, revealing which of the two ester groups is hydrolysed, and the rate of the enzyme-catalysed reaction. The results of a group experiment revealed that in all cases hydrolysis was observed with at least one enzyme, and in most cases the enzymatic hydrolysis was specific for production of either the hydroxy-ester or acyl-acid product. Specificity towards the ortho-substituted series was markedly different to that of the para-substituted series, which could be rationalised in the case of pig liver esterase by a published active site model.

  20. Therapeutic Assessment of Complex Trauma: A Single-Case Time-Series Study

    PubMed Central

    Tarocchi, Anna; Aschieri, Filippo; Fantini, Francesca; Smith, Justin D.

    2013-01-01

    The cumulative effect of repeated traumatic experiences in early childhood incrementally increases the risk of adjustment problems later in life. Surviving traumatic environments can lead to the development of an interrelated constellation of emotional and interpersonal symptoms termed complex posttraumatic stress disorder (CPTSD). Effective treatment of trauma begins with a multimethod psychological assessment and requires the use of several evidence-based therapeutic processes, including establishing a safe therapeutic environment, reprocessing the trauma, constructing a new narrative, and managing emotional dysregulation. Therapeutic Assessment (TA) is a semistructured, brief intervention that uses psychological testing to promote positive change. The case study of Kelly, a middle-aged woman with a history of repeated interpersonal trauma, illustrates delivery of the TA model for CPTSD. Results of this single-case time-series experiment indicate statistically significant symptom improvement as a result of participating in TA. We discuss the implications of these findings for assessing and treating trauma-related concerns, such as CPTSD. PMID:24159267

  1. Bacterial molecular networks: bridging the gap between functional genomics and dynamical modelling.

    PubMed

    van Helden, Jacques; Toussaint, Ariane; Thieffry, Denis

    2012-01-01

    This introductory review synthesizes the contents of the volume Bacterial Molecular Networks of the series Methods in Molecular Biology. This volume gathers 9 reviews and 16 method chapters describing computational protocols for the analysis of metabolic pathways, protein interaction networks, and regulatory networks. Each protocol is documented by concrete case studies dedicated to model bacteria or interacting populations. Altogether, the chapters provide a representative overview of state-of-the-art methods for data integration and retrieval, network visualization, graph analysis, and dynamical modelling.

  2. Queueing Network Models for Parallel Processing of Task Systems: an Operational Approach

    NASA Technical Reports Server (NTRS)

    Mak, Victor W. K.

    1986-01-01

    Computer performance modeling of possibly complex computations running on highly concurrent systems is considered. Earlier works in this area either dealt with a very simple program structure or resulted in methods with exponential complexity. An efficient procedure is developed to compute the performance measures for series-parallel-reducible task systems using queueing network models. The procedure is based on the concept of hierarchical decomposition and a new operational approach. Numerical results for three test cases are presented and compared to those of simulations.

  3. Predictive models of alcohol use based on attitudes and individual values.

    PubMed

    García del Castillo Rodríguez, José A; López-Sánchez, Carmen; Quiles Soler, M Carmen; García del Castillo-López, Alvaro; Gázquez Pertusa, Mónica; Marzo Campos, Juan Carlos; Inglés, Candido J

    2013-01-01

    Two predictive models are developed in this article: the first is designed to predict people's attitudes to alcoholic drinks, while the second sets out to predict the use of alcohol in relation to selected individual values. University students (N = 1,500) were recruited through stratified sampling based on sex and academic discipline. The questionnaire used obtained information on participants' alcohol use, attitudes and personal values. The results show that the attitudes model correctly classifies 76.3% of cases. Likewise, the model for level of alcohol use correctly classifies 82% of cases. According to our results, we can conclude that there are a series of individual values that influence drinking and attitudes to alcohol use, which therefore provides us with a potentially powerful instrument for developing preventive intervention programs.

  4. Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.

    PubMed

    Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping

    2018-01-01

    Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.

  5. Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.

    PubMed

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  6. Scaling symmetry, renormalization, and time series modeling: The case of financial assets dynamics

    NASA Astrophysics Data System (ADS)

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L.

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments’ stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  7. Forecasting the number of zoonotic cutaneous leishmaniasis cases in south of Fars province, Iran using seasonal ARIMA time series method.

    PubMed

    Sharafi, Mehdi; Ghaem, Haleh; Tabatabaee, Hamid Reza; Faramarzi, Hossein

    2017-01-01

    To predict the trend of cutaneous leishmaniasis and assess the relationship between the disease trend and weather variables in south of Fars province using Seasonal Autoregressive Integrated Moving Average (SARIMA) model. The trend of cutaneous leishmaniasis was predicted using Mini tab software and SARIMA model. Besides, information about the disease and weather conditions was collected monthly based on time series design during January 2010 to March 2016. Moreover, various SARIMA models were assessed and the best one was selected. Then, the model's fitness was evaluated based on normality of the residuals' distribution, correspondence between the fitted and real amounts, and calculation of Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC). The study results indicated that SARIMA model (4,1,4)(0,1,0) (12) in general and SARIMA model (4,1,4)(0,1,1) (12) in below and above 15 years age groups could appropriately predict the disease trend in the study area. Moreover, temperature with a three-month delay (lag3) increased the disease trend, rainfall with a four-month delay (lag4) decreased the disease trend, and rainfall with a nine-month delay (lag9) increased the disease trend. Based on the results, leishmaniasis follows a descending trend in the study area in case drought condition continues, SARIMA models can suitably measure the disease trend, and the disease follows a seasonal trend. Copyright © 2017 Hainan Medical University. Production and hosting by Elsevier B.V. All rights reserved.

  8. A combined treatment approach emphasizing impairment-based manual physical therapy for plantar heel pain: a case series.

    PubMed

    Young, Brian; Walker, Michael J; Strunce, Joseph; Boyles, Robert

    2004-11-01

    Case series. To describe an impairment-based physical therapy treatment approach for 4 patients with plantar heel pain. There is limited evidence from clinical trials on which to base treatment decision making for plantar heel pain. Four patients completed a course of physical therapy based on an impairment-based model. All patients received manual physical therapy and stretching. Two patients were also treated with custom orthoses, and 1 patient received an additional strengthening program. Outcome measures included a numeric pain rating scale (NPRS) and self-reported functional status. Symptom duration ranged from 6 to 52 weeks (mean duration+/-SD, 33+/-19 weeks). Treatment duration ranged from 8 to 49 days (mean duration+/-SD, 23+/-18 days), with number of treatment sessions ranging from 2 to 7 (mode, 3). All 4 patients reported a decrease in NPRS scores from an average (+/-SD) of 5.8+/-2.2 to 0 (out of 10) during previously painful activities. Additionally, all patients returned to prior activity levels. In this case series, patients with plantar heel pain treated with an impairment-based physical therapy approach emphasizing manual therapy demonstrated complete pain relief and full return to activities. Further research is necessary to determine the effectiveness of impairment-based physical therapy interventions for patients with plantar heel pain/plantar fasciitis.

  9. Effectiveness of chronic care models: opportunities for improving healthcare practice and health outcomes: a systematic review.

    PubMed

    Davy, Carol; Bleasel, Jonathan; Liu, Hueiming; Tchan, Maria; Ponniah, Sharon; Brown, Alex

    2015-05-10

    The increasing prevalence of chronic disease and even multiple chronic diseases faced by both developed and developing countries is of considerable concern. Many of the interventions to address this within primary healthcare settings are based on a chronic care model first developed by MacColl Institute for Healthcare Innovation at Group Health Cooperative. This systematic literature review aimed to identify and synthesise international evidence on the effectiveness of elements that have been included in a chronic care model for improving healthcare practices and health outcomes within primary healthcare settings. The review broadens the work of other similar reviews by focusing on effectiveness of healthcare practice as well as health outcomes associated with implementing a chronic care model. In addition, relevant case series and case studies were also included. Of the 77 papers which met the inclusion criteria, all but two reported improvements to healthcare practice or health outcomes for people living with chronic disease. While the most commonly used elements of a chronic care model were self-management support and delivery system design, there were considerable variations between studies regarding what combination of elements were included as well as the way in which chronic care model elements were implemented. This meant that it was impossible to clearly identify any optimal combination of chronic care model elements that led to the reported improvements. While the main argument for excluding papers reporting case studies and case series in systematic literature reviews is that they are not of sufficient quality or generalizability, we found that they provided a more detailed account of how various chronic care models were developed and implemented. In particular, these papers suggested that several factors including supporting reflective healthcare practice, sending clear messages about the importance of chronic disease care and ensuring that leaders support the implementation and sustainability of interventions may have been just as important as a chronic care model's elements in contributing to the improvements in healthcare practice or health outcomes for people living with chronic disease.

  10. Chaos and Forecasting - Proceedings of the Royal Society Discussion Meeting

    NASA Astrophysics Data System (ADS)

    Tong, Howell

    1995-04-01

    The Table of Contents for the full book PDF is as follows: * Preface * Orthogonal Projection, Embedding Dimension and Sample Size in Chaotic Time Series from a Statistical Perspective * A Theory of Correlation Dimension for Stationary Time Series * On Prediction and Chaos in Stochastic Systems * Locally Optimized Prediction of Nonlinear Systems: Stochastic and Deterministic * A Poisson Distribution for the BDS Test Statistic for Independence in a Time Series * Chaos and Nonlinear Forecastability in Economics and Finance * Paradigm Change in Prediction * Predicting Nonuniform Chaotic Attractors in an Enzyme Reaction * Chaos in Geophysical Fluids * Chaotic Modulation of the Solar Cycle * Fractal Nature in Earthquake Phenomena and its Simple Models * Singular Vectors and the Predictability of Weather and Climate * Prediction as a Criterion for Classifying Natural Time Series * Measuring and Characterising Spatial Patterns, Dynamics and Chaos in Spatially-Extended Dynamical Systems and Ecologies * Non-Linear Forecasting and Chaos in Ecology and Epidemiology: Measles as a Case Study

  11. Multiresolution analysis of Bursa Malaysia KLCI time series

    NASA Astrophysics Data System (ADS)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferraioli, Luigi; Hueller, Mauro; Vitale, Stefano

    The scientific objectives of the LISA Technology Package experiment on board of the LISA Pathfinder mission demand accurate calibration and validation of the data analysis tools in advance of the mission launch. The level of confidence required in the mission outcomes can be reached only by intensively testing the tools on synthetically generated data. A flexible procedure allowing the generation of a cross-correlated stationary noise time series was set up. A multichannel time series with the desired cross-correlation behavior can be generated once a model for a multichannel cross-spectral matrix is provided. The core of the procedure comprises a noisemore » coloring, multichannel filter designed via a frequency-by-frequency eigendecomposition of the model cross-spectral matrix and a subsequent fit in the Z domain. The common problem of initial transients in a filtered time series is solved with a proper initialization of the filter recursion equations. The noise generator performance was tested in a two-dimensional case study of the closed-loop LISA Technology Package dynamics along the two principal degrees of freedom.« less

  13. Scale invariance in chaotic time series: Classical and quantum examples

    NASA Astrophysics Data System (ADS)

    Landa, Emmanuel; Morales, Irving O.; Stránský, Pavel; Fossion, Rubén; Velázquez, Victor; López Vieyra, J. C.; Frank, Alejandro

    Important aspects of chaotic behavior appear in systems of low dimension, as illustrated by the Map Module 1. It is indeed a remarkable fact that all systems tha make a transition from order to disorder display common properties, irrespective of their exacta functional form. We discuss evidence for 1/f power spectra in the chaotic time series associated in classical and quantum examples, the one-dimensional map module 1 and the spectrum of 48Ca. A Detrended Fluctuation Analysis (DFA) method is applied to investigate the scaling properties of the energy fluctuations in the spectrum of 48Ca obtained with a large realistic shell model calculation (ANTOINE code) and with a random shell model (TBRE) calculation also in the time series obtained with the map mod 1. We compare the scale invariant properties of the 48Ca nuclear spectrum sith similar analyses applied to the RMT ensambles GOE and GDE. A comparison with the corresponding power spectra is made in both cases. The possible consequences of the results are discussed.

  14. Stereotypes of autism

    PubMed Central

    Draaisma, Douwe

    2009-01-01

    In their landmark papers, both Kanner and Asperger employed a series of case histories to shape clinical insight into autistic disorders. This way of introducing, assessing and representing disorders has disappeared from today's psychiatric practice, yet it offers a convincing model of the way stereotypes may build up as a result of representations of autism. Considering that much of what society at large learns on disorders on the autism spectrum is produced by representations of autism in novels, TV-series, movies or autobiographies, it will be of vital importance to scrutinize these representations and to check whether or not they are, in fact, misrepresenting autism. In quite a few cases, media representations of talent and special abilities can be said to have contributed to a harmful divergence between the general image of autism and the clinical reality of the autistic condition. PMID:19528033

  15. Stereotypes of autism.

    PubMed

    Draaisma, Douwe

    2009-05-27

    In their landmark papers, both Kanner and Asperger employed a series of case histories to shape clinical insight into autistic disorders. This way of introducing, assessing and representing disorders has disappeared from today's psychiatric practice, yet it offers a convincing model of the way stereotypes may build up as a result of representations of autism. Considering that much of what society at large learns on disorders on the autism spectrum is produced by representations of autism in novels, TV-series, movies or autobiographies, it will be of vital importance to scrutinize these representations and to check whether or not they are, in fact, misrepresenting autism. In quite a few cases, media representations of talent and special abilities can be said to have contributed to a harmful divergence between the general image of autism and the clinical reality of the autistic condition.

  16. Produsage in hybrid networks: sociotechnical skills in the case of Arduino

    NASA Astrophysics Data System (ADS)

    De Paoli, Stefano; Storni, Cristiano

    2011-04-01

    In 1this paper we investigate produsage using Actor-Network Theory with a focus on (produsage) skills, their development, and transformation. We argue that produsage is not a model that determines a change in the traditional consumption/production paradigm through a series of essential preconditions (such as open participation, peer-sharing, or common ownership). Rather, we explain produsage as the open-ended result of a series of heterogeneous actor-networking strategies. In this view, the so-called preconditions do not explain produsage but have to be explained along with its establishment as an actor-network. Drawing on this approach, we discuss a case study of an open hardware project: the Arduino board, and we develop a perspective that maps the skills of human and non-human entities in produsage actor-networks, showing how skills are symmetrical, relational, and circulating.

  17. Contraction Options and Optimal Multiple-Stopping in Spectrally Negative Lévy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamazaki, Kazutoshi, E-mail: kyamazak@kansai-u.ac.jp

    This paper studies the optimal multiple-stopping problem arising in the context of the timing option to withdraw from a project in stages. The profits are driven by a general spectrally negative Lévy process. This allows the model to incorporate sudden declines of the project values, generalizing greatly the classical geometric Brownian motion model. We solve the one-stage case as well as the extension to the multiple-stage case. The optimal stopping times are of threshold-type and the value function admits an expression in terms of the scale function. A series of numerical experiments are conducted to verify the optimality and tomore » evaluate the efficiency of the algorithm.« less

  18. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals.

    PubMed

    Hedayatifar, L; Vahabi, M; Jafari, G R

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  19. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals

    NASA Astrophysics Data System (ADS)

    Hedayatifar, L.; Vahabi, M.; Jafari, G. R.

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  20. Fractional Brownian motion time-changed by gamma and inverse gamma process

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Wyłomańska, A.; Połoczański, R.; Sundar, S.

    2017-02-01

    Many real time-series exhibit behavior adequate to long range dependent data. Additionally very often these time-series have constant time periods and also have characteristics similar to Gaussian processes although they are not Gaussian. Therefore there is need to consider new classes of systems to model these kinds of empirical behavior. Motivated by this fact in this paper we analyze two processes which exhibit long range dependence property and have additional interesting characteristics which may be observed in real phenomena. Both of them are constructed as the superposition of fractional Brownian motion (FBM) and other process. In the first case the internal process, which plays role of the time, is the gamma process while in the second case the internal process is its inverse. We present in detail their main properties paying main attention to the long range dependence property. Moreover, we show how to simulate these processes and estimate their parameters. We propose to use a novel method based on rescaled modified cumulative distribution function for estimation of parameters of the second considered process. This method is very useful in description of rounded data, like waiting times of subordinated processes delayed by inverse subordinators. By using the Monte Carlo method we show the effectiveness of proposed estimation procedures. Finally, we present the applications of proposed models to real time series.

  1. An autoregressive integrated moving average model for short-term prediction of hepatitis C virus seropositivity among male volunteer blood donors in Karachi, Pakistan

    PubMed Central

    Akhtar, Saeed; Rozi, Shafquat

    2009-01-01

    AIM: To identify the stochastic autoregressive integrated moving average (ARIMA) model for short term forecasting of hepatitis C virus (HCV) seropositivity among volunteer blood donors in Karachi, Pakistan. METHODS: Ninety-six months (1998-2005) data on HCV seropositive cases (1000-1 × month-1) among male volunteer blood donors tested at four major blood banks in Karachi, Pakistan were subjected to ARIMA modeling. Subsequently, a fitted ARIMA model was used to forecast HCV seropositive donors for 91-96 mo to contrast with observed series of the same months. To assess the forecast accuracy, the mean absolute error rate (%) between the observed and predicted HCV seroprevalence was calculated. Finally, a fitted ARIMA model was used for short-term forecasts beyond the observed series. RESULTS: The goodness-of-fit test of the optimum ARIMA (2,1,7) model showed non-significant autocorrelations in the residuals of the model. The forecasts by ARIMA for 91-96 mo closely followed the pattern of observed series for the same months, with mean monthly absolute forecast errors (%) over 6 mo of 6.5%. The short-term forecasts beyond the observed series adequately captured the pattern in the data and showed increasing tendency of HCV seropositivity with a mean ± SD HCV seroprevalence (1000-1 × month-1) of 24.3 ± 1.4 over the forecast interval. CONCLUSION: To curtail HCV spread, public health authorities need to educate communities and health care providers about HCV transmission routes based on known HCV epidemiology in Pakistan and its neighboring countries. Future research may focus on factors associated with hyperendemic levels of HCV infection. PMID:19340903

  2. Western Michigan University: Quasi-Revolving Fund. Green Revolving Funds in Action: Case Study Series

    ERIC Educational Resources Information Center

    Billingsley, Christina

    2011-01-01

    Western Michigan University has designed an innovative "Quasi-Revolving Fund" model that demonstrates the institution's full commitment to incorporating sustainability into campus operations. The Quasi-Revolving Fund recaptures money from cost-savings, similar to a typical green revolving fund, but it also sources capital from the…

  3. The Industrial Manufacturing Technician Apprenticeship. Work-Based Learning in Action

    ERIC Educational Resources Information Center

    Scott, Geri

    2016-01-01

    This case study, one of a series of publications exploring effective and inclusive models of work-based learning, finds that entry-level occupations in manufacturing have historically been considered unskilled jobs for which little or no training is necessary. As a consequence, employers have experienced high turnover among new-hires, and…

  4. 75 FR 68693 - Airworthiness Directives; Airbus Model A380-800 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-09

    ... may lead to a degraded leak detection capability have been reported. In case of hot air leakage, the... inspection in production and on in-service aircraft, a number of OverHeat Detection System (OHDS... could allow undetected leakage of bleed air from the hot engine/auxiliary power unit causing damage to...

  5. A "Layers of Negotiation" Model for Designing Constructivist Learning Materials.

    ERIC Educational Resources Information Center

    Cennamo, Katherine S.; And Others

    In designing materials for use in a contructivist learning environment, instructional designers still have a role in selecting the situations that may provide a stimulus for knowledge construction and providing features that support students and teachers in using these materials. This paper describes the process of designing a series of case-based…

  6. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    DOT National Transportation Integrated Search

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  7. How Do Students' Behaviors Relate to the Growth of Their Mathematical Ideas?

    ERIC Educational Resources Information Center

    Warner, Lisa B.

    2008-01-01

    The purpose of this study is to analyze the relationship between student behaviors and the growth of mathematical ideas (using the Pirie-Kieren model). This analysis was accomplished through a series of case studies, involving middle school students of varying ability levels, who were investigating a combinatorics problem in after-school…

  8. Renormalization group methods for the Reynolds stress transport equations

    NASA Technical Reports Server (NTRS)

    Rubinstein, R.

    1992-01-01

    The Yakhot-Orszag renormalization group is used to analyze the pressure gradient-velocity correlation and return to isotropy terms in the Reynolds stress transport equations. The perturbation series for the relevant correlations, evaluated to lowest order in the epsilon-expansion of the Yakhot-Orszag theory, are infinite series in tensor product powers of the mean velocity gradient and its transpose. Formal lowest order Pade approximations to the sums of these series produce a rapid pressure strain model of the form proposed by Launder, Reece, and Rodi, and a return to isotropy model of the form proposed by Rotta. In both cases, the model constants are computed theoretically. The predicted Reynolds stress ratios in simple shear flows are evaluated and compared with experimental data. The possibility is discussed of deriving higher order nonlinear models by approximating the sums more accurately. The Yakhot-Orszag renormalization group provides a systematic procedure for deriving turbulence models. Typical applications have included theoretical derivation of the universal constants of isotropic turbulence theory, such as the Kolmogorov constant, and derivation of two equation models, again with theoretically computed constants and low Reynolds number forms of the equations. Recent work has applied this formalism to Reynolds stress modeling, previously in the form of a nonlinear eddy viscosity representation of the Reynolds stresses, which can be used to model the simplest normal stress effects. The present work attempts to apply the Yakhot-Orszag formalism to Reynolds stress transport modeling.

  9. Scapular flap for maxillectomy defect reconstruction and preliminary results using three-dimensional modeling.

    PubMed

    Modest, Mara C; Moore, Eric J; Abel, Kathryn M Van; Janus, Jeffrey R; Sims, John R; Price, Daniel L; Olsen, Kerry D

    2017-01-01

    Discuss current techniques utilizing the scapular tip and subscapular system for free tissue reconstruction of maxillary defects and highlight the impact of medical modeling on these techniques with a case series. Case review series at an academic hospital of patients undergoing maxillectomy + thoracodorsal scapula composite free flap (TSCF) reconstruction. Three-dimensional (3D) models were used in the last five cases. 3D modeling, surgical, functional, and aesthetic outcomes were reviewed. Nine patients underwent TSCF reconstruction for maxillectomy defects (median age = 43 years; range, 19-66 years). Five patients (55%) had a total maxillectomy (TM) ± orbital exenteration, whereas four patients (44%) underwent subtotal palatal maxillectomy. For TM, the contralateral scapula tip was positioned with its natural concavity recreating facial contour. The laterally based vascular pedicle was ideally positioned for facial vessel anastomosis. For subtotal-palatal defect, an ipsilateral flap was harvested, but inset with the convex surface facing superiorly. Once 3D models were available from our anatomic modeling lab, they were used for intraoperative planning of the last five patients. Use of the model intraoperatively improved efficiency and allowed for better contouring/plating of the TSCF. At last follow-up, all patients had good functional outcomes. Aesthetic outcomes were more successful in patients where 3D-modeling was used (100% vs. 50%). There were no flap failures. Median follow-up >1 month was 5.2 months (range, 1-32.7 months). Reconstruction of maxillectomy defects is complex. Successful aesthetic and functional outcomes are critical to patient satisfaction. The TSCF is a versatile flap. Based on defect type, choosing laterality is crucial for proper vessel orientation and outcomes. The use of internally produced 3D models has helped refine intraoperative contouring and flap inset, leading to more successful outcomes. 4. Laryngoscope, 127:E8-E14, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  10. Secondary Organic Aerosol Formation and Organic Nitrate Yield from NO 3 Oxidation of Biogenic Hydrocarbons

    DOE PAGES

    Fry, Juliane L.; Draper, Danielle C.; Barsanti, Kelley C.; ...

    2014-09-17

    Here, the secondary organic aerosol (SOA) mass yields from NO 3 oxidation of a series of biogenic volatile organic compounds (BVOCs), consisting of five monoterpenes and one sesquiterpene (α-pinene, β-pinene, Δ-3-carene, limonene, sabinene, and β-caryophyllene), were investigated in a series of continuous flow experiments in a 10 m 3 indoor Teflon chamber. By making in situ measurements of the nitrate radical and employing a kinetics box model, we generate time-dependent yield curves as a function of reacted BVOC. SOA yields varied dramatically among the different BVOCs, from zero for α-pinene to 38–65% for Δ-3-carene and 86% for β-caryophyllene at massmore » loading of 10 μg m –3, suggesting that model mechanisms that treat all NO 3 + monoterpene reactions equally will lead to errors in predicted SOA depending on each location’s mix of BVOC emissions. In most cases, organonitrate is a dominant component of the aerosol produced, but in the case of α-pinene, little organonitrate and no aerosol is formed.« less

  11. Optical Variability Signatures from Massive Black Hole Binaries

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.; Frank, Koby Alexander; Lidz, Adam

    2017-01-01

    The hierarchical merging of dark matter halos and their associated galaxies should lead to a population of supermassive black hole binaries (MBHBs). We consider plausible optical variability signatures from MBHBs at sub-parsec separations and search for these using data from the Catalina Real-Time Transient Survey (CRTS). Specifically, we model the impact of relativistic Doppler beaming on the accretion disk emission from the less massive, secondary black hole. We explore whether this Doppler modulation may be separated from other sources of stochastic variability in the accretion flow around the MBHBs, which we describe as a damped random walk (DRW). In the simple case of a circular orbit, relativistic beaming leads to a series of broad peaks — located at multiples of the orbital frequency — in the fluctuation power spectrum. We extend our analysis to the case of elliptical orbits and discuss the effect of beaming on the flux power spectrum and auto-correlation function using simulations. We present a code to model an observed light curve as a stochastic DRW-type time series modulated by relativistic beaming and apply the code to CRTS data.

  12. Secondary Organic Aerosol Formation and Organic Nitrate Yield from NO3 Oxidation of Biogenic Hydrocarbons

    PubMed Central

    2014-01-01

    The secondary organic aerosol (SOA) mass yields from NO3 oxidation of a series of biogenic volatile organic compounds (BVOCs), consisting of five monoterpenes and one sesquiterpene (α-pinene, β-pinene, Δ-3-carene, limonene, sabinene, and β-caryophyllene), were investigated in a series of continuous flow experiments in a 10 m3 indoor Teflon chamber. By making in situ measurements of the nitrate radical and employing a kinetics box model, we generate time-dependent yield curves as a function of reacted BVOC. SOA yields varied dramatically among the different BVOCs, from zero for α-pinene to 38–65% for Δ-3-carene and 86% for β-caryophyllene at mass loading of 10 μg m–3, suggesting that model mechanisms that treat all NO3 + monoterpene reactions equally will lead to errors in predicted SOA depending on each location’s mix of BVOC emissions. In most cases, organonitrate is a dominant component of the aerosol produced, but in the case of α-pinene, little organonitrate and no aerosol is formed. PMID:25229208

  13. Oral lichen planus in childhood: a case series.

    PubMed

    Cascone, Marco; Celentano, Antonio; Adamo, Daniela; Leuci, Stefania; Ruoppo, Elvira; Mignogna, Michele D

    2017-06-01

    Although the exact incidence of pediatric oral lichen planus (OLP) is unknown, the oral mucosa seems to be less commonly involved, and the clinical presentation is often atypical. The aim of the study is to present a case series of OLP in childhood. From our database, we retrospectively selected and analyzed the clinical data of OLP patients under the age of 18 where the diagnosis had been confirmed by histopathological analysis. The case series from our database shows eight patients, four males and four females. The mean (±SD) age at the time of diagnosis of the disease was 13.5 (±2.73) years, ranging in age from 9 to 17. Clinically, a reticular pattern was present in six patients (75%), and the tongue was the most commonly involved oral site (six cases, 75%). We also report the first case of OLP in a 9-year-old girl affected by autoimmune polyendocrinopathy-candidiasis-ectodermal dystrophy. We report the largest case series of pediatric OLP published in literature thus far. Differences in the disease between adults and pediatric patients have been detected, but further investigation and a larger case series are needed to establish any detailed differences in clinical outcomes. © 2017 The International Society of Dermatology.

  14. Superhero‐related injuries in paediatrics: a case series

    PubMed Central

    Davies, Patrick; Surridge, Julia; Hole, Laura; Munro‐Davies, Lisa

    2007-01-01

    Five cases of serious injuries to children wearing superhero costumes, involving extreme risk‐taking behaviour, are presented here. Although children have always displayed behaviour seemingly unwise to the adult eye, the advent of superhero role models can give unrealistic expectations to the child, which may lead to serious injury. The children we saw have all had to contemplate on their way to hospital that they do not in fact possess superpowers. The inbuilt injury protection which some costumes possess is also discussed. PMID:17337680

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aldridge, David F.; Bartel, Lewis C.

    Program LETS calculates the electric current distribution (in space and time) along an electrically energized steel-cased geologic borehole situated within the subsurface earth. The borehole is modeled as an electrical transmission line that “leaks” current into the surrounding geology. Parameters pertinent to the transmission line current calculation (i.e., series resistance and inductance, shunt capacitance and conductance) are obtained by sampling the electromagnetic (EM) properties of a three-dimensional (3D) geologic earth model along a (possibly deviated) well track.

  16. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  17. Modeling Individual Cyclic Variation in Human Behavior.

    PubMed

    Pierson, Emma; Althoff, Tim; Leskovec, Jure

    2018-04-01

    Cycles are fundamental to human health and behavior. Examples include mood cycles, circadian rhythms, and the menstrual cycle. However, modeling cycles in time series data is challenging because in most cases the cycles are not labeled or directly observed and need to be inferred from multidimensional measurements taken over time. Here, we present Cyclic Hidden Markov Models (CyH-MMs) for detecting and modeling cycles in a collection of multidimensional heterogeneous time series data. In contrast to previous cycle modeling methods, CyHMMs deal with a number of challenges encountered in modeling real-world cycles: they can model multivariate data with both discrete and continuous dimensions; they explicitly model and are robust to missing data; and they can share information across individuals to accommodate variation both within and between individual time series. Experiments on synthetic and real-world health-tracking data demonstrate that CyHMMs infer cycle lengths more accurately than existing methods, with 58% lower error on simulated data and 63% lower error on real-world data compared to the best-performing baseline. CyHMMs can also perform functions which baselines cannot: they can model the progression of individual features/symptoms over the course of the cycle, identify the most variable features, and cluster individual time series into groups with distinct characteristics. Applying CyHMMs to two real-world health-tracking datasets-of human menstrual cycle symptoms and physical activity tracking data-yields important insights including which symptoms to expect at each point during the cycle. We also find that people fall into several groups with distinct cycle patterns, and that these groups differ along dimensions not provided to the model. For example, by modeling missing data in the menstrual cycles dataset, we are able to discover a medically relevant group of birth control users even though information on birth control is not given to the model.

  18. Modeling Individual Cyclic Variation in Human Behavior

    PubMed Central

    Pierson, Emma; Althoff, Tim; Leskovec, Jure

    2018-01-01

    Cycles are fundamental to human health and behavior. Examples include mood cycles, circadian rhythms, and the menstrual cycle. However, modeling cycles in time series data is challenging because in most cases the cycles are not labeled or directly observed and need to be inferred from multidimensional measurements taken over time. Here, we present Cyclic Hidden Markov Models (CyH-MMs) for detecting and modeling cycles in a collection of multidimensional heterogeneous time series data. In contrast to previous cycle modeling methods, CyHMMs deal with a number of challenges encountered in modeling real-world cycles: they can model multivariate data with both discrete and continuous dimensions; they explicitly model and are robust to missing data; and they can share information across individuals to accommodate variation both within and between individual time series. Experiments on synthetic and real-world health-tracking data demonstrate that CyHMMs infer cycle lengths more accurately than existing methods, with 58% lower error on simulated data and 63% lower error on real-world data compared to the best-performing baseline. CyHMMs can also perform functions which baselines cannot: they can model the progression of individual features/symptoms over the course of the cycle, identify the most variable features, and cluster individual time series into groups with distinct characteristics. Applying CyHMMs to two real-world health-tracking datasets—of human menstrual cycle symptoms and physical activity tracking data—yields important insights including which symptoms to expect at each point during the cycle. We also find that people fall into several groups with distinct cycle patterns, and that these groups differ along dimensions not provided to the model. For example, by modeling missing data in the menstrual cycles dataset, we are able to discover a medically relevant group of birth control users even though information on birth control is not given to the model. PMID:29780976

  19. Volterra series truncation and kernel estimation of nonlinear systems in the frequency domain

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Billings, S. A.

    2017-02-01

    The Volterra series model is a direct generalisation of the linear convolution integral and is capable of displaying the intrinsic features of a nonlinear system in a simple and easy to apply way. Nonlinear system analysis using Volterra series is normally based on the analysis of its frequency-domain kernels and a truncated description. But the estimation of Volterra kernels and the truncation of Volterra series are coupled with each other. In this paper, a novel complex-valued orthogonal least squares algorithm is developed. The new algorithm provides a powerful tool to determine which terms should be included in the Volterra series expansion and to estimate the kernels and thus solves the two problems all together. The estimated results are compared with those determined using the analytical expressions of the kernels to validate the method. To further evaluate the effectiveness of the method, the physical parameters of the system are also extracted from the measured kernels. Simulation studies demonstrates that the new approach not only can truncate the Volterra series expansion and estimate the kernels of a weakly nonlinear system, but also can indicate the applicability of the Volterra series analysis in a severely nonlinear system case.

  20. Determination of sample size for higher volatile data using new framework of Box-Jenkins model with GARCH: A case study on gold price

    NASA Astrophysics Data System (ADS)

    Roslindar Yaziz, Siti; Zakaria, Roslinazairimah; Hura Ahmad, Maizah

    2017-09-01

    The model of Box-Jenkins - GARCH has been shown to be a promising tool for forecasting higher volatile time series. In this study, the framework of determining the optimal sample size using Box-Jenkins model with GARCH is proposed for practical application in analysing and forecasting higher volatile data. The proposed framework is employed to daily world gold price series from year 1971 to 2013. The data is divided into 12 different sample sizes (from 30 to 10200). Each sample is tested using different combination of the hybrid Box-Jenkins - GARCH model. Our study shows that the optimal sample size to forecast gold price using the framework of the hybrid model is 1250 data of 5-year sample. Hence, the empirical results of model selection criteria and 1-step-ahead forecasting evaluations suggest that the latest 12.25% (5-year data) of 10200 data is sufficient enough to be employed in the model of Box-Jenkins - GARCH with similar forecasting performance as by using 41-year data.

  1. On the relationship between health, education and economic growth: Time series evidence from Malaysia

    NASA Astrophysics Data System (ADS)

    Khan, Habib Nawaz; Razali, Radzuan B.; Shafei, Afza Bt.

    2016-11-01

    The objectives of this paper is two-fold: First, to empirically investigate the effects of an enlarged number of healthy and well-educated people on economic growth in Malaysia within the Endogeneous Growth Model framework. Second, to examine the causal links between education, health and economic growth using annual time series data from 1981 to 2014 for Malaysia. Data series were checked for the time series properties by using ADF and KPSS tests. Long run co-integration relationship was investigated with the help of vector autoregressive (VAR) method. For short and long run dynamic relationship investigation vector error correction model (VECM) was applied. Causality analysis was performed through Engle-Granger technique. The study results showed long run co-integration relation and positively significant effects of education and health on economic growth in Malaysia. The reported results also confirmed a feedback hypothesis between the variables in the case of Malaysia. The study results have policy relevance of the importance of human capital (health and education) to the growth process of the Malaysia. Thus, it is suggested that policy makers focus on education and health sectors for sustainable economic growth in Malaysia.

  2. Coherent and partially coherent dark hollow beams with rectangular symmetry and paraxial propagation properties

    NASA Astrophysics Data System (ADS)

    Cai, Yangjian; Zhang, Lei

    2006-07-01

    A theoretical model is proposed to describe coherent dark hollow beams (DHBs) with rectangular symmetry. The electric field of a coherent rectangular DHB is expressed as a superposition of a series of the electric field of a finite series of fundamental Gaussian beams. Analytical propagation formulas for a coherent rectangular DHB passing through paraxial optical systems are derived in a tensor form. Furthermore, for the more general case, we propose a theoretical model to describe a partially coherent rectangular DHB. Analytical propagation formulas for a partially coherent rectangular DHB passing through paraxial optical systems are derived. The beam propagation factor (M2 factor) for both coherent and partially coherent rectangular DHBs are studied. Numerical examples are given by using the derived formulas. Our models and method provide an effective way to describe and treat the propagation of coherent and partially coherent rectangular DHBs.

  3. Microscopic Spin Model for the STOCK Market with Attractor Bubbling on Regular and Small-World Lattices

    NASA Astrophysics Data System (ADS)

    Krawiecki, A.

    A multi-agent spin model for changes of prices in the stock market based on the Ising-like cellular automaton with interactions between traders randomly varying in time is investigated by means of Monte Carlo simulations. The structure of interactions has topology of a small-world network obtained from regular two-dimensional square lattices with various coordination numbers by randomly cutting and rewiring edges. Simulations of the model on regular lattices do not yield time series of logarithmic price returns with statistical properties comparable with the empirical ones. In contrast, in the case of networks with a certain degree of randomness for a wide range of parameters the time series of the logarithmic price returns exhibit intermittent bursting typical of volatility clustering. Also the tails of distributions of returns obey a power scaling law with exponents comparable to those obtained from the empirical data.

  4. Learning in the model space for cognitive fault diagnosis.

    PubMed

    Chen, Huanhuan; Tino, Peter; Rodan, Ali; Yao, Xin

    2014-01-01

    The emergence of large sensor networks has facilitated the collection of large amounts of real-time data to monitor and control complex engineering systems. However, in many cases the collected data may be incomplete or inconsistent, while the underlying environment may be time-varying or unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles the above challenges. This framework investigates fault diagnosis in the model space instead of the signal space. Learning in the model space is implemented by fitting a series of models using a series of signal segments selected with a sliding window. By investigating the learning techniques in the fitted model space, faulty models can be discriminated from healthy models using a one-class learning algorithm. The framework enables us to construct a fault library when unknown faults occur, which can be regarded as cognitive fault isolation. This paper also theoretically investigates how to measure the pairwise distance between two models in the model space and incorporates the model distance into the learning algorithm in the model space. The results on three benchmark applications and one simulated model for the Barcelona water distribution network confirm the effectiveness of the proposed framework.

  5. Parameterisation of rainfall-runoff models for forecasting low and average flows, I: Conceptual modelling

    NASA Astrophysics Data System (ADS)

    Castiglioni, S.; Toth, E.

    2009-04-01

    In the calibration procedure of continuously-simulating models, the hydrologist has to choose which part of the observed hydrograph is most important to fit, either implicitly, through the visual agreement in manual calibration, or explicitly, through the choice of the objective function(s). Changing the objective functions it is in fact possible to emphasise different kind of errors, giving them more weight in the calibration phase. The objective functions used for calibrating hydrological models are generally of the quadratic type (mean squared error, correlation coefficient, coefficient of determination, etc) and are therefore oversensitive to high and extreme error values, that typically correspond to high and extreme streamflow values. This is appropriate when, like in the majority of streamflow forecasting applications, the focus is on the ability to reproduce potentially dangerous flood events; on the contrary, if the aim of the modelling is the reproduction of low and average flows, as it is the case in water resource management problems, this may result in a deterioration of the forecasting performance. This contribution presents the results of a series of automatic calibration experiments of a continuously-simulating rainfall-runoff model applied over several real-world case-studies, where the objective function is chosen so to highlight the fit of average and low flows. In this work a simple conceptual model will be used, of the lumped type, with a relatively low number of parameters to be calibrated. The experiments will be carried out for a set of case-study watersheds in Central Italy, covering an extremely wide range of geo-morphologic conditions and for whom at least five years of contemporary daily series of streamflow, precipitation and evapotranspiration estimates are available. Different objective functions will be tested in calibration and the results will be compared, over validation data, against those obtained with traditional squared functions. A companion work presents the results, over the same case-study watersheds and observation periods, of a system-theoretic model, again calibrated for reproducing average and low streamflows.

  6. Two approaches to timescale modeling for proxy series with chronological errors.

    NASA Astrophysics Data System (ADS)

    Divine, Dmitry; Godtliebsen, Fred

    2010-05-01

    A substantial part of proxy series used in paleoclimate research has chronological uncertainties. Any constructed timescale is therefore only an estimate of the true, but unknown timescale. An accurate assessment of the timing of events in the paleoproxy series and networks, as well as the use of proxy-based paleoclimate reconstructions in GCM model scoring experiments, requires the effect of these errors to be properly taken into account. We consider two types of the timescale error models corresponding to the two basic approaches to construction of the (depth-) age scale in a proxy series. Typically, a chronological control of a proxy series stemming from all types of marine and terrestrial sedimentary archives is based on the use of 14C dates, reference horizons or their combination. Depending on the prevalent origin of the available fix points (age markers) the following approaches to timescale modeling are proposed. 1) 14C dates. The algorithm uses Markov-chain Monte Carlo sampling technique to generate the ordered set of perturbed age markers. Proceeding sequentially from the youngest to the oldest fixpoint, the sampler draws random numbers from the age distribution of each individual 14C date. Every following perturbed age marker is generated such that condition of no age reversal is fulfilled. The relevant regression model is then applied to construct a simulated timescale. 2) Reference horizons (f. ex. volcanic or dust layers, T bomb peak) generally provide absolutely dated fixpoints. Due to a natural variability in sedimentation (accumulation) rate, however, the dating uncertainty in the interpolated timescale tends to grow together with a span to the nearest fixpoint. The (accumulation, sedimentation) process associated with formation of a proxy series is modelled using stochastic Levy process. The respective increments for the process are drawn from the log-normal distribution with the mean/variance ratio prescribed as a site(proxy)- dependent external parameter. The number of generated annual increments corresponds to a time interval between the considered reference horizons. The simulated series is then rescaled to match the length of the actual core section being modelled. Within each method the multitude of timescales is generated creating a number of possible realisations of a proxy series or a proxy based reconstruction in the time domain. This allows consideration of a proxy record in a probabilistic framework. The effect of accounting for uncertainties in chronology on a reconstructed environmental variable is illustrated with the two case studies of marine sediment records.

  7. A structured framework for assessing sensitivity to missing data assumptions in longitudinal clinical trials.

    PubMed

    Mallinckrodt, C H; Lin, Q; Molenberghs, M

    2013-01-01

    The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.

  8. Modelling the effects of booster dose vaccination schedules and recommendations for public health immunization programs: the case of Haemophilus influenzae serotype b.

    PubMed

    Charania, Nadia A; Moghadas, Seyed M

    2017-09-13

    Haemophilus influenzae serotype b (Hib) has yet to be eliminated despite the implementation of routine infant immunization programs. There is no consensus regarding the number of primary vaccine doses and an optimal schedule for the booster dose. We sought to evaluate the effect of a booster dose after receiving the primary series on the long-term disease incidence. A stochastic model of Hib transmission dynamics was constructed to compare the long-term impact of a booster vaccination and different booster schedules after receiving the primary series on the incidence of carriage and symptomatic disease. We parameterized the model with available estimates for the efficacy of Hib conjugate vaccine and durations of both vaccine-induced and naturally acquired immunity. We found that administering a booster dose substantially reduced the population burden of Hib disease compared to the scenario of only receiving the primary series. Comparing the schedules, the incidence of carriage for a 2-year delay (on average) in booster vaccination was comparable or lower than that observed for the scenario of booster dose within 1 year after primary series. The temporal reduction of symptomatic disease was similar in the two booster schedules, suggesting no superiority of one schedule over the other in terms of reducing the incidence of symptomatic disease. The findings underscore the importance of a booster vaccination for continued decline of Hib incidence. When the primary series provides a high level of protection temporarily, delaying the booster dose (still within the average duration of protection conferred by the primary series) may be beneficial to maintain longer-term protection levels and decelerate the decline of herd immunity in the population.

  9. Isolated and synergistic effects of PM10 and average temperature on cardiovascular and respiratory mortality.

    PubMed

    Pinheiro, Samya de Lara Lins de Araujo; Saldiva, Paulo Hilário Nascimento; Schwartz, Joel; Zanobetti, Antonella

    2014-12-01

    OBJECTIVE To analyze the effect of air pollution and temperature on mortality due to cardiovascular and respiratory diseases. METHODS We evaluated the isolated and synergistic effects of temperature and particulate matter with aerodynamic diameter < 10 µm (PM10) on the mortality of individuals > 40 years old due to cardiovascular disease and that of individuals > 60 years old due to respiratory diseases in Sao Paulo, SP, Southeastern Brazil, between 1998 and 2008. Three methodologies were used to evaluate the isolated association: time-series analysis using Poisson regression model, bidirectional case-crossover analysis matched by period, and case-crossover analysis matched by the confounding factor, i.e., average temperature or pollutant concentration. The graphical representation of the response surface, generated by the interaction term between these factors added to the Poisson regression model, was interpreted to evaluate the synergistic effect of the risk factors. RESULTS No differences were observed between the results of the case-crossover and time-series analyses. The percentage change in the relative risk of cardiovascular and respiratory mortality was 0.85% (0.45;1.25) and 1.60% (0.74;2.46), respectively, due to an increase of 10 μg/m3 in the PM10 concentration. The pattern of correlation of the temperature with cardiovascular mortality was U-shaped and that with respiratory mortality was J-shaped, indicating an increased relative risk at high temperatures. The values for the interaction term indicated a higher relative risk for cardiovascular and respiratory mortalities at low temperatures and high temperatures, respectively, when the pollution levels reached approximately 60 μg/m3. CONCLUSIONS The positive association standardized in the Poisson regression model for pollutant concentration is not confounded by temperature, and the effect of temperature is not confounded by the pollutant levels in the time-series analysis. The simultaneous exposure to different levels of environmental factors can create synergistic effects that are as disturbing as those caused by extreme concentrations.

  10. GOATS - Orbitology Component

    NASA Technical Reports Server (NTRS)

    Haber, Benjamin M.; Green, Joseph J.

    2010-01-01

    The GOATS Orbitology Component software was developed to specifically address the concerns presented by orbit analysis tools that are often written as stand-alone applications. These applications do not easily interface with standard JPL first-principles analysis tools, and have a steep learning curve due to their complicated nature. This toolset is written as a series of MATLAB functions, allowing seamless integration into existing JPL optical systems engineering modeling and analysis modules. The functions are completely open, and allow for advanced users to delve into and modify the underlying physics being modeled. Additionally, this software module fills an analysis gap, allowing for quick, high-level mission analysis trades without the need for detailed and complicated orbit analysis using commercial stand-alone tools. This software consists of a series of MATLAB functions to provide for geometric orbit-related analysis. This includes propagation of orbits to varying levels of generalization. In the simplest case, geosynchronous orbits can be modeled by specifying a subset of three orbit elements. The next case is a circular orbit, which can be specified by a subset of four orbit elements. The most general case is an arbitrary elliptical orbit specified by all six orbit elements. These orbits are all solved geometrically, under the basic problem of an object in circular (or elliptical) orbit around a rotating spheroid. The orbit functions output time series ground tracks, which serve as the basis for more detailed orbit analysis. This software module also includes functions to track the positions of the Sun, Moon, and arbitrary celestial bodies specified by right ascension and declination. Also included are functions to calculate line-of-sight geometries to ground-based targets, angular rotations and decompositions, and other line-of-site calculations. The toolset allows for the rapid execution of orbit trade studies at the level of detail required for the early stage of mission concept development.

  11. A complete dynamic model of primary sedimentation.

    PubMed

    Paraskevas, P; Kolokithas, G; Lekkas, T

    1993-11-01

    A dynamic mathematical model for the primary clarifier of a wastewater treatment plant is described, which is represented by a general tanks-in-series model, to simulate insufficient mixing. The model quantifies successfully the diurnal response of both the suspended and dissolved species. It is general enough, so that the values of the parameters can be replaced with those applicable to a specific case. The model was verified through data from the Biological Centre of Metamorfosi, in Athens, Greece, and can be used to assist in the design of new plants or in the analysis and output predictions of existing ones.

  12. The air forces on a systematic series of biplane and triplane cellule models

    NASA Technical Reports Server (NTRS)

    Munk, Max M

    1927-01-01

    The air forces on a systematic series of biplane and triplane cellule models are the subject of this report. The test consist in the determination of the lift, drag, and moment of each individual airfoil in each cellule, mostly with the same wing section. The magnitude of the gap and of the stagger is systematically varied; not, however, the decalage, which is zero throughout the tests. Certain check tests with a second wing section make the tests more complete and conclusions more convincing. The results give evidence that the present army and navy specifications for the relative lifts of biplanes are good. They furnish material for improving such specifications for the relative lifts of triplanes. A larger number of factors can now be prescribed to take care of different cases.

  13. Implications on 1+1 D runup modeling due to time features of the earthquake source

    NASA Astrophysics Data System (ADS)

    Fuentes, M.; Riquelme, S.; Campos, J. A.

    2017-12-01

    The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1+1D solution for the shoreline motion time series, from the static case to the dynamic case, by including both, rise time and rupture velocity. Results show that the static case correspond to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum run-up may be affected by very slow ruptures and long rise time. The analytical solution has been tested for the Nicaraguan tsunami earthquake, suggesting that the rupture was not slow enough to cause wave amplification to explain the high runup observations.

  14. Thermodynamics of Biological Processes

    PubMed Central

    Garcia, Hernan G.; Kondev, Jane; Orme, Nigel; Theriot, Julie A.; Phillips, Rob

    2012-01-01

    There is a long and rich tradition of using ideas from both equilibrium thermodynamics and its microscopic partner theory of equilibrium statistical mechanics. In this chapter, we provide some background on the origins of the seemingly unreasonable effectiveness of ideas from both thermodynamics and statistical mechanics in biology. After making a description of these foundational issues, we turn to a series of case studies primarily focused on binding that are intended to illustrate the broad biological reach of equilibrium thinking in biology. These case studies include ligand-gated ion channels, thermodynamic models of transcription, and recent applications to the problem of bacterial chemotaxis. As part of the description of these case studies, we explore a number of different uses of the famed Monod–Wyman–Changeux (MWC) model as a generic tool for providing a mathematical characterization of two-state systems. These case studies should provide a template for tailoring equilibrium ideas to other problems of biological interest. PMID:21333788

  15. Modeling ecosystem processes with variable freshwater inflow to the Caloosahatchee River Estuary, southwest Florida. II. Nutrient loading, submarine light, and seagrasses

    NASA Astrophysics Data System (ADS)

    Buzzelli, Christopher; Doering, Peter; Wan, Yongshan; Sun, Detong

    2014-12-01

    Short- and long-term changes in estuarine biogeochemical and biological attributes are consequences of variations in both the magnitude and composition of freshwater inputs. A common conceptualization of estuaries depicts nutrient loading from coastal watersheds as the stressor that promotes algal biomass, decreases submarine light penetration, and degrades seagrass habitats. Freshwater inflow depresses salinity while simultaneously introducing colored dissolved organic matter (color or CDOM) which greatly reduces estuarine light penetration. This is especially true for sub-tropical estuaries. This study applied a model of the Caloosahatchee River Estuary (CRE) in southwest Florida to explore the relationships between freshwater inflow, nutrient loading, submarine light, and seagrass survival. In two independent model series, the loading of dissolved inorganic nitrogen and phosphorus (DIN and DIP) was reduced by 10%, 20%, 30%, and 50% relative to the base model case from 2002 to 2009 (2922 days). While external nutrient loads were reduced by lowering inflow (Q0) in the first series (Q0 series), reductions were accomplished by decreasing the incoming concentrations of DIN and DIP in the second series (NP Series). The model also was used to explore the partitioning of submarine light extinction due to chlorophyll a, CDOM, and turbidity. Results suggested that attempting to control nutrient loading by decreasing freshwater inflow could have minor effects on water column concentrations but greatly influence submarine light and seagrass biomass. This is because of the relative importance of Q0 to salinity and submarine light. In general, light penetration and seagrass biomass decreased with increased inflow and CDOM. Increased chlorophyll a did account for more submarine light extinction in the lower estuary. The model output was used to help identify desirable levels of inflow, nutrient loading, water quality, salinity, and submarine light for seagrass in the lower CRE. These findings provide information essential to the development of a resource-based approach to improve the management of both freshwater inflow and estuarine biotic resources.

  16. Statistical variability comparison in MODIS and AERONET derived aerosol optical depth over Indo-Gangetic Plains using time series modeling.

    PubMed

    Soni, Kirti; Parmar, Kulwinder Singh; Kapoor, Sangeeta; Kumar, Nishant

    2016-05-15

    A lot of studies in the literature of Aerosol Optical Depth (AOD) done by using Moderate Resolution Imaging Spectroradiometer (MODIS) derived data, but the accuracy of satellite data in comparison to ground data derived from ARrosol Robotic NETwork (AERONET) has been always questionable. So to overcome from this situation, comparative study of a comprehensive ground based and satellite data for the period of 2001-2012 is modeled. The time series model is used for the accurate prediction of AOD and statistical variability is compared to assess the performance of the model in both cases. Root mean square error (RMSE), mean absolute percentage error (MAPE), stationary R-squared, R-squared, maximum absolute percentage error (MAPE), normalized Bayesian information criterion (NBIC) and Ljung-Box methods are used to check the applicability and validity of the developed ARIMA models revealing significant precision in the model performance. It was found that, it is possible to predict the AOD by statistical modeling using time series obtained from past data of MODIS and AERONET as input data. Moreover, the result shows that MODIS data can be formed from AERONET data by adding 0.251627 ± 0.133589 and vice-versa by subtracting. From the forecast available for AODs for the next four years (2013-2017) by using the developed ARIMA model, it is concluded that the forecasted ground AOD has increased trend. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Complication with Intraosseous Access: Scandinavian Users’ Experience

    PubMed Central

    Hallas, Peter; Brabrand, Mikkel; Folkestad, Lars

    2013-01-01

    Introduction: Intraosseous access (IO) is indicated if vascular access cannot be quickly established during resuscitation. Complication rates are estimated to be low, based on small patient series, model or cadaver studies, and case reports. However, user experience with IO use in real-life emergency situations might differ from the results in the controlled environment of model studies and small patient series. We performed a survey of IO use in real-life emergency situations to assess users’ experiences of complications. Methods: An online questionnaire was sent to Scandinavian emergency physicians, anesthesiologists and pediatricians. Results: 1,802 clinical cases of IO use was reported by n=386 responders. Commonly reported complications with establishing IO access were patient discomfort/pain (7.1%), difficulties with penetration of periosteum with IO needle (10.3%), difficulties with aspiration of bone marrow (12.3%), and bended/broken needle (4.0%). When using an established IO access the reported complications were difficulties with injection fluid and drugs after IO insertion (7.4%), slow infusion (despite use of pressure bag) (8.8%), displacement after insertion (8.5%), and extravasation (3.7%). Compartment syndrome and osteomyelitis occurred in 0.6% and 0.4% of cases respectively. Conclusion: In users’ recollection of real-life IO use, perceived complications were more frequent than usually reported from model studies. The perceived difficulties with using IO could affect the willingness of medical staff to use IO. Therefore, user experience should be addressed both in education of how to use, and research and development of IOs. PMID:24106537

  18. Daily air quality index forecasting with hybrid models: A case in China.

    PubMed

    Zhu, Suling; Lian, Xiuyuan; Liu, Haixia; Hu, Jianming; Wang, Yuanyuan; Che, Jinxing

    2017-12-01

    Air quality is closely related to quality of life. Air pollution forecasting plays a vital role in air pollution warnings and controlling. However, it is difficult to attain accurate forecasts for air pollution indexes because the original data are non-stationary and chaotic. The existing forecasting methods, such as multiple linear models, autoregressive integrated moving average (ARIMA) and support vector regression (SVR), cannot fully capture the information from series of pollution indexes. Therefore, new effective techniques need to be proposed to forecast air pollution indexes. The main purpose of this research is to develop effective forecasting models for regional air quality indexes (AQI) to address the problems above and enhance forecasting accuracy. Therefore, two hybrid models (EMD-SVR-Hybrid and EMD-IMFs-Hybrid) are proposed to forecast AQI data. The main steps of the EMD-SVR-Hybrid model are as follows: the data preprocessing technique EMD (empirical mode decomposition) is utilized to sift the original AQI data to obtain one group of smoother IMFs (intrinsic mode functions) and a noise series, where the IMFs contain the important information (level, fluctuations and others) from the original AQI series. LS-SVR is applied to forecast the sum of the IMFs, and then, S-ARIMA (seasonal ARIMA) is employed to forecast the residual sequence of LS-SVR. In addition, EMD-IMFs-Hybrid first separately forecasts the IMFs via statistical models and sums the forecasting results of the IMFs as EMD-IMFs. Then, S-ARIMA is employed to forecast the residuals of EMD-IMFs. To certify the proposed hybrid model, AQI data from June 2014 to August 2015 collected from Xingtai in China are utilized as a test case to investigate the empirical research. In terms of some of the forecasting assessment measures, the AQI forecasting results of Xingtai show that the two proposed hybrid models are superior to ARIMA, SVR, GRNN, EMD-GRNN, Wavelet-GRNN and Wavelet-SVR. Therefore, the proposed hybrid models can be used as effective and simple tools for air pollution forecasting and warning as well as for management. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A general regression framework for a secondary outcome in case-control studies.

    PubMed

    Tchetgen Tchetgen, Eric J

    2014-01-01

    Modern case-control studies typically involve the collection of data on a large number of outcomes, often at considerable logistical and monetary expense. These data are of potentially great value to subsequent researchers, who, although not necessarily concerned with the disease that defined the case series in the original study, may want to use the available information for a regression analysis involving a secondary outcome. Because cases and controls are selected with unequal probability, regression analysis involving a secondary outcome generally must acknowledge the sampling design. In this paper, the author presents a new framework for the analysis of secondary outcomes in case-control studies. The approach is based on a careful re-parameterization of the conditional model for the secondary outcome given the case-control outcome and regression covariates, in terms of (a) the population regression of interest of the secondary outcome given covariates and (b) the population regression of the case-control outcome on covariates. The error distribution for the secondary outcome given covariates and case-control status is otherwise unrestricted. For a continuous outcome, the approach sometimes reduces to extending model (a) by including a residual of (b) as a covariate. However, the framework is general in the sense that models (a) and (b) can take any functional form, and the methodology allows for an identity, log or logit link function for model (a).

  20. Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.

    PubMed

    Bühler, Jonas; von Lieres, Eric; Huber, Gregor J

    2018-01-01

    Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.

  1. Successful use of right unilateral ECT for catatonia: a case series.

    PubMed

    Cristancho, Pilar; Jewkes, Delaina; Mon, Thetsu; Conway, Charles

    2014-03-01

    Catatonia is a neuropsychiatric syndrome involving motor signs in association with disorders of mood, behavior, or thought. Bitemporal electrode placement electroconvulsive therapy (ECT) is a proven effective treatment for catatonia, and this mode of ECT delivery is the preferred method of treatment in this condition. Studies in major depressive disorder have demonstrated that suprathreshold, nondominant (right) hemisphere, unilateral electrode placement ECT has fewer adverse effects, especially cognitive adverse effects, than bitemporal ECT. This case series describes the use of right unilateral (RUL) ECT in 5 patients with catatonia. Before ECT, all 5 patients in this series initially failed therapy with benzodiazepines and psychotropic medications. Each catatonic patient received a series of 8 to 12 RUL ECT in an every-other-day series. After ECT, 4 of the 5 patients had a full recovery from catatonia. One patient achieved only partial response to RUL ECT, and no additional benefit was obtained with bitemporal ECT. All patients in this case series tolerated RUL ECT without major adverse effects. This case series illustrates successful use of RUL ECT in patients with catatonia and adds to the early literature demonstrating its effective use in treating this complex condition.

  2. Applications and Comparisons of Four Time Series Models in Epidemiological Surveillance Data

    PubMed Central

    Young, Alistair A.; Li, Xiaosong

    2014-01-01

    Public health surveillance systems provide valuable data for reliable predication of future epidemic events. This paper describes a study that used nine types of infectious disease data collected through a national public health surveillance system in mainland China to evaluate and compare the performances of four time series methods, namely, two decomposition methods (regression and exponential smoothing), autoregressive integrated moving average (ARIMA) and support vector machine (SVM). The data obtained from 2005 to 2011 and in 2012 were used as modeling and forecasting samples, respectively. The performances were evaluated based on three metrics: mean absolute error (MAE), mean absolute percentage error (MAPE), and mean square error (MSE). The accuracy of the statistical models in forecasting future epidemic disease proved their effectiveness in epidemiological surveillance. Although the comparisons found that no single method is completely superior to the others, the present study indeed highlighted that the SVMs outperforms the ARIMA model and decomposition methods in most cases. PMID:24505382

  3. Validating the WRF-Chem model for wind energy applications using High Resolution Doppler Lidar data from a Utah 2012 field campaign

    NASA Astrophysics Data System (ADS)

    Mitchell, M. J.; Pichugina, Y. L.; Banta, R. M.

    2015-12-01

    Models are important tools for assessing potential of wind energy sites, but the accuracy of these projections has not been properly validated. In this study, High Resolution Doppler Lidar (HRDL) data obtained with high temporal and spatial resolution at heights of modern turbine rotors were compared to output from the WRF-chem model in order to help improve the performance of the model in producing accurate wind forecasts for the industry. HRDL data were collected from January 23-March 1, 2012 during the Uintah Basin Winter Ozone Study (UBWOS) field campaign. A model validation method was based on the qualitative comparison of the wind field images, time-series analysis and statistical analysis of the observed and modeled wind speed and direction, both for case studies and for the whole experiment. To compare the WRF-chem model output to the HRDL observations, the model heights and forecast times were interpolated to match the observed times and heights. Then, time-height cross-sections of the HRDL and WRF-Chem wind speed and directions were plotted to select case studies. Cross-sections of the differences between the observed and forecasted wind speed and directions were also plotted to visually analyze the model performance in different wind flow conditions. A statistical analysis includes the calculation of vertical profiles and time series of bias, correlation coefficient, root mean squared error, and coefficient of determination between two datasets. The results from this analysis reveals where and when the model typically struggles in forecasting winds at heights of modern turbine rotors so that in the future the model can be improved for the industry.

  4. COLLABORATE©: A Universal Competency-Based Paradigm for Professional Case Management, Part II: Competency Clarification.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2013-01-01

    The purpose of this second article of a 3-article series is to clarify the competencies for a new paradigm of case management built upon a value-driven foundation that : Applicable to all health care sectors where case management is practiced. In moving forward, the one fact that rings true is that there will be a constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby™ or Pokey™. This is exactly the time to define a competency-based case management model, highlighting one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. Although there is an inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  5. Simulation-based power calculation for designing interrupted time series analyses of health policy interventions.

    PubMed

    Zhang, Fang; Wagner, Anita K; Ross-Degnan, Dennis

    2011-11-01

    Interrupted time series is a strong quasi-experimental research design to evaluate the impacts of health policy interventions. Using simulation methods, we estimated the power requirements for interrupted time series studies under various scenarios. Simulations were conducted to estimate the power of segmented autoregressive (AR) error models when autocorrelation ranged from -0.9 to 0.9 and effect size was 0.5, 1.0, and 2.0, investigating balanced and unbalanced numbers of time periods before and after an intervention. Simple scenarios of autoregressive conditional heteroskedasticity (ARCH) models were also explored. For AR models, power increased when sample size or effect size increased, and tended to decrease when autocorrelation increased. Compared with a balanced number of study periods before and after an intervention, designs with unbalanced numbers of periods had less power, although that was not the case for ARCH models. The power to detect effect size 1.0 appeared to be reasonable for many practical applications with a moderate or large number of time points in the study equally divided around the intervention. Investigators should be cautious when the expected effect size is small or the number of time points is small. We recommend conducting various simulations before investigation. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Progress Report on the Airborne Metadata and Time Series Working Groups of the 2016 ESDSWG

    NASA Astrophysics Data System (ADS)

    Evans, K. D.; Northup, E. A.; Chen, G.; Conover, H.; Ames, D. P.; Teng, W. L.; Olding, S. W.; Krotkov, N. A.

    2016-12-01

    NASA's Earth Science Data Systems Working Groups (ESDSWG) was created over 10 years ago. The role of the ESDSWG is to make recommendations relevant to NASA's Earth science data systems from users' experiences. Each group works independently focusing on a unique topic. Participation in ESDSWG groups comes from a variety of NASA-funded science and technology projects, including MEaSUREs and ROSS. Participants include NASA information technology experts, affiliated contractor staff and other interested community members from academia and industry. Recommendations from the ESDSWG groups will enhance NASA's efforts to develop long term data products. The Airborne Metadata Working Group is evaluating the suitability of the current Common Metadata Repository (CMR) and Unified Metadata Model (UMM) for airborne data sets and to develop new recommendations as necessary. The overarching goal is to enhance the usability, interoperability, discovery and distribution of airborne observational data sets. This will be done by assessing the suitability (gaps) of the current UMM model for airborne data using lessons learned from current and past field campaigns, listening to user needs and community recommendations and assessing the suitability of ISO metadata and other standards to fill the gaps. The Time Series Working Group (TSWG) is a continuation of the 2015 Time Series/WaterML2 Working Group. The TSWG is using a case study-driven approach to test the new Open Geospatial Consortium (OGC) TimeseriesML standard to determine any deficiencies with respect to its ability to fully describe and encode NASA earth observation-derived time series data. To do this, the time series working group is engaging with the OGC TimeseriesML Standards Working Group (SWG) regarding unsatisfied needs and possible solutions. The effort will end with the drafting of an OGC Engineering Report based on the use cases and interactions with the OGC TimeseriesML SWG. Progress towards finalizing recommendations will be presented at the meeting.

  7. Programming a hillslope water movement model on the MPP

    NASA Technical Reports Server (NTRS)

    Devaney, J. E.; Irving, A. R.; Camillo, P. J.; Gurney, R. J.

    1987-01-01

    A physically based numerical model was developed of heat and moisture flow within a hillslope on a parallel architecture computer, as a precursor to a model of a complete catchment. Moisture flow within a catchment includes evaporation, overland flow, flow in unsaturated soil, and flow in saturated soil. Because of the empirical evidence that moisture flow in unsaturated soil is mainly in the vertical direction, flow in the unsaturated zone can be modeled as a series of one dimensional columns. This initial version of the hillslope model includes evaporation and a single column of one dimensional unsaturated zone flow. This case has already been solved on an IBM 3081 computer and is now being applied to the massively parallel processor architecture so as to make the extension to the one dimensional case easier and to check the problems and benefits of using a parallel architecture machine.

  8. Did case-based payment influence surgical readmission rates in France? A retrospective study

    PubMed Central

    Vuagnat, Albert; Yilmaz, Engin; Roussot, Adrien; Rodwin, Victor; Gadreau, Maryse; Bernard, Alain; Creuzot-Garcher, Catherine; Quantin, Catherine

    2018-01-01

    Objectives To determine whether implementation of a case-based payment system changed all-cause readmission rates in the 30 days following discharge after surgery, we analysed all surgical procedures performed in all hospitals in France before (2002–2004), during (2005–2008) and after (2009–2012) its implementation. Setting Our study is based on claims data for all surgical procedures performed in all acute care hospitals with >300 surgical admissions per year (740 hospitals) in France over 11 years (2002–2012; n=51.6 million admissions). Interventions We analysed all-cause 30-day readmission rates after surgery using a logistic regression model and an interrupted time series analysis. Results The overall 30-day all-cause readmission rate following discharge after surgery increased from 8.8% to 10.0% (P<0.001) for the public sector and from 5.9% to 8.6% (P<0.001) for the private sector. Interrupted time series models revealed a significant linear increase in readmission rates over the study period in all types of hospitals. However, the implementation of case-based payment was only associated with a significant increase in rehospitalisation rates for private hospitals (P<0.001). Conclusion In France, the increase in the readmission rate appears to be relatively steady in both the private and public sector but appears not to have been affected by the introduction of a case-based payment system after accounting for changes in care practices in the public sector. PMID:29391376

  9. What's in a Name? The Incorrect Use of Case Series as a Study Design Label in Studies Involving Dogs and Cats.

    PubMed

    Sargeant, J M; O'Connor, A M; Cullen, J N; Makielski, K M; Jones-Bitton, A

    2017-07-01

    Study design labels are used to identify relevant literature to address specific clinical and research questions and to aid in evaluating the evidentiary value of research. Evidence from the human healthcare literature indicates that the label "case series" may be used inconsistently and inappropriately. Our primary objective was to determine the proportion of studies in the canine and feline veterinary literature labeled as case series that actually corresponded to descriptive cohort studies, population-based cohort studies, or other study designs. Our secondary objective was to identify the proportion of case series in which potentially inappropriate inferential statements were made. Descriptive evaluation of published literature. One-hundred published studies (from 19 journals) labeled as case series. Studies were identified by a structured literature search, with random selection of 100 studies from the relevant citations. Two reviewers independently characterized each study, with disagreements resolved by consensus. Of the 100 studies, 16 were case series. The remaining studies were descriptive cohort studies (35), population-based cohort studies (36), or other observational or experimental study designs (13). Almost half (48.8%) of the case series or descriptive cohort studies, with no control group and no formal statistical analysis, included inferential statements about the efficacy of treatment or statistical significance of potential risk factors. Authors, peer-reviewers, and editors should carefully consider the design elements of a study to accurately identify and label the study design. Doing so will facilitate an understanding of the evidentiary value of the results. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  10. The application of MINIQUASI to thermal program boundary and initial value problems

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The feasibility of applying the solution techniques of Miniquasi to the set of equations which govern a thermoregulatory model is investigated. For solving nonlinear equations and/or boundary conditions, a Taylor Series expansion is required for linearization of both equations and boundary conditions. The solutions are iterative and in each iteration, a problem like the linear case is solved. It is shown that Miniquasi cannot be applied to the thermoregulatory model as originally planned.

  11. Testing the Effectiveness of Cognitive Analytic Therapy for Hypersexuality Disorder: An Intensive Time-Series Evaluation.

    PubMed

    Kellett, Stephen; Simmonds-Buckley, Mel; Totterdell, Peter

    2017-08-18

    The evidence base for treatment of hypersexuality disorder (HD) has few studies with appropriate methodological rigor. This study therefore conducted a single case experiment of cognitive analytic therapy (CAT) for HD using an A/B design with extended follow-up. Cruising, pornography usage, masturbation frequency and associated cognitions and emotions were measured daily in a 231-day time series. Following a three-week assessment baseline (A: 21 days), treatment was delivered via outpatient sessions (B: 147 days), with the follow-up period lasting 63 days. Results show that cruising and pornography usage extinguished. The total sexual outlet score no longer met caseness, and the primary nomothetic hypersexuality outcome measure met recovery criteria. Reduced pornography consumption was mediated by reduced obsessionality and greater interpersonal connectivity. The utility of the CAT model for intimacy problems shows promise. Directions for future HD outcome research are also provided.

  12. 75 FR 75870 - Airworthiness Directives; Airbus Model A300 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... to the aeroplane has two load paths, a Primary Load Path (PLP) and a Secondary Load Path (SLP), which is only engaged in case of PLP failure. Following the design intent, engagement of the SLP leads to... representative flights have demonstrated that, when the SLP is engaged, it does not systematically jam the THSA...

  13. 75 FR 52652 - Airworthiness Directives; Airbus Model A300 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-27

    ... to the aeroplane has two load paths, a Primary Load Path (PLP) and a Secondary Load Path (SLP), which is only engaged in case of PLP failure. Following the design intent, engagement of the SLP leads to... representative flights have demonstrated that, when the SLP is engaged, it does not systematically jam the THSA...

  14. A Behavior-Based Circuit Model of How Outcome Expectations Organize Learned Behavior in Larval "Drosophila"

    ERIC Educational Resources Information Center

    Schleyer, Michael; Saumweber, Timo; Nahrendorf, Wiebke; Fischer, Benjamin; von Alpen, Desiree; Pauls, Dennis; Thum, Andreas; Gerber, Bertram

    2011-01-01

    Drosophila larvae combine a numerically simple brain, a correspondingly moderate behavioral complexity, and the availability of a rich toolbox for transgenic manipulation. This makes them attractive as a study case when trying to achieve a circuit-level understanding of behavior organization. From a series of behavioral experiments, we suggest a…

  15. Outrage Management in Cases of Sexual Harassment as Revealed in Judicial Decisions

    ERIC Educational Resources Information Center

    McDonald, Paula; Graham, Tina; Martin, Brian

    2010-01-01

    Sexual harassment can be conceptualized as a series of interactions between harassers and targets that either inhibit or increase outrage by third parties. The outrage management model predicts the kinds of actions likely to be used by perpetrators to minimize outrage, predicts the consequences of failing to use these tactics--namely backfire, and…

  16. Noncognitive Factors in an Elementary School-Wide Model of Arts Integration

    ERIC Educational Resources Information Center

    Simpson Steele, Jamie

    2016-01-01

    Pomaika'i Elementary School has answered a call to improve education by providing content instruction through the arts. How does school wide arts integration in an elementary setting support students as they transition to middle school? This bounded case study examines the experiences of eight families through a series of interviews with students,…

  17. Energy in an Interdependent World: A Global Development Studies Case Study.

    ERIC Educational Resources Information Center

    Collier, Anne B.

    Part of the Global Development Studies Institute series of model curricula, the teacher guide presents strategies for teaching about energy as a global issue. The unit, intended for students in grades 11-14, is designed for one semester. The overall objective is to promote awareness of and responsibility toward the global community through an…

  18. 75 FR 17884 - Airworthiness Directives; Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100 & 440...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-08

    ... product. The MCAI describes the unsafe condition as: Following five reported cases of balance washer screw...-conformity that increased their susceptibility to brittle fracture. Failure of a balance washer screw can result in loss of the related balance washer, with consequent turbine imbalance. Such imbalance could...

  19. Universities and Innovation in a Factor-Driven Economy: The Egyptian Case

    ERIC Educational Resources Information Center

    El Hadidi, Hala; Kirby, David A.

    2015-01-01

    The paper explores the role of universities in innovation in the modern knowledge economy, discusses the Triple Helix model and the entrepreneurial university, and then examines the application of these concepts in Egypt. The study, which specifically addresses the roles of universities in the innovation process in Egypt, is based on a series of…

  20. Developmental Growth in Students' Concept of Energy: Analysis of Selected Items from the TIMSS Database

    ERIC Educational Resources Information Center

    Liu, Xiufeng; McKeough, Anne

    2005-01-01

    The aim of this study was to develop a model of students' energy concept development. Applying Case's (1985, 1992) structural theory of cognitive development, we hypothesized that students' concept of energy undergoes a series of transitions, corresponding to systematic increases in working memory capacity. The US national sample from the Third…

  1. Organizing for Student Success: The University College Model. The First Year Experience Monograph Series No. 53

    ERIC Educational Resources Information Center

    Evenbeck, Scott E.; Jackson, Barbara; Smith, Maggy; Ward, Dorothy

    2010-01-01

    Organizing for Student Success draws on data from more than 50 institutions to provide insight into how university colleges are organized, the initiatives they house, and the practices in place to ensure their effectiveness. Twenty case studies from 15 different campuses offer an in-depth understanding of institutional practice. Ultimately,…

  2. Extending (Q)SARs to incorporate proprietary knowledge for regulatory purposes: A case study using aromatic amine mutagenicity.

    PubMed

    Ahlberg, Ernst; Amberg, Alexander; Beilke, Lisa D; Bower, David; Cross, Kevin P; Custer, Laura; Ford, Kevin A; Van Gompel, Jacky; Harvey, James; Honma, Masamitsu; Jolly, Robert; Joossens, Elisabeth; Kemper, Raymond A; Kenyon, Michelle; Kruhlak, Naomi; Kuhnke, Lara; Leavitt, Penny; Naven, Russell; Neilan, Claire; Quigley, Donald P; Shuey, Dana; Spirkl, Hans-Peter; Stavitskaya, Lidiya; Teasdale, Andrew; White, Angela; Wichard, Joerg; Zwickl, Craig; Myatt, Glenn J

    2016-06-01

    Statistical-based and expert rule-based models built using public domain mutagenicity knowledge and data are routinely used for computational (Q)SAR assessments of pharmaceutical impurities in line with the approach recommended in the ICH M7 guideline. Knowledge from proprietary corporate mutagenicity databases could be used to increase the predictive performance for selected chemical classes as well as expand the applicability domain of these (Q)SAR models. This paper outlines a mechanism for sharing knowledge without the release of proprietary data. Primary aromatic amine mutagenicity was selected as a case study because this chemical class is often encountered in pharmaceutical impurity analysis and mutagenicity of aromatic amines is currently difficult to predict. As part of this analysis, a series of aromatic amine substructures were defined and the number of mutagenic and non-mutagenic examples for each chemical substructure calculated across a series of public and proprietary mutagenicity databases. This information was pooled across all sources to identify structural classes that activate or deactivate aromatic amine mutagenicity. This structure activity knowledge, in combination with newly released primary aromatic amine data, was incorporated into Leadscope's expert rule-based and statistical-based (Q)SAR models where increased predictive performance was demonstrated. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. To what extent does variability of historical rainfall series influence extreme event statistics of sewer system surcharge and overflows?

    PubMed

    Schaarup-Jensen, K; Rasmussen, M R; Thorndahl, S

    2009-01-01

    In urban drainage modelling long-term extreme statistics has become an important basis for decision-making e.g. in connection with renovation projects. Therefore it is of great importance to minimize the uncertainties with regards to long-term prediction of maximum water levels and combined sewer overflow (CSO) in drainage systems. These uncertainties originate from large uncertainties regarding rainfall inputs, parameters, and assessment of return periods. This paper investigates how the choice of rainfall time series influences the extreme events statistics of max water levels in manholes and CSO volumes. Traditionally, long-term rainfall series, from a local rain gauge, are unavailable. In the present case study, however, long and local rain series are available. 2 rainfall gauges have recorded events for approximately 9 years at 2 locations within the catchment. Beside these 2 gauges another 7 gauges are located at a distance of max 20 kilometers from the catchment. All gauges are included in the Danish national rain gauge system which was launched in 1976. The paper describes to what extent the extreme events statistics based on these 9 series diverge from each other and how this diversity can be handled, e.g. by introducing an "averaging procedure" based on the variability within the set of statistics. All simulations are performed by means of the MOUSE LTS model.

  4. Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source

    NASA Astrophysics Data System (ADS)

    Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.

    2018-02-01

    The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.

  5. Implications on 1 + 1 D Tsunami Runup Modeling due to Time Features of the Earthquake Source

    NASA Astrophysics Data System (ADS)

    Fuentes, M.; Riquelme, S.; Ruiz, J.; Campos, J.

    2018-04-01

    The time characteristics of the seismic source are usually neglected in tsunami modeling, due to the difference in the time scale of both processes. Nonetheless, there are just a few analytical studies that intended to explain separately the role of the rise time and the rupture velocity. In this work, we extend an analytical 1 + 1 D solution for the shoreline motion time series, from the static case to the kinematic case, by including both rise time and rupture velocity. Our results show that the static case corresponds to a limit case of null rise time and infinite rupture velocity. Both parameters contribute in shifting the arrival time, but maximum runup may be affected by very slow ruptures and long rise time. Parametric analysis reveals that runup is strictly decreasing with the rise time while is highly amplified in a certain range of slow rupture velocities. For even lower rupture velocities, the tsunami excitation vanishes and for larger, quicker approaches to the instantaneous case.

  6. The potential impact of scatterometry on oceanography - A wave forecasting case

    NASA Technical Reports Server (NTRS)

    Cane, M. A.; Cardone, V. J.

    1981-01-01

    A series of observing system simulation experiments have been performed in order to assess the potential impact of marine surface wind data on numerical weather prediction. In addition to conventional data, the experiments simulated the time-continuous assimilation of remotely sensed marine surface wind or temperature sounding data. The wind data were fabricated directly for model grid points intercepted by a Seasat-1 scatterometer swath and were assimilated into the lowest active level (945 mb) of the model using a localized successive correction method. It is shown that Seasat wind data can greatly improve numerical weather forecasts due to better definition of specific features. The case of the QE II storm is examined.

  7. Composite operators in the hopping parameter expansion in the free quark model

    NASA Astrophysics Data System (ADS)

    Kunszt, Z.

    1983-11-01

    I have calculated hopping parameter series of meson and baryon propagators up to O(K32) in the Wilson formulation of the free quark model. The position of branch point singularities has been found with the help of Padé approximants. The values of the position of the singularities in K agreed with the exact values within 1-2% in case of mesons and 4-5% in case of baryons. It is argued that in QCD at the cross-over region the systematic errors of the method must be even smaller. Part of this work has been done while the author was visiting the Rutherford and Appleton Laboratories, UK.

  8. Local Difference Measures between Complex Networks for Dynamical System Model Evaluation

    PubMed Central

    Lange, Stefan; Donges, Jonathan F.; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation. Building on a recent study by Feldhoff et al. [1] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system. Three types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node-weighted graphs are discussed. PMID:25856374

  9. Local difference measures between complex networks for dynamical system model evaluation.

    PubMed

    Lange, Stefan; Donges, Jonathan F; Volkholz, Jan; Kurths, Jürgen

    2015-01-01

    A faithful modeling of real-world dynamical systems necessitates model evaluation. A recent promising methodological approach to this problem has been based on complex networks, which in turn have proven useful for the characterization of dynamical systems. In this context, we introduce three local network difference measures and demonstrate their capabilities in the field of climate modeling, where these measures facilitate a spatially explicit model evaluation.Building on a recent study by Feldhoff et al. [8] we comparatively analyze statistical and dynamical regional climate simulations of the South American monsoon system [corrected]. types of climate networks representing different aspects of rainfall dynamics are constructed from the modeled precipitation space-time series. Specifically, we define simple graphs based on positive as well as negative rank correlations between rainfall anomaly time series at different locations, and such based on spatial synchronizations of extreme rain events. An evaluation against respective networks built from daily satellite data provided by the Tropical Rainfall Measuring Mission 3B42 V7 reveals far greater differences in model performance between network types for a fixed but arbitrary climate model than between climate models for a fixed but arbitrary network type. We identify two sources of uncertainty in this respect. Firstly, climate variability limits fidelity, particularly in the case of the extreme event network; and secondly, larger geographical link lengths render link misplacements more likely, most notably in the case of the anticorrelation network; both contributions are quantified using suitable ensembles of surrogate networks. Our model evaluation approach is applicable to any multidimensional dynamical system and especially our simple graph difference measures are highly versatile as the graphs to be compared may be constructed in whatever way required. Generalizations to directed as well as edge- and node-weighted graphs are discussed.

  10. Assessing the economic impacts of drought from the perspective of profit loss rate: a case study of the sugar industry in China

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Lin, L.; Chen, H.

    2015-07-01

    Natural disasters have enormous impacts on human society, especially on the development of the economy. To support decision-making in mitigation and adaption to natural disasters, assessment of economic impacts is fundamental and of great significance. Based on a review of the literature on economic impact evaluation, this paper proposes a new assessment model of the economic impacts of droughts by using the sugar industry in China as a case study, which focuses on the generation and transfer of economic impacts along a simple value chain involving only sugarcane growers and a sugar-producing company. A perspective of profit loss rate is applied to scale economic impact. By using "with and without" analysis, profit loss is defined as the difference in profits between disaster-hit and disaster-free scenarios. To calculate profit, analysis of a time series of sugar price is applied. With the support of a linear regression model, an endogenous trend in sugar price is identified and the time series of sugar price "without" disaster is obtained, using an autoregressive error model to separate impact of disasters from the internal trend in sugar price. Unlike the settings in other assessment models, representative sugar prices, which represent value level in disaster-free conditions and disaster-hit conditions, are integrated from a long time series that covers the whole period of drought. As a result, it is found that in a rigid farming contract, sugarcane growers suffer far more than the sugar company when impacted by severe drought, which may promote reflections among various economic bodies on economic equality related to the occurrence of natural disasters. Further, sensitivity analysis of the model built reveals that sugarcane purchase price has a significant influence on profit loss rate, which implies that setting a proper sugarcane purchase price would be an effective way of realizing economic equality in future practice of contract farming.

  11. Numerical limitations in application of vector autoregressive modeling and Granger causality to analysis of EEG time series

    NASA Astrophysics Data System (ADS)

    Kammerdiner, Alla; Xanthopoulos, Petros; Pardalos, Panos M.

    2007-11-01

    In this chapter a potential problem with application of the Granger-causality based on the simple vector autoregressive (VAR) modeling to EEG data is investigated. Although some initial studies tested whether the data support the stationarity assumption of VAR, the stability of the estimated model is rarely (if ever) been verified. In fact, in cases when the stability condition is violated the process may exhibit a random walk like behavior or even be explosive. The problem is illustrated by an example.

  12. 76 FR 71341 - BASINS and WEPP Climate Assessment Tools: Case Study Guide to Potential Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-17

    ... report presents a series of short case studies designed to illustrate the capabilities of these tools for... change impacts on water. This report presents a series of short case studies using the BASINS and WEPP climate assessment tools. The case studies are designed to illustrate the capabilities of these tools for...

  13. Temporal association between the influenza virus and respiratory syncytial virus (RSV): RSV as a predictor of seasonal influenza.

    PubMed

    Míguez, A; Iftimi, A; Montes, F

    2016-09-01

    Epidemiologists agree that there is a prevailing seasonality in the presentation of epidemic waves of respiratory syncytial virus (RSV) infections and influenza. The aim of this study is to quantify the potential relationship between the activity of RSV, with respect to the influenza virus, in order to use the RSV seasonal curve as a predictor of the evolution of an influenza virus epidemic wave. Two statistical tools, logistic regression and time series, are used for predicting the evolution of influenza. Both logistic models and time series of influenza consider RSV information from previous weeks. Data consist of influenza and confirmed RSV cases reported in Comunitat Valenciana (Spain) during the period from week 40 (2010) to week 8 (2014). Binomial logistic regression models used to predict the two states of influenza wave, basal or peak, result in a rate of correct classification higher than 92% with the validation set. When a finer three-states categorization is established, basal, increasing peak and decreasing peak, the multinomial logistic model performs well in 88% of cases of the validation set. The ARMAX model fits well for influenza waves and shows good performance for short-term forecasts up to 3 weeks. The seasonal evolution of influenza virus can be predicted a minimum of 4 weeks in advance using logistic models based on RSV. It would be necessary to study more inter-pandemic seasons to establish a stronger relationship between the epidemic waves of both viruses.

  14. Contact problem for an elastic reinforcement bonded to an elastic plate

    NASA Technical Reports Server (NTRS)

    Erdogan, F.; Civelek, M. B.

    1973-01-01

    The stiffening layer is treated as an elastic membrane and the base plate is assumed to be an elastic continuum. The bonding between the two materials is assumed to be either one of direct adhesion ro through a thin adhesive layer which is treated as a shear spring. The solution for the simple case in which both the stiffener and the base plate are treated as membranes is also given. The contact stress is obtained for a series of numerical examples. In the direct adhesion case the contact stress becomes infinite at the stiffener ends with a typical square root singularity for the continuum model, and behaving as a delta function for the membrane model. In the case of bonding through an adhesive layer the contact stress becomes finite and continuous along the entire contact area.

  15. Testing for nonlinearity in time series: The method of surrogate data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Theiler, J.; Galdrikian, B.; Longtin, A.

    1991-01-01

    We describe a statistical approach for identifying nonlinearity in time series; in particular, we want to avoid claims of chaos when simpler models (such as linearly correlated noise) can explain the data. The method requires a careful statement of the null hypothesis which characterizes a candidate linear process, the generation of an ensemble of surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against themore » null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. We present algorithms for generating surrogate data under various null hypotheses, and we show the results of numerical experiments on artificial data using correlation dimension, Lyapunov exponent, and forecasting error as discriminating statistics. Finally, we consider a number of experimental time series -- including sunspots, electroencephalogram (EEG) signals, and fluid convection -- and evaluate the statistical significance of the evidence for nonlinear structure in each case. 56 refs., 8 figs.« less

  16. Modeling time-series count data: the unique challenges facing political communication studies.

    PubMed

    Fogarty, Brian J; Monogan, James E

    2014-05-01

    This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Evaluating mallard adaptive management models with time series

    USGS Publications Warehouse

    Conn, P.B.; Kendall, W.L.

    2004-01-01

    Wildlife practitioners concerned with midcontinent mallard (Anas platyrhynchos) management in the United States have instituted a system of adaptive harvest management (AHM) as an objective format for setting harvest regulations. Under the AHM paradigm, predictions from a set of models that reflect key uncertainties about processes underlying population dynamics are used in coordination with optimization software to determine an optimal set of harvest decisions. Managers use comparisons of the predictive abilities of these models to gauge the relative truth of different hypotheses about density-dependent recruitment and survival, with better-predicting models giving more weight to the determination of harvest regulations. We tested the effectiveness of this strategy by examining convergence rates of 'predictor' models when the true model for population dynamics was known a priori. We generated time series for cases when the a priori model was 1 of the predictor models as well as for several cases when the a priori model was not in the model set. We further examined the addition of different levels of uncertainty into the variance structure of predictor models, reflecting different levels of confidence about estimated parameters. We showed that in certain situations, the model-selection process favors a predictor model that incorporates the hypotheses of additive harvest mortality and weakly density-dependent recruitment, even when the model is not used to generate data. Higher levels of predictor model variance led to decreased rates of convergence to the model that generated the data, but model weight trajectories were in general more stable. We suggest that predictive models should incorporate all sources of uncertainty about estimated parameters, that the variance structure should be similar for all predictor models, and that models with different functional forms for population dynamics should be considered for inclusion in predictor model! sets. All of these suggestions should help lower the probability of erroneous learning in mallard ABM and adaptive management in general.

  18. [Changes in the diagnosis and therapeutic management of hepatic trauma. A retrospective study comparing 2 series of cases in different (1997-1984 vs. 2001-2008)].

    PubMed

    Sánchez-Bueno, Francisco; Fernández-Carrión, Jezabel; Torres Salmerón, Gloria; García Pérez, Rocío; Ramírez Romero, Pablo; Fuster Quiñonero, Matilde; Parrilla, Pascual

    2011-01-01

    We present a series of 146 cases of hepatic trauma (HT) treated in our hospital over a period of 8 yearsm (2001-2008), and comparing it with a previous series of 92 cases (1977-1984). The mean age in the current series was 28.6 years and the majority were male. The closed traumas were mainly penetrating, with the most frequent cause being road traffic accidents. The American Association for the Surgery of Trauma (AAST) classification was used to evaluate the grade of the hepatic injury. Associated abdominal and /or extra-abdominal injuries were seen in 79.5% of the patients, with the most frequent being chest trauma, compared to bone fractures in the previous series. The most common associated intra-abdominal injury was the spleen in both series. The most used diagnostic technique in the current series was abdominal CT. Simple peritoneal puncture and lavage (PLP) were the most used examinations used in the previous series. Non-surgical treatment (NST) was given in 98 cases and the surgery was indicated in the remaining 48. In the previous series, 97.8% of patients were operated on. In the current series, on the 15 patients with severe liver injuries, 5 right hepatectomies, 2 segmentectomies and 6 packing compressions were performed, with the remaining two dying during surgery due to hepatic avulsion. The overall mortality was 3.4%, being 1% in the NST group and 8.3% in the surgical patients. In the previous series, the overall mortality was 29.3%. The key factor for using NST is to control haemodynamic stability, leaving surgical treatment for haemodynamically unstable patients. Copyright © 2011 AEC. Published by Elsevier Espana. All rights reserved.

  19. Climate change and precipitation evolution in Ifran region (Middle Atlas of Morocco).

    NASA Astrophysics Data System (ADS)

    Reddad, H.; Bakhat, M.; Damnati, B.

    2012-04-01

    Climate variability and extreme climatic events pose significant risks to human beings and generate terrestrial ecosystem dysfunctions. These effects are usually amplified by an inappropriate use of the existing natural resources. To face the new context of climate change, a rational and efficient use of these resources - particularly, water resource - on a global and regional scale must be implemented. Annual precipitation provides an overall amount of water, the assessment and management of this water is complicated due to the spatio-temporal variation of disturbance (aridity, rainfall intensity, length of dry season...). Therefore, understanding rainfall behavior would at least help to plan interventions to manage this resource and protect ecosystems that depend on it. Time-series analysis has become one of the major tools in hydrology. It is used for building mathematical models to detect trends and shifts in hydrologic records and to forecast hydrologic events. In this paper we present a case study of IFRAN region, which is situated in the Middle Atlas Mountains in Morocco. This study deals with modeling and forecasting rainfall time series using monthly rainfall data for the period 1970-2005. To determine the seasonal properties of this series we used first the Box-Jenkins methodology to build ARIMA model, and we expended the analysis with the Hylleberg-Engle-Granger-Yoo (HEGY) tests. The results of time series modeling showed the presence of significant deterministic seasonal pattern and no seasonal unit roots. This means that the series is stationary in all frequencies. The model can be used to predict rainfall in IFRAN and near sites; this prediction is not without interest in so far as any information about these random variables could provide a contribution to the researches made in domain for fighting against climate change. It doesn't give solutions to eradicate the precipitation variability phenomenon, but just to adapt to it.

  20. A case series investigating acceptance and commitment therapy as a treatment for previously treated, unremitted patients with anorexia nervosa.

    PubMed

    Berman, M I; Boutelle, K N; Crow, S J

    2009-11-01

    The aim of the present study was to evaluate the effectiveness of Acceptance and Commitment Therapy (ACT) for treatment of anorexia nervosa (AN) using a case series methodology among participants with a history of prior treatment for AN. Three participants enrolled; all completed the study. All participants had a history of 1-20 years of intensive eating disorder treatment prior to enrollment. Participants were seen for 17-19 twice-weekly sessions of manualized ACT. Symptoms were assessed at baseline, post-treatment and 1-year follow-up. All participants experienced clinically significant improvement on at least some measures; no participants worsened or lost weight even at 1-year follow-up. Simulation modelling analysis (SMA) revealed for some participants an increase in weight gain and a decrease in eating disorder symptoms during the treatment phase as compared to a baseline assessment phase. These data, although preliminary, suggest that ACT could be a promising treatment for subthreshold or clinical cases of AN, even with chronic participants or those with medical complications.

  1. Machine learning methods as a tool to analyse incomplete or irregularly sampled radon time series data.

    PubMed

    Janik, M; Bossew, P; Kurihara, O

    2018-07-15

    Machine learning is a class of statistical techniques which has proven to be a powerful tool for modelling the behaviour of complex systems, in which response quantities depend on assumed controls or predictors in a complicated way. In this paper, as our first purpose, we propose the application of machine learning to reconstruct incomplete or irregularly sampled data of time series indoor radon ( 222 Rn). The physical assumption underlying the modelling is that Rn concentration in the air is controlled by environmental variables such as air temperature and pressure. The algorithms "learn" from complete sections of multivariate series, derive a dependence model and apply it to sections where the controls are available, but not the response (Rn), and in this way complete the Rn series. Three machine learning techniques are applied in this study, namely random forest, its extension called the gradient boosting machine and deep learning. For a comparison, we apply the classical multiple regression in a generalized linear model version. Performance of the models is evaluated through different metrics. The performance of the gradient boosting machine is found to be superior to that of the other techniques. By applying learning machines, we show, as our second purpose, that missing data or periods of Rn series data can be reconstructed and resampled on a regular grid reasonably, if data of appropriate physical controls are available. The techniques also identify to which degree the assumed controls contribute to imputing missing Rn values. Our third purpose, though no less important from the viewpoint of physics, is identifying to which degree physical, in this case environmental variables, are relevant as Rn predictors, or in other words, which predictors explain most of the temporal variability of Rn. We show that variables which contribute most to the Rn series reconstruction, are temperature, relative humidity and day of the year. The first two are physical predictors, while "day of the year" is a statistical proxy or surrogate for missing or unknown predictors. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Learning from Multiple Cases: A New Paradigm for Investigating the Effects of Clinical Experience on Knowledge Restructuring and Knowledge Acquisition.

    ERIC Educational Resources Information Center

    Boshuizen, Henny P. A.; Bongaerts, Maureen Machiels; van de Wiel, Margaretha W. J.; Schmidt, Henk G.

    The effects of experience with a series of similar cases on the knowledge restructuring and learning from text were studied in a longitudinal design. Two groups of fourth-year medical students were confronted with a series of cases, part of them having the same underlying disease. The cases were interspersed with fillers, and each set of cases had…

  3. Universal RCFT correlators from the holomorphic bootstrap

    NASA Astrophysics Data System (ADS)

    Mukhi, Sunil; Muralidhara, Girish

    2018-02-01

    We elaborate and extend the method of Wronskian differential equations for conformal blocks to compute four-point correlation functions on the plane for classes of primary fields in rational (and possibly more general) conformal field theories. This approach leads to universal differential equations for families of CFT's and provides a very simple re-derivation of the BPZ results for the degenerate fields ϕ 1,2 and ϕ 2,1 in the c < 1 minimal models. We apply this technique to compute correlators for the WZW models corresponding to the Deligne-Cvitanović exceptional series of Lie algebras. The application turns out to be subtle in certain cases where there are multiple decoupled primaries. The power of this approach is demonstrated by applying it to compute four-point functions for the Baby Monster CFT, which does not belong to any minimal series.

  4. FINITE ELEMENT MODEL FOR TIDES AND CURRENTS WITH FIELD APPLICATIONS.

    USGS Publications Warehouse

    Walters, Roy A.

    1988-01-01

    A finite element model, based upon the shallow water equations, is used to calculate tidal amplitudes and currents for two field-scale test problems. Because tides are characterized by line spectra, the governing equations are subjected to harmonic decomposition. Thus the solution variables are the real and imaginary parts of the amplitude of sea level and velocity rather than a time series of these variables. The time series is recovered through synthesis. This scheme, coupled with a modified form of the governing equations, leads to high computational efficiency and freedom from excessive numerical noise. Two test-cases are presented. The first is a solution for eleven tidal constituents in the English Channel and southern North Sea, and three constituents are discussed. The second is an analysis of the frequency response and tidal harmonics for south San Francisco Bay.

  5. Conceptual models in exploration geochemistry-The Basin and Range Province of the Western United States and Northern Mexico

    USGS Publications Warehouse

    Lovering, T.G.; McCarthy, J.H.

    1978-01-01

    This summary of geochemical exploration in the Basin and Range Province is another in the series of reviews of geochemical-exploration applications covering a large region; this series began in 1975 with a summary for the Canadian Cordillera and Canadian Shield, and was followed in 1976 by a similar summary for Scandinavia (Norden). Rather than adhering strictly to the type of conceptual models applied in those papers, we have made use of generalized landscape geochemistry models related to the nature of concealment of ore deposits. This study is part of a continuing effort to examine and evaluate geochemical-exploration practices in different areas of the world. Twenty case histories of the application of geochemical exploration in both district and regional settings illustrate recent developments in techniques and approaches. Along with other published reports these case histories, exemplifying generalized models of concealed deposits, provide data used to evaluate geochemical-exploration programs and specific sample media. Because blind deposits are increasingly sought in the Basin and Range Province, the use of new sample media or anomaly-enhancement techniques is a necessity. Analysis of vapors or gases emanating from blind deposits is a promising new technique. Certain fractions of stream sediments show anomalies that are weak or not detected in conventional minus 80-mesh fractions. Multi-element analysis of mineralized bedrock may show zoning patterns that indicate depth or direction of ore. Examples of the application of these and other, more conventional methods are indicated in the case histories. The final section of this paper contains a brief evaluation of the applications of all types of sample media to geochemical exploration in the arid environment of the Basin and Range Province. ?? 1978.

  6. [Non-Hodgkin's malignant lymphoma with cervicofacial expression. Modulation of the radiotherapy-chemotherapy combination according to the cytological class].

    PubMed

    Bolla, M; Sotto, J J; Sotto, M F; Junien Lavillauroy, C; Bryon, J P; Vrousos, C; Holland, D

    1984-01-01

    An analysis was conducted in March 1983, after a mean follow up of 40 months, of cases of cervicofacial stages I and II non-Hodgkins malignant lymphoma in 3 children and 41 adults (mean age: 51 years, range: 6-90 years) treated between 1969 and March 1981. According to the Working Formulation malignancy was low in 4 cases, intermediate in 24 and high in 13; 3 cases could not ne classified retrospectively. Cytologic classification showed 13 of class 1 of low malignancy, 7 of class 2 of high malignancy with leukemic potential, and 16 of class 3 of high malignancy with a course leading to tumor formation. The cavum was involved in 10 cases, the tonsils in 9, the parotids in 1, the uvula in 1, isolated cervical adenopathies in 14, multiple unilateral adenopathies in 3 and bilateral cervical adenopathies in 5 cases. Therapy varied according to the series: in the first series (1969-1975) the 23 cases were treated by radiotherapy alone (40-55 Gy). In the second series (1976-1981) of 21 cases, chemotherapy was given as a function of the cytologic class: prophylactic chemotherapy for 6 months after radiation for classes 1 and 2, initial chemotherapy for 6 weeks, cerebral radiation and methotrexate intrathecally, and maintenance chemotherapy for 3 months in class 3. The failure rate for radiated zones was identical in the 2 series (less than 10%). Adjusted 5-year survival rate was 60% for series 1 against 70% for series 2 (p = 0.9), and adjusted remission rate was 43% against 64% (p = 0.8).

  7. CLUSTERnGO: a user-defined modelling platform for two-stage clustering of time-series data.

    PubMed

    Fidaner, Işık Barış; Cankorur-Cetinkaya, Ayca; Dikicioglu, Duygu; Kirdar, Betul; Cemgil, Ali Taylan; Oliver, Stephen G

    2016-02-01

    Simple bioinformatic tools are frequently used to analyse time-series datasets regardless of their ability to deal with transient phenomena, limiting the meaningful information that may be extracted from them. This situation requires the development and exploitation of tailor-made, easy-to-use and flexible tools designed specifically for the analysis of time-series datasets. We present a novel statistical application called CLUSTERnGO, which uses a model-based clustering algorithm that fulfils this need. This algorithm involves two components of operation. Component 1 constructs a Bayesian non-parametric model (Infinite Mixture of Piecewise Linear Sequences) and Component 2, which applies a novel clustering methodology (Two-Stage Clustering). The software can also assign biological meaning to the identified clusters using an appropriate ontology. It applies multiple hypothesis testing to report the significance of these enrichments. The algorithm has a four-phase pipeline. The application can be executed using either command-line tools or a user-friendly Graphical User Interface. The latter has been developed to address the needs of both specialist and non-specialist users. We use three diverse test cases to demonstrate the flexibility of the proposed strategy. In all cases, CLUSTERnGO not only outperformed existing algorithms in assigning unique GO term enrichments to the identified clusters, but also revealed novel insights regarding the biological systems examined, which were not uncovered in the original publications. The C++ and QT source codes, the GUI applications for Windows, OS X and Linux operating systems and user manual are freely available for download under the GNU GPL v3 license at http://www.cmpe.boun.edu.tr/content/CnG. sgo24@cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  8. Time series analysis of InSAR data: Methods and trends

    NASA Astrophysics Data System (ADS)

    Osmanoğlu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cabral-Cano, Enrique

    2016-05-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ;unwrapping; of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  9. Time Series Analysis of Insar Data: Methods and Trends

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  10. Development of a Standalone Thermal Wellbore Simulator

    NASA Astrophysics Data System (ADS)

    Xiong, Wanqiang

    With continuous developments of various different sophisticated wells in the petroleum industry, wellbore modeling and simulation have increasingly received more attention. Especially in unconventional oil and gas recovery processes, there is a growing demand for more accurate wellbore modeling. Despite notable advancements made in wellbore modeling, none of the existing wellbore simulators has been as successful as reservoir simulators such as Eclipse and CMG's and further research works on handling issues such as accurate heat loss modeling and multi-tubing wellbore modeling are really necessary. A series of mathematical equations including main governing equations, auxiliary equations, PVT equations, thermodynamic equations, drift-flux model equations, and wellbore heat loss calculation equations are collected and screened from publications. Based on these modeling equations, workflows for wellbore simulation and software development are proposed. Research works are conducted in key steps for developing a wellbore simulator: discretization, a grid system, a solution method, a linear equation solver, and computer language. A standalone thermal wellbore simulator is developed by using standard C++ language. This wellbore simulator can simulate single-phase injection and production, two-phase steam injection and two-phase oil and water production. By implementing a multi-part scheme which divides a wellbore with sophisticated configuration into several relative simple simulation running units, this simulator can handle different complex wellbores: wellbore with multistage casings, horizontal wells, multilateral wells and double tubing. In pursuance of improved accuracy of heat loss calculations to surrounding formations, a semi-numerical method is proposed and a series of FLUENT simulations have been conducted in this study. This semi-numerical method involves extending the 2D formation heat transfer simulation to include a casing wall and cement and adopting new correlations regressed by this study. Meanwhile, a correlation for handling heat transfer in double-tubing annulus is regressed. This work initiates the research on heat transfer in a double-tubing wellbore system. A series of validation and test works are performed in hot water injection, steam injection, real filed data, a horizontal well, a double-tubing well and comparison with the Ramey method. The program in this study also performs well in matching with real measured field data, simulation in horizontal wells and double-tubing wells.

  11. A Fault Location Algorithm for Two-End Series-Compensated Double-Circuit Transmission Lines Using the Distributed Parameter Line Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kang, Ning; Gombos, Gergely; Mousavi, Mirrasoul J.

    A new fault location algorithm for two-end series-compensated double-circuit transmission lines utilizing unsynchronized two-terminal current phasors and local voltage phasors is presented in this paper. The distributed parameter line model is adopted to take into account the shunt capacitance of the lines. The mutual coupling between the parallel lines in the zero-sequence network is also considered. The boundary conditions under different fault types are used to derive the fault location formulation. The developed algorithm directly uses the local voltage phasors on the line side of series compensation (SC) and metal oxide varistor (MOV). However, when potential transformers are not installedmore » on the line side of SC and MOVs for the local terminal, these measurements can be calculated from the local terminal bus voltage and currents by estimating the voltages across the SC and MOVs. MATLAB SimPowerSystems is used to generate cases under diverse fault conditions to evaluating accuracy. The simulation results show that the proposed algorithm is qualified for practical implementation.« less

  12. Atmospheric gradients from very long baseline interferometry observations

    NASA Technical Reports Server (NTRS)

    Macmillan, D. S.

    1995-01-01

    Azimuthal asymmetries in the atmospheric refractive index can lead to errors in estimated vertical and horizontal station coordinates. Daily average gradient effects can be as large as 50 mm of delay at a 7 deg elevation. To model gradients, the constrained estimation of gradient paramters was added to the standard VLBI solution procedure. Here the analysis of two sets of data is summarized: the set of all geodetic VLBI experiments from 1990-1993 and a series of 12 state-of-the-art R&D experiments run on consecutive days in January 1994. In both cases, when the gradient parameters are estimated, the overall fit of the geodetic solution is improved at greater than the 99% confidence level. Repeatabilities of baseline lengths ranging up to 11,000 km are improved by 1 to 8 mm in a root-sum-square sense. This varies from about 20% to 40% of the total baseline length scatter without gradient modeling for the 1990-1993 series and 40% to 50% for the January series. Gradients estimated independently for each day as a piecewise linear function are mostly continuous from day to day within their formal uncertainties.

  13. Towards malaria risk prediction in Afghanistan using remote sensing.

    PubMed

    Adimi, Farida; Soebiyanto, Radina P; Safi, Najibullah; Kiang, Richard

    2010-05-13

    Malaria is a significant public health concern in Afghanistan. Currently, approximately 60% of the population, or nearly 14 million people, live in a malaria-endemic area. Afghanistan's diverse landscape and terrain contributes to the heterogeneous malaria prevalence across the country. Understanding the role of environmental variables on malaria transmission can further the effort for malaria control programme. Provincial malaria epidemiological data (2004-2007) collected by the health posts in 23 provinces were used in conjunction with space-borne observations from NASA satellites. Specifically, the environmental variables, including precipitation, temperature and vegetation index measured by the Tropical Rainfall Measuring Mission and the Moderate Resolution Imaging Spectoradiometer, were used. Regression techniques were employed to model malaria cases as a function of environmental predictors. The resulting model was used for predicting malaria risks in Afghanistan. The entire time series except the last 6 months is used for training, and the last 6-month data is used for prediction and validation. Vegetation index, in general, is the strongest predictor, reflecting the fact that irrigation is the main factor that promotes malaria transmission in Afghanistan. Surface temperature is the second strongest predictor. Precipitation is not shown as a significant predictor, as it may not directly lead to higher larval population. Autoregressiveness of the malaria epidemiological data is apparent from the analysis. The malaria time series are modelled well, with provincial average R2 of 0.845. Although the R2 for prediction has larger variation, the total 6-month cases prediction is only 8.9% higher than the actual cases. The provincial monthly malaria cases can be modelled and predicted using satellite-measured environmental parameters with reasonable accuracy. The Third Strategic Approach of the WHO EMRO Malaria Control and Elimination Plan is aimed to develop a cost-effective surveillance system that includes forecasting, early warning and detection. The predictive and early warning capabilities shown in this paper support this strategy.

  14. Forecasting and prediction of scorpion sting cases in Biskra province, Algeria, using a seasonal autoregressive integrated moving average model.

    PubMed

    Selmane, Schehrazad; L'Hadj, Mohamed

    2016-01-01

    The aims of this study were to highlight some epidemiological aspects of scorpion envenomations, to analyse and interpret the available data for Biskra province, Algeria, and to develop a forecasting model for scorpion sting cases in Biskra province, which records the highest number of scorpion stings in Algeria. In addition to analysing the epidemiological profile of scorpion stings that occurred throughout the year 2013, we used the Box-Jenkins approach to fit a seasonal autoregressive integrated moving average (SARIMA) model to the monthly recorded scorpion sting cases in Biskra from 2000 to 2012. The epidemiological analysis revealed that scorpion stings were reported continuously throughout the year, with peaks in the summer months. The most affected age group was 15 to 49 years old, with a male predominance. The most prone human body areas were the upper and lower limbs. The majority of cases (95.9%) were classified as mild envenomations. The time series analysis showed that a (5,1,0)×(0,1,1) 12 SARIMA model offered the best fit to the scorpion sting surveillance data. This model was used to predict scorpion sting cases for the year 2013, and the fitted data showed considerable agreement with the actual data. SARIMA models are useful for monitoring scorpion sting cases, and provide an estimate of the variability to be expected in future scorpion sting cases. This knowledge is helpful in predicting whether an unusual situation is developing or not, and could therefore assist decision-makers in strengthening the province's prevention and control measures and in initiating rapid response measures.

  15. Time series, correlation matrices and random matrix models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vinayak; Seligman, Thomas H.

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series.more » By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.« less

  16. 77 FR 37797 - Airworthiness Directives; Airbus Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-25

    ... Airworthiness Directives; Airbus Airplanes AGENCY: Federal Aviation Administration (FAA), Department of... Airbus Model A330-200 series airplanes; Airbus Model A330-200 Freighter series airplanes; Airbus Model A330-300 series airplanes; Airbus Model A340-200 series airplanes; and Airbus Model A340-300 series...

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eriksen, Janus J., E-mail: janusje@chem.au.dk; Jørgensen, Poul; Matthews, Devin A.

    The accuracy at which total energies of open-shell atoms and organic radicals may be calculated is assessed for selected coupled cluster perturbative triples expansions, all of which augment the coupled cluster singles and doubles (CCSD) energy by a non-iterative correction for the effect of triple excitations. Namely, the second- through sixth-order models of the recently proposed CCSD(T–n) triples series [J. J. Eriksen et al., J. Chem. Phys. 140, 064108 (2014)] are compared to the acclaimed CCSD(T) model for both unrestricted as well as restricted open-shell Hartree-Fock (UHF/ROHF) reference determinants. By comparing UHF- and ROHF-based statistical results for a test setmore » of 18 modest-sized open-shell species with comparable RHF-based results, no behavioral differences are observed for the higher-order models of the CCSD(T–n) series in their correlated descriptions of closed- and open-shell species. In particular, we find that the convergence rate throughout the series towards the coupled cluster singles, doubles, and triples (CCSDT) solution is identical for the two cases. For the CCSD(T) model, on the other hand, not only its numerical consistency, but also its established, yet fortuitous cancellation of errors breaks down in the transition from closed- to open-shell systems. The higher-order CCSD(T–n) models (orders n > 3) thus offer a consistent and significant improvement in accuracy relative to CCSDT over the CCSD(T) model, equally for RHF, UHF, and ROHF reference determinants, albeit at an increased computational cost.« less

  18. The tails of the satellite auroral footprints at Jupiter

    NASA Astrophysics Data System (ADS)

    Bonfond, B.; Saur, J.; Grodent, D.; Badman, S. V.; Bisikalo, D.; Shematovich, V.; Gérard, J.-C.; Radioti, A.

    2017-08-01

    The electromagnetic interaction between Io, Europa, and Ganymede and the rotating plasma that surrounds Jupiter has a signature in the aurora of the planet. This signature, called the satellite footprint, takes the form of a series of spots located slightly downstream of the feet of the field lines passing through the moon under consideration. In the case of Io, these spots are also followed by an extended tail in the downstream direction relative to the plasma flow encountering the moon. A few examples of a tail for the Europa footprint have also been reported in the northern hemisphere. Here we present a simplified Alfvénic model for footprint tails and simulations of vertical brightness profiles for various electron distributions, which favor such a model over quasi-static models. We also report here additional cases of Europa footprint tails, in both hemispheres, even though such detections are rare and difficult. Furthermore, we show that the Ganymede footprint can also be followed by a similar tail. Finally, we present a case of a 320° long Io footprint tail, while other cases in similar configurations do not display such a length.

  19. Issues in midterm analysis and forecasting 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-07-01

    Issues in Midterm Analysis and Forecasting 1998 (Issues) presents a series of nine papers covering topics in analysis and modeling that underlie the Annual Energy Outlook 1998 (AEO98), as well as other significant issues in midterm energy markets. AEO98, DOE/EIA-0383(98), published in December 1997, presents national forecasts of energy production, demand, imports, and prices through the year 2020 for five cases -- a reference case and four additional cases that assume higher and lower economic growth and higher and lower world oil prices than in the reference case. The forecasts were prepared by the Energy Information Administration (EIA), using EIA`smore » National Energy Modeling System (NEMS). The papers included in Issues describe underlying analyses for the projections in AEO98 and the forthcoming Annual Energy Outlook 1999 and for other products of EIA`s Office of Integrated Analysis and Forecasting. Their purpose is to provide public access to analytical work done in preparation for the midterm projections and other unpublished analyses. Specific topics were chosen for their relevance to current energy issues or to highlight modeling activities in NEMS. 59 figs., 44 tabs.« less

  20. Computer tomography of flows external to test models

    NASA Technical Reports Server (NTRS)

    Prikryl, I.; Vest, C. M.

    1982-01-01

    Computer tomographic techniques for reconstruction of three-dimensional aerodynamic density fields, from interferograms recorded from several different viewing directions were studied. Emphasis is on the case in which an opaque object such as a test model in a wind tunnel obscures significant regions of the interferograms (projection data). A method called the Iterative Convolution Method (ICM), existing methods in which the field is represented by a series expansions, and analysis of real experimental data in the form of aerodynamic interferograms are discussed.

  1. 75 FR 38009 - Airworthiness Directives; The Boeing Company Model 737-200, -300, -400, -500, -600, -700, -800...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-01

    ... Airworthiness Directives; The Boeing Company Model 737-200, -300, -400, -500, -600, -700, -800, and -900 Series Airplanes; Model 747-400 Series Airplanes; Model 757-200 and 757-300 Series Airplanes; Model 767-200, 767..., -500, -600, -700, -800, and -900 series airplanes; Model 747-400 series airplanes; Model 757-200 and...

  2. A case-association cluster detection and visualisation tool with an application to Legionnaires’ disease

    PubMed Central

    Sansom, P; Copley, V R; Naik, F C; Leach, S; Hall, I M

    2013-01-01

    Statistical methods used in spatio-temporal surveillance of disease are able to identify abnormal clusters of cases but typically do not provide a measure of the degree of association between one case and another. Such a measure would facilitate the assignment of cases to common groups and be useful in outbreak investigations of diseases that potentially share the same source. This paper presents a model-based approach, which on the basis of available location data, provides a measure of the strength of association between cases in space and time and which is used to designate and visualise the most likely groupings of cases. The method was developed as a prospective surveillance tool to signal potential outbreaks, but it may also be used to explore groupings of cases in outbreak investigations. We demonstrate the method by using a historical case series of Legionnaires’ disease amongst residents of England and Wales. PMID:23483594

  3. On Sums of Numerical Series and Fourier Series

    ERIC Educational Resources Information Center

    Pavao, H. Germano; de Oliveira, E. Capelas

    2008-01-01

    We discuss a class of trigonometric functions whose corresponding Fourier series, on a conveniently chosen interval, can be used to calculate several numerical series. Particular cases are presented and two recent results involving numerical series are recovered. (Contains 1 note.)

  4. Chemotherapy-induced pulmonary hypertension: role of alkylating agents.

    PubMed

    Ranchoux, Benoît; Günther, Sven; Quarck, Rozenn; Chaumais, Marie-Camille; Dorfmüller, Peter; Antigny, Fabrice; Dumas, Sébastien J; Raymond, Nicolas; Lau, Edmund; Savale, Laurent; Jaïs, Xavier; Sitbon, Olivier; Simonneau, Gérald; Stenmark, Kurt; Cohen-Kaminsky, Sylvia; Humbert, Marc; Montani, David; Perros, Frédéric

    2015-02-01

    Pulmonary veno-occlusive disease (PVOD) is an uncommon form of pulmonary hypertension (PH) characterized by progressive obstruction of small pulmonary veins and a dismal prognosis. Limited case series have reported a possible association between different chemotherapeutic agents and PVOD. We evaluated the relationship between chemotherapeutic agents and PVOD. Cases of chemotherapy-induced PVOD from the French PH network and literature were reviewed. Consequences of chemotherapy exposure on the pulmonary vasculature and hemodynamics were investigated in three different animal models (mouse, rat, and rabbit). Thirty-seven cases of chemotherapy-associated PVOD were identified in the French PH network and systematic literature analysis. Exposure to alkylating agents was observed in 83.8% of cases, mostly represented by cyclophosphamide (43.2%). In three different animal models, cyclophosphamide was able to induce PH on the basis of hemodynamic, morphological, and biological parameters. In these models, histopathological assessment confirmed significant pulmonary venous involvement highly suggestive of PVOD. Together, clinical data and animal models demonstrated a plausible cause-effect relationship between alkylating agents and PVOD. Clinicians should be aware of this uncommon, but severe, pulmonary vascular complication of alkylating agents. Copyright © 2015 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  5. Correlation analysis of air pollutant index levels and dengue cases across five different zones in Selangor, Malaysia.

    PubMed

    Thiruchelvam, Loshini; Dass, Sarat C; Zaki, Rafdzah; Yahya, Abqariyah; Asirvadam, Vijanth S

    2018-05-07

    This study investigated the potential relationship between dengue cases and air quality - as measured by the Air Pollution Index (API) for five zones in the state of Selangor, Malaysia. Dengue case patterns can be learned using prediction models based on feedback (lagged terms). However, the question whether air quality affects dengue cases is still not thoroughly investigated based on such feedback models. This work developed dengue prediction models using the autoregressive integrated moving average (ARIMA) and ARIMA with an exogeneous variable (ARIMAX) time series methodologies with API as the exogeneous variable. The Box Jenkins approach based on maximum likelihood was used for analysis as it gives effective model estimates and prediction. Three stages of model comparison were carried out for each zone: first with ARIMA models without API, then ARIMAX models with API data from the API station for that zone and finally, ARIMAX models with API data from the zone and spatially neighbouring zones. Bayesian Information Criterion (BIC) gives goodness-of-fit versus parsimony comparisons between all elicited models. Our study found that ARIMA models, with the lowest BIC value, outperformed the rest in all five zones. The BIC values for the zone of Kuala Selangor were -800.66, -796.22, and -790.5229, respectively, for ARIMA only, ARIMAX with single API component and ARIMAX with API components from its zone and spatially neighbouring zones. Therefore, we concluded that API levels, either temporally for each zone or spatio- temporally based on neighbouring zones, do not have a significant effect on dengue cases.

  6. An Efficient Interval Type-2 Fuzzy CMAC for Chaos Time-Series Prediction and Synchronization.

    PubMed

    Lee, Ching-Hung; Chang, Feng-Yu; Lin, Chih-Min

    2014-03-01

    This paper aims to propose a more efficient control algorithm for chaos time-series prediction and synchronization. A novel type-2 fuzzy cerebellar model articulation controller (T2FCMAC) is proposed. In some special cases, this T2FCMAC can be reduced to an interval type-2 fuzzy neural network, a fuzzy neural network, and a fuzzy cerebellar model articulation controller (CMAC). So, this T2FCMAC is a more generalized network with better learning ability, thus, it is used for the chaos time-series prediction and synchronization. Moreover, this T2FCMAC realizes the un-normalized interval type-2 fuzzy logic system based on the structure of the CMAC. It can provide better capabilities for handling uncertainty and more design degree of freedom than traditional type-1 fuzzy CMAC. Unlike most of the interval type-2 fuzzy system, the type-reduction of T2FCMAC is bypassed due to the property of un-normalized interval type-2 fuzzy logic system. This causes T2FCMAC to have lower computational complexity and is more practical. For chaos time-series prediction and synchronization applications, the training architectures with corresponding convergence analyses and optimal learning rates based on Lyapunov stability approach are introduced. Finally, two illustrated examples are presented to demonstrate the performance of the proposed T2FCMAC.

  7. Hydraulic modeling of riverbank filtration systems with curved boundaries using analytic elements and series solutions

    NASA Astrophysics Data System (ADS)

    Bakker, Mark

    2010-08-01

    A new analytic solution approach is presented for the modeling of steady flow to pumping wells near rivers in strip aquifers; all boundaries of the river and strip aquifer may be curved. The river penetrates the aquifer only partially and has a leaky stream bed. The water level in the river may vary spatially. Flow in the aquifer below the river is semi-confined while flow in the aquifer adjacent to the river is confined or unconfined and may be subject to areal recharge. Analytic solutions are obtained through superposition of analytic elements and Fourier series. Boundary conditions are specified at collocation points along the boundaries. The number of collocation points is larger than the number of coefficients in the Fourier series and a solution is obtained in the least squares sense. The solution is analytic while boundary conditions are met approximately. Very accurate solutions are obtained when enough terms are used in the series. Several examples are presented for domains with straight and curved boundaries, including a well pumping near a meandering river with a varying water level. The area of the river bottom where water infiltrates into the aquifer is delineated and the fraction of river water in the well water is computed for several cases.

  8. Development of the NEDO implantable ventricular assist device with Gyro centrifugal pump.

    PubMed

    Yoshikawa, M; Nonaka, K; Linneweber, J; Kawahito, S; Ohtsuka, G; Nakata, K; Takano, T; Schulte-Eistrup, S; Glueck, J; Schima, H; Wolner, E; Nosé, Y

    2000-06-01

    The Gyro centrifugal pump, PI (permanently implantable) series, is being developed as a totally implantable artificial heart. Our final goal is to establish a "functional TAH," a totally implantable biventricular assist system (BiVAS) with centrifugal pumps. A plastic prototype pump, Gyro PI 601, was evaluated through in vitro and in vivo studies as a single ventricular assist device (VAD). Based upon these results, the pump head material was converted to a titanium alloy, and the actuator was modified. These titanium Gyro pumps, PI 700 series, also were subjected to in vitro and in vivo studies. The Gyro PI 601 and PI 700 series have the same inner dimensions and characteristics, such as the eccentric inlet port, double pivot bearing system, secondary vane, and magnet coupling system; however, the material of the PI 700 is different from the PI 601. The Gyro PI series is driven by the Vienna DC brushless motor actuator. The inlet cannula of the right ventricular assist system (RVAS) specially made for this system consists of 2 parts: a hat-shaped silicone tip biolized with gelatin and an angled wire reinforced tube made of polyvinylchloride. The pump-actuator package was implanted into 8 calves in the preperitoneal space, bypassing from the left ventricle apex to the descending aorta for the left ventricular assist system (LVAS) and bypassing the right ventricle to the main pulmonary artery for the RVAS. According to the PI 601 feasibility protocol, 2 LVAS cases were terminated after 2 weeks, and 1 LVAS case and 1 RVAS were terminated after 1 month. The PI 700 series was implanted into 4 cases: 3 LVAS cases survived for a long term, 2 of them over 200 days (72-283 days), and 1 RVAS case survived for 1 month and was terminated according to the protocol for a short-term antithrombogenic screening and system feasibility study. Regarding power consumption, the plastic pump cases demonstrated from 6.2 to 12.1 W as LVAS and 7.3 W as RVAS, the titanium pump cases showed from 10.4 to 14.2 W as LVAS and 15.8 W as RVAS. All cases exhibited low hemolysis. The renal function and the liver function were maintained normally in all cases throughout these experimental periods. In the 2 RVAS cases, pulmonary function was normally maintained. No calves demonstrated thromboembolic signs or symptoms throughout the experiments except Case 1 with the plastic pump. However, in the plastic pump cases, bilateral renal infarction was suspected in 2 cases during necropsy whereas no abnormal findings were revealed in the titanium pump cases. There were also no blood clots inside the PI 700 series. As for the 601, the explanted pumps demonstrated slight thrombus formations at the top and bottom pivots except in 1 case. The Gyro PI series, especially the PI 700 series, demonstrated superior performance, biocompatibility, antithrombogenicity and low hemolysis. Also, the durability of the actuator was demonstrated. Based on these results, this titanium centrifugal pump is suitable as an implantable LVAS and RVAS. It is likely that the Gyro PI series is a feasible component of the BiVAS functional TAH.

  9. 75 FR 91 - Airworthiness Directives; Bombardier, Inc. (Type Certificate Previously Held by Canadair) Model...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ... condition as: Two cases of a crack on a ``dry'' ADG (Air Driven Generator) (Hamilton Sundstrand part number in the 761339 series) in the aft area of the strut and generator housing assembly, have been reported... typically 45 days, which is consistent with the comment period for domestic transport ADs. We will post all...

  10. 78 FR 76249 - Special Conditions: Airbus, Model A350-900 Series Airplane; Flight Envelope Protection: Normal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-17

    ..., and accommodates side-by-side placement of LD-3 containers in the cargo compartment. The basic Airbus... availability of this excess maneuver capacity in case of extreme emergency such as upset recoveries or... factor must not be less than: (a) 2.5g for the EFCS normal state with the high lift devices retracted up...

  11. 75 FR 50856 - Airworthiness Directives; Airbus Model A380-800 Series Airplanes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... been found on the Droop Nose (DN) 1 master sidestay bracket on the inboard leading edge of an Airbus A380 flight test aeroplane. In case of failure of the master bracket, the sub-master bracket would be... been found on the Droop Nose (DN) 1 master sidestay bracket on the inboard leading edge of an Airbus...

  12. 78 FR 26526 - Magnuson-Stevens Act Provisions; Fisheries off West Coast States; Pacific Coast Groundfish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-07

    ... base-case model using nine years of an acoustic survey biomass index as well as catches to estimate the scale of the current hake stock. The 2012 acoustic-trawl survey result was a relative biomass 1,380,000... series. The age-composition data from the aggregated fisheries (1975-2012) and the acoustic survey...

  13. Bimodal Bilingual Language Development of Hearing Children of Deaf Parents

    ERIC Educational Resources Information Center

    Hofmann, Kristin; Chilla, Solveig

    2015-01-01

    Adopting a bimodal bilingual language acquisition model, this qualitative case study is the first in Germany to investigate the spoken and sign language development of hearing children of deaf adults (codas). The spoken language competence of six codas within the age range of 3;10 to 6;4 is assessed by a series of standardised tests (SETK 3-5,…

  14. Jobs to Manufacturing Careers: Work-Based Courses. Work-Based Learning in Action

    ERIC Educational Resources Information Center

    Kobes, Deborah

    2016-01-01

    This case study, one of a series of publications exploring effective and inclusive models of work-based learning, finds that work-based courses bring college to the production line by using the job as a learning lab. Work-based courses are an innovative way to give incumbent workers access to community college credits and degrees. They are…

  15. Scaly scalp associated with crusted scabies: case series.

    PubMed

    Anbar, T S; El-Domyati, M B; Mansour, H A; Ahmad, H M

    2007-07-13

    The diagnosis of crusted scabies is becoming more relevant due to the increase in number of immunocompromised patients. To date, more than 200 cases have been reported in the literature. However, crusted scabies seems to be under-diagnosed because of its unusual presentations. In this case series we present history, clinical manifestations, KOH smear, and histopathological findings of a series of four patients with crusted scabies. Scaly scalp was a prominent feature of the disease in all cases. Examination of and treatment of the scalp of patients with suspected crusted scabies should not be neglected. A KOH smear from the scalp offers a simple and reliable technique for diagnosis.

  16. Accelerating Large Data Analysis By Exploiting Regularities

    NASA Technical Reports Server (NTRS)

    Moran, Patrick J.; Ellsworth, David

    2003-01-01

    We present techniques for discovering and exploiting regularity in large curvilinear data sets. The data can be based on a single mesh or a mesh composed of multiple submeshes (also known as zones). Multi-zone data are typical to Computational Fluid Dynamics (CFD) simulations. Regularities include axis-aligned rectilinear and cylindrical meshes as well as cases where one zone is equivalent to a rigid-body transformation of another. Our algorithms can also discover rigid-body motion of meshes in time-series data. Next, we describe a data model where we can utilize the results from the discovery process in order to accelerate large data visualizations. Where possible, we replace general curvilinear zones with rectilinear or cylindrical zones. In rigid-body motion cases we replace a time-series of meshes with a transformed mesh object where a reference mesh is dynamically transformed based on a given time value in order to satisfy geometry requests, on demand. The data model enables us to make these substitutions and dynamic transformations transparently with respect to the visualization algorithms. We present results with large data sets where we combine our mesh replacement and transformation techniques with out-of-core paging in order to achieve significant speed-ups in analysis.

  17. Development and Testing of Data Mining Algorithms for Earth Observation

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    The new algorithms developed under this project included a principled procedure for classification of objects, events or circumstances according to a target variable when a very large number of potential predictor variables is available but the number of cases that can be used for training a classifier is relatively small. These "high dimensional" problems require finding a minimal set of variables -called the Markov Blanket-- sufficient for predicting the value of the target variable. An algorithm, the Markov Blanket Fan Search, was developed, implemented and tested on both simulated and real data in conjunction with a graphical model classifier, which was also implemented. Another algorithm developed and implemented in TETRAD IV for time series elaborated on work by C. Granger and N. Swanson, which in turn exploited some of our earlier work. The algorithms in question learn a linear time series model from data. Given such a time series, the simultaneous residual covariances, after factoring out time dependencies, may provide information about causal processes that occur more rapidly than the time series representation allow, so called simultaneous or contemporaneous causal processes. Working with A. Monetta, a graduate student from Italy, we produced the correct statistics for estimating the contemporaneous causal structure from time series data using the TETRAD IV suite of algorithms. Two economists, David Bessler and Kevin Hoover, have independently published applications using TETRAD style algorithms to the same purpose. These implementations and algorithmic developments were separately used in two kinds of studies of climate data: Short time series of geographically proximate climate variables predicting agricultural effects in California, and longer duration climate measurements of temperature teleconnections.

  18. Probability of misclassifying biological elements in surface waters.

    PubMed

    Loga, Małgorzata; Wierzchołowska-Dziedzic, Anna

    2017-11-24

    Measurement uncertainties are inherent to assessment of biological indices of water bodies. The effect of these uncertainties on the probability of misclassification of ecological status is the subject of this paper. Four Monte-Carlo (M-C) models were applied to simulate the occurrence of random errors in the measurements of metrics corresponding to four biological elements of surface waters: macrophytes, phytoplankton, phytobenthos, and benthic macroinvertebrates. Long series of error-prone measurement values of these metrics, generated by M-C models, were used to identify cases in which values of any of the four biological indices lay outside of the "true" water body class, i.e., outside the class assigned from the actual physical measurements. Fraction of such cases in the M-C generated series was used to estimate the probability of misclassification. The method is particularly useful for estimating the probability of misclassification of the ecological status of surface water bodies in the case of short sequences of measurements of biological indices. The results of the Monte-Carlo simulations show a relatively high sensitivity of this probability to measurement errors of the river macrophyte index (MIR) and high robustness to measurement errors of the benthic macroinvertebrate index (MMI). The proposed method of using Monte-Carlo models to estimate the probability of misclassification has significant potential for assessing the uncertainty of water body status reported to the EC by the EU member countries according to WFD. The method can be readily applied also in risk assessment of water management decisions before adopting the status dependent corrective actions.

  19. COLLABORATE©: a universal competency-based paradigm for professional case management, part i: introduction, historical validation, and competency presentation.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2013-01-01

    The purpose of this first of a three-article series is to provide context and justification for a new paradigm of case management built upon a value-driven foundation that Applicable to all health care sectors where case management is practiced. In moving forward, the one fact that rings true is there will be constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby or Pokey. This is exactly why the definition of a competency-based case management model's time has come, one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than by the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. While there is inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  20. Achieving Good Perioperative Outcomes After Pancreaticoduodenectomy in a Low-Volume Setting: A 25-Year Experience

    PubMed Central

    Chedid, Aljamir D.; Chedid, Marcio F.; Winkelmann, Leonardo V.; Filho, Tomaz J. M. Grezzana; Kruel, Cleber D. P.

    2015-01-01

    Perioperative mortality following pancreaticoduodenectomy has improved over time and is lower than 5% in selected high-volume centers. Based on several large literature series on pancreaticoduodenectomy from high-volume centers, some defend that high annual volumes are necessary for good outcomes after pancreaticoduodenectomy. We report here the outcomes of a low annual volume pancreaticoduodenectomy series after incorporating technical expertise from a high-volume center. We included all patients who underwent pancreaticoduodenectomy performed by a single surgeon (ADC.) as treatment for periampullary malignancies from 1981 to 2005. Outcomes of this series were compared to those of 3 high-volume literature series. Additionally, outcomes for first 10 cases in the present series were compared to those of all 37 remaining cases in this series. A total of 47 pancreaticoduodenectomies were performed over a 25-year period. Overall in-hospital mortality was 2 cases (4.3%), and morbidity occurred in 23 patients (48.9%). Both mortality and morbidity were similar to those of each of the three high-volume center comparison series. Comparison of the outcomes for the first 10 to the remaining 37 cases in this series revealed that the latter 37 cases had inferior mortality (20% versus 0%; P = 0.042), less tumor-positive margins (50 versus 13.5%; P = 0.024), less use of intraoperative blood transfusions (90% versus 32.4%; P = 0.003), and tendency to a shorter length of in-hospital stay (20 versus 15.8 days; P = 0.053). Accumulation of surgical experience and incorporation of expertise from high-volume centers may enable achieving satisfactory outcomes after pancreaticoduodenectomy in low-volume settings whenever referral to a high-volume center is limited. PMID:25875555

  1. Madden–Julian Oscillation prediction skill of a new-generation global model demonstrated using a supercomputer

    PubMed Central

    Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T.; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio

    2014-01-01

    Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden–Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003–2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts. PMID:24801254

  2. Madden-Julian Oscillation prediction skill of a new-generation global model demonstrated using a supercomputer.

    PubMed

    Miyakawa, Tomoki; Satoh, Masaki; Miura, Hiroaki; Tomita, Hirofumi; Yashiro, Hisashi; Noda, Akira T; Yamada, Yohei; Kodama, Chihiro; Kimoto, Masahide; Yoneyama, Kunio

    2014-05-06

    Global cloud/cloud system-resolving models are perceived to perform well in the prediction of the Madden-Julian Oscillation (MJO), a huge eastward -propagating atmospheric pulse that dominates intraseasonal variation of the tropics and affects the entire globe. However, owing to model complexity, detailed analysis is limited by computational power. Here we carry out a simulation series using a recently developed supercomputer, which enables the statistical evaluation of the MJO prediction skill of a costly new-generation model in a manner similar to operational forecast models. We estimate the current MJO predictability of the model as 27 days by conducting simulations including all winter MJO cases identified during 2003-2012. The simulated precipitation patterns associated with different MJO phases compare well with observations. An MJO case captured in a recent intensive observation is also well reproduced. Our results reveal that the global cloud-resolving approach is effective in understanding the MJO and in providing month-long tropical forecasts.

  3. Obsessive Compulsive Disorder Treatment in Patients with Down Syndrome: A Case Series

    ERIC Educational Resources Information Center

    Sutor, Bruce; Hansen, Mark R.; Black, John L.

    2006-01-01

    In this case series we report four cases of patients with Down syndrome with symptoms consistent with obsessive compulsive disorder. Each patient experienced substantial reduction in compulsive behaviors with pharmacotherapy of an SSRI alone or with the addition of risperidone to SSRI therapy. None of the patients experienced significant side…

  4. Intensive (Daily) Behavior Therapy for School Refusal: A Multiple Baseline Case Series

    ERIC Educational Resources Information Center

    Tolin, David F.; Whiting, Sara; Maltby, Nicholas; Diefenbach, Gretchen J.; Lothstein, Mary Anne; Hardcastle, Surrey; Catalano, Amy; Gray, Krista

    2009-01-01

    The following multiple baseline case series examines school refusal behavior in 4 male adolescents. School refusal symptom presentation was ascertained utilizing a functional analysis from the School Refusal Assessment Scale (Kearney, 2002). For the majority of cases, treatment was conducted within a 15-session intensive format over a 3-week…

  5. Knocked by the shuttlecock: twelve sight-threatening blunt-eye injuries in Australian badminton players.

    PubMed

    Jao, Kathy K; Atik, Alp; Jamieson, Michael P; Sheales, Mariana P; Lee, Matthew H; Porter, Ashley; Roufas, Athena; Goldberg, Ivan; Zamir, Ehud; White, Andrew; Skalicky, Simon E

    2017-07-01

    Non-penetrating ocular injuries from badminton shuttlecocks can result in severe damage and life-long complications. This case series highlights the morbidity of such injuries, particularly in regard to post-traumatic glaucoma. This is a retrospective case series of 12 patients with shuttlecock-related blunt eye injuries sustained during badminton play without eye protection. By approaching colleagues through conference presentations and networking, the authors have attempted to gather all known cases of shuttlecock ocular injury managed in tertiary ocular emergency departments or private ophthalmological clinics in Victoria and New South Wales, Australia in 2015. This is the first multicentre case series to describe badminton-related ocular injuries in Australia. Our case series demonstrates, in particular, long-term glaucoma-related morbidity for patients over a large age range (16 to 77 years), with one patient requiring ongoing management 26 years following their initial injury. The cases reported further add to the literature promoting awareness of badminton-related ocular injury. We encourage player education and advocacy on badminton-related eye injuries and appropriate use of eye protection to reduce associated morbidity. © 2016 Optometry Australia.

  6. The Combined Effect of Periodic Signals and Noise on the Dilution of Precision of GNSS Station Velocity Uncertainties

    NASA Astrophysics Data System (ADS)

    Klos, Anna; Olivares, German; Teferle, Felix Norman; Bogusz, Janusz

    2016-04-01

    Station velocity uncertainties determined from a series of Global Navigation Satellite System (GNSS) position estimates depend on both the deterministic and stochastic models applied to the time series. While the deterministic model generally includes parameters for a linear and several periodic terms the stochastic model is a representation of the noise character of the time series in form of a power-law process. For both of these models the optimal model may vary from one time series to another while the models also depend, to some degree, on each other. In the past various power-law processes have been shown to fit the time series and the sources for the apparent temporally-correlated noise were attributed to, for example, mismodelling of satellites orbits, antenna phase centre variations, troposphere, Earth Orientation Parameters, mass loading effects and monument instabilities. Blewitt and Lavallée (2002) demonstrated how improperly modelled seasonal signals affected the estimates of station velocity uncertainties. However, in their study they assumed that the time series followed a white noise process with no consideration of additional temporally-correlated noise. Bos et al. (2010) empirically showed for a small number of stations that the noise character was much more important for the reliable estimation of station velocity uncertainties than the seasonal signals. In this presentation we pick up from Blewitt and Lavallée (2002) and Bos et al. (2010), and have derived formulas for the computation of the General Dilution of Precision (GDP) under presence of periodic signals and temporally-correlated noise in the time series. We show, based on simulated and real time series from globally distributed IGS (International GNSS Service) stations processed by the Jet Propulsion Laboratory (JPL), that periodic signals dominate the effect on the velocity uncertainties at short time scales while for those beyond four years, the type of noise becomes much more important. In other words, for time series long enough, the assumed periodic signals do not affect the velocity uncertainties as much as the assumed noise model. We calculated the GDP to be the ratio between two errors of velocity: without and with inclusion of seasonal terms of periods equal to one year and its overtones till 3rd. To all these cases power-law processes of white, flicker and random-walk noise were added separately. Few oscillations in GDP can be noticed for integer years, which arise from periodic terms added. Their amplitudes in GDP increase along with the increasing spectral index. Strong peaks of oscillations in GDP are indicated for short time scales, especially for random-walk processes. This means that badly monumented stations are affected the most. Local minima and maxima in GDP are also enlarged as the noise approaches random walk. We noticed that the semi-annual signal increased the local GDP minimum for white noise. This suggests that adding power-law noise to a deterministic model with annual term or adding a semi-annual term to white noise causes an increased velocity uncertainty even at the points, where determined velocity is not biased.

  7. LANDSAT-D MSS/TM tuned orbital jitter analysis model LDS900

    NASA Technical Reports Server (NTRS)

    Pollak, T. E.

    1981-01-01

    The final LANDSAT-D orbital dynamic math model (LSD900), comprised of all test validated substructures, was used to evaluate the jitter response of the MSS/TM experiments. A dynamic forced response analysis was performed at both the MSS and TM locations on all structural modes considered (thru 200 Hz). The analysis determined the roll angular response of the MSS/TM experiments to improve excitation generated by component operation. Cross axis and cross experiment responses were also calculated. The excitations were analytically represented by seven and nine term Fourier series approximations, for the MSS and TM experiment respectively, which enabled linear harmonic solution techniques to be applied to response calculations. Single worst case jitter was estimated by variations of the eigenvalue spectrum of model LSD 900. The probability of any worst case mode occurrence was investigated.

  8. Lattice model for water-solute mixtures.

    PubMed

    Furlan, A P; Almarza, N G; Barbosa, M C

    2016-10-14

    A lattice model for the study of mixtures of associating liquids is proposed. Solvent and solute are modeled by adapting the associating lattice gas (ALG) model. The nature of interaction of solute/solvent is controlled by tuning the energy interactions between the patches of ALG model. We have studied three set of parameters, resulting in, hydrophilic, inert, and hydrophobic interactions. Extensive Monte Carlo simulations were carried out, and the behavior of pure components and the excess properties of the mixtures have been studied. The pure components, water (solvent) and solute, have quite similar phase diagrams, presenting gas, low density liquid, and high density liquid phases. In the case of solute, the regions of coexistence are substantially reduced when compared with both the water and the standard ALG models. A numerical procedure has been developed in order to attain series of results at constant pressure from simulations of the lattice gas model in the grand canonical ensemble. The excess properties of the mixtures, volume and enthalpy as the function of the solute fraction, have been studied for different interaction parameters of the model. Our model is able to reproduce qualitatively well the excess volume and enthalpy for different aqueous solutions. For the hydrophilic case, we show that the model is able to reproduce the excess volume and enthalpy of mixtures of small alcohols and amines. The inert case reproduces the behavior of large alcohols such as propanol, butanol, and pentanol. For the last case (hydrophobic), the excess properties reproduce the behavior of ionic liquids in aqueous solution.

  9. Time series analysis of temporal trends in the pertussis incidence in Mainland China from 2005 to 2016.

    PubMed

    Zeng, Qianglin; Li, Dandan; Huang, Gui; Xia, Jin; Wang, Xiaoming; Zhang, Yamei; Tang, Wanping; Zhou, Hui

    2016-08-31

    Short-term forecast of pertussis incidence is helpful for advanced warning and planning resource needs for future epidemics. By utilizing the Auto-Regressive Integrated Moving Average (ARIMA) model and Exponential Smoothing (ETS) model as alterative models with R software, this paper analyzed data from Chinese Center for Disease Control and Prevention (China CDC) between January 2005 and June 2016. The ARIMA (0,1,0)(1,1,1)12 model (AICc = 1342.2 BIC = 1350.3) was selected as the best performing ARIMA model and the ETS (M,N,M) model (AICc = 1678.6, BIC = 1715.4) was selected as the best performing ETS model, and the ETS (M,N,M) model with the minimum RMSE was finally selected for in-sample-simulation and out-of-sample forecasting. Descriptive statistics showed that the reported number of pertussis cases by China CDC increased by 66.20% from 2005 (4058 cases) to 2015 (6744 cases). According to Hodrick-Prescott filter, there was an apparent cyclicity and seasonality in the pertussis reports. In out of sample forecasting, the model forecasted a relatively high incidence cases in 2016, which predicates an increasing risk of ongoing pertussis resurgence in the near future. In this regard, the ETS model would be a useful tool in simulating and forecasting the incidence of pertussis, and helping decision makers to take efficient decisions based on the advanced warning of disease incidence.

  10. Autotransplantation of immature third molars using a computer-aided rapid prototyping model: a report of 4 cases.

    PubMed

    Jang, Ji-Hyun; Lee, Seung-Jong; Kim, Euiseong

    2013-11-01

    Autotransplantation of immature teeth can be an option for premature tooth loss in young patients as an alternative to immediately replacing teeth with fixed or implant-supported prostheses. The present case series reports 4 successful autotransplantation cases using computer-aided rapid prototyping (CARP) models with immature third molars. The compromised upper and lower molars (n = 4) of patients aged 15-21 years old were transplanted with third molars using CARP models. Postoperatively, the pulp vitality and the development of the roots were examined clinically and radiographically. The patient follow-up period was 2-7.5 years after surgery. The long-term follow-up showed that all of the transplants were asymptomatic and functional. Radiographic examination indicated that the apices developed continuously and the root length and thickness increased. The final follow-up examination revealed that all of the transplants kept the vitality, and the apices were fully developed with normal periodontal ligaments and trabecular bony patterns. Based on long-term follow-up observations, our 4 cases of autotransplantation of immature teeth using CARP models resulted in favorable prognoses. The CARP model assisted in minimizing the extraoral time and the possible Hertwig epithelial root sheath injury of the transplanted tooth. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  11. A time series analysis of the relationship of ambient temperature and common bacterial enteric infections in two Canadian provinces

    NASA Astrophysics Data System (ADS)

    Fleury, Manon; Charron, Dominique F.; Holt, John D.; Allen, O. Brian; Maarouf, Abdel R.

    2006-07-01

    The incidence of enteric infections in the Canadian population varies seasonally, and may be expected to be change in response to global climate changes. To better understand any potential impact of warmer temperature on enteric infections in Canada, we investigated the relationship between ambient temperature and weekly reports of confirmed cases of three pathogens in Canada: Salmonella, pathogenic Escherichia coli and Campylobacter, between 1992 and 2000 in two Canadian provinces. We used generalized linear models (GLMs) and generalized additive models (GAMs) to estimate the effect of seasonal adjustments on the estimated models. We found a strong non-linear association between ambient temperature and the occurrence of all three enteric pathogens in Alberta, Canada, and of Campylobacter in Newfoundland-Labrador. Threshold models were used to quantify the relationship of disease and temperature with thresholds chosen from 0 to -10°C depending on the pathogen modeled. For Alberta, the log relative risk of Salmonella weekly case counts increased by 1.2%, Campylobacter weekly case counts increased by 2.2%, and E. coli weekly case counts increased by 6.0% for every degree increase in weekly mean temperature. For Newfoundland-Labrador the log relative risk increased by 4.5% for Campylobacter for every degree increase in weekly mean temperature.

  12. Performance of time-series methods in forecasting the demand for red blood cell transfusion.

    PubMed

    Pereira, Arturo

    2004-05-01

    Planning the future blood collection efforts must be based on adequate forecasts of transfusion demand. In this study, univariate time-series methods were investigated for their performance in forecasting the monthly demand for RBCs at one tertiary-care, university hospital. Three time-series methods were investigated: autoregressive integrated moving average (ARIMA), the Holt-Winters family of exponential smoothing models, and one neural-network-based method. The time series consisted of the monthly demand for RBCs from January 1988 to December 2002 and was divided into two segments: the older one was used to fit or train the models, and the younger to test for the accuracy of predictions. Performance was compared across forecasting methods by calculating goodness-of-fit statistics, the percentage of months in which forecast-based supply would have met the RBC demand (coverage rate), and the outdate rate. The RBC transfusion series was best fitted by a seasonal ARIMA(0,1,1)(0,1,1)(12) model. Over 1-year time horizons, forecasts generated by ARIMA or exponential smoothing laid within the +/- 10 percent interval of the real RBC demand in 79 percent of months (62% in the case of neural networks). The coverage rate for the three methods was 89, 91, and 86 percent, respectively. Over 2-year time horizons, exponential smoothing largely outperformed the other methods. Predictions by exponential smoothing laid within the +/- 10 percent interval of real values in 75 percent of the 24 forecasted months, and the coverage rate was 87 percent. Over 1-year time horizons, predictions of RBC demand generated by ARIMA or exponential smoothing are accurate enough to be of help in the planning of blood collection efforts. For longer time horizons, exponential smoothing outperforms the other forecasting methods.

  13. Can We Speculate Running Application With Server Power Consumption Trace?

    PubMed

    Li, Yuanlong; Hu, Han; Wen, Yonggang; Zhang, Jun

    2018-05-01

    In this paper, we propose to detect the running applications in a server by classifying the observed power consumption series for the purpose of data center energy consumption monitoring and analysis. Time series classification problem has been extensively studied with various distance measurements developed; also recently the deep learning-based sequence models have been proved to be promising. In this paper, we propose a novel distance measurement and build a time series classification algorithm hybridizing nearest neighbor and long short term memory (LSTM) neural network. More specifically, first we propose a new distance measurement termed as local time warping (LTW), which utilizes a user-specified index set for local warping, and is designed to be noncommutative and nondynamic programming. Second, we hybridize the 1-nearest neighbor (1NN)-LTW and LSTM together. In particular, we combine the prediction probability vector of 1NN-LTW and LSTM to determine the label of the test cases. Finally, using the power consumption data from a real data center, we show that the proposed LTW can improve the classification accuracy of dynamic time warping (DTW) from about 84% to 90%. Our experimental results prove that the proposed LTW is competitive on our data set compared with existed DTW variants and its noncommutative feature is indeed beneficial. We also test a linear version of LTW and find out that it can perform similar to state-of-the-art DTW-based method while it runs as fast as the linear runtime lower bound methods like LB_Keogh for our problem. With the hybrid algorithm, for the power series classification task we achieve an accuracy up to about 93%. Our research can inspire more studies on time series distance measurement and the hybrid of the deep learning models with other traditional models.

  14. A study of pilot modeling in multi-controller tasks

    NASA Technical Reports Server (NTRS)

    Whitbeck, R. F.; Knight, J. R.

    1972-01-01

    A modeling approach, which utilizes a matrix of transfer functions to describe the human pilot in multiple input, multiple output control situations, is studied. The approach used was to extend a well established scalar Wiener-Hopf minimization technique to the matrix case and then study, via a series of experiments, the data requirements when only finite record lengths are available. One of these experiments was a two-controller roll tracking experiment designed to force the pilot to use rudder in order to coordinate and reduce the effects of aileron yaw. One model was computed for the case where the signals used to generate the spectral matrix are error and bank angle while another model was computed for the case where error and yaw angle are the inputs. Several anomalies were observed to be present in the experimental data. These are defined by the descriptive terms roll up, break up, and roll down. Due to these algorithm induced anomalies, the frequency band over which reliable estimates of power spectra can be achieved is considerably less than predicted by the sampling theorem.

  15. Rapid amelioration of severe manic episodes with right unilateral ultrabrief pulse ECT: a case series of four patients.

    PubMed

    Sidorov, Alexey; Mayur, Prashanth

    2017-02-01

    The aim of this small case series is to describe four cases of severe mania, where ultrabrief pulse electroconvulsive therapy (ECT) was used as a primary mode of treatment. A retrospective file review was undertaken of four patients identified as having received ultrabrief pulse ECT for severe mania. The outcome measures for treatment efficacy were the Young Mania Rating Scale (YMRS) and Clinical Global Impression (CGI). All the patients showed significant clinical improvement. A comparison of pre- and post-treatment YMRS and CGI scores showed a dramatic decrease in all four cases. However, one patient was shifted to brief pulse ECT due to inadequate response. Ultrabrief pulse ECT may be an effective treatment in cases of severe mania. Due to the very small number of cases in the current case series, no specific conclusions regarding efficacy may be drawn; however, larger, controlled studies would be indicated.

  16. Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting

    NASA Astrophysics Data System (ADS)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-07-01

    In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.

  17. Identification of AR(I)MA processes for modelling temporal correlations of GPS observations

    NASA Astrophysics Data System (ADS)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In many geodetic applications observations of the Global Positioning System (GPS) are routinely processed by means of the least-squares method. However, this algorithm delivers reliable estimates of unknown parameters und realistic accuracy measures only if both the functional and stochastic models are appropriately defined within GPS data processing. One deficiency of the stochastic model used in many GPS software products consists in neglecting temporal correlations of GPS observations. In practice the knowledge of the temporal stochastic behaviour of GPS observations can be improved by analysing time series of residuals resulting from the least-squares evaluation. This paper presents an approach based on the theory of autoregressive (integrated) moving average (AR(I)MA) processes to model temporal correlations of GPS observations using time series of observation residuals. A practicable integration of AR(I)MA models in GPS data processing requires the determination of the order parameters of AR(I)MA processes at first. In case of GPS, the identification of AR(I)MA processes could be affected by various factors impacting GPS positioning results, e.g. baseline length, multipath effects, observation weighting, or weather variations. The influences of these factors on AR(I)MA identification are empirically analysed based on a large amount of representative residual time series resulting from differential GPS post-processing using 1-Hz observation data collected within the permanent SAPOS® (Satellite Positioning Service of the German State Survey) network. Both short and long time series are modelled by means of AR(I)MA processes. The final order parameters are determined based on the whole residual database; the corresponding empirical distribution functions illustrate that multipath and weather variations seem to affect the identification of AR(I)MA processes much more significantly than baseline length and observation weighting. Additionally, the modelling results of temporal correlations using high-order AR(I)MA processes are compared with those by means of first order autoregressive (AR(1)) processes and empirically estimated autocorrelation functions.

  18. Multiple Indicator Stationary Time Series Models.

    ERIC Educational Resources Information Center

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  19. Case series: toxicity from 25B-NBOMe--a cluster of N-bomb cases.

    PubMed

    Gee, Paul; Schep, Leo J; Jensen, Berit P; Moore, Grant; Barrington, Stuart

    2016-01-01

    Background A new class of hallucinogens called NBOMes has emerged. This class includes analogues 25I-NBOMe, 25C-NBOMe and 25B-NBOMe. Case reports and judicial seizures indicate that 25I-NBOMe and 25C-NBOMe are more prevalently abused. There have been a few confirmed reports of 25B-NBOMe use or toxicity. Report Observational case series. This report describes a series of 10 patients who suffered adverse effects from 25B-NBOMe. Hallucinations and violent agitation predominate along with serotonergic/stimulant signs such as mydriasis, tachycardia, hypertension and hyperthermia. The majority (7/10) required sedation with benzodiazepines. Analytical method 25B-NBOMe concentrations in plasma and urine were quantified in all patients using a validated liquid chromatography-tandem mass spectrometry (LC-MS/MS) method. Peak plasma levels were measured between 0.7-10.1 ng/ml. Discussion The NBOMes are desired by users because of their hallucinogenic and stimulant effects. They are often sold as LSD or synthetic LSD. Reported cases of 25B- NBOMe toxicity are reviewed and compared to our series. Seizures and one pharmacological death have been described but neither were observed in our series. Based on our experience with cases of mild to moderate toxicity, we suggest that management should be supportive and focused on preventing further (self) harm. High doses of benzodiazepines may be required to control agitation. Patients who develop significant hyperthermia need to be actively managed. Conclusions Effects from 25B-NBOMe in our series were similar to previous individual case reports. The clinical features were also similar to effects from other analogues in the class (25I-NBOMe, 25C-NBOMe). Violent agitation frequently present along with signs of serotonergic stimulation. Hyperthermia, rhabdomyolysis and kidney injury were also observed.

  20. Lenticular neovascularization subsequent to traumatic cataract formation.

    PubMed

    Kabat, Alan G

    2011-09-01

    To report a series of cases involving neovascularization within the human crystalline lens-a normally avascular structure-after ocular trauma. This is a retrospective, consecutive observational case series with review of the prevailing literature. Four individuals with a history of ocular trauma and subsequent cataract development were examined between May 2004 and April 2007. All had hypermature cataracts and intraocular inflammation, presumably secondary to phacolysis; two of the four had concurrent hyphema and ocular hypertension in the involved eye. All subjects in this series were found to display a discrete network of blood vessels within the structure of the crystalline lens, just beneath the anterior lens capsule. Neovascularization of the crystalline lens has received little attention in the ophthalmic literature, having been described only rarely in individual case reports. This manuscript details the first known case series involving lenticular neovascularization, and offers insight into its possible developmental mechanism.

  1. Amniotic fluid embolism mortality rate.

    PubMed

    Benson, Michael D

    2017-11-01

    The objective of this study was to determine the mortality rate of amniotic fluid embolism (AFE) using population-based studies and case series. A literature search was conducted using the two key words: 'amniotic fluid embolism (AFE)' AND 'mortality rate'. Thirteen population-based studies were evaluated, as well as 36 case series including at least two patients. The mortality rate from population-based studies varied from 11% to 44%. When nine population-based studies with over 17 000 000 live births were aggregated, the maternal mortality rate was 20.4%. In contrast, the mortality rate of AFE in case series varies from 0% to 100% with numerous rates in between. The AFE mortality rate in population-based studies varied from 11% to 44% with the best available evidence supporting an overall mortality rate of 20.4%. Data from case series should no longer be used as a basis for describing the lethality of AFE. © 2017 Japan Society of Obstetrics and Gynecology.

  2. Neonatal medical exposures and characteristics of low birth weight hepatoblastoma cases: a report from the Children's Oncology Group.

    PubMed

    Turcotte, Lucie M; Georgieff, Michael K; Ross, Julie A; Feusner, James H; Tomlinson, Gail E; Malogolowkin, Marcio H; Krailo, Mark D; Miller, Nicole; Fonstad, Rachel; Spector, Logan G

    2014-11-01

    Hepatoblastoma is a malignancy of young children. Low birth weight is associated with significantly increased risk of hepatoblastoma and neonatal medical exposures are hypothesized as contributors. This study represents the largest case-control study of hepatoblastoma to date and aimed to define the role of neonatal exposures in hepatoblastoma risk among low birth weight children. Incident hepatoblastoma cases who were born <2,500 g (N = 60), diagnosed between 2000 and 2008, were identified through the Children's Oncology Group. Controls were recruited through state birth registries (N = 51). Neonatal medical exposures were abstracted from medical records. Subjects from the Vermont Oxford Network were used for further comparisons, as were existing reports on neonatal medical exposures. Case-control comparisons were hindered by poor matching within birth weight strata. Cases were smaller and received more aggressive neonatal treatment compared to controls, and reflected high correlation levels between birth weight and treatments. Similar difficulty was encountered when comparing cases to Vermont Oxford Network subjects; cases were smaller and required more aggressive neonatal therapy. Furthermore, it appears hepatoblastoma cases were exposed to a greater number of diagnostic X-rays than in case series previously reported in the neonatal literature. This study presents the largest case series of hepatoblastoma in <2,500 g birth weight infants with accompanying neonatal medical exposure data. Findings confirm that birth weight is highly correlated with exposure intensity, and neonatal exposures are themselves highly correlated, which hampers the identification of a causal exposure among hepatoblastoma cases. Experimental models or genetic susceptibility testing may be more revealing of etiology. © 2014 Wiley Periodicals, Inc.

  3. Contact problem for an elastic reinforcement bonded to an elastic plate

    NASA Technical Reports Server (NTRS)

    Erdogan, F.; Civelek, M. B.

    1974-01-01

    The contact problem for a thin elastic reinforcement bonded to an elastic plate is considered. The stiffening layer is treated as an elastic membrane and the base plate is assumed to be an elastic continuum. The bonding between the two materials is assumed to be either one of direct adhesion or through a thin adhesive layer which is treated as a shear spring. The solution for the simple case in which both the stiffener and the base plate are treated as membranes is also given. The contact stress is obtained for a series of numerical examples. In the direct adhesion case the contact stress becomes infinite at the stiffener ends with a typical square root singularity for the continuum model and behaving as a delta function for the membrane model. In the case of bonding through an adhesive layer the contact stress becomes finite and continuous along the entire contact area.

  4. Blind tests of methods for InSight Mars mission: Open scientific challenge

    NASA Astrophysics Data System (ADS)

    Clinton, John; Ceylan, Savas; Giardini, Domenico; Khan, Amir; van Driel, Martin; Böse, Maren; Euchner, Fabian; Garcia, Raphael F.; Drilleau, Mélanie; Lognonné, Philippe; Panning, Mark; Banerdt, Bruce

    2017-04-01

    The Marsquake Service (MQS) will be the ground segment service within the InSight mission to Mars, which will deploy a single seismic station on Elysium Planitia in November 2018. The main tasks of the MQS are the identification and characterisation of seismicity, and managing the Martian seismic event catalogue. In advance of the mission, we have developed a series of single station event location methods that rely on a priori 1D and 3D structural models. In coordination with the Mars Structural Service, we expect to use iterative inversion techniques to revise these structural models and event locations. In order to seek methodological advancements and test our current approaches, we have designed a blind test case using Martian synthetics combined with realistic noise models for the Martian surface. We invite all scientific parties that are interested in single station approaches and in exploring the Martian time-series to participate and contribute to our blind test. We anticipate the test will can improve currently developed location and structural inversion techniques, and also allow us explore new single station techniques for moment tensor and magnitude determination. The waveforms for our test case are computed employing AxiSEM and Instaseis for a randomly selected 1D background model and event catalogue that is statistically consistent with our current expectation of Martian seismicity. Realistic seismic surface noise is superimposed to generate a continuous time-series spanning 6 months. The event catalog includes impacts as well as Martian quakes. The temporal distribution of the seismicity in the timeseries, as well as the true structural model, are not be known to any participating parties including MQS till the end of competition. We provide our internal tools such as event location codes, suite of background models, seismic phase travel times, in order to support researchers who are willing to use/improve our current methods. Following the deadline of our blind test in late 2017, we plan to combine all outcomes in an article with all participants as co-authors.

  5. A statistical approach for generating synthetic tip stress data from limited CPT soundings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Basalams, M.K.

    CPT tip stress data obtained from a Uranium mill tailings impoundment are treated as time series. A statistical class of models that was developed to model time series is explored to investigate its applicability in modeling the tip stress series. These models were developed by Box and Jenkins (1970) and are known as Autoregressive Moving Average (ARMA) models. This research demonstrates how to apply the ARMA models to tip stress series. Generation of synthetic tip stress series that preserve the main statistical characteristics of the measured series is also investigated. Multiple regression analysis is used to model the regional variationmore » of the ARMA model parameters as well as the regional variation of the mean and the standard deviation of the measured tip stress series. The reliability of the generated series is investigated from a geotechnical point of view as well as from a statistical point of view. Estimation of the total settlement using the measured and the generated series subjected to the same loading condition are performed. The variation of friction angle with depth of the impoundment materials is also investigated. This research shows that these series can be modeled by the Box and Jenkins ARMA models. A third degree Autoregressive model AR(3) is selected to represent these series. A theoretical double exponential density function is fitted to the AR(3) model residuals. Synthetic tip stress series are generated at nearby locations. The generated series are shown to be reliable in estimating the total settlement and the friction angle variation with depth for this particular site.« less

  6. Modeling The Shock Initiation of PBX-9501 in ALE3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leininger, L; Springer, H K; Mace, J

    The SMIS (Specific Munitions Impact Scenario) experimental series performed at Los Alamos National Laboratory has determined the 3-dimensional shock initiation behavior of the HMX-based heterogeneous high explosive, PBX 9501. A series of finite element impact calculations have been performed in the ALE3D [1] hydrodynamic code and compared to the SMIS results to validate the code predictions. The SMIS tests use a powder gun to shoot scaled NATO standard fragments at a cylinder of PBX 9501, which has a PMMA case and a steel impact cover. The SMIS real-world shot scenario creates a unique test-bed because many of the fragments arrivemore » at the impact plate off-center and at an angle of impact. The goal of this model validation experiments is to demonstrate the predictive capability of the Tarver-Lee Ignition and Growth (I&G) reactive flow model [2] in this fully 3-dimensional regime of Shock to Detonation Transition (SDT). The 3-dimensional Arbitrary Lagrange Eulerian hydrodynamic model in ALE3D applies the Ignition and Growth (I&G) reactive flow model with PBX 9501 parameters derived from historical 1-dimensional experimental data. The model includes the off-center and angle of impact variations seen in the experiments. Qualitatively, the ALE3D I&G calculations accurately reproduce the 'Go/No-Go' threshold of the Shock to Detonation Transition (SDT) reaction in the explosive, as well as the case expansion recorded by a high-speed optical camera. Quantitatively, the calculations show good agreement with the shock time of arrival at internal and external diagnostic pins. This exercise demonstrates the utility of the Ignition and Growth model applied in a predictive fashion for the response of heterogeneous high explosives in the SDT regime.« less

  7. An at-site flood estimation method in the context of nonstationarity I. A simulation study

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    The stationarity of annual flood peak records is the traditional assumption of flood frequency analysis. In some cases, however, as a result of land-use and/or climate change, this assumption is no longer valid. Therefore, new statistical models are needed to capture dynamically the change of probability density functions over time, in order to obtain reliable flood estimation. In this study, an innovative method for nonstationary flood frequency analysis was presented. Here, the new method is based on detrending the flood series and applying the L-moments along with the GEV distribution to the transformed ;stationary; series (hereafter, this is called the LM-NS). The LM-NS method was assessed through a comparative study with the maximum likelihood (ML) method for the nonstationary GEV model, as well as with the stationary (S) GEV model. The comparative study, based on Monte Carlo simulations, was carried out for three nonstationary GEV models: a linear dependence of the mean on time (GEV1), a quadratic dependence of the mean on time (GEV2), and linear dependence in both the mean and log standard deviation on time (GEV11). The simulation results indicated that the LM-NS method performs better than the ML method for most of the cases studied, whereas the stationary method provides the least accurate results. An additional advantage of the LM-NS method is to avoid the numerical problems (e.g., convergence problems) that may occur with the ML method when estimating parameters for small data samples.

  8. Assessment of Turbulent Shock-Boundary Layer Interaction Computations Using the OVERFLOW Code

    NASA Technical Reports Server (NTRS)

    Oliver, A. B.; Lillard, R. P.; Schwing, A. M.; Blaisdell, G> A.; Lyrintzis, A. S.

    2007-01-01

    The performance of two popular turbulence models, the Spalart-Allmaras model and Menter s SST model, and one relatively new model, Olsen & Coakley s Lag model, are evaluated using the OVERFLOWcode. Turbulent shock-boundary layer interaction predictions are evaluated with three different experimental datasets: a series of 2D compression ramps at Mach 2.87, a series of 2D compression ramps at Mach 2.94, and an axisymmetric coneflare at Mach 11. The experimental datasets include flows with no separation, moderate separation, and significant separation, and use several different experimental measurement techniques (including laser doppler velocimetry (LDV), pitot-probe measurement, inclined hot-wire probe measurement, preston tube skin friction measurement, and surface pressure measurement). Additionally, the OVERFLOW solutions are compared to the solutions of a second CFD code, DPLR. The predictions for weak shock-boundary layer interactions are in reasonable agreement with the experimental data. For strong shock-boundary layer interactions, all of the turbulence models overpredict the separation size and fail to predict the correct skin friction recovery distribution. In most cases, surface pressure predictions show too much upstream influence, however including the tunnel side-wall boundary layers in the computation improves the separation predictions.

  9. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  10. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  11. Development of time-trend model for analysing and predicting case pattern of dog bite injury induced rabies-like-illness in Liberia, 2014-2017.

    PubMed

    Jomah, N D; Ojo, J F; Odigie, E A; Olugasa, B O

    2014-12-01

    The post-civil war records of dog bite injuries (DBI) and rabies-like-illness (RLI) among humans in Liberia is a vital epidemiological resource for developing a predictive model to guide the allocation of resources towards human rabies control. Whereas DBI and RLI are high, they are largely under-reported. The objective of this study was to develop a time model of the case-pattern and apply it to derive predictors of time-trend point distribution of DBI-RLI cases. A retrospective 6 years data of DBI distribution among humans countrywide were converted to quarterly series using a transformation technique of Minimizing Squared First Difference statistic. The generated dataset was used to train a time-trend model of the DBI-RLI syndrome in Liberia. An additive detenninistic time-trend model was selected due to its performance compared to multiplication model of trend and seasonal movement. Parameter predictors were run on least square method to predict DBI cases for a prospective 4 years period, covering 2014-2017. The two-stage predictive model of DBI case-pattern between 2014 and 2017 was characterised by a uniform upward trend within Liberia's coastal and hinterland Counties over the forecast period. This paper describes a translational application of the time-trend distribution pattern of DBI epidemics, 2008-2013 reported in Liberia, on which a predictive model was developed. A computationally feasible two-stage time-trend permutation approach is proposed to estimate the time-trend parameters and conduct predictive inference on DBI-RLI in Liberia.

  12. On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.

    PubMed

    Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki

    2017-08-01

    In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.

  13. Vector Autoregressive Models and Granger Causality in Time Series Analysis in Nursing Research: Dynamic Changes Among Vital Signs Prior to Cardiorespiratory Instability Events as an Example.

    PubMed

    Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M

    Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display interrelated vital sign changes during situations of physiological stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. The purpose of this article is to illustrate the development of patient-specific VAR models using vital sign time series data in a sample of acutely ill, monitored, step-down unit patients and determine their Granger causal dynamics prior to onset of an incident CRI. CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40-140/minute, RR = 8-36/minute, SpO2 < 85%) and persisting for 3 minutes within a 5-minute moving window (60% of the duration of the window). A 6-hour time segment prior to onset of first CRI was chosen for time series modeling in 20 patients using a six-step procedure: (a) the uniform time series for each vital sign was assessed for stationarity, (b) appropriate lag was determined using a lag-length selection criteria, (c) the VAR model was constructed, (d) residual autocorrelation was assessed with the Lagrange Multiplier test, (e) stability of the VAR system was checked, and (f) Granger causality was evaluated in the final stable model. The primary cause of incident CRI was low SpO2 (60% of cases), followed by out-of-range RR (30%) and HR (10%). Granger causality testing revealed that change in RR caused change in HR (21%; i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing changes in HR occurred with equal frequency (18%). Within this sample of acutely ill patients who experienced a CRI event, VAR modeling indicated that RR changes tend to occur before changes in HR and SpO2. These findings suggest that contextual assessment of RR changes as the earliest sign of CRI is warranted. Use of VAR modeling may be helpful in other nursing research applications based on time series data.

  14. Temporal abstraction for the analysis of intensive care information

    NASA Astrophysics Data System (ADS)

    Hadad, Alejandro J.; Evin, Diego A.; Drozdowicz, Bartolomé; Chiotti, Omar

    2007-11-01

    This paper proposes a scheme for the analysis of time-stamped series data from multiple monitoring devices of intensive care units, using Temporal Abstraction concepts. This scheme is oriented to obtain a description of the patient state evolution in an unsupervised way. The case of study is based on a dataset clinically classified with Pulmonary Edema. For this dataset a trends based Temporal Abstraction mechanism is proposed, by means of a Behaviours Base of time-stamped series and then used in a classification step. Combining this approach with the introduction of expert knowledge, using Fuzzy Logic, and multivariate analysis by means of Self-Organizing Maps, a states characterization model is obtained. This model is feasible of being extended to different patients groups and states. The proposed scheme allows to obtain intermediate states descriptions through which it is passing the patient and that could be used to anticipate alert situations.

  15. Formability analysis of aluminum alloy sheets at elevated temperatures with numerical simulation based on the M-K method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bagheriasl, Reza; Ghavam, Kamyar; Worswick, Michael

    2011-05-04

    The effect of temperature on formability of aluminum alloy sheet is studied by developing the Forming Limit Diagrams, FLD, for aluminum alloy 3000-series using the Marciniak and Kuczynski technique by numerical simulation. The numerical model is conducted in LS-DYNA and incorporates the Barlat's YLD2000 anisotropic yield function and the temperature dependant Bergstrom hardening law. Three different temperatures; room temperature, 250 deg. C and 300 deg. C, are studied. For each temperature case, various loading conditions are applied to the M-K defect model. The effect of the material anisotropy is considered by varying the defect angle. A simplified failure criterion ismore » used to predict the onset of necking. Minor and major strains are obtained from the simulations and plotted for each temperature level. It is demonstrated that temperature improves the forming limit of aluminum 3000-series alloy sheet.« less

  16. A framework for assessing cumulative effects in watersheds: an introduction to Canadian case studies.

    PubMed

    Dubé, Monique G; Duinker, Peter; Greig, Lorne; Carver, Martin; Servos, Mark; McMaster, Mark; Noble, Bram; Schreier, Hans; Jackson, Lee; Munkittrick, Kelly R

    2013-07-01

    From 2008 to 2013, a series of studies supported by the Canadian Water Network were conducted in Canadian watersheds in an effort to improve methods to assess cumulative effects. These studies fit under a common framework for watershed cumulative effects assessment (CEA). This article presents an introduction to the Special Series on Watershed CEA in IEAM including the framework and its impetus, a brief introduction to each of the articles in the series, challenges, and a path forward. The framework includes a regional water monitoring program that produces 3 core outputs: an accumulated state assessment, stressor-response relationships, and development of predictive cumulative effects scenario models. The framework considers core values, indicators, thresholds, and use of consistent terminology. It emphasizes that CEA requires 2 components, accumulated state quantification and predictive scenario forecasting. It recognizes both of these components must be supported by a regional, multiscale monitoring program. Copyright © 2013 SETAC.

  17. Impact of the 13-Valent Pneumococcal Conjugate Vaccine on Clinical and Hypoxemic Childhood Pneumonia over Three Years in Central Malawi: An Observational Study

    PubMed Central

    McCollum, Eric D.; Nambiar, Bejoy; Deula, Rashid; Zadutsa, Beatiwel; Bondo, Austin; King, Carina; Beard, James; Liyaya, Harry; Mankhambo, Limangeni; Lazzerini, Marzia; Makwenda, Charles; Masache, Gibson; Bar-Zeev, Naor; Kazembe, Peter N.; Mwansambo, Charles; Lufesi, Norman; Costello, Anthony; Armstrong, Ben

    2017-01-01

    Background The pneumococcal conjugate vaccine’s (PCV) impact on childhood pneumonia during programmatic conditions in Africa is poorly understood. Following PCV13 introduction in Malawi in November 2011, we evaluated the case burden and rates of childhood pneumonia. Methods and Findings Between January 1, 2012-June 30, 2014 we conducted active pneumonia surveillance in children <5 years at seven hospitals, 18 health centres, and with 38 community health workers in two districts, central Malawi. Eligible children had clinical pneumonia per Malawi guidelines, defined as fast breathing only, chest indrawing +/- fast breathing, or, ≥1 clinical danger sign. Since pulse oximetry was not in the Malawi guidelines, oxygenation <90% defined hypoxemic pneumonia, a distinct category from clinical pneumonia. We quantified the pneumonia case burden and rates in two ways. We compared the period immediately following vaccine introduction (early) to the period with >75% three-dose PCV13 coverage (post). We also used multivariable time-series regression, adjusting for autocorrelation and exploring seasonal variation and alternative model specifications in sensitivity analyses. The early versus post analysis showed an increase in cases and rates of total, fast breathing, and indrawing pneumonia and a decrease in danger sign and hypoxemic pneumonia, and pneumonia mortality. At 76% three-dose PCV13 coverage, versus 0%, the time-series model showed a non-significant increase in total cases (+47%, 95% CI: -13%, +149%, p = 0.154); fast breathing cases increased 135% (+39%, +297%, p = 0.001), however, hypoxemia fell 47% (-5%, -70%, p = 0.031) and hospital deaths decreased 36% (-1%, -58%, p = 0.047) in children <5 years. We observed a shift towards disease without danger signs, as the proportion of cases with danger signs decreased by 65% (-46%, -77%, p<0.0001). These results were generally robust to plausible alternative model specifications. Conclusions Thirty months after PCV13 introduction in Malawi, the health system burden and rates of the severest forms of childhood pneumonia, including hypoxemia and death, have markedly decreased. PMID:28052071

  18. Inferring the relative resilience of alternative states

    USGS Publications Warehouse

    Angeler, David G.; Allen, Craig R.; Rojo, Carmen; Alvarez-Cobelas, Miguel; Rodrigo, Maria A.; Sanchez-Carrillo, Salvador

    2013-01-01

    Ecological systems may occur in alternative states that differ in ecological structures, functions and processes. Resilience is the measure of disturbance an ecological system can absorb before changing states. However, how the intrinsic structures and processes of systems that characterize their states affects their resilience remains unclear. We analyzed time series of phytoplankton communities at three sites in a floodplain in central Spain to assess the dominant frequencies or “temporal scales” in community dynamics and compared the patterns between a wet and a dry alternative state. The identified frequencies and cross-scale structures are expected to arise from positive feedbacks that are thought to reinforce processes in alternative states of ecological systems and regulate emergent phenomena such as resilience. Our analyses show a higher species richness and diversity but lower evenness in the dry state. Time series modeling revealed a decrease in the importance of short-term variability in the communities, suggesting that community dynamics slowed down in the dry relative to the wet state. The number of temporal scales at which community dynamics manifested, and the explanatory power of time series models, was lower in the dry state. The higher diversity, reduced number of temporal scales and the lower explanatory power of time series models suggest that species dynamics tended to be more stochastic in the dry state. From a resilience perspective our results highlight a paradox: increasing species richness may not necessarily enhance resilience. The loss of cross-scale structure (i.e. the lower number of temporal scales) in community dynamics across sites suggests that resilience erodes during drought. Phytoplankton communities in the dry state are therefore likely less resilient than in the wet state. Our case study demonstrates the potential of time series modeling to assess attributes that mediate resilience. The approach is useful for assessing resilience of alternative states across ecological and other complex systems.

  19. Interchangeability of counts of cases and hours of cases for quantifying a hospital's change in workload among four-week periods of 1 year.

    PubMed

    Dexter, Franklin; Epstein, Richard H; Ledolter, Johannes; Wanderer, Jonathan P

    2018-05-16

    Recent studies have made longitudinal assessments of case counts using State (e.g., United States) and Provincial (e.g., Canada) databases. Such databases rarely include either operating room (OR) or anesthesia times and, even when duration data are available, there are major statistical limitations to their use. We evaluated how to forecast short-term changes in OR caseload and workload (hours) and how to decide whether changes are outliers (e.g., significant, abrupt decline in anesthetics). Observational cohort study. Large teaching hospital. 35 years of annual anesthesia caseload data. Annual data were used without regard to where or when in the year each case was performed, thereby matching public use files. Changes in caseload or hours among four-week periods were examined within individual year-long periods using 159 consecutive four-week periods from the same hospital. Series of 12 four-week periods of the hours of cases performed on workdays lacked trend or correlation among periods for 49 of 50 series and followed normal distributions for 50 of 50 series. These criteria also were satisfied for 50 of 50 series based on counts of cases. The Pearson r = 0.999 between hours of anesthetics and cases. For purposes of time series analysis of total workload at a hospital within 1-year, hours of cases and counts of cases are interchangeable. Simple control chart methods of detecting sudden changes in workload or caseload, based simply on the sample mean and standard deviation from the preceding year, are appropriate. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Wernicke's Aphasia Reflects a Combination of Acoustic-Phonological and Semantic Control Deficits: A Case-Series Comparison of Wernicke's Aphasia, Semantic Dementia and Semantic Aphasia

    ERIC Educational Resources Information Center

    Robson, Holly; Sage, Karen; Lambon Ralph, Matthew A.

    2012-01-01

    Wernicke's aphasia (WA) is the classical neurological model of comprehension impairment and, as a result, the posterior temporal lobe is assumed to be critical to semantic cognition. This conclusion is potentially confused by (a) the existence of patient groups with semantic impairment following damage to other brain regions (semantic dementia and…

  1. HRD--Leadership Training for Women on the Lower Rungs of the Organizational Ladder: A Qualitative Study

    ERIC Educational Resources Information Center

    Dugan, Dixie

    2010-01-01

    The purpose of this case study, HRD--Leadership Training for Women on the Lower Rungs of the Organizational Ladder: A Qualitative Study, was to determine the responses of a group of women to a series of classes on leadership development and to perceive what they might have gained from this experience. These classes were modeled after those offered…

  2. Promoting Student-Centered Learning Using iPads in a Grade 1 Classroom: Using the Digital Didactic Framework to Deconstruct Instruction

    ERIC Educational Resources Information Center

    Woloshyn, Vera Ella; Bajovic, Mira; Worden, Melissa Maney

    2017-01-01

    In this qualitative case study, we provide a series of vignettes illustrating a Grade 1 teacher's experiences integrating iPad technology into her instruction over a school year. We use the digital didactic model to deconstruct these vignettes and draw upon the teacher's reflections to gain further insights about her instructional experiences…

  3. Transvestism as a Symptom: A Case Series.

    PubMed

    Anupama, M; Gangadhar, K H; Shetty, Vandana B; Dip, P Bhadja

    2016-01-01

    Transvestism, commonly termed as cross-dressing, means to dress in the clothing of opposite sex. We describe a series of three cases with transvestism as one of their primary complaints. The discussion sheds light on the various ways in which transvestism as a symptom can present in Psychiatry. In the first two cases, there was lower intelligence. In first and third case, there were other paraphilia along with transvestism. Second case had co-morbid obsessive-compulsive disorder (OCD) and had good response to selective serotonin reuptake inhibitor (SSRI).

  4. An Investigation of the Impact of Aerodynamic Model Fidelity on Close-In Combat Effectiveness Prediction in Piloted Simulation

    NASA Technical Reports Server (NTRS)

    Persing, T. Ray; Bellish, Christine A.; Brandon, Jay; Kenney, P. Sean; Carzoo, Susan; Buttrill, Catherine; Guenther, Arlene

    2005-01-01

    Several aircraft airframe modeling approaches are currently being used in the DoD community for acquisition, threat evaluation, training, and other purposes. To date there has been no clear empirical study of the impact of airframe simulation fidelity on piloted real-time aircraft simulation study results, or when use of a particular level of fidelity is indicated. This paper documents a series of piloted simulation studies using three different levels of airframe model fidelity. This study was conducted using the NASA Langley Differential Maneuvering Simulator. Evaluations were conducted with three pilots for scenarios requiring extensive maneuvering of the airplanes during air combat. In many cases, a low-fidelity modified point-mass model may be sufficient to evaluate the combat effectiveness of the aircraft. However, in cases where high angle-of-attack flying qualities and aerodynamic performance are a factor or when precision tracking ability of the aircraft must be represented, use of high-fidelity models is indicated.

  5. Case Series: Chikungunya and Dengue at a Forward Operating Location

    DTIC Science & Technology

    2015-05-01

    Journal Article 3. DATES COVERED (From – To) November 2014 – January 2015 4. TITLE AND SUBTITLE Case Series: Chikungunya and Dengue at a Forward...series and discusses the significance of this disease in the Americas and diagnostic challenges when other arboviruses such as dengue are present. 15...SUBJECT TERMS Chikungunya, dengue , mosquitoes 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT SAR 18. NUMBER OF PAGES 3

  6. Enabling the use of climate model data in the Dutch climate effect community

    NASA Astrophysics Data System (ADS)

    Som de Cerff, Wim; Plieger, Maarten

    2010-05-01

    Within the climate effect community the usage of climate model data is emerging. Where mostly climate time series and weather generators were used, there is a shift to incorporate climate model data into climate effect models. The use of climate model data within the climate effect models is difficult, due to missing metadata, resolution and projection issues, data formats and availability of the parameters of interest. Often the climate effect modelers are not aware of available climate model data or are not aware of how they can use it. Together with seven other partners (CERFACS, CNR-IPSL, SMHI, INHGA, CMCC, WUR, MF-CNRM), KNMI is involved in the FP7 IS ENES (http://www.enes.org) project work package 10/JRA5 ‘Bridging Climate Research Data and the Needs of the Impact Community. The aims of this work package are to enhance the use of Climate Research Data and to enhance the interaction with climate effect/impact communities. Phase one is to define use cases together with the Dutch climate effect community, which describe the intended use of climate model data in climate effect models. We defined four use cases: 1) FEWS hydrological Framework (Deltares) 2) METAPHOR, a plants and species dispersion model (Wageningen University) 3) Natuurplanner, an Ecological model suite (Wageningen University) 4) Land use models (Free University/JRC). Also the other partners in JRA5 have defined use cases, which are representative for the climate effect and impact communities in their country. Goal is to find commonalities between all defined use cases. The common functionality will be implemented as e-tools and incorporated in the IS-ENES data portal. Common issues relate to e.g., need for high resolution: downscaling from GCM to local scale (also involves interpolation); parameter selection; finding extremes; averaging methods. At the conference we will describe the FEWS case in more detail: Delft FEWS is an open shell system (in development since 1995) for performing hydrological predictions and the handling of time series data. The most important capabilities of FEWS are importing of meteorological and hydrological data and organizing the workflows of the different models which can be used within FEWS, like the Netherlands Hydrological Instrumentarium (NHI). Besides predictions, the system is currently being used for hydrological climate effects studies. Currently regionally downscaled data are used, but using model data will be the next step. This coupling of climate model data to FEWS will open a wider rage of climate impact and effect research, but it is a difficult task to accomplish. Issues to be dealt with are: regridding, downscaling, format conversion, extraction of required data and addition of descriptive metadata, including quality and uncertainty parameters. Finding an appropriate solution involves several iterations: first, the use case was defined, then we just provided a single data file containing some data of interest provided via FTP, next this data was offered through OGC services. Currently we are working on providing larger datasets and improving on the parameters and metadata. We will present the results (e-tools/data) and experiences gained on implementing the described use cases. Note that we are currently using experimental data, as the official climate model runs are not available yet.

  7. Modeled Forecasts of Dengue Fever in San Juan, Puerto Rico Using NASA Satellite Enhanced Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.

    2015-12-01

    Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.

  8. Forecasting and prediction of scorpion sting cases in Biskra province, Algeria, using a seasonal autoregressive integrated moving average model

    PubMed Central

    2016-01-01

    OBJECTIVES The aims of this study were to highlight some epidemiological aspects of scorpion envenomations, to analyse and interpret the available data for Biskra province, Algeria, and to develop a forecasting model for scorpion sting cases in Biskra province, which records the highest number of scorpion stings in Algeria. METHODS In addition to analysing the epidemiological profile of scorpion stings that occurred throughout the year 2013, we used the Box-Jenkins approach to fit a seasonal autoregressive integrated moving average (SARIMA) model to the monthly recorded scorpion sting cases in Biskra from 2000 to 2012. RESULTS The epidemiological analysis revealed that scorpion stings were reported continuously throughout the year, with peaks in the summer months. The most affected age group was 15 to 49 years old, with a male predominance. The most prone human body areas were the upper and lower limbs. The majority of cases (95.9%) were classified as mild envenomations. The time series analysis showed that a (5,1,0)×(0,1,1)12 SARIMA model offered the best fit to the scorpion sting surveillance data. This model was used to predict scorpion sting cases for the year 2013, and the fitted data showed considerable agreement with the actual data. CONCLUSIONS SARIMA models are useful for monitoring scorpion sting cases, and provide an estimate of the variability to be expected in future scorpion sting cases. This knowledge is helpful in predicting whether an unusual situation is developing or not, and could therefore assist decision-makers in strengthening the province’s prevention and control measures and in initiating rapid response measures. PMID:27866407

  9. Multistep-Ahead Air Passengers Traffic Prediction with Hybrid ARIMA-SVMs Models

    PubMed Central

    Ming, Wei; Xiong, Tao

    2014-01-01

    The hybrid ARIMA-SVMs prediction models have been established recently, which take advantage of the unique strength of ARIMA and SVMs models in linear and nonlinear modeling, respectively. Built upon this hybrid ARIMA-SVMs models alike, this study goes further to extend them into the case of multistep-ahead prediction for air passengers traffic with the two most commonly used multistep-ahead prediction strategies, that is, iterated strategy and direct strategy. Additionally, the effectiveness of data preprocessing approaches, such as deseasonalization and detrending, is investigated and proofed along with the two strategies. Real data sets including four selected airlines' monthly series were collected to justify the effectiveness of the proposed approach. Empirical results demonstrate that the direct strategy performs better than iterative one in long term prediction case while iterative one performs better in the case of short term prediction. Furthermore, both deseasonalization and detrending can significantly improve the prediction accuracy for both strategies, indicating the necessity of data preprocessing. As such, this study contributes as a full reference to the planners from air transportation industries on how to tackle multistep-ahead prediction tasks in the implementation of either prediction strategy. PMID:24723814

  10. Queuing Time Prediction Using WiFi Positioning Data in an Indoor Scenario.

    PubMed

    Shu, Hua; Song, Ci; Pei, Tao; Xu, Lianming; Ou, Yang; Zhang, Libin; Li, Tao

    2016-11-22

    Queuing is common in urban public places. Automatically monitoring and predicting queuing time can not only help individuals to reduce their wait time and alleviate anxiety but also help managers to allocate resources more efficiently and enhance their ability to address emergencies. This paper proposes a novel method to estimate and predict queuing time in indoor environments based on WiFi positioning data. First, we use a series of parameters to identify the trajectories that can be used as representatives of queuing time. Next, we divide the day into equal time slices and estimate individuals' average queuing time during specific time slices. Finally, we build a nonstandard autoregressive (NAR) model trained using the previous day's WiFi estimation results and actual queuing time to predict the queuing time in the upcoming time slice. A case study comparing two other time series analysis models shows that the NAR model has better precision. Random topological errors caused by the drift phenomenon of WiFi positioning technology (locations determined by a WiFi positioning system may drift accidently) and systematic topological errors caused by the positioning system are the main factors that affect the estimation precision. Therefore, we optimize the deployment strategy during the positioning system deployment phase and propose a drift ratio parameter pertaining to the trajectory screening phase to alleviate the impact of topological errors and improve estimates. The WiFi positioning data from an eight-day case study conducted at the T3-C entrance of Beijing Capital International Airport show that the mean absolute estimation error is 147 s, which is approximately 26.92% of the actual queuing time. For predictions using the NAR model, the proportion is approximately 27.49%. The theoretical predictions and the empirical case study indicate that the NAR model is an effective method to estimate and predict queuing time in indoor public areas.

  11. Queuing Time Prediction Using WiFi Positioning Data in an Indoor Scenario

    PubMed Central

    Shu, Hua; Song, Ci; Pei, Tao; Xu, Lianming; Ou, Yang; Zhang, Libin; Li, Tao

    2016-01-01

    Queuing is common in urban public places. Automatically monitoring and predicting queuing time can not only help individuals to reduce their wait time and alleviate anxiety but also help managers to allocate resources more efficiently and enhance their ability to address emergencies. This paper proposes a novel method to estimate and predict queuing time in indoor environments based on WiFi positioning data. First, we use a series of parameters to identify the trajectories that can be used as representatives of queuing time. Next, we divide the day into equal time slices and estimate individuals’ average queuing time during specific time slices. Finally, we build a nonstandard autoregressive (NAR) model trained using the previous day’s WiFi estimation results and actual queuing time to predict the queuing time in the upcoming time slice. A case study comparing two other time series analysis models shows that the NAR model has better precision. Random topological errors caused by the drift phenomenon of WiFi positioning technology (locations determined by a WiFi positioning system may drift accidently) and systematic topological errors caused by the positioning system are the main factors that affect the estimation precision. Therefore, we optimize the deployment strategy during the positioning system deployment phase and propose a drift ratio parameter pertaining to the trajectory screening phase to alleviate the impact of topological errors and improve estimates. The WiFi positioning data from an eight-day case study conducted at the T3-C entrance of Beijing Capital International Airport show that the mean absolute estimation error is 147 s, which is approximately 26.92% of the actual queuing time. For predictions using the NAR model, the proportion is approximately 27.49%. The theoretical predictions and the empirical case study indicate that the NAR model is an effective method to estimate and predict queuing time in indoor public areas. PMID:27879663

  12. An Ad-Hoc Adaptive Pilot Model for Pitch Axis Gross Acquisition Tasks

    NASA Technical Reports Server (NTRS)

    Hanson, Curtis E.

    2012-01-01

    An ad-hoc algorithm is presented for real-time adaptation of the well-known crossover pilot model and applied to pitch axis gross acquisition tasks in a generic fighter aircraft. Off-line tuning of the crossover model to human pilot data gathered in a fixed-based high fidelity simulation is first accomplished for a series of changes in aircraft dynamics to provide expected values for model parameters. It is shown that in most cases, for this application, the traditional crossover model can be reduced to a gain and a time delay. The ad-hoc adaptive pilot gain algorithm is shown to have desirable convergence properties for most types of changes in aircraft dynamics.

  13. How are you feeling?: A personalized methodology for predicting mental states from temporally observable physical and behavioral information.

    PubMed

    Tuarob, Suppawong; Tucker, Conrad S; Kumara, Soundar; Giles, C Lee; Pincus, Aaron L; Conroy, David E; Ram, Nilam

    2017-04-01

    It is believed that anomalous mental states such as stress and anxiety not only cause suffering for the individuals, but also lead to tragedies in some extreme cases. The ability to predict the mental state of an individual at both current and future time periods could prove critical to healthcare practitioners. Currently, the practical way to predict an individual's mental state is through mental examinations that involve psychological experts performing the evaluations. However, such methods can be time and resource consuming, mitigating their broad applicability to a wide population. Furthermore, some individuals may also be unaware of their mental states or may feel uncomfortable to express themselves during the evaluations. Hence, their anomalous mental states could remain undetected for a prolonged period of time. The objective of this work is to demonstrate the ability of using advanced machine learning based approaches to generate mathematical models that predict current and future mental states of an individual. The problem of mental state prediction is transformed into the time series forecasting problem, where an individual is represented as a multivariate time series stream of monitored physical and behavioral attributes. A personalized mathematical model is then automatically generated to capture the dependencies among these attributes, which is used for prediction of mental states for each individual. In particular, we first illustrate the drawbacks of traditional multivariate time series forecasting methodologies such as vector autoregression. Then, we show that such issues could be mitigated by using machine learning regression techniques which are modified for capturing temporal dependencies in time series data. A case study using the data from 150 human participants illustrates that the proposed machine learning based forecasting methods are more suitable for high-dimensional psychological data than the traditional vector autoregressive model in terms of both magnitude of error and directional accuracy. These results not only present a successful usage of machine learning techniques in psychological studies, but also serve as a building block for multiple medical applications that could rely on an automated system to gauge individuals' mental states. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Using surface displacement derived from GRACE to constrain the water loading signal in cGPS measurements in the Amazon Basin

    NASA Astrophysics Data System (ADS)

    Jose, L.; Bennett, R. A.; Harig, C.

    2017-12-01

    Currently, cGPS data is well suited to track vertical changes in the Earth's surface. However, there are annual, semi-annual, and interannual signals within cGPS time series that are not well constrained. We hypothesize that these signals are primarily due to water loading. If this is the case, the conventional method of modeling cGPS data as an annual or semiannual sinusoid falls short, as such models cannot accurately capture all variations in surface displacement, especially those due to extreme hydrologic events. We believe that we can better correct the cGPS time series with another method we are developing wherein we use a time series of surface displacement derived from the GRACE geopotential field instead of a sinusoidal model to correct the data. Currently, our analysis is constrained to the Amazon Basin, where the signal due to water loading is large enough to appear in both the GRACE and cGPS measurements. The vertical signal from cGPS stations across the Amazon Basin show an apparent spatial correlation, which further supports our idea that these signals are due to a regional water loading signal. In our preliminary research, we used tsview for Matlab to find that the WRMS of the corrected cGPS time series can be reduced as much as 30% from the model corrected data to the GRACE corrected data. The Amazon, like many places around the world, has experienced extreme drought, in 2005, 2010, and recently in 2015. In addition to making the cGPS vertical signal more robust, the method we are developing has the potential to help us understand the effects of these weather events and track trends in water loading.

  15. Modeling a High Explosive Cylinder Experiment

    NASA Astrophysics Data System (ADS)

    Zocher, Marvin A.

    2017-06-01

    Cylindrical assemblies constructed from high explosives encased in an inert confining material are often used in experiments aimed at calibrating and validating continuum level models for the so-called equation of state (constitutive model for the spherical part of the Cauchy tensor). Such is the case in the work to be discussed here. In particular, work will be described involving the modeling of a series of experiments involving PBX-9501 encased in a copper cylinder. The objective of the work is to test and perhaps refine a set of phenomenological parameters for the Wescott-Stewart-Davis reactive burn model. The focus of this talk will be on modeling the experiments, which turned out to be non-trivial. The modeling is conducted using ALE methodology.

  16. Predictors of re-entry into the child protection system in Singapore: a cumulative ecological-transactional risk model.

    PubMed

    Li, Dongdong; Chu, Chi Meng; Ng, Wei Chern; Leong, Wai

    2014-11-01

    This study examines the risk factors of re-entry for 1,750 child protection cases in Singapore using a cumulative ecological-transactional risk model. Using administrative data, the present study found that the overall percentage of Child Protection Service (CPS) re-entry in Singapore is 10.5% based on 1,750 cases, with a range from 3.9% (within 1 year) to 16.5% (within 8 years after case closure). One quarter of the re-entry cases were observed to occur within 9 months from case closure. Seventeen risk factors, as identified from the extant literature, were tested for their utility to predict CPS re-entry in this study using a series of Cox regression analyses. A final list of seven risk factors (i.e., children's age at entry, case type, case closure result, duration of case, household income, family size, and mother's employment status) was used to create a cumulative risk score. The results supported the cumulative risk model in that higher risk score is related to higher risk of CPS re-entry. Understanding the prevalence of CPS re-entry and the risk factors associated with re-entry is the key to informing practice and policy in a culturally relevant way. The results from this study could then be used to facilitate critical case management decisions in order to enhance positive outcomes of families and children in Singapore's care system. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Modelling the influence of climate on malaria occurrence in Chimoio Municipality, Mozambique.

    PubMed

    Ferrão, João Luís; Mendes, Jorge M; Painho, Marco

    2017-05-25

    Mozambique was recently ranked fifth in the African continent for the number of cases of malaria. In Chimoio municipality cases of malaria are increasing annually, contrary to the decreasing trend in Africa. As malaria transmission is influenced to a large extent by climatic conditions, modelling this relationship can provide useful insights for designing precision health measures for malaria control. There is a scarcity of information on the association between climatic variability and malaria transmission risk in Mozambique in general, and in Chimoio in particular. Therefore, the aim of this study is to model the association between climatic variables and malaria cases on a weekly basis, to help policy makers find adequate measures for malaria control and eradication. Time series analysis was conducted using data on weekly climatic variables and weekly malaria cases (counts) in Chimoio municipality, from 2006 to 2014. All data were analysed using SPSS-20, R 3.3.2 and BioEstat 5.0. Cross-correlation analysis, linear processes, namely ARIMA models and regression modelling, were used to develop the final model. Between 2006 and 2014, 490,561 cases of malaria were recorded in Chimoio. Both malaria and climatic data exhibit weekly and yearly systematic fluctuations. Cross-correlation analysis showed that mean temperature and precipitation present significantly lagged correlations with malaria cases. An ARIMA model (2,1,0) (2,1,1) 52 , and a regression model for a Box-Cox transformed number of malaria cases with lags 1, 2 and 3 of weekly malaria cases and lags 6 and 7 of weekly mean temperature and lags 12 of precipitation were fitted. Although, both produced similar widths for prediction intervals, the last was able to anticipate malaria outbreak more accurately. The Chimoio climate seems ideal for malaria occurrence. Malaria occurrence peaks during January to March in Chimoio. As the lag effect between climatic events and malaria occurrence is important for the prediction of malaria cases, this can be used for designing public precision health measures. The model can be used for planning specific measures for Chimoio municipality. Prospective and multidisciplinary research involving researchers from different fields is welcomed to improve the effect of climatic factors and other factors in malaria cases.

  18. Assimilation of satellite altimetry data in hydrological models for improved inland surface water information: Case studies from the "Sentinel-3 Hydrologic Altimetry Processor prototypE" project (SHAPE)

    NASA Astrophysics Data System (ADS)

    Gustafsson, David; Pimentel, Rafael; Fabry, Pierre; Bercher, Nicolas; Roca, Mónica; Garcia-Mondejar, Albert; Fernandes, Joana; Lázaro, Clara; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    This communication is about the Sentinel-3 Hydrologic Altimetry Processor prototypE (SHAPE) project, with a focus on the components dealing with assimilation of satellite altimetry data into hydrological models. The SHAPE research and development project started in September 2015, within the Scientific Exploitation of Operational Missions (SEOM) programme of the European Space Agency. The objectives of the project are to further develop and assess recent improvement in altimetry data, processing algorithms and methods for assimilation in hydrological models, with the overarching goal to support improved scientific use of altimetry data and improved inland water information. The objective is also to take scientific steps towards a future Inland Water dedicated processor on the Sentinel-3 ground segment. The study focuses on three main variables of interest in hydrology: river stage, river discharge and lake level. The improved altimetry data from the project is used to estimate river stage, river discharge and lake level information in a data assimilation framework using the hydrological dynamic and semi-distributed model HYPE (Hydrological Predictions for the Environment). This model has been developed by SMHI and includes data assimilation module based on the Ensemble Kalman filter method. The method will be developed and assessed for a number of case studies with available in situ reference data and satellite altimetry data based on mainly the CryoSat-2 mission on which the new processor will be run; Results will be presented from case studies on the Amazon and Danube rivers and Lake Vänern (Sweden). The production of alti-hydro products (water level time series) are improved thanks to the use of water masks. This eases the geo-selection of the CryoSat-2 altimetric measurements since there are acquired from a geodetic orbit and are thus spread along the river course in space and and time. The specific processing of data from this geodetic orbit space-time pattern will be discussed as well as the subsequent possible strategies for data assimilation into models (and eventually highlight a generalized approach toward multi-mission data processing). Notably, in case of data assimilation along the course of rivers, the river slope might be estimated and compensated for, in order to produce local water level "pseudo time series" at arbitrary locations, and specifically at model's inlets.

  19. Inferential Precision in Single-Case Time-Series Data Streams: How Well Does the EM Procedure Perform When Missing Observations Occur in Autocorrelated Data?

    PubMed Central

    Smith, Justin D.; Borckardt, Jeffrey J.; Nash, Michael R.

    2013-01-01

    The case-based time-series design is a viable methodology for treatment outcome research. However, the literature has not fully addressed the problem of missing observations with such autocorrelated data streams. Mainly, to what extent do missing observations compromise inference when observations are not independent? Do the available missing data replacement procedures preserve inferential integrity? Does the extent of autocorrelation matter? We use Monte Carlo simulation modeling of a single-subject intervention study to address these questions. We find power sensitivity to be within acceptable limits across four proportions of missing observations (10%, 20%, 30%, and 40%) when missing data are replaced using the Expectation-Maximization Algorithm, more commonly known as the EM Procedure (Dempster, Laird, & Rubin, 1977).This applies to data streams with lag-1 autocorrelation estimates under 0.80. As autocorrelation estimates approach 0.80, the replacement procedure yields an unacceptable power profile. The implications of these findings and directions for future research are discussed. PMID:22697454

  20. Experimental study of transport of a dimer on a vertically oscillating plate

    PubMed Central

    Wang, Jiao; Liu, Caishan; Ma, Daolin

    2014-01-01

    It has recently been shown that a dimer, composed of two identical spheres rigidly connected by a rod, under harmonic vertical vibration can exhibit a self-ordered transport behaviour. In this case, the mass centre of the dimer will perform a circular orbit in the horizontal plane, or a straight line if confined between parallel walls. In order to validate the numerical discoveries, we experimentally investigate the temporal evolution of the dimer's motion in both two- and three-dimensional situations. A stereoscopic vision method with a pair of high-speed cameras is adopted to perform omnidirectional measurements. All the cases studied in our experiments are also simulated using an existing numerical model. The combined investigations detail the dimer's dynamics and clearly show that its transport behaviours originate from a series of combinations of different contact states. This series is critical to our understanding of the transport properties in the dimer's motion and related self-ordered phenomena in granular systems. PMID:25383029

  1. Improvement of nonsuicidal self-injury following treatment with antipsychotics possessing strong D1 antagonistic activity: evidence from a report of three cases.

    PubMed

    Wollweber, Bastian; Keck, Martin E; Schmidt, Ulrike

    2015-08-01

    There is no drug treatment for nonsuicidal self-injury (NSSI), a highly prevalent and burdensome symptom of several psychiatric diseases like posttraumatic stress disorder (PTSD), personality disorders, and major depression (MD). Here, we present a retrospective series of three patients demonstrating a persistent remission in MD-associated NSSI in response to treatment with antipsychotics possessing marked D1 receptor antagonistic activity. To the best of the authors' knowledge, the case series presented is only the second clinical paper suggesting a role for D1 antagonists in NSSI drug therapy. Together with previously published data from rodent models, the findings suggest a role for D1 antagonists in NSSI drug therapy and hence for the D1 receptor in NSSI pathogenesis. This conclusion is limited by the facts that the patients presented here received polypharmacy and that the D1 receptor antagonistic antipsychotics suggested here as effective 'anti-auto-aggressants' do not address D1 receptors only but multiple neurotransmitter receptors/systems.

  2. Dynamics of electricity market correlations

    NASA Astrophysics Data System (ADS)

    Alvarez-Ramirez, J.; Escarela-Perez, R.; Espinosa-Perez, G.; Urrea, R.

    2009-06-01

    Electricity market participants rely on demand and price forecasts to decide their bidding strategies, allocate assets, negotiate bilateral contracts, hedge risks, and plan facility investments. However, forecasting is hampered by the non-linear and stochastic nature of price time series. Diverse modeling strategies, from neural networks to traditional transfer functions, have been explored. These approaches are based on the assumption that price series contain correlations that can be exploited for model-based prediction purposes. While many works have been devoted to the demand and price modeling, a limited number of reports on the nature and dynamics of electricity market correlations are available. This paper uses detrended fluctuation analysis to study correlations in the demand and price time series and takes the Australian market as a case study. The results show the existence of correlations in both demand and prices over three orders of magnitude in time ranging from hours to months. However, the Hurst exponent is not constant over time, and its time evolution was computed over a subsample moving window of 250 observations. The computations, also made for two Canadian markets, show that the correlations present important fluctuations over a seasonal one-year cycle. Interestingly, non-linearities (measured in terms of a multifractality index) and reduced price predictability are found for the June-July periods, while the converse behavior is displayed during the December-January period. In terms of forecasting models, our results suggest that non-linear recursive models should be considered for accurate day-ahead price estimation. On the other hand, linear models seem to suffice for demand forecasting purposes.

  3. The Propeller and Cooling-Air-Flow Characteristics of a Twin-Engine Airplane Model Equipped with NACA D sub s -Type Cowlings and with Propellers of NACA 16-Series Airfoil Sections

    DTIC Science & Technology

    1944-09-01

    with the cowling flaps neutral, did not in any case exceed T] = ±0.03. Drag and Cowling-Air Plow with Propeller Removed The effects, on the lift...cowling flaps. Effect of internal flow on drar.- For convenience in studying the drf.g oharaoterlstio a of the two cowling arrangement•, values of the...operation and take-off. Influence of Cooling Hequireir;ent3 on Airplane Performance In the case of many conventional radial elr-ccoled engine

  4. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  5. A new algorithm for automatic Outlier Detection in GPS Time Series

    NASA Astrophysics Data System (ADS)

    Cannavo', Flavio; Mattia, Mario; Rossi, Massimo; Palano, Mimmo; Bruno, Valentina

    2010-05-01

    Nowadays continuous GPS time series are considered a crucial product of GPS permanent networks, useful in many geo-science fields, such as active tectonics, seismology, crustal deformation and volcano monitoring (Altamimi et al. 2002, Elósegui et al. 2006, Aloisi et al. 2009). Although the GPS data elaboration software has increased in reliability, the time series are still affected by different kind of noise, from the intrinsic noise (e.g. thropospheric delay) to the un-modeled noise (e.g. cycle slips, satellite faults, parameters changing). Typically GPS Time Series present characteristic noise that is a linear combination of white noise and correlated colored noise, and this characteristic is fractal in the sense that is evident for every considered time scale or sampling rate. The un-modeled noise sources result in spikes, outliers and steps. These kind of errors can appreciably influence the estimation of velocities of the monitored sites. The outlier detection in generic time series is a widely treated problem in literature (Wei, 2005), while is not fully developed for the specific kind of GPS series. We propose a robust automatic procedure for cleaning the GPS time series from the outliers and, especially for long daily series, steps due to strong seismic or volcanic events or merely instrumentation changing such as antenna and receiver upgrades. The procedure is basically divided in two steps: a first step for the colored noise reduction and a second step for outlier detection through adaptive series segmentation. Both algorithms present novel ideas and are nearly unsupervised. In particular, we propose an algorithm to estimate an autoregressive model for colored noise in GPS time series in order to subtract the effect of non Gaussian noise on the series. This step is useful for the subsequent step (i.e. adaptive segmentation) which requires the hypothesis of Gaussian noise. The proposed algorithms are tested in a benchmark case study and the results confirm that the algorithms are effective and reasonable. Bibliography - Aloisi M., A. Bonaccorso, F. Cannavò, S. Gambino, M. Mattia, G. Puglisi, E. Boschi, A new dyke intrusion style for the Mount Etna May 2008 eruption modelled through continuous tilt and GPS data, Terra Nova, Volume 21 Issue 4 , Pages 316 - 321, doi: 10.1111/j.1365-3121.2009.00889.x (August 2009) - Altamimi Z., Sillard P., Boucher C., ITRF2000: A new release of the International Terrestrial Reference frame for earth science applications, J Geophys Res-Solid Earth, 107 (B10): art. no.-2214, (Oct 2002) - Elósegui, P., J. L. Davis, D. Oberlander, R. Baena, and G. Ekström , Accuracy of high-rate GPS for seismology, Geophys. Res. Lett., 33, L11308, doi:10.1029/2006GL026065 (2006) - Wei W. S., Time Series Analysis: Univariate and Multivariate Methods, Addison Wesley (2 edition), ISBN-10: 0321322169 (July, 2005)

  6. Complex time series analysis of PM10 and PM2.5 for a coastal site using artificial neural network modelling and k-means clustering

    NASA Astrophysics Data System (ADS)

    Elangasinghe, M. A.; Singhal, N.; Dirks, K. N.; Salmond, J. A.; Samarasinghe, S.

    2014-09-01

    This paper uses artificial neural networks (ANN), combined with k-means clustering, to understand the complex time series of PM10 and PM2.5 concentrations at a coastal location of New Zealand based on data from a single site. Out of available meteorological parameters from the network (wind speed, wind direction, solar radiation, temperature, relative humidity), key factors governing the pattern of the time series concentrations were identified through input sensitivity analysis performed on the trained neural network model. The transport pathways of particulate matter under these key meteorological parameters were further analysed through bivariate concentration polar plots and k-means clustering techniques. The analysis shows that the external sources such as marine aerosols and local sources such as traffic and biomass burning contribute equally to the particulate matter concentrations at the study site. These results are in agreement with the results of receptor modelling by the Auckland Council based on Positive Matrix Factorization (PMF). Our findings also show that contrasting concentration-wind speed relationships exist between marine aerosols and local traffic sources resulting in very noisy and seemingly large random PM10 concentrations. The inclusion of cluster rankings as an input parameter to the ANN model showed a statistically significant (p < 0.005) improvement in the performance of the ANN time series model and also showed better performance in picking up high concentrations. For the presented case study, the correlation coefficient between observed and predicted concentrations improved from 0.77 to 0.79 for PM2.5 and from 0.63 to 0.69 for PM10 and reduced the root mean squared error (RMSE) from 5.00 to 4.74 for PM2.5 and from 6.77 to 6.34 for PM10. The techniques presented here enable the user to obtain an understanding of potential sources and their transport characteristics prior to the implementation of costly chemical analysis techniques or advanced air dispersion models.

  7. Parametric vs. non-parametric daily weather generator: validation and comparison

    NASA Astrophysics Data System (ADS)

    Dubrovsky, Martin

    2016-04-01

    As the climate models (GCMs and RCMs) fail to satisfactorily reproduce the real-world surface weather regime, various statistical methods are applied to downscale GCM/RCM outputs into site-specific weather series. The stochastic weather generators are among the most favourite downscaling methods capable to produce realistic (observed like) meteorological inputs for agrological, hydrological and other impact models used in assessing sensitivity of various ecosystems to climate change/variability. To name their advantages, the generators may (i) produce arbitrarily long multi-variate synthetic weather series representing both present and changed climates (in the latter case, the generators are commonly modified by GCM/RCM-based climate change scenarios), (ii) be run in various time steps and for multiple weather variables (the generators reproduce the correlations among variables), (iii) be interpolated (and run also for sites where no weather data are available to calibrate the generator). This contribution will compare two stochastic daily weather generators in terms of their ability to reproduce various features of the daily weather series. M&Rfi is a parametric generator: Markov chain model is used to model precipitation occurrence, precipitation amount is modelled by the Gamma distribution, and the 1st order autoregressive model is used to generate non-precipitation surface weather variables. The non-parametric GoMeZ generator is based on the nearest neighbours resampling technique making no assumption on the distribution of the variables being generated. Various settings of both weather generators will be assumed in the present validation tests. The generators will be validated in terms of (a) extreme temperature and precipitation characteristics (annual and 30 years extremes and maxima of duration of hot/cold/dry/wet spells); (b) selected validation statistics developed within the frame of VALUE project. The tests will be based on observational weather series from several European stations available from the ECA&D database.

  8. Generalized analytical model for benthic water flux forced by surface gravity waves

    USGS Publications Warehouse

    King, J.N.; Mehta, A.J.; Dean, R.G.

    2009-01-01

    A generalized analytical model for benthic water flux forced by linear surface gravity waves over a series of layered hydrogeologic units is developed by adapting a previous solution for a hydrogeologic unit with an infinite thickness (Case I) to a unit with a finite thickness (Case II) and to a dual-unit system (Case III). The model compares favorably with laboratory observations. The amplitude of wave-forced benthic water flux is shown to be directly proportional to the amplitude of the wave, the permeability of the hydrogeologic unit, and the wave number and inversely proportional to the kinematic viscosity of water. A dimensionless amplitude parameter is introduced and shown to reach a maximum where the product of water depth and the wave number is 1.2. Submarine groundwater discharge (SGD) is a benthic water discharge flux to a marine water body. The Case I model estimates an 11.5-cm/d SGD forced by a wave with a 1 s period and 5-cm amplitude in water that is 0.5-m deep. As this wave propagates into a region with a 0.3-m-thick hydrogeologic unit, with a no-flow bottom boundary, the Case II model estimates a 9.7-cm/d wave-forced SGD. As this wave propagates into a region with a 0.2-m-thick hydrogeologic unit over an infinitely thick, more permeable unit, the Case III quasi-confined model estimates a 15.7-cm/d wave-forced SGD. The quasi-confined model has benthic constituent flux implications in coral reef, karst, and clastic regions. Waves may undermine tracer and seepage meter estimates of SGD at some locations. Copyright 2009 by the American Geophysical Union.

  9. Use of the levonorgestrel 52-mg intrauterine system in adolescent and young adult solid organ transplant recipients: a case series.

    PubMed

    Huguelet, P S; Sheehan, C; Spitzer, R F; Scott, S

    2017-04-01

    This case series reports on the safety and efficacy of the levonorgestrel 52-mg intrauterine system in adolescent and young adult solid organ transplant recipients. All patients used the device for contraception, with no documented cases of disseminated pelvic infection or unplanned pregnancy. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. 27 CFR 19.490 - Numbering of packages and cases filled in processing.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... cases filled in processing. 19.490 Section 19.490 Alcohol, Tobacco Products and Firearms ALCOHOL AND... Marks Marking Requirements for Spirits § 19.490 Numbering of packages and cases filled in processing. (a... any series reaches “1,000,000”, the proprietor may begin a new series with “1” but must add an...

  11. Etiologies of Autism in a Case-Series from Tanzania

    ERIC Educational Resources Information Center

    Mankoski, Raymond E.; Collins, Martha; Ndosi, Noah K.; Mgalla, Ella H.; Sarwatt, Veronica V.; Folstein, Susan E.

    2006-01-01

    Most autism has a genetic cause although post-encephalitis cases are reported. In a case-series (N = 20) from Tanzania, 14 met research criteria for autism. Three (M:F = 1:2) had normal development to age 22, 35, and 42 months, with onset of autism upon recovery from severe malaria, attended by prolonged high fever, convulsions, and in one case…

  12. 76 FR 33658 - Airworthiness Directives; Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100 & 440...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-09

    ... Airworthiness Directives; Bombardier, Inc. Model CL-600-2B19 (Regional Jet Series 100 & 440); Model CL-600-2C10 (Regional Jet Series 700, 701, & 702); Model CL-600-2D15 (Regional Jet Series 705); and Model CL-600-2D24 (Regional Jet Series 900) Airplanes AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Notice of...

  13. Assessment of climate change impacts on meteorological and hydrological droughts in the Jucar River Basin

    NASA Astrophysics Data System (ADS)

    Marcos-Garcia, Patricia; Pulido-Velazquez, Manuel; Lopez-Nicolas, Antonio

    2016-04-01

    Extreme natural phenomena, and more specifically droughts, constitute a serious environmental, economic and social issue in Southern Mediterranean countries, common in the Mediterranean Spanish basins due to the high temporal and spatial rainfall variability. Drought events are characterized by their complexity, being often difficult to identify and quantify both in time and space, and an universally accepted definition does not even exist. This fact, along with future uncertainty about the duration and intensity of the phenomena on account of climate change, makes necessary increasing the knowledge about the impacts of climate change on droughts in order to design management plans and mitigation strategies. The present abstract aims to evaluate the impact of climate change on both meteorological and hydrological droughts, through the use of a generalization of the Standardized Precipitation Index (SPI). We use the Standardized Flow Index (SFI) to assess the hydrological drought, using flow time series instead of rainfall time series. In the case of the meteorological droughts, the Standardized Precipitation and Evapotranspiration Index (SPEI) has been applied to assess the variability of temperature impacts. In order to characterize climate change impacts on droughts, we have used projections from the CORDEX project (Coordinated Regional Climate Downscaling Experiment). Future rainfall and temperature time series for short (2011-2040) and medium terms (2041-2070) were obtained, applying a quantile mapping method to correct the bias of these time series. Regarding the hydrological drought, the Témez hydrological model has been applied to simulate the impacts of future temperature and rainfall time series on runoff and river discharges. It is a conceptual, lumped and a few parameters hydrological model. Nevertheless, it is necessary to point out the time difference between the meteorological and the hydrological droughts. The case study is the Jucar river basin (Spain), a highly regulated system with a share of 80% of water use for irrigated agriculture. The results show that the climate change would increase the historical drought impacts in the river basin. Acknowledgments The study has been supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and European FEDER funds.

  14. EFFECTS OF FOREFOOT RUNNING ON CHRONIC EXERTIONAL COMPARTMENT SYNDROME: A CASE SERIES

    PubMed Central

    Gregory, Robert; Alitz, Curtis; Gerber, J. Parry

    2011-01-01

    Introduction: Chronic exertional compartment syndrome (CECS) is a condition that occurs almost exclusively with running whereby exercise increases intramuscular pressure compromising circulation, prohibiting muscular function, and causing pain in the lower leg. Currently, a lack of evidence exists for the effective conservative management of CECS. Altering running mechanics by adopting forefoot running as opposed to heel striking may assist in the treatment of CECS, specifically with anterior compartment symptoms. Case Description: The purpose of this case series is to describe the outcomes for subjects with CECS through a systematic conservative treatment model focused on forefoot running. Subject one was a 21 y/o female with a 4 year history of CECS and subject two was a 21 y/o male, 7 months status-post two-compartment right leg fasciotomy with a return of symptoms and a new onset of symptoms on the contralateral side. Outcome: Both subjects modified their running technique over a period of six weeks. Kinematic and kinetic analysis revealed increased step rate while step length, impulse, and peak vertical ground reaction forces decreased. In addition, leg intracompartmental pressures decreased from pre-training to post-training. Within 6 weeks of intervention subjects increased their running distance and speed absent of symptoms of CECS. Follow-up questionnaires were completed by the subjects at 7 months following intervention; subject one reported running distances up to 12.87 km pain-free and subject two reported running 6.44 km pain-free consistently 3 times a week. Discussion: This case series describes a potentially beneficial conservative management approach to CECS in the form of forefoot running instruction. Further research in this area is warranted to further explore the benefits of adopting a forefoot running technique for CECS as well as other musculoskeletal overuse complaints. PMID:22163093

  15. COLLABORATE©: a universal competency-based paradigm for professional case management, Part III: key considerations for making the paradigm shift.

    PubMed

    Treiger, Teresa M; Fink-Samnick, Ellen

    2014-01-01

    The purpose of the third of this 3-article series is to provide context and justification for a new paradigm of case management built upon a value-driven foundation that * improves the patient's experience of health care delivery, * provides consistency in approach applicable across health care populations, and * optimizes the potential for return on investment. Applicable to all health care sectors where case management is practiced. In moving forward the one fact that rings true is there will be constant change in our industry. As the health care terrain shifts and new influences continually surface, there will be consequences for case management practice. These impacts require nimble clinical professionals in possession of recognized and firmly established competencies. They must be agile to frame (and reframe) their professional practice to facilitate the best possible outcomes for their patients. Case managers can choose to be Gumby or Pokey. This is exactly why the definition of a competency-based case management model's time has come, one sufficiently fluid to fit into any setting of care. The practice of case management transcends the vast array of representative professional disciplines and educational levels. A majority of current models are driven by business priorities rather than the competencies critical to successful practice and quality patient outcomes. This results in a fragmented professional case management identity. While there is inherent value in what each discipline brings to the table, this advanced model unifies behind case management's unique, strengths-based identity instead of continuing to align within traditional divisions (e.g., discipline, work setting, population served). This model fosters case management's expanding career advancement opportunities, including a reflective clinical ladder.

  16. Fragrance contact allergy: a 4-year retrospective study.

    PubMed

    Cuesta, Laura; Silvestre, Juan Francisco; Toledo, Fernando; Lucas, Ana; Pérez-Crespo, María; Ballester, Irene

    2010-08-01

    Fragrance chemicals are the second most frequent cause of contact allergy. The mandatory labelling of 26 fragrance chemicals when present in cosmetics has facilitated management of patients allergic to fragrances. The study was aimed to define the characteristics of the population allergic to perfumes detected in our hospital district, to determine the usefulness of markers of fragrance allergy in the baseline GEIDAC series, and to describe the contribution made by the fragrance series to the data obtained with the baseline series. We performed a 4-year retrospective study of patients tested with the Spanish baseline series and/or fragrance series. There are four fragrance markers in the baseline series: fragrance mix I (FM I), Myroxylon pereirae, fragrance mix II (FM II), and hydroxyisohexyl 3-cyclohexene carboxaldehyde. A total of 1253 patients were patch tested, 117 (9.3%) of whom were positive to a fragrance marker. FM I and M. pereirae detected 92.5% of the cases of fragrance contact allergy. FM II and hydroxyisohexyl 3-cyclohexene carboxaldehyde detected 6 additional cases and provided further information in 8, enabling improved management. A fragrance series was tested in a selected group of 86 patients and positive results were obtained in 45.3%. Geraniol was the allergen most frequently found in the group of patients tested with the fragrance series. Classic markers detect the majority of cases of fragrance contact allergy. We recommend incorporating FM II in the Spanish baseline series, as in the European baseline series, and using a specific fragrance series to study patients allergic to a fragrance marker.

  17. Time Prediction Models for Echinococcosis Based on Gray System Theory and Epidemic Dynamics.

    PubMed

    Zhang, Liping; Wang, Li; Zheng, Yanling; Wang, Kai; Zhang, Xueliang; Zheng, Yujian

    2017-03-04

    Echinococcosis, which can seriously harm human health and animal husbandry production, has become an endemic in the Xinjiang Uygur Autonomous Region of China. In order to explore an effective human Echinococcosis forecasting model in Xinjiang, three grey models, namely, the traditional grey GM(1,1) model, the Grey-Periodic Extensional Combinatorial Model (PECGM(1,1)), and the Modified Grey Model using Fourier Series (FGM(1,1)), in addition to a multiplicative seasonal ARIMA(1,0,1)(1,1,0)₄ model, are applied in this study for short-term predictions. The accuracy of the different grey models is also investigated. The simulation results show that the FGM(1,1) model has a higher performance ability, not only for model fitting, but also for forecasting. Furthermore, considering the stability and the modeling precision in the long run, a dynamic epidemic prediction model based on the transmission mechanism of Echinococcosis is also established for long-term predictions. Results demonstrate that the dynamic epidemic prediction model is capable of identifying the future tendency. The number of human Echinococcosis cases will increase steadily over the next 25 years, reaching a peak of about 1250 cases, before eventually witnessing a slow decline, until it finally ends.

  18. Using web search query data to monitor dengue epidemics: a new model for neglected tropical disease surveillance.

    PubMed

    Chan, Emily H; Sahai, Vikram; Conrad, Corrie; Brownstein, John S

    2011-05-01

    A variety of obstacles including bureaucracy and lack of resources have interfered with timely detection and reporting of dengue cases in many endemic countries. Surveillance efforts have turned to modern data sources, such as Internet search queries, which have been shown to be effective for monitoring influenza-like illnesses. However, few have evaluated the utility of web search query data for other diseases, especially those of high morbidity and mortality or where a vaccine may not exist. In this study, we aimed to assess whether web search queries are a viable data source for the early detection and monitoring of dengue epidemics. Bolivia, Brazil, India, Indonesia and Singapore were chosen for analysis based on available data and adequate search volume. For each country, a univariate linear model was then built by fitting a time series of the fraction of Google search query volume for specific dengue-related queries from that country against a time series of official dengue case counts for a time-frame within 2003-2010. The specific combination of queries used was chosen to maximize model fit. Spurious spikes in the data were also removed prior to model fitting. The final models, fit using a training subset of the data, were cross-validated against both the overall dataset and a holdout subset of the data. All models were found to fit the data quite well, with validation correlations ranging from 0.82 to 0.99. Web search query data were found to be capable of tracking dengue activity in Bolivia, Brazil, India, Indonesia and Singapore. Whereas traditional dengue data from official sources are often not available until after some substantial delay, web search query data are available in near real-time. These data represent valuable complement to assist with traditional dengue surveillance.

  19. Discovering time-lagged rules from microarray data using gene profile classifiers

    PubMed Central

    2011-01-01

    Background Gene regulatory networks have an essential role in every process of life. In this regard, the amount of genome-wide time series data is becoming increasingly available, providing the opportunity to discover the time-delayed gene regulatory networks that govern the majority of these molecular processes. Results This paper aims at reconstructing gene regulatory networks from multiple genome-wide microarray time series datasets. In this sense, a new model-free algorithm called GRNCOP2 (Gene Regulatory Network inference by Combinatorial OPtimization 2), which is a significant evolution of the GRNCOP algorithm, was developed using combinatorial optimization of gene profile classifiers. The method is capable of inferring potential time-delay relationships with any span of time between genes from various time series datasets given as input. The proposed algorithm was applied to time series data composed of twenty yeast genes that are highly relevant for the cell-cycle study, and the results were compared against several related approaches. The outcomes have shown that GRNCOP2 outperforms the contrasted methods in terms of the proposed metrics, and that the results are consistent with previous biological knowledge. Additionally, a genome-wide study on multiple publicly available time series data was performed. In this case, the experimentation has exhibited the soundness and scalability of the new method which inferred highly-related statistically-significant gene associations. Conclusions A novel method for inferring time-delayed gene regulatory networks from genome-wide time series datasets is proposed in this paper. The method was carefully validated with several publicly available data sets. The results have demonstrated that the algorithm constitutes a usable model-free approach capable of predicting meaningful relationships between genes, revealing the time-trends of gene regulation. PMID:21524308

  20. Hypoplasia or Absence of Posterior Leaflet: A Rare Congenital Anomaly of The Mitral Valve in Adulthood - Case Series.

    PubMed

    Parato, Vito Maurizio; Masia, Stefano Lucio

    2018-01-01

    We present a case series of two adult patients with almost complete absence of the posterior mitral valve leaflet and who are asymptomatic or mildly symptomatic, with two different degrees of mitral regurgitation.

  1. Use of cantilever mechanics for impacted teeth: case series.

    PubMed

    Paduano, Sergio; Spagnuolo, Gianrico; Franzese, Gerardo; Pellegrino, Gioacchino; Valletta, Rosa; Cioffi, Iacopo

    2013-01-01

    This paper describes the orthodontic treatment, and the biomechanics of cantilevers for the impaction of permanent teeth in youngs, adolescents, and adults. In these case series, multibracket straightwire fixed appliances, together with cantilever mechanics, were used to treat the impaired occlusion.

  2. Evidence of Nanoflare Heating in Coronal Loops Observed with Hinolde-XRT and SDO-AIA

    NASA Technical Reports Server (NTRS)

    Lopez-Fuentes, M. C.; Klimchuk, James

    2013-01-01

    We study a series of coronal loop lightcurves from X-ray and EUV observations. In search for signatures of nanoflare heating, we analyze the statistical properties of the observed lightcurves and compare them with synthetic cases obtained with a 2D cellular-automaton model based on nanoflare heating driven by photospheric motions. Our analysis shows that the observed and the model lightcurves have similar statistical properties. The asymmetries observed in the distribution of the intensity fluctuations indicate the possible presence of widespread cooling processes in sub-resolution magnetic strands.

  3. Utilization of VAS satellite data in the initialization of an oceanic cyclogenesis simulation

    NASA Technical Reports Server (NTRS)

    Douglas, Sharon G.; Warner, Thomas T.

    1987-01-01

    A series of experiments was performed to test various methods of incorporating Visible Infrared Spin Scan Radiometer Atmospheric Sounder (VAS)-sounding data into the initial conditions of the Penn State University/National Center for Atmospheric mesoscale model. The VAS data for this ocean-cyclogenesis case consist of 110 irregularly distributed temperature and humidity soundings located over the North Pacific Ocean and apply at approximately 1200 GMT November 10, 1981. Various methods of utilizing VAS data in the initial condition of a mesoscale model were evaluated.

  4. Utilization of VAS satellite data in the initialization of an oceanic-cyclogenesis simulation

    NASA Technical Reports Server (NTRS)

    Douglas, Sharon G.; Warner, Thomas T.

    1986-01-01

    A series of experiments was performed to test various method of incorporating Visible Infrared Spin Scan Radiometer Atmospheric Sounder (VAS)-sounding data into the initial conditions of the Penn State University/National Center for Atmospheric mesoscale model. The VAS data for this ocean-cyclogenesis case consist of 110 irregularly distributed temperature and humidity soundings located over the North Pacific Ocean and apply at approximately 1200 GMT 10 November 1981. Various methods of utilizing VAS data in the initial condition of a mesoscale model were evaluated.

  5. Defined contribution defined: health insurance for the next century.

    PubMed

    Marhula, D C; Shannon, E G

    2001-01-01

    The consumerism movement will dramatically affect the current payer model and present a new series of challenges for managed care organizations. Employers will fuel the changes, as they create health benefit programs that are modeled after retirement programs. In these cases, employers will shift a major portion of financial responsibility to employees, who will be asked to make buying decisions often previously determined by managed care organizations. New business entities known as health navigators will be formed to aid consumers. However, many structural and policy obstacles may slow or transform the consumerism movement.

  6. A path model for Whittaker vectors

    NASA Astrophysics Data System (ADS)

    Di Francesco, Philippe; Kedem, Rinat; Turmunkh, Bolor

    2017-06-01

    In this paper we construct weighted path models to compute Whittaker vectors in the completion of Verma modules, as well as Whittaker functions of fundamental type, for all finite-dimensional simple Lie algebras, affine Lie algebras, and the quantum algebra U_q(slr+1) . This leads to series expressions for the Whittaker functions. We show how this construction leads directly to the quantum Toda equations satisfied by these functions, and to the q-difference equations in the quantum case. We investigate the critical limit of affine Whittaker functions computed in this way.

  7. Reconstructing land use history from Landsat time-series. Case study of a swidden agriculture system in Brazil

    NASA Astrophysics Data System (ADS)

    Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert

    2016-05-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond.

  8. Reconstructing Land Use History from Landsat Time-Series. Case study of Swidden Agriculture Intensification in Brazil

    NASA Astrophysics Data System (ADS)

    Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.

    2015-12-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for land use history mapping in the tropics and beyond. Spatial and temporal patterns were further analysed with an ecological perspective in a follow-up study. Results show that changes in land use patterns such as land use intensification and reduced agricultural expansion reflect the socio-economic transformations that occurred in the region

  9. Missing metastases as a model to challenge current therapeutic algorithms in colorectal liver metastases.

    PubMed

    Lucidi, Valerio; Hendlisz, Alain; Van Laethem, Jean-Luc; Donckier, Vincent

    2016-04-21

    In oncosurgical approach to colorectal liver metastases, surgery remains considered as the only potentially curative option, while chemotherapy alone represents a strictly palliative treatment. However, missing metastases, defined as metastases disappearing after chemotherapy, represent a unique model to evaluate the curative potential of chemotherapy and to challenge current therapeutic algorithms. We reviewed recent series on missing colorectal liver metastases to evaluate incidence of this phenomenon, predictive factors and rates of cure defined by complete pathologic response in resected missing metastases and sustained clinical response when they were left unresected. According to the progresses in the efficacy of chemotherapeutic regimen, the incidence of missing liver metastases regularly increases these last years. Main predictive factors are small tumor size, low marker level, duration of chemotherapy, and use of intra-arterial chemotherapy. Initial series showed low rates of complete pathologic response in resected missing metastases and high recurrence rates when unresected. However, recent reports describe complete pathologic responses and sustained clinical responses reaching 50%, suggesting that chemotherapy could be curative in some cases. Accordingly, in case of missing colorectal liver metastases, the classical recommendation to resect initial tumor sites might have become partially obsolete. Furthermore, the curative effect of chemotherapy in selected cases could lead to a change of paradigm in patients with unresectable liver-only metastases, using intensive first-line chemotherapy to intentionally induce missing metastases, followed by adjuvant surgery on remnant chemoresistant tumors and close surveillance of initial sites that have been left unresected.

  10. Computer models of social processes: the case of migration.

    PubMed

    Beshers, J M

    1967-06-01

    The demographic model is a program for representing births, deaths, migration, and social mobility as social processes in a non-stationary stochastic process (Markovian). Transition probabilities for each age group are stored and then retrieved at the next appearance of that age cohort. In this way new transition probabilities can be calculated as a function of the old transition probabilities and of two successive distribution vectors.Transition probabilities can be calculated to represent effects of the whole age-by-state distribution at any given time period, too. Such effects as saturation or queuing may be represented by a market mechanism; for example, migration between metropolitan areas can be represented as depending upon job supplies and labor markets. Within metropolitan areas, migration can be represented as invasion and succession processes with tipping points (acceleration curves), and the market device has been extended to represent this phenomenon.Thus, the demographic model makes possible the representation of alternative classes of models of demographic processes. With each class of model one can deduce implied time series (varying parame-terswithin the class) and the output of the several classes can be compared to each other and to outside criteria, such as empirical time series.

  11. Vitamin D toxicity of dietary origin in cats fed a natural complementary kitten food

    PubMed Central

    Crossley, Victoria J; Bovens, Catherine PV; Pineda, Carmen; Hibbert, Angie; Finch, Natalie C

    2017-01-01

    Case series summary This case series describes two young sibling cats and an additional unrelated cat, from two separate households, that developed hypercalcaemia associated with hypervitaminosis D. Excessive vitamin D concentrations were identified in a natural complementary tinned kitten food that was fed to all three cats as part of their diet. In one of the cases, there was clinical evidence of soft tissue mineralisation. The hypercalcaemia and soft tissue mineralisation resolved following withdrawal of the affected food and medical management of the hypercalcaemia. Relevance and novel information This case series demonstrates the importance of obtaining a thorough dietary history in patients presenting with hypercalcaemia and the measurement of vitamin D metabolites when investigating such cases. Complementary foods may have the potential to induce nutritional toxicity even when fed with complete, nutritionally balanced diets. PMID:29270305

  12. Reconstruction of extended Petri nets from time series data and its application to signal transduction and to gene regulatory networks

    PubMed Central

    2011-01-01

    Background Network inference methods reconstruct mathematical models of molecular or genetic networks directly from experimental data sets. We have previously reported a mathematical method which is exclusively data-driven, does not involve any heuristic decisions within the reconstruction process, and deliveres all possible alternative minimal networks in terms of simple place/transition Petri nets that are consistent with a given discrete time series data set. Results We fundamentally extended the previously published algorithm to consider catalysis and inhibition of the reactions that occur in the underlying network. The results of the reconstruction algorithm are encoded in the form of an extended Petri net involving control arcs. This allows the consideration of processes involving mass flow and/or regulatory interactions. As a non-trivial test case, the phosphate regulatory network of enterobacteria was reconstructed using in silico-generated time-series data sets on wild-type and in silico mutants. Conclusions The new exact algorithm reconstructs extended Petri nets from time series data sets by finding all alternative minimal networks that are consistent with the data. It suggested alternative molecular mechanisms for certain reactions in the network. The algorithm is useful to combine data from wild-type and mutant cells and may potentially integrate physiological, biochemical, pharmacological, and genetic data in the form of a single model. PMID:21762503

  13. Neonatal Medical Exposures and Characteristics of Low Birth Weight Hepatoblastoma Cases: A Report From the Children's Oncology Group

    PubMed Central

    Turcotte, Lucie M.; Georgieff, Michael K.; Ross, Julie A.; Feusner, James H.; Tomlinson, Gail E.; Malogolowkin, Marcio H.; Krailo, Mark D.; Miller, Nicole; Fonstad, Rachel; Spector, Logan G.

    2015-01-01

    Background Hepatoblastoma is a malignancy of young children. Low birth weight is associated with significantly increased risk of hepatoblastoma and neonatal medical exposures are hypothesized as contributors. This study represents the largest case–control study of hepatoblastoma to date and aimed to define the role of neonatal exposures in hepatoblastoma risk among low birth weight children. Procedure Incident hepatoblastoma cases who were born <2,500 g (N = 60), diagnosed between 2000 and 2008, were identified through the Children's Oncology Group. Controls were recruited through state birth registries (N = 51). Neonatal medical exposures were abstracted from medical records. Subjects from the Vermont Oxford Network were used for further comparisons, as were existing reports on neonatal medical exposures. Results Case–control comparisons were hindered by poor matching within birth weight strata. Cases were smaller and received more aggressive neonatal treatment compared to controls, and reflected high correlation levels between birth weight and treatments. Similar difficulty was encountered when comparing cases to Vermont Oxford Network subjects; cases were smaller and required more aggressive neonatal therapy. Furthermore, it appears hepatoblastoma cases were exposed to a greater number of diagnostic X-rays than in case series previously reported in the neonatal literature. Conclusions This study presents the largest case series of hepatoblastoma in <2,500 g birth weight infants with accompanying neonatal medical exposure data. Findings confirm that birth weight is highly correlated with exposure intensity, and neonatal exposures are themselves highly correlated, which hampers the identification of a causal exposure among hepatoblastoma cases. Experimental models or genetic susceptibility testing may be more revealing of etiology. PMID:25044669

  14. The many faces of intestinal tract gastric heterotopia; a series of four cases highlighting clinical and pathological heterogeneity.

    PubMed

    Nasir, Aqsa; Amateau, Stuart K; Khan, Sabina; Simpson, Ross W; Snover, Dale C; Amin, Khalid

    2018-04-01

    Gastric heterotopia of the intestinal tract can have a diverse clinicopathologic presentation, resulting in a diagnostic dilemma. We present a series of four cases, two male and two female patients with age range of 31-82 years, found in the duodenum, jejunum, and transverse colon. The most common and rather unusual clinical presentation was iron deficiency anemia, seen in three cases, while one patient presented with abdominal pain. Endoscopically, two cases were visualized as pedunculated polyps and two as sessile/plaque-like lesions. Polypectomy was performed in three patients, and one patient underwent biopsy followed by resection. Two cases showed oxyntic-type epithelium, and two cases exhibited pyloric-type gastric epithelium. Three patients were relieved of their presenting symptoms after therapeutic procedures with no evidence of recurrence noted on follow-up. Follow-up was not available on one patient. This case series highlights a diverse clinicopathologic spectrum of gastric heterotopia. Accurate diagnosis is essential for proper management. Copyright © 2018. Published by Elsevier Inc.

  15. On a General Class of Trigonometric Functions and Fourier Series

    ERIC Educational Resources Information Center

    Pavao, H. Germano; Capelas de Oliveira, E.

    2008-01-01

    We discuss a general class of trigonometric functions whose corresponding Fourier series can be used to calculate several interesting numerical series. Particular cases are presented. (Contains 4 notes.)

  16. A recurrent 16p12.1 microdeletion suggests a two-hit model for severe developmental delay

    PubMed Central

    Girirajan, Santhosh; Rosenfeld, Jill A.; Cooper, Gregory M.; Antonacci, Francesca; Siswara, Priscillia; Itsara, Andy; Vives, Laura; Walsh, Tom; McCarthy, Shane E.; Baker, Carl; Mefford, Heather C.; Kidd, Jeffrey M.; Browning, Sharon R.; Browning, Brian L.; Dickel, Diane E.; Levy, Deborah L.; Ballif, Blake C.; Platky, Kathryn; Farber, Darren M.; Gowans, Gordon C.; Wetherbee, Jessica J.; Asamoah, Alexander; Weaver, David D.; Mark, Paul R.; Dickerson, Jennifer; Garg, Bhuwan P.; Ellingwood, Sara A.; Smith, Rosemarie; Banks, Valerie C.; Smith, Wendy; McDonald, Marie T.; Hoo, Joe J.; French, Beatrice N.; Hudson, Cindy; Johnson, John P.; Ozmore, Jillian R.; Moeschler, John B.; Surti, Urvashi; Escobar, Luis F.; El-Kechen, Dima; Gorski, Jerome L.; Kussman, Jennifer; Salbert, Bonnie; Lacassie, Yves; Biser, Alisha; McDonald-McGinn, Donna M.; Zackai, Elaine H.; Deardorff, Matthew A.; Shaikh, Tamim H.; Haan, Eric; Friend, Kathryn L.; Fichera, Marco; Romano, Corrado; Gécz, Jozef; deLisi, Lynn E.; Sebat, Jonathan; King, Mary-Claire; Shaffer, Lisa G.; Eichler, Evan E.

    2010-01-01

    We report the identification of a recurrent 520-kbp 16p12.1 microdeletion significantly associated with childhood developmental delay. The microdeletion was detected in 20/11,873 cases vs. 2/8,540 controls (p=0.0009, OR=7.2) and replicated in a second series of 22/9,254 cases vs. 6/6,299 controls (p=0.028, OR=2.5). Most deletions were inherited with carrier parents likely to manifest neuropsychiatric phenotypes (p=0.037, OR=6). Probands were more likely to carry an additional large CNV when compared to matched controls (10/42 cases, p=5.7×10-5, OR=6.65). Clinical features of cases with two mutations were distinct from and/or more severe than clinical features of patients carrying only the co-occurring mutation. Our data suggest a two-hit model in which the 16p12.1 microdeletion both predisposes to neuropsychiatric phenotypes as a single event and exacerbates neurodevelopmental phenotypes in association with other large deletions or duplications. Analysis of other microdeletions with variable expressivity suggests that this two-hit model may be more generally applicable to neuropsychiatric disease. PMID:20154674

  17. BestPractices Corporate Energy Management Case Study: Alcoa Teams with DOE to Reduce Energy Consumption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2004-05-01

    This is the first in a series of DOE Industrial Technologies Program case studies on corporate energy management. The case study highlights Alcoa Aluminum's successful results and activities through its corporate energy management approach and collaboration with DOE. Case studies in this series will be used to encourage other energy-intensive industrial plants to adopt a corporate strategy, and to promote the concept of replicating results with a company or industry.

  18. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    NASA Astrophysics Data System (ADS)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  19. Aerodynamic data banks for Clark-Y, NACA 4-digit and NACA 16-series airfoil families

    NASA Technical Reports Server (NTRS)

    Korkan, K. D.; Camba, J., III; Morris, P. M.

    1986-01-01

    With the renewed interest in propellers as means of obtaining thrust and fuel efficiency in addition to the increased utilization of the computer, a significant amount of progress was made in the development of theoretical models to predict the performance of propeller systems. Inherent in the majority of the theoretical performance models to date is the need for airfoil data banks which provide lift, drag, and moment coefficient values as a function of Mach number, angle-of-attack, maximum thickness to chord ratio, and Reynolds number. Realizing the need for such data, a study was initiated to provide airfoil data banks for three commonly used airfoil families in propeller design and analysis. The families chosen consisted of the Clark-Y, NACA 16 series, and NACA 4 digit series airfoils. The various component of each computer code, the source of the data used to create the airfoil data bank, the limitations of each data bank, program listing, and a sample case with its associated input-output are described. Each airfoil data bank computer code was written to be used on the Amdahl Computer system, which is IBM compatible and uses Fortran.

  20. Renormalization of a tensorial field theory on the homogeneous space SU(2)/U(1)

    NASA Astrophysics Data System (ADS)

    Lahoche, Vincent; Oriti, Daniele

    2017-01-01

    We study the renormalization of a general field theory on the homogeneous space (SU(2)/ ≤ft. U(1)\\right){{}× d} with tensorial interaction and gauge invariance under the diagonal action of SU(2). We derive the power counting for arbitrary d. For the case d  =  4, we prove perturbative renormalizability to all orders via multi-scale analysis, study both the renormalized and effective perturbation series, and establish the asymptotic freedom of the model. We also outline a general power counting for the homogeneous space {{≤ft(SO(D)/SO(D-1)\\right)}× d} , of direct interest for quantum gravity models in arbitrary dimension, and point out the obstructions to the direct generalization of our results to these cases.

Top