Sample records for random utility models

  1. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE PAGES

    Ye, Xin; Garikapati, Venu M.; You, Daehyun; ...

    2017-11-08

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  2. A practical method to test the validity of the standard Gumbel distribution in logit-based multinomial choice models of travel behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Xin; Garikapati, Venu M.; You, Daehyun

    Most multinomial choice models (e.g., the multinomial logit model) adopted in practice assume an extreme-value Gumbel distribution for the random components (error terms) of utility functions. This distributional assumption offers a closed-form likelihood expression when the utility maximization principle is applied to model choice behaviors. As a result, model coefficients can be easily estimated using the standard maximum likelihood estimation method. However, maximum likelihood estimators are consistent and efficient only if distributional assumptions on the random error terms are valid. It is therefore critical to test the validity of underlying distributional assumptions on the error terms that form the basismore » of parameter estimation and policy evaluation. In this paper, a practical yet statistically rigorous method is proposed to test the validity of the distributional assumption on the random components of utility functions in both the multinomial logit (MNL) model and multiple discrete-continuous extreme value (MDCEV) model. Based on a semi-nonparametric approach, a closed-form likelihood function that nests the MNL or MDCEV model being tested is derived. The proposed method allows traditional likelihood ratio tests to be used to test violations of the standard Gumbel distribution assumption. Simulation experiments are conducted to demonstrate that the proposed test yields acceptable Type-I and Type-II error probabilities at commonly available sample sizes. The test is then applied to three real-world discrete and discrete-continuous choice models. For all three models, the proposed test rejects the validity of the standard Gumbel distribution in most utility functions, calling for the development of robust choice models that overcome adverse effects of violations of distributional assumptions on the error terms in random utility functions.« less

  3. Designing management strategies for carbon dioxide storage and utilization under uncertainty using inexact modelling

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2017-06-01

    Effective application of carbon capture, utilization and storage (CCUS) systems could help to alleviate the influence of climate change by reducing carbon dioxide (CO2) emissions. The research objective of this study is to develop an equilibrium chance-constrained programming model with bi-random variables (ECCP model) for supporting the CCUS management system under random circumstances. The major advantage of the ECCP model is that it tackles random variables as bi-random variables with a normal distribution, where the mean values follow a normal distribution. This could avoid irrational assumptions and oversimplifications in the process of parameter design and enrich the theory of stochastic optimization. The ECCP model is solved by an equilibrium change-constrained programming algorithm, which provides convenience for decision makers to rank the solution set using the natural order of real numbers. The ECCP model is applied to a CCUS management problem, and the solutions could be useful in helping managers to design and generate rational CO2-allocation patterns under complexities and uncertainties.

  4. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids

    PubMed Central

    Ali, S. M.; Mehmood, C. A; Khan, B.; Jawad, M.; Farid, U; Jadoon, J. K.; Ali, M.; Tareen, N. K.; Usman, S.; Majid, M.; Anwar, S. M.

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion. PMID:27314229

  5. Stochastic and Statistical Analysis of Utility Revenues and Weather Data Analysis for Consumer Demand Estimation in Smart Grids.

    PubMed

    Ali, S M; Mehmood, C A; Khan, B; Jawad, M; Farid, U; Jadoon, J K; Ali, M; Tareen, N K; Usman, S; Majid, M; Anwar, S M

    2016-01-01

    In smart grid paradigm, the consumer demands are random and time-dependent, owning towards stochastic probabilities. The stochastically varying consumer demands have put the policy makers and supplying agencies in a demanding position for optimal generation management. The utility revenue functions are highly dependent on the consumer deterministic stochastic demand models. The sudden drifts in weather parameters effects the living standards of the consumers that in turn influence the power demands. Considering above, we analyzed stochastically and statistically the effect of random consumer demands on the fixed and variable revenues of the electrical utilities. Our work presented the Multi-Variate Gaussian Distribution Function (MVGDF) probabilistic model of the utility revenues with time-dependent consumer random demands. Moreover, the Gaussian probabilities outcome of the utility revenues is based on the varying consumer n demands data-pattern. Furthermore, Standard Monte Carlo (SMC) simulations are performed that validated the factor of accuracy in the aforesaid probabilistic demand-revenue model. We critically analyzed the effect of weather data parameters on consumer demands using correlation and multi-linear regression schemes. The statistical analysis of consumer demands provided a relationship between dependent (demand) and independent variables (weather data) for utility load management, generation control, and network expansion.

  6. Collective states in social systems with interacting learning agents

    NASA Astrophysics Data System (ADS)

    Semeshenko, Viktoriya; Gordon, Mirta B.; Nadal, Jean-Pierre

    2008-08-01

    We study the implications of social interactions and individual learning features on consumer demand in a simple market model. We consider a social system of interacting heterogeneous agents with learning abilities. Given a fixed price, agents repeatedly decide whether or not to buy a unit of a good, so as to maximize their expected utilities. This model is close to Random Field Ising Models, where the random field corresponds to the idiosyncratic willingness to pay. We show that the equilibrium reached depends on the nature of the information agents use to estimate their expected utilities. It may be different from the systems’ Nash equilibria.

  7. Using partial site aggregation to reduce bias in random utility travel cost models

    NASA Astrophysics Data System (ADS)

    Lupi, Frank; Feather, Peter M.

    1998-12-01

    We propose a "partial aggregation" strategy for defining the recreation sites that enter choice sets in random utility models. Under the proposal, the most popular sites and sites that will be the subject of policy analysis enter choice sets as individual sites while remaining sites are aggregated into groups of similar sites. The scheme balances the desire to include all potential substitute sites in the choice sets with practical data and modeling constraints. Unlike fully aggregate models, our analysis and empirical applications suggest that the partial aggregation approach reasonably approximates the results of a disaggregate model. The partial aggregation approach offers all of the data and computational advantages of models with aggregate sites but does not suffer from the same degree of bias as fully aggregate models.

  8. Genetic Parameter Estimates for Metabolizing Two Common Pharmaceuticals in Swine.

    PubMed

    Howard, Jeremy T; Ashwell, Melissa S; Baynes, Ronald E; Brooks, James D; Yeatts, James L; Maltecca, Christian

    2018-01-01

    In livestock, the regulation of drugs used to treat livestock has received increased attention and it is currently unknown how much of the phenotypic variation in drug metabolism is due to the genetics of an animal. Therefore, the objective of the study was to determine the amount of phenotypic variation in fenbendazole and flunixin meglumine drug metabolism due to genetics. The population consisted of crossbred female and castrated male nursery pigs ( n = 198) that were sired by boars represented by four breeds. The animals were spread across nine batches. Drugs were administered intravenously and blood collected a minimum of 10 times over a 48 h period. Genetic parameters for the parent drug and metabolite concentration within each drug were estimated based on pharmacokinetics (PK) parameters or concentrations across time utilizing a random regression model. The PK parameters were estimated using a non-compartmental analysis. The PK model included fixed effects of sex and breed of sire along with random sire and batch effects. The random regression model utilized Legendre polynomials and included a fixed population concentration curve, sex, and breed of sire effects along with a random sire deviation from the population curve and batch effect. The sire effect included the intercept for all models except for the fenbendazole metabolite (i.e., intercept and slope). The mean heritability across PK parameters for the fenbendazole and flunixin meglumine parent drug (metabolite) was 0.15 (0.18) and 0.31 (0.40), respectively. For the parent drug (metabolite), the mean heritability across time was 0.27 (0.60) and 0.14 (0.44) for fenbendazole and flunixin meglumine, respectively. The errors surrounding the heritability estimates for the random regression model were smaller compared to estimates obtained from PK parameters. Across both the PK and plasma drug concentration across model, a moderate heritability was estimated. The model that utilized the plasma drug concentration across time resulted in estimates with a smaller standard error compared to models that utilized PK parameters. The current study found a low to moderate proportion of the phenotypic variation in metabolizing fenbendazole and flunixin meglumine that was explained by genetics in the current study.

  9. Genetic Parameter Estimates for Metabolizing Two Common Pharmaceuticals in Swine

    PubMed Central

    Howard, Jeremy T.; Ashwell, Melissa S.; Baynes, Ronald E.; Brooks, James D.; Yeatts, James L.; Maltecca, Christian

    2018-01-01

    In livestock, the regulation of drugs used to treat livestock has received increased attention and it is currently unknown how much of the phenotypic variation in drug metabolism is due to the genetics of an animal. Therefore, the objective of the study was to determine the amount of phenotypic variation in fenbendazole and flunixin meglumine drug metabolism due to genetics. The population consisted of crossbred female and castrated male nursery pigs (n = 198) that were sired by boars represented by four breeds. The animals were spread across nine batches. Drugs were administered intravenously and blood collected a minimum of 10 times over a 48 h period. Genetic parameters for the parent drug and metabolite concentration within each drug were estimated based on pharmacokinetics (PK) parameters or concentrations across time utilizing a random regression model. The PK parameters were estimated using a non-compartmental analysis. The PK model included fixed effects of sex and breed of sire along with random sire and batch effects. The random regression model utilized Legendre polynomials and included a fixed population concentration curve, sex, and breed of sire effects along with a random sire deviation from the population curve and batch effect. The sire effect included the intercept for all models except for the fenbendazole metabolite (i.e., intercept and slope). The mean heritability across PK parameters for the fenbendazole and flunixin meglumine parent drug (metabolite) was 0.15 (0.18) and 0.31 (0.40), respectively. For the parent drug (metabolite), the mean heritability across time was 0.27 (0.60) and 0.14 (0.44) for fenbendazole and flunixin meglumine, respectively. The errors surrounding the heritability estimates for the random regression model were smaller compared to estimates obtained from PK parameters. Across both the PK and plasma drug concentration across model, a moderate heritability was estimated. The model that utilized the plasma drug concentration across time resulted in estimates with a smaller standard error compared to models that utilized PK parameters. The current study found a low to moderate proportion of the phenotypic variation in metabolizing fenbendazole and flunixin meglumine that was explained by genetics in the current study. PMID:29487615

  10. A random utility model of delay discounting and its application to people with externalizing psychopathology.

    PubMed

    Dai, Junyi; Gunn, Rachel L; Gerst, Kyle R; Busemeyer, Jerome R; Finn, Peter R

    2016-10-01

    Previous studies have demonstrated that working memory capacity plays a central role in delay discounting in people with externalizing psychopathology. These studies used a hyperbolic discounting model, and its single parameter-a measure of delay discounting-was estimated using the standard method of searching for indifference points between intertemporal options. However, there are several problems with this approach. First, the deterministic perspective on delay discounting underlying the indifference point method might be inappropriate. Second, the estimation procedure using the R2 measure often leads to poor model fit. Third, when parameters are estimated using indifference points only, much of the information collected in a delay discounting decision task is wasted. To overcome these problems, this article proposes a random utility model of delay discounting. The proposed model has 2 parameters, 1 for delay discounting and 1 for choice variability. It was fit to choice data obtained from a recently published data set using both maximum-likelihood and Bayesian parameter estimation. As in previous studies, the delay discounting parameter was significantly associated with both externalizing problems and working memory capacity. Furthermore, choice variability was also found to be significantly associated with both variables. This finding suggests that randomness in decisions may be a mechanism by which externalizing problems and low working memory capacity are associated with poor decision making. The random utility model thus has the advantage of disclosing the role of choice variability, which had been masked by the traditional deterministic model. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. Stemflow estimation in a redwood forest using model-based stratified random sampling

    Treesearch

    Jack Lewis

    2003-01-01

    Model-based stratified sampling is illustrated by a case study of stemflow volume in a redwood forest. The approach is actually a model-assisted sampling design in which auxiliary information (tree diameter) is utilized in the design of stratum boundaries to optimize the efficiency of a regression or ratio estimator. The auxiliary information is utilized in both the...

  12. Diabetes Care Management Teams Did Not Reduce Utilization When Compared With Traditional Care: A Randomized Cluster Trial.

    PubMed

    Kearns, Patrick

    2017-10-01

    PURPOSE: Health services research evaluates redesign models for primary care. Care management is one alternative. Evaluation includes resource utilization as a criterion. Compare the impact of care-manager teams on resource utilization. The comparison includes entire panes of patients and the subset of patients with diabetes. DESIGN: Randomized, prospective, cohort study comparing change in utilization rates between groups, pre- and post-intervention. METHODOLOGY: Ten primary care physician panels in a safety-net setting. Ten physicians were randomized to either a care-management approach (Group 1) or a traditional approach (Group 2). Care managers focused on diabetes and the cardiovascular cluster of diseases. Analysis compared rates of hospitalization, 30-day readmission, emergency room visits, and urgent care visits. Analysis compared baseline rates to annual rates after a yearlong run-in for entire panels and the subset of patients with diabetes. RESULTS: Resource utilization showed no statistically significant change between baseline and Year 3 (P=.79). Emergency room visits and hospital readmission increased for both groups (P=.90), while hospital admissions and urgent care visits decreased (P=.73). Similarly, utilization was not significantly different for patients with diabetes (P=.69). CONCLUSIONS: A care-management team approach failed to improve resource utilization rates by entire panels and the subset of diabetic patients compared to traditional care. This reinforces the need for further evidentiary support for the care-management model's hypothesis in the safety net.

  13. An Evaluation of a Behaviorally Based Social Skills Group for Individuals Diagnosed with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Jeremy A.; Milne, Christine; Taubman, Mitchell; Oppenheim-Leaf, Misty; Torres, Norma; Townley-Cochran, Donna; Leaf, Ronald; McEachin, John; Yoder, Paul

    2017-01-01

    In this study we evaluated a social skills group which employed a progressive applied behavior analysis model for individuals diagnosed with autism spectrum disorder. A randomized control trial was utilized; eight participants were randomly assigned to a treatment group and seven participants were randomly assigned to a waitlist control group. The…

  14. A random forest algorithm for nowcasting of intense precipitation events

    NASA Astrophysics Data System (ADS)

    Das, Saurabh; Chakraborty, Rohit; Maitra, Animesh

    2017-09-01

    Automatic nowcasting of convective initiation and thunderstorms has potential applications in several sectors including aviation planning and disaster management. In this paper, random forest based machine learning algorithm is tested for nowcasting of convective rain with a ground based radiometer. Brightness temperatures measured at 14 frequencies (7 frequencies in 22-31 GHz band and 7 frequencies in 51-58 GHz bands) are utilized as the inputs of the model. The lower frequency band is associated to the water vapor absorption whereas the upper frequency band relates to the oxygen absorption and hence, provide information on the temperature and humidity of the atmosphere. Synthetic minority over-sampling technique is used to balance the data set and 10-fold cross validation is used to assess the performance of the model. Results indicate that random forest algorithm with fixed alarm generation time of 30 min and 60 min performs quite well (probability of detection of all types of weather condition ∼90%) with low false alarms. It is, however, also observed that reducing the alarm generation time improves the threat score significantly and also decreases false alarms. The proposed model is found to be very sensitive to the boundary layer instability as indicated by the variable importance measure. The study shows the suitability of a random forest algorithm for nowcasting application utilizing a large number of input parameters from diverse sources and can be utilized in other forecasting problems.

  15. Rescaling quality of life values from discrete choice experiments for use as QALYs: a cautionary tale

    PubMed Central

    Flynn, Terry N; Louviere, Jordan J; Marley, Anthony AJ; Coast, Joanna; Peters, Tim J

    2008-01-01

    Background Researchers are increasingly investigating the potential for ordinal tasks such as ranking and discrete choice experiments to estimate QALY health state values. However, the assumptions of random utility theory, which underpin the statistical models used to provide these estimates, have received insufficient attention. In particular, the assumptions made about the decisions between living states and the death state are not satisfied, at least for some people. Estimated values are likely to be incorrectly anchored with respect to death (zero) in such circumstances. Methods Data from the Investigating Choice Experiments for the preferences of older people CAPability instrument (ICECAP) valuation exercise were analysed. The values (previously anchored to the worst possible state) were rescaled using an ordinal model proposed previously to estimate QALY-like values. Bootstrapping was conducted to vary artificially the proportion of people who conformed to the conventional random utility model underpinning the analyses. Results Only 26% of respondents conformed unequivocally to the assumptions of conventional random utility theory. At least 14% of respondents unequivocally violated the assumptions. Varying the relative proportions of conforming respondents in sensitivity analyses led to large changes in the estimated QALY values, particularly for lower-valued states. As a result these values could be either positive (considered to be better than death) or negative (considered to be worse than death). Conclusion Use of a statistical model such as conditional (multinomial) regression to anchor quality of life values from ordinal data to death is inappropriate in the presence of respondents who do not conform to the assumptions of conventional random utility theory. This is clearest when estimating values for that group of respondents observed in valuation samples who refuse to consider any living state to be worse than death: in such circumstances the model cannot be estimated. Only a valuation task requiring respondents to make choices in which both length and quality of life vary can produce estimates that properly reflect the preferences of all respondents. PMID:18945358

  16. Box-Cox Mixed Logit Model for Travel Behavior Analysis

    NASA Astrophysics Data System (ADS)

    Orro, Alfonso; Novales, Margarita; Benitez, Francisco G.

    2010-09-01

    To represent the behavior of travelers when they are deciding how they are going to get to their destination, discrete choice models, based on the random utility theory, have become one of the most widely used tools. The field in which these models were developed was halfway between econometrics and transport engineering, although the latter now constitutes one of their principal areas of application. In the transport field, they have mainly been applied to mode choice, but also to the selection of destination, route, and other important decisions such as the vehicle ownership. In usual practice, the most frequently employed discrete choice models implement a fixed coefficient utility function that is linear in the parameters. The principal aim of this paper is to present the viability of specifying utility functions with random coefficients that are nonlinear in the parameters, in applications of discrete choice models to transport. Nonlinear specifications in the parameters were present in discrete choice theory at its outset, although they have seldom been used in practice until recently. The specification of random coefficients, however, began with the probit and the hedonic models in the 1970s, and, after a period of apparent little practical interest, has burgeoned into a field of intense activity in recent years with the new generation of mixed logit models. In this communication, we present a Box-Cox mixed logit model, original of the authors. It includes the estimation of the Box-Cox exponents in addition to the parameters of the random coefficients distribution. Probability of choose an alternative is an integral that will be calculated by simulation. The estimation of the model is carried out by maximizing the simulated log-likelihood of a sample of observed individual choices between alternatives. The differences between the predictions yielded by models that are inconsistent with real behavior have been studied with simulation experiments.

  17. Deleterious Thermal Effects Due To Randomized Flow Paths in Pebble Bed, and Particle Bed Style Reactors

    NASA Technical Reports Server (NTRS)

    Moran, Robert P.

    2013-01-01

    A review of literature associated with Pebble Bed and Particle Bed reactor core research has revealed a systemic problem inherent to reactor core concepts which utilize randomized rather than structured coolant channel flow paths. For both the Pebble Bed and Particle Bed Reactor designs; case studies reveal that for indeterminate reasons, regions within the core would suffer from excessive heating leading to thermal runaway and localized fuel melting. A thermal Computational Fluid Dynamics model was utilized to verify that In both the Pebble Bed and Particle Bed Reactor concepts randomized coolant channel pathways combined with localized high temperature regions would work together to resist the flow of coolant diverting it away from where it is needed the most to cooler less resistive pathways where it is needed the least. In other words given the choice via randomized coolant pathways the reactor coolant will take the path of least resistance, and hot zones offer the highest resistance. Having identified the relationship between randomized coolant channel pathways and localized fuel melting it is now safe to assume that other reactor concepts that utilize randomized coolant pathways such as the foam core reactor are also susceptible to this phenomenon.

  18. A novel method for predicting the power outputs of wave energy converters

    NASA Astrophysics Data System (ADS)

    Wang, Yingguang

    2018-03-01

    This paper focuses on realistically predicting the power outputs of wave energy converters operating in shallow water nonlinear waves. A heaving two-body point absorber is utilized as a specific calculation example, and the generated power of the point absorber has been predicted by using a novel method (a nonlinear simulation method) that incorporates a second order random wave model into a nonlinear dynamic filter. It is demonstrated that the second order random wave model in this article can be utilized to generate irregular waves with realistic crest-trough asymmetries, and consequently, more accurate generated power can be predicted by subsequently solving the nonlinear dynamic filter equation with the nonlinearly simulated second order waves as inputs. The research findings demonstrate that the novel nonlinear simulation method in this article can be utilized as a robust tool for ocean engineers in their design, analysis and optimization of wave energy converters.

  19. Modal identification of structures from the responses and random decrement signatures

    NASA Technical Reports Server (NTRS)

    Brahim, S. R.; Goglia, G. L.

    1977-01-01

    The theory and application of a method which utilizes the free response of a structure to determine its vibration parameters is described. The time-domain free response is digitized and used in a digital computer program to determine the number of modes excited, the natural frequencies, the damping factors, and the modal vectors. The technique is applied to a complex generalized payload model previously tested using sine sweep method and analyzed by NASTRAN. Ten modes of the payload model are identified. In case free decay response is not readily available, an algorithm is developed to obtain the free responses of a structure from its random responses, due to some unknown or known random input or inputs, using the random decrement technique without changing time correlation between signals. The algorithm is tested using random responses from a generalized payload model and from the space shuttle model.

  20. Estimating overall exposure effects for the clustered and censored outcome using random effect Tobit regression models.

    PubMed

    Wang, Wei; Griswold, Michael E

    2016-11-30

    The random effect Tobit model is a regression model that accommodates both left- and/or right-censoring and within-cluster dependence of the outcome variable. Regression coefficients of random effect Tobit models have conditional interpretations on a constructed latent dependent variable and do not provide inference of overall exposure effects on the original outcome scale. Marginalized random effects model (MREM) permits likelihood-based estimation of marginal mean parameters for the clustered data. For random effect Tobit models, we extend the MREM to marginalize over both the random effects and the normal space and boundary components of the censored response to estimate overall exposure effects at population level. We also extend the 'Average Predicted Value' method to estimate the model-predicted marginal means for each person under different exposure status in a designated reference group by integrating over the random effects and then use the calculated difference to assess the overall exposure effect. The maximum likelihood estimation is proposed utilizing a quasi-Newton optimization algorithm with Gauss-Hermite quadrature to approximate the integration of the random effects. We use these methods to carefully analyze two real datasets. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. A UTILITY THEORY OF OLD AGE.

    ERIC Educational Resources Information Center

    HAMLIN, ROY M.

    HERZBERG'S JOB SATISFACTION MODEL SERVES AS THE BASIS FOR AN ANALYSIS OF OLD AGE. THE PATTERN VARIES AMONG INDIVIDUALS, BUT THE CAPACITY FOR ORGANIZED BEHAVIOR RATHER THAN RANDOM STRESS REDUCTION SUPPLIES EACH INDIVIDUAL WITH A TASK. THE HYPOTHESIS IS THAT IF THE OLDER INDIVIDUAL REALIZES UTILITY IN HIS YEARS BEYOND 70, HE WILL RETAIN COMPETENCE…

  2. Predicting Health Care Utilization After Behavioral Health Referral Using Natural Language Processing and Machine Learning.

    PubMed

    Roysden, Nathaniel; Wright, Adam

    2015-01-01

    Mental health problems are an independent predictor of increased healthcare utilization. We created random forest classifiers for predicting two outcomes following a patient's first behavioral health encounter: decreased utilization by any amount (AUROC 0.74) and ultra-high absolute utilization (AUROC 0.88). These models may be used for clinical decision support by referring providers, to automatically detect patients who may benefit from referral, for cost management, or for risk/protection factor analysis.

  3. Rigorously testing multialternative decision field theory against random utility models.

    PubMed

    Berkowitsch, Nicolas A J; Scheibehenne, Benjamin; Rieskamp, Jörg

    2014-06-01

    Cognitive models of decision making aim to explain the process underlying observed choices. Here, we test a sequential sampling model of decision making, multialternative decision field theory (MDFT; Roe, Busemeyer, & Townsend, 2001), on empirical grounds and compare it against 2 established random utility models of choice: the probit and the logit model. Using a within-subject experimental design, participants in 2 studies repeatedly choose among sets of options (consumer products) described on several attributes. The results of Study 1 showed that all models predicted participants' choices equally well. In Study 2, in which the choice sets were explicitly designed to distinguish the models, MDFT had an advantage in predicting the observed choices. Study 2 further revealed the occurrence of multiple context effects within single participants, indicating an interdependent evaluation of choice options and correlations between different context effects. In sum, the results indicate that sequential sampling models can provide relevant insights into the cognitive process underlying preferential choices and thus can lead to better choice predictions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. The Dependent Poisson Race Model and Modeling Dependence in Conjoint Choice Experiments

    ERIC Educational Resources Information Center

    Ruan, Shiling; MacEachern, Steven N.; Otter, Thomas; Dean, Angela M.

    2008-01-01

    Conjoint choice experiments are used widely in marketing to study consumer preferences amongst alternative products. We develop a class of choice models, belonging to the class of Poisson race models, that describe a "random utility" which lends itself to a process-based description of choice. The models incorporate a dependence structure which…

  5. User-driven health care: answering multidimensional information needs in individual patients utilizing post-EBM approaches: an operational model.

    PubMed

    Biswas, Rakesh; Maniam, Jayanthy; Lee, Edwin Wen Huo; Gopal, Premalatha; Umakanth, Shashikiran; Dahiya, Sumit; Ahmed, Sayeed

    2008-10-01

    The hypothesis in the conceptual model was that a user-driven innovation in presently available information and communication technology infrastructure would be able to meet patient and health professional users information needs and help them attain better health outcomes. An operational model was created to plan a trial on a sample diabetic population utilizing a randomized control trial design, assigning one randomly selected group of diabetics to receive electronic information intervention and analyse if it would improve their health outcomes in comparison with a matched diabetic population who would only receive regular medical intervention. Diabetes was chosen for this particular trial, as it is a major chronic illness in Malaysia as elsewhere in the world. It is in essence a position paper for how the study concept should be organized to stimulate wider discussion prior to beginning the study.

  6. Analytic methods for questions pertaining to a randomized pretest, posttest, follow-up design.

    PubMed

    Rausch, Joseph R; Maxwell, Scott E; Kelley, Ken

    2003-09-01

    Delineates 5 questions regarding group differences that are likely to be of interest to researchers within the framework of a randomized pretest, posttest, follow-up (PPF) design. These 5 questions are examined from a methodological perspective by comparing and discussing analysis of variance (ANOVA) and analysis of covariance (ANCOVA) methods and briefly discussing hierarchical linear modeling (HLM) for these questions. This article demonstrates that the pretest should be utilized as a covariate in the model rather than as a level of the time factor or as part of the dependent variable within the analysis of group differences. It is also demonstrated that how the posttest and the follow-up are utilized in the analysis of group differences is determined by the specific question asked by the researcher.

  7. Randomness determines practical security of BB84 quantum key distribution.

    PubMed

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-10

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  8. Randomness determines practical security of BB84 quantum key distribution

    PubMed Central

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-01-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system. PMID:26552359

  9. Randomness determines practical security of BB84 quantum key distribution

    NASA Astrophysics Data System (ADS)

    Li, Hong-Wei; Yin, Zhen-Qiang; Wang, Shuang; Qian, Yong-Jun; Chen, Wei; Guo, Guang-Can; Han, Zheng-Fu

    2015-11-01

    Unconditional security of the BB84 quantum key distribution protocol has been proved by exploiting the fundamental laws of quantum mechanics, but the practical quantum key distribution system maybe hacked by considering the imperfect state preparation and measurement respectively. Until now, different attacking schemes have been proposed by utilizing imperfect devices, but the general security analysis model against all of the practical attacking schemes has not been proposed. Here, we demonstrate that the general practical attacking schemes can be divided into the Trojan horse attack, strong randomness attack and weak randomness attack respectively. We prove security of BB84 protocol under randomness attacking models, and these results can be applied to guarantee the security of the practical quantum key distribution system.

  10. Encoding Sequential Information in Semantic Space Models: Comparing Holographic Reduced Representation and Random Permutation

    PubMed Central

    Recchia, Gabriel; Sahlgren, Magnus; Kanerva, Pentti; Jones, Michael N.

    2015-01-01

    Circular convolution and random permutation have each been proposed as neurally plausible binding operators capable of encoding sequential information in semantic memory. We perform several controlled comparisons of circular convolution and random permutation as means of encoding paired associates as well as encoding sequential information. Random permutations outperformed convolution with respect to the number of paired associates that can be reliably stored in a single memory trace. Performance was equal on semantic tasks when using a small corpus, but random permutations were ultimately capable of achieving superior performance due to their higher scalability to large corpora. Finally, “noisy” permutations in which units are mapped to other units arbitrarily (no one-to-one mapping) perform nearly as well as true permutations. These findings increase the neurological plausibility of random permutations and highlight their utility in vector space models of semantics. PMID:25954306

  11. Cost-utility of cognitive behavioral therapy for low back pain from the commercial payer perspective.

    PubMed

    Norton, Giulia; McDonough, Christine M; Cabral, Howard; Shwartz, Michael; Burgess, James F

    2015-05-15

    Markov cost-utility model. To evaluate the cost-utility of cognitive behavioral therapy (CBT) for the treatment of persistent nonspecific low back pain (LBP) from the perspective of US commercial payers. CBT is widely deemed clinically effective for LBP treatment. The evidence is suggestive of cost-effectiveness. We constructed and validated a Markov intention-to-treat model to estimate the cost-utility of CBT, with 1-year and 10-year time horizons. We applied likelihood of improvement and utilities from a randomized controlled trial assessing CBT to treat LBP. The trial randomized subjects to treatment but subjects freely sought health care services. We derived the cost of equivalent rates and types of services from US commercial claims for LBP for a similar population. For the 10-year estimates, we derived recurrence rates from the literature. The base case included medical and pharmaceutical services and assumed gradual loss of skill in applying CBT techniques. Sensitivity analyses assessed the distribution of service utilization, utility values, and rate of LBP recurrence. We compared health plan designs. Results are based on 5000 iterations of each model and expressed as an incremental cost per quality-adjusted life-year. The incremental cost-utility of CBT was $7197 per quality-adjusted life-year in the first year and $5855 per quality-adjusted life-year over 10 years. The results are robust across numerous sensitivity analyses. No change of parameter estimate resulted in a difference of more than 7% from the base case for either time horizon. Including chiropractic and/or acupuncture care did not substantively affect cost-effectiveness. The model with medical but no pharmaceutical costs was more cost-effective ($5238 for 1 yr and $3849 for 10 yr). CBT is a cost-effective approach to manage chronic LBP among commercial health plans members. Cost-effectiveness is demonstrated for multiple plan designs. 2.

  12. Estimating Independent Locally Shifted Random Utility Models for Ranking Data

    ERIC Educational Resources Information Center

    Lam, Kar Yin; Koning, Alex J.; Franses, Philip Hans

    2011-01-01

    We consider the estimation of probabilistic ranking models in the context of conjoint experiments. By using approximate rather than exact ranking probabilities, we avoided the computation of high-dimensional integrals. We extended the approximation technique proposed by Henery (1981) in the context of the Thurstone-Mosteller-Daniels model to any…

  13. The Impact of Sample Size and Other Factors When Estimating Multilevel Logistic Models

    ERIC Educational Resources Information Center

    Schoeneberger, Jason A.

    2016-01-01

    The design of research studies utilizing binary multilevel models must necessarily incorporate knowledge of multiple factors, including estimation method, variance component size, or number of predictors, in addition to sample sizes. This Monte Carlo study examined the performance of random effect binary outcome multilevel models under varying…

  14. Model-based sensor-less wavefront aberration correction in optical coherence tomography.

    PubMed

    Verstraete, Hans R G W; Wahls, Sander; Kalkman, Jeroen; Verhaegen, Michel

    2015-12-15

    Several sensor-less wavefront aberration correction methods that correct nonlinear wavefront aberrations by maximizing the optical coherence tomography (OCT) signal are tested on an OCT setup. A conventional coordinate search method is compared to two model-based optimization methods. The first model-based method takes advantage of the well-known optimization algorithm (NEWUOA) and utilizes a quadratic model. The second model-based method (DONE) is new and utilizes a random multidimensional Fourier-basis expansion. The model-based algorithms achieve lower wavefront errors with up to ten times fewer measurements. Furthermore, the newly proposed DONE method outperforms the NEWUOA method significantly. The DONE algorithm is tested on OCT images and shows a significantly improved image quality.

  15. A utility-based design for randomized comparative trials with ordinal outcomes and prognostic subgroups.

    PubMed

    Murray, Thomas A; Yuan, Ying; Thall, Peter F; Elizondo, Joan H; Hofstetter, Wayne L

    2018-01-22

    A design is proposed for randomized comparative trials with ordinal outcomes and prognostic subgroups. The design accounts for patient heterogeneity by allowing possibly different comparative conclusions within subgroups. The comparative testing criterion is based on utilities for the levels of the ordinal outcome and a Bayesian probability model. Designs based on two alternative models that include treatment-subgroup interactions are considered, the proportional odds model and a non-proportional odds model with a hierarchical prior that shrinks toward the proportional odds model. A third design that assumes homogeneity and ignores possible treatment-subgroup interactions also is considered. The three approaches are applied to construct group sequential designs for a trial of nutritional prehabilitation versus standard of care for esophageal cancer patients undergoing chemoradiation and surgery, including both untreated patients and salvage patients whose disease has recurred following previous therapy. A simulation study is presented that compares the three designs, including evaluation of within-subgroup type I and II error probabilities under a variety of scenarios including different combinations of treatment-subgroup interactions. © 2018, The International Biometric Society.

  16. Zipf's law in city size from a resource utilization model.

    PubMed

    Ghosh, Asim; Chatterjee, Arnab; Chakrabarti, Anindya S; Chakrabarti, Bikas K

    2014-10-01

    We study a resource utilization scenario characterized by intrinsic fitness. To describe the growth and organization of different cities, we consider a model for resource utilization where many restaurants compete, as in a game, to attract customers using an iterative learning process. Results for the case of restaurants with uniform fitness are reported. When fitness is uniformly distributed, it gives rise to a Zipf law for the number of customers. We perform an exact calculation for the utilization fraction for the case when choices are made independent of fitness. A variant of the model is also introduced where the fitness can be treated as an ability to stay in the business. When a restaurant loses customers, its fitness is replaced by a random fitness. The steady state fitness distribution is characterized by a power law, while the distribution of the number of customers still follows the Zipf law, implying the robustness of the model. Our model serves as a paradigm for the emergence of Zipf law in city size distribution.

  17. Zipf's law in city size from a resource utilization model

    NASA Astrophysics Data System (ADS)

    Ghosh, Asim; Chatterjee, Arnab; Chakrabarti, Anindya S.; Chakrabarti, Bikas K.

    2014-10-01

    We study a resource utilization scenario characterized by intrinsic fitness. To describe the growth and organization of different cities, we consider a model for resource utilization where many restaurants compete, as in a game, to attract customers using an iterative learning process. Results for the case of restaurants with uniform fitness are reported. When fitness is uniformly distributed, it gives rise to a Zipf law for the number of customers. We perform an exact calculation for the utilization fraction for the case when choices are made independent of fitness. A variant of the model is also introduced where the fitness can be treated as an ability to stay in the business. When a restaurant loses customers, its fitness is replaced by a random fitness. The steady state fitness distribution is characterized by a power law, while the distribution of the number of customers still follows the Zipf law, implying the robustness of the model. Our model serves as a paradigm for the emergence of Zipf law in city size distribution.

  18. Excellent Teachers' Thinking Model: Implications for Effective Teaching

    ERIC Educational Resources Information Center

    Hamzah, Sahandri G.; Mohamad, Hapidah; Ghorbani, Mohammad R.

    2008-01-01

    This study aimed to suggest an Excellent Teacher Thinking Model that has the potential to be utilized in the development of excellent teachers. Interaction survey method using survey questions, observation, document review and interview was conducted in this study. One hundred and five excellent teachers were selected randomly as research…

  19. New machine learning tools for predictive vegetation mapping after climate change: Bagging and Random Forest perform better than Regression Tree Analysis

    Treesearch

    L.R. Iverson; A.M. Prasad; A. Liaw

    2004-01-01

    More and better machine learning tools are becoming available for landscape ecologists to aid in understanding species-environment relationships and to map probable species occurrence now and potentially into the future. To thal end, we evaluated three statistical models: Regression Tree Analybib (RTA), Bagging Trees (BT) and Random Forest (RF) for their utility in...

  20. Probabilistic analysis for fatigue strength degradation of materials

    NASA Technical Reports Server (NTRS)

    Royce, Lola

    1989-01-01

    This report presents the results of the first year of a research program conducted for NASA-LeRC by the University of Texas at San Antonio. The research included development of methodology that provides a probabilistic treatment of lifetime prediction of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Linear elastic fracture mechanics is utilized in the latter model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs, RANDOM2, RANDOM3, and RANDOM4. These programs determine the random lifetime of an engine component, in mechanical load cycles, to reach a critical fatigue strength or crack size. The material considered was a cast nickel base superalloy, one typical of those used in the Space Shuttle Main Engine.

  1. Fast Geostatistical Inversion using Randomized Matrix Decompositions and Sketchings for Heterogeneous Aquifer Characterization

    NASA Astrophysics Data System (ADS)

    O'Malley, D.; Le, E. B.; Vesselinov, V. V.

    2015-12-01

    We present a fast, scalable, and highly-implementable stochastic inverse method for characterization of aquifer heterogeneity. The method utilizes recent advances in randomized matrix algebra and exploits the structure of the Quasi-Linear Geostatistical Approach (QLGA), without requiring a structured grid like Fast-Fourier Transform (FFT) methods. The QLGA framework is a more stable version of Gauss-Newton iterates for a large number of unknown model parameters, but provides unbiased estimates. The methods are matrix-free and do not require derivatives or adjoints, and are thus ideal for complex models and black-box implementation. We also incorporate randomized least-square solvers and data-reduction methods, which speed up computation and simulate missing data points. The new inverse methodology is coded in Julia and implemented in the MADS computational framework (http://mads.lanl.gov). Julia is an advanced high-level scientific programing language that allows for efficient memory management and utilization of high-performance computational resources. Inversion results based on series of synthetic problems with steady-state and transient calibration data are presented.

  2. Power-law exponent of the Bouchaud-Mézard model on regular random networks

    NASA Astrophysics Data System (ADS)

    Ichinomiya, Takashi

    2013-07-01

    We study the Bouchaud-Mézard model on a regular random network. By assuming adiabaticity and independency, and utilizing the generalized central limit theorem and the Tauberian theorem, we derive an equation that determines the exponent of the probability distribution function of the wealth as x→∞. The analysis shows that the exponent can be smaller than 2, while a mean-field analysis always gives the exponent as being larger than 2. The results of our analysis are shown to be in good agreement with those of the numerical simulations.

  3. The Remle Project: A Study Utilizing the iPad with Families of Individuals Diagnosed with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Vasquez-Terry, Teresa LaDoan

    2013-01-01

    The purpose of The REMLE Project was to develop a best practices model for using the iPad as an assistive technology device with families of individuals with Autism Spectrum Disorder. Implementation of a double-blind, randomized control trial during a six-week intervention utilizing the iPad was measured for effectiveness in empowerment, social…

  4. Expected Power-Utility Maximization Under Incomplete Information and with Cox-Process Observations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujimoto, Kazufumi, E-mail: m_fuji@kvj.biglobe.ne.jp; Nagai, Hideo, E-mail: nagai@sigmath.es.osaka-u.ac.jp; Runggaldier, Wolfgang J., E-mail: runggal@math.unipd.it

    2013-02-15

    We consider the problem of maximization of expected terminal power utility (risk sensitive criterion). The underlying market model is a regime-switching diffusion model where the regime is determined by an unobservable factor process forming a finite state Markov process. The main novelty is due to the fact that prices are observed and the portfolio is rebalanced only at random times corresponding to a Cox process where the intensity is driven by the unobserved Markovian factor process as well. This leads to a more realistic modeling for many practical situations, like in markets with liquidity restrictions; on the other hand itmore » considerably complicates the problem to the point that traditional methodologies cannot be directly applied. The approach presented here is specific to the power-utility. For log-utilities a different approach is presented in Fujimoto et al. (Preprint, 2012).« less

  5. A method for reducing the order of nonlinear dynamic systems

    NASA Astrophysics Data System (ADS)

    Masri, S. F.; Miller, R. K.; Sassi, H.; Caughey, T. K.

    1984-06-01

    An approximate method that uses conventional condensation techniques for linear systems together with the nonparametric identification of the reduced-order model generalized nonlinear restoring forces is presented for reducing the order of discrete multidegree-of-freedom dynamic systems that possess arbitrary nonlinear characteristics. The utility of the proposed method is demonstrated by considering a redundant three-dimensional finite-element model half of whose elements incorporate hysteretic properties. A nonlinear reduced-order model, of one-third the order of the original model, is developed on the basis of wideband stationary random excitation and the validity of the reduced-order model is subsequently demonstrated by its ability to predict with adequate accuracy the transient response of the original nonlinear model under a different nonstationary random excitation.

  6. Breeding value accuracy estimates for growth traits using random regression and multi-trait models in Nelore cattle.

    PubMed

    Boligon, A A; Baldi, F; Mercadante, M E Z; Lobo, R B; Pereira, R J; Albuquerque, L G

    2011-06-28

    We quantified the potential increase in accuracy of expected breeding value for weights of Nelore cattle, from birth to mature age, using multi-trait and random regression models on Legendre polynomials and B-spline functions. A total of 87,712 weight records from 8144 females were used, recorded every three months from birth to mature age from the Nelore Brazil Program. For random regression analyses, all female weight records from birth to eight years of age (data set I) were considered. From this general data set, a subset was created (data set II), which included only nine weight records: at birth, weaning, 365 and 550 days of age, and 2, 3, 4, 5, and 6 years of age. Data set II was analyzed using random regression and multi-trait models. The model of analysis included the contemporary group as fixed effects and age of dam as a linear and quadratic covariable. In the random regression analyses, average growth trends were modeled using a cubic regression on orthogonal polynomials of age. Residual variances were modeled by a step function with five classes. Legendre polynomials of fourth and sixth order were utilized to model the direct genetic and animal permanent environmental effects, respectively, while third-order Legendre polynomials were considered for maternal genetic and maternal permanent environmental effects. Quadratic polynomials were applied to model all random effects in random regression models on B-spline functions. Direct genetic and animal permanent environmental effects were modeled using three segments or five coefficients, and genetic maternal and maternal permanent environmental effects were modeled with one segment or three coefficients in the random regression models on B-spline functions. For both data sets (I and II), animals ranked differently according to expected breeding value obtained by random regression or multi-trait models. With random regression models, the highest gains in accuracy were obtained at ages with a low number of weight records. The results indicate that random regression models provide more accurate expected breeding values than the traditionally finite multi-trait models. Thus, higher genetic responses are expected for beef cattle growth traits by replacing a multi-trait model with random regression models for genetic evaluation. B-spline functions could be applied as an alternative to Legendre polynomials to model covariance functions for weights from birth to mature age.

  7. Predictive Utility of Personality Disorder in Depression: Comparison of Outcomes and Taxonomic Approach.

    PubMed

    Newton-Howes, Giles; Mulder, Roger; Ellis, Pete M; Boden, Joseph M; Joyce, Peter

    2017-09-19

    There is debate around the best model for diagnosing personality disorder, both in terms of its relationship to the empirical data and clinical utility. Four randomized controlled trials examining various treatments for depression were analyzed at an individual patient level. Three different approaches to the diagnosis of personality disorder were analyzed in these patients. A total of 578 depressed patients were included in the analysis. Personality disorder, however measured, was of little predictive utility in the short term but added significantly to predictive modelling of medium-term outcomes, accounting for more than twice as much of the variance in social functioning outcome as depression psychopathology. Personality disorder assessment is of predictive utility with longer timeframes and when considering social outcomes as opposed to symptom counts. This utility is sufficiently great that there appears to be value in assessing personality; however, no particular approach outperforms any other.

  8. The episodic random utility model unifies time trade-off and discrete choice approaches in health state valuation

    PubMed Central

    Craig, Benjamin M; Busschbach, Jan JV

    2009-01-01

    Background To present an episodic random utility model that unifies time trade-off and discrete choice approaches in health state valuation. Methods First, we introduce two alternative random utility models (RUMs) for health preferences: the episodic RUM and the more common instant RUM. For the interpretation of time trade-off (TTO) responses, we show that the episodic model implies a coefficient estimator, and the instant model implies a mean slope estimator. Secondly, we demonstrate these estimators and the differences between the estimates for 42 health states using TTO responses from the seminal Measurement and Valuation in Health (MVH) study conducted in the United Kingdom. Mean slopes are estimates with and without Dolan's transformation of worse-than-death (WTD) responses. Finally, we demonstrate an exploded probit estimator, an extension of the coefficient estimator for discrete choice data that accommodates both TTO and rank responses. Results By construction, mean slopes are less than or equal to coefficients, because slopes are fractions and, therefore, magnify downward errors in WTD responses. The Dolan transformation of WTD responses causes mean slopes to increase in similarity to coefficient estimates, yet they are not equivalent (i.e., absolute mean difference = 0.179). Unlike mean slopes, coefficient estimates demonstrate strong concordance with rank-based predictions (Lin's rho = 0.91). Combining TTO and rank responses under the exploded probit model improves the identification of health state values, decreasing the average width of confidence intervals from 0.057 to 0.041 compared to TTO only results. Conclusion The episodic RUM expands upon the theoretical framework underlying health state valuation and contributes to health econometrics by motivating the selection of coefficient and exploded probit estimators for the analysis of TTO and rank responses. In future MVH surveys, sample size requirements may be reduced through the incorporation of multiple responses under a single estimator. PMID:19144115

  9. The Trail Making test: a study of its ability to predict falls in the acute neurological in-patient population.

    PubMed

    Mateen, Bilal Akhter; Bussas, Matthias; Doogan, Catherine; Waller, Denise; Saverino, Alessia; Király, Franz J; Playford, E Diane

    2018-05-01

    To determine whether tests of cognitive function and patient-reported outcome measures of motor function can be used to create a machine learning-based predictive tool for falls. Prospective cohort study. Tertiary neurological and neurosurgical center. In all, 337 in-patients receiving neurosurgical, neurological, or neurorehabilitation-based care. Binary (Y/N) for falling during the in-patient episode, the Trail Making Test (a measure of attention and executive function) and the Walk-12 (a patient-reported measure of physical function). The principal outcome was a fall during the in-patient stay ( n = 54). The Trail test was identified as the best predictor of falls. Moreover, addition of other variables, did not improve the prediction (Wilcoxon signed-rank P < 0.001). Classical linear statistical modeling methods were then compared with more recent machine learning based strategies, for example, random forests, neural networks, support vector machines. The random forest was the best modeling strategy when utilizing just the Trail Making Test data (Wilcoxon signed-rank P < 0.001) with 68% (± 7.7) sensitivity, and 90% (± 2.3) specificity. This study identifies a simple yet powerful machine learning (Random Forest) based predictive model for an in-patient neurological population, utilizing a single neuropsychological test of cognitive function, the Trail Making test.

  10. A Fast Variational Approach for Learning Markov Random Field Language Models

    DTIC Science & Technology

    2015-01-01

    the same distribution as n- gram models, but utilize a non-linear neural network pa- rameterization. NLMs have been shown to produce com- petitive...to either resort to local optimiza- tion methods, such as those used in neural lan- guage models, or work with heavily constrained distributions. In...embeddings learned through neural language models. Central to the language modelling problem is the challenge Proceedings of the 32nd International

  11. Evaluating gambles using dynamics

    NASA Astrophysics Data System (ADS)

    Peters, O.; Gell-Mann, M.

    2016-02-01

    Gambles are random variables that model possible changes in wealth. Classic decision theory transforms money into utility through a utility function and defines the value of a gamble as the expectation value of utility changes. Utility functions aim to capture individual psychological characteristics, but their generality limits predictive power. Expectation value maximizers are defined as rational in economics, but expectation values are only meaningful in the presence of ensembles or in systems with ergodic properties, whereas decision-makers have no access to ensembles, and the variables representing wealth in the usual growth models do not have the relevant ergodic properties. Simultaneously addressing the shortcomings of utility and those of expectations, we propose to evaluate gambles by averaging wealth growth over time. No utility function is needed, but a dynamic must be specified to compute time averages. Linear and logarithmic "utility functions" appear as transformations that generate ergodic observables for purely additive and purely multiplicative dynamics, respectively. We highlight inconsistencies throughout the development of decision theory, whose correction clarifies that our perspective is legitimate. These invalidate a commonly cited argument for bounded utility functions.

  12. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-05-15

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  13. Absolute nuclear material assay

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2010-07-13

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  14. Geometric Modeling of Inclusions as Ellipsoids

    NASA Technical Reports Server (NTRS)

    Bonacuse, Peter J.

    2008-01-01

    Nonmetallic inclusions in gas turbine disk alloys can have a significant detrimental impact on fatigue life. Because large inclusions that lead to anomalously low lives occur infrequently, probabilistic approaches can be utilized to avoid the excessively conservative assumption of lifing to a large inclusion in a high stress location. A prerequisite to modeling the impact of inclusions on the fatigue life distribution is a characterization of the inclusion occurrence rate and size distribution. To help facilitate this process, a geometric simulation of the inclusions was devised. To make the simulation problem tractable, the irregularly sized and shaped inclusions were modeled as arbitrarily oriented, three independent dimensioned, ellipsoids. Random orientation of the ellipsoid is accomplished through a series of three orthogonal rotations of axes. In this report, a set of mathematical models for the following parameters are described: the intercepted area of a randomly sectioned ellipsoid, the dimensions and orientation of the intercepted ellipse, the area of a randomly oriented sectioned ellipse, the depth and width of a randomly oriented sectioned ellipse, and the projected area of a randomly oriented ellipsoid. These parameters are necessary to determine an inclusion s potential to develop a propagating fatigue crack. Without these mathematical models, computationally expensive search algorithms would be required to compute these parameters.

  15. Utility-based designs for randomized comparative trials with categorical outcomes

    PubMed Central

    Murray, Thomas A.; Thall, Peter F.; Yuan, Ying

    2016-01-01

    A general utility-based testing methodology for design and conduct of randomized comparative clinical trials with categorical outcomes is presented. Numerical utilities of all elementary events are elicited to quantify their desirabilities. These numerical values are used to map the categorical outcome probability vector of each treatment to a mean utility, which is used as a one-dimensional criterion for constructing comparative tests. Bayesian tests are presented, including fixed sample and group sequential procedures, assuming Dirichlet-multinomial models for the priors and likelihoods. Guidelines are provided for establishing priors, eliciting utilities, and specifying hypotheses. Efficient posterior computation is discussed, and algorithms are provided for jointly calibrating test cutoffs and sample size to control overall type I error and achieve specified power. Asymptotic approximations for the power curve are used to initialize the algorithms. The methodology is applied to re-design a completed trial that compared two chemotherapy regimens for chronic lymphocytic leukemia, in which an ordinal efficacy outcome was dichotomized and toxicity was ignored to construct the trial’s design. The Bayesian tests also are illustrated by several types of categorical outcomes arising in common clinical settings. Freely available computer software for implementation is provided. PMID:27189672

  16. Measuring CAMD technique performance. 2. How "druglike" are drugs? Implications of Random test set selection exemplified using druglikeness classification models.

    PubMed

    Good, Andrew C; Hermsmeier, Mark A

    2007-01-01

    Research into the advancement of computer-aided molecular design (CAMD) has a tendency to focus on the discipline of algorithm development. Such efforts are often wrought to the detriment of the data set selection and analysis used in said algorithm validation. Here we highlight the potential problems this can cause in the context of druglikeness classification. More rigorous efforts are applied to the selection of decoy (nondruglike) molecules from the ACD. Comparisons are made between model performance using the standard technique of random test set creation with test sets derived from explicit ontological separation by drug class. The dangers of viewing druglike space as sufficiently coherent to permit simple classification are highlighted. In addition the issues inherent in applying unfiltered data and random test set selection to (Q)SAR models utilizing large and supposedly heterogeneous databases are discussed.

  17. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE PAGES

    Chassin, David P.; Rondeau, Daniel

    2016-08-24

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  18. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. Finally, the results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  19. Aggregate modeling of fast-acting demand response and control under real-time pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chassin, David P.; Rondeau, Daniel

    This paper develops and assesses the performance of a short-term demand response (DR) model for utility load control with applications to resource planning and control design. Long term response models tend to underestimate short-term demand response when induced by prices. This has two important consequences. First, planning studies tend to undervalue DR and often overlook its benefits in utility demand management program development. Second, when DR is not overlooked, the open-loop DR control gain estimate may be too low. This can result in overuse of load resources, control instability and excessive price volatility. Our objective is therefore to develop amore » more accurate and better performing short-term demand response model. We construct the model from first principles about the nature of thermostatic load control and show that the resulting formulation corresponds exactly to the Random Utility Model employed in economics to study consumer choice. The model is tested against empirical data collected from field demonstration projects and is shown to perform better than alternative models commonly used to forecast demand in normal operating conditions. The results suggest that (1) existing utility tariffs appear to be inadequate to incentivize demand response, particularly in the presence of high renewables, and (2) existing load control systems run the risk of becoming unstable if utilities close the loop on real-time prices.« less

  20. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  1. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  2. GAMETES: a fast, direct algorithm for generating pure, strict, epistatic models with random architectures.

    PubMed

    Urbanowicz, Ryan J; Kiralis, Jeff; Sinnott-Armstrong, Nicholas A; Heberling, Tamra; Fisher, Jonathan M; Moore, Jason H

    2012-10-01

    Geneticists who look beyond single locus disease associations require additional strategies for the detection of complex multi-locus effects. Epistasis, a multi-locus masking effect, presents a particular challenge, and has been the target of bioinformatic development. Thorough evaluation of new algorithms calls for simulation studies in which known disease models are sought. To date, the best methods for generating simulated multi-locus epistatic models rely on genetic algorithms. However, such methods are computationally expensive, difficult to adapt to multiple objectives, and unlikely to yield models with a precise form of epistasis which we refer to as pure and strict. Purely and strictly epistatic models constitute the worst-case in terms of detecting disease associations, since such associations may only be observed if all n-loci are included in the disease model. This makes them an attractive gold standard for simulation studies considering complex multi-locus effects. We introduce GAMETES, a user-friendly software package and algorithm which generates complex biallelic single nucleotide polymorphism (SNP) disease models for simulation studies. GAMETES rapidly and precisely generates random, pure, strict n-locus models with specified genetic constraints. These constraints include heritability, minor allele frequencies of the SNPs, and population prevalence. GAMETES also includes a simple dataset simulation strategy which may be utilized to rapidly generate an archive of simulated datasets for given genetic models. We highlight the utility and limitations of GAMETES with an example simulation study using MDR, an algorithm designed to detect epistasis. GAMETES is a fast, flexible, and precise tool for generating complex n-locus models with random architectures. While GAMETES has a limited ability to generate models with higher heritabilities, it is proficient at generating the lower heritability models typically used in simulation studies evaluating new algorithms. In addition, the GAMETES modeling strategy may be flexibly combined with any dataset simulation strategy. Beyond dataset simulation, GAMETES could be employed to pursue theoretical characterization of genetic models and epistasis.

  3. MSC/NASTRAN Stress Analysis of Complete Models Subjected to Random and Quasi-Static Loads

    NASA Technical Reports Server (NTRS)

    Hampton, Roy W.

    2000-01-01

    Space payloads, such as those which fly on the Space Shuttle in Spacelab, are designed to withstand dynamic loads which consist of combined acoustic random loads and quasi-static acceleration loads. Methods for computing the payload stresses due to these loads are well known and appear in texts and NASA documents, but typically involve approximations such as the Miles' equation, as well as possible adjustments based on "modal participation factors." Alternatively, an existing capability in MSC/NASTRAN may be used to output exact root mean square [rms] stresses due to the random loads for any specified elements in the Finite Element Model. However, it is time consuming to use this methodology to obtain the rms stresses for the complete structural model and then combine them with the quasi-static loading induced stresses. Special processing was developed as described here to perform the stress analysis of all elements in the model using existing MSC/NASTRAN and MSC/PATRAN and UNIX utilities. Fail-safe and buckling analyses applications are also described.

  4. Nurses wanted Is the job too harsh or is the wage too low?

    PubMed

    Di Tommaso, M L; Strøm, S; Saether, E M

    2009-05-01

    When entering the job market, nurses choose among different kind of jobs. Each of these jobs is characterized by wage, sector (primary care or hospital) and shift (daytime work or shift). This paper estimates a multi-sector-job-type random utility model of labor supply on data for Norwegian registered nurses (RNs) in 2000. The empirical model implies that labor supply is rather inelastic; 10% increase in the wage rates for all nurses is estimated to yield 3.3% increase in overall labor supply. This modest response shadows for much stronger inter-job-type responses. Our approach differs from previous studies in two ways: First, to our knowledge, it is the first time that a model of labor supply for nurses is estimated taking explicitly into account the choices that RN's have regarding work place and type of job. Second, it differs from previous studies with respect to the measurement of the compensations for different types of work. So far, it has been focused on wage differentials. But there are more attributes of a job than the wage. Based on the estimated random utility model we therefore calculate the expected value of compensation that makes a utility maximizing agent indifferent between types of jobs, here between shift work and daytime work. It turns out that Norwegian nurses working shifts may be willing to work shift relative to daytime work for a lower wage than the current one.

  5. Development of a transformation model to derive general population-based utility: Mapping the pruritus-visual analog scale (VAS) to the EQ-5D utility.

    PubMed

    Park, Sun-Young; Park, Eun-Ja; Suh, Hae Sun; Ha, Dongmun; Lee, Eui-Kyung

    2017-08-01

    Although nonpreference-based disease-specific measures are widely used in clinical studies, they cannot generate utilities for economic evaluation. A solution to this problem is to estimate utilities from disease-specific instruments using the mapping function. This study aimed to develop a transformation model for mapping the pruritus-visual analog scale (VAS) to the EuroQol 5-Dimension 3-Level (EQ-5D-3L) utility index in pruritus. A cross-sectional survey was conducted with a sample (n = 268) drawn from the general population of South Korea. Data were randomly divided into 2 groups, one for estimating and the other for validating mapping models. To select the best model, we developed and compared 3 separate models using demographic information and the pruritus-VAS as independent variables. The predictive performance was assessed using the mean absolute deviation and root mean square error in a separate dataset. Among the 3 models, model 2 using age, age squared, sex, and the pruritus-VAS as independent variables had the best performance based on the goodness of fit and model simplicity, with a log likelihood of 187.13. The 3 models had similar precision errors based on mean absolute deviation and root mean square error in the validation dataset. No statistically significant difference was observed between the mean observed and predicted values in all models. In conclusion, model 2 was chosen as the preferred mapping model. Outcomes measured as the pruritus-VAS can be transformed into the EQ-5D-3L utility index using this mapping model, which makes an economic evaluation possible when only pruritus-VAS data are available. © 2017 John Wiley & Sons, Ltd.

  6. Healthcare utilization in adults with opioid dependence receiving extended release naltrexone compared to treatment as usual.

    PubMed

    Soares, William E; Wilson, Donna; Rathlev, Niels; Lee, Joshua D; Gordon, Michael; Nunes, Edward V; O'Brien, Charles P; Friedmann, Peter D

    2018-02-01

    Opioid use disorders have reached epidemic proportions, with overdose now the leading cause of accidental death in the United States. Extended release naltrexone (XR-NTX) has emerged as a medication treatment that reduces opioid use and craving. However, the effect of XR-NTX therapy on acute healthcare utilization, including emergency department visits and inpatient hospitalizations, remains uncertain. The objective of the current study is to evaluate hospital-based healthcare resource utilization in adults involved in the criminal justice system with a history of opioid use disorder randomized to XR-NTX therapy compared with treatment as usual (TAU) during a 6-month treatment phase and 12months post-treatment follow up. This retrospective exploratory analysis uses data collected in a published randomized trial. Comparisons of the number of emergency department visits and hospital admissions (for drug detox, psychiatric care and other medical reasons) were performed using chi square tests for any admission and negative binomial models for number of admissions. Of the 308 participants randomized, 96% had utilization data (76% complete 6months, 67% complete follow up). No significant differences were seen in overall healthcare utilization (IRR=0.88, 95%CI 0.63-1.23, p=0.45), or substance use-related drug detox hospitalizations (IRR=0.83, 95%CI 0.32-2.16, p=0.71). Despite having more participants report chronic medical problems at baseline (43% vs. 32%, p=0.05), those receiving XR-NTX generally experienced equivalent or lower rates of healthcare utilization compared to TAU. The XR-NTX group had significantly lower medical/surgical related hospital admissions (IRR=0.55, 95%CI 0.30-1.00, p=0.05) during the course of the entire study. XR-NTX did not significantly increase rates of healthcare utilization compared to TAU. Provider concerns regarding healthcare utilization should not preclude the consideration of XR-NTX as therapy for opioid use disorders. Copyright © 2018 Elsevier Inc. All rights reserved.

  7. Impact of Contextual Factors on Prostate Cancer Risk and Outcomes

    DTIC Science & Technology

    2013-07-01

    framework with random effects (“frailty models”) while the case-control analyses (Aim 4) will use multilevel unconditional logistic regression models...contextual-level SES on prostate cancer risk within racial/ethnic groups. The survival analyses (Aims 1-3) will utilize a proportional hazards regression

  8. Television and Its News: A Discrepancy Examination.

    ERIC Educational Resources Information Center

    Melton, Gary Warren

    This exploratory endeavor utilized a functional discrepancy model of mass communication research to examine the audience experience with television generally and its news in particular. Specifically, gratifications sought from television in general and gratifications perceived as being obtained from television news are analyzed for a random sample…

  9. Practical Effects of Classwide Mathematics Intervention

    ERIC Educational Resources Information Center

    VanDerHeyden, Amanda M.; Codding, Robin S.

    2015-01-01

    The current article presents additional analyses of a classwide mathematics intervention, from a previously reported randomized controlled trial, to offer new information about the treatment and to demonstrate the utility of different types of effect sizes. Multilevel modeling was used to examine treatment effects by race, sex, socioeconomic…

  10. Fitting Nonlinear Ordinary Differential Equation Models with Random Effects and Unknown Initial Conditions Using the Stochastic Approximation Expectation-Maximization (SAEM) Algorithm.

    PubMed

    Chow, Sy-Miin; Lu, Zhaohua; Sherwood, Andrew; Zhu, Hongtu

    2016-03-01

    The past decade has evidenced the increased prevalence of irregularly spaced longitudinal data in social sciences. Clearly lacking, however, are modeling tools that allow researchers to fit dynamic models to irregularly spaced data, particularly data that show nonlinearity and heterogeneity in dynamical structures. We consider the issue of fitting multivariate nonlinear differential equation models with random effects and unknown initial conditions to irregularly spaced data. A stochastic approximation expectation-maximization algorithm is proposed and its performance is evaluated using a benchmark nonlinear dynamical systems model, namely, the Van der Pol oscillator equations. The empirical utility of the proposed technique is illustrated using a set of 24-h ambulatory cardiovascular data from 168 men and women. Pertinent methodological challenges and unresolved issues are discussed.

  11. FITTING NONLINEAR ORDINARY DIFFERENTIAL EQUATION MODELS WITH RANDOM EFFECTS AND UNKNOWN INITIAL CONDITIONS USING THE STOCHASTIC APPROXIMATION EXPECTATION–MAXIMIZATION (SAEM) ALGORITHM

    PubMed Central

    Chow, Sy- Miin; Lu, Zhaohua; Zhu, Hongtu; Sherwood, Andrew

    2014-01-01

    The past decade has evidenced the increased prevalence of irregularly spaced longitudinal data in social sciences. Clearly lacking, however, are modeling tools that allow researchers to fit dynamic models to irregularly spaced data, particularly data that show nonlinearity and heterogeneity in dynamical structures. We consider the issue of fitting multivariate nonlinear differential equation models with random effects and unknown initial conditions to irregularly spaced data. A stochastic approximation expectation–maximization algorithm is proposed and its performance is evaluated using a benchmark nonlinear dynamical systems model, namely, the Van der Pol oscillator equations. The empirical utility of the proposed technique is illustrated using a set of 24-h ambulatory cardiovascular data from 168 men and women. Pertinent methodological challenges and unresolved issues are discussed. PMID:25416456

  12. Universal dispersion model for characterization of optical thin films over wide spectral range: Application to magnesium fluoride

    NASA Astrophysics Data System (ADS)

    Franta, Daniel; Nečas, David; Giglia, Angelo; Franta, Pavel; Ohlídal, Ivan

    2017-11-01

    Optical characterization of magnesium fluoride thin films is performed in a wide spectral range from far infrared to extreme ultraviolet (0.01-45 eV) utilizing the universal dispersion model. Two film defects, i.e. random roughness of the upper boundaries and defect transition layer at lower boundary are taken into account. An extension of universal dispersion model consisting in expressing the excitonic contributions as linear combinations of Gaussian and truncated Lorentzian terms is introduced. The spectral dependencies of the optical constants are presented in a graphical form and by the complete set of dispersion parameters that allows generating tabulated optical constants with required range and step using a simple utility in the newAD2 software package.

  13. A mixed-effects regression model for longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C; Hedeker, Donald

    2006-03-01

    A mixed-effects item response theory model that allows for three-level multivariate ordinal outcomes and accommodates multiple random subject effects is proposed for analysis of multivariate ordinal outcomes in longitudinal studies. This model allows for the estimation of different item factor loadings (item discrimination parameters) for the multiple outcomes. The covariates in the model do not have to follow the proportional odds assumption and can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is proposed utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher scoring solution, which provides standard errors for all model parameters, is used. An analysis of a longitudinal substance use data set, where four items of substance use behavior (cigarette use, alcohol use, marijuana use, and getting drunk or high) are repeatedly measured over time, is used to illustrate application of the proposed model.

  14. A comparison of observation-level random effect and Beta-Binomial models for modelling overdispersion in Binomial data in ecology & evolution.

    PubMed

    Harrison, Xavier A

    2015-01-01

    Overdispersion is a common feature of models of biological data, but researchers often fail to model the excess variation driving the overdispersion, resulting in biased parameter estimates and standard errors. Quantifying and modeling overdispersion when it is present is therefore critical for robust biological inference. One means to account for overdispersion is to add an observation-level random effect (OLRE) to a model, where each data point receives a unique level of a random effect that can absorb the extra-parametric variation in the data. Although some studies have investigated the utility of OLRE to model overdispersion in Poisson count data, studies doing so for Binomial proportion data are scarce. Here I use a simulation approach to investigate the ability of both OLRE models and Beta-Binomial models to recover unbiased parameter estimates in mixed effects models of Binomial data under various degrees of overdispersion. In addition, as ecologists often fit random intercept terms to models when the random effect sample size is low (<5 levels), I investigate the performance of both model types under a range of random effect sample sizes when overdispersion is present. Simulation results revealed that the efficacy of OLRE depends on the process that generated the overdispersion; OLRE failed to cope with overdispersion generated from a Beta-Binomial mixture model, leading to biased slope and intercept estimates, but performed well for overdispersion generated by adding random noise to the linear predictor. Comparison of parameter estimates from an OLRE model with those from its corresponding Beta-Binomial model readily identified when OLRE were performing poorly due to disagreement between effect sizes, and this strategy should be employed whenever OLRE are used for Binomial data to assess their reliability. Beta-Binomial models performed well across all contexts, but showed a tendency to underestimate effect sizes when modelling non-Beta-Binomial data. Finally, both OLRE and Beta-Binomial models performed poorly when models contained <5 levels of the random intercept term, especially for estimating variance components, and this effect appeared independent of total sample size. These results suggest that OLRE are a useful tool for modelling overdispersion in Binomial data, but that they do not perform well in all circumstances and researchers should take care to verify the robustness of parameter estimates of OLRE models.

  15. Understanding agent-based models of financial markets: A bottom-up approach based on order parameters and phase diagrams

    NASA Astrophysics Data System (ADS)

    Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann

    2012-11-01

    We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.

  16. Kernel-Based Approximate Dynamic Programming Using Bellman Residual Elimination

    DTIC Science & Technology

    2010-02-01

    framework is the ability to utilize stochastic system models, thereby allowing the system to make sound decisions even if there is randomness in the system ...approximate policy when a system model is unavailable. We present theoretical analysis of all BRE algorithms proving convergence to the optimal policy in...policies based on MDPs is that there may be parameters of the system model that are poorly known and/or vary with time as the system operates. System

  17. Mapping of the DLQI scores to EQ-5D utility values using ordinal logistic regression.

    PubMed

    Ali, Faraz Mahmood; Kay, Richard; Finlay, Andrew Y; Piguet, Vincent; Kupfer, Joerg; Dalgard, Florence; Salek, M Sam

    2017-11-01

    The Dermatology Life Quality Index (DLQI) and the European Quality of Life-5 Dimension (EQ-5D) are separate measures that may be used to gather health-related quality of life (HRQoL) information from patients. The EQ-5D is a generic measure from which health utility estimates can be derived, whereas the DLQI is a specialty-specific measure to assess HRQoL. To reduce the burden of multiple measures being administered and to enable a more disease-specific calculation of health utility estimates, we explored an established mathematical technique known as ordinal logistic regression (OLR) to develop an appropriate model to map DLQI data to EQ-5D-based health utility estimates. Retrospective data from 4010 patients were randomly divided five times into two groups for the derivation and testing of the mapping model. Split-half cross-validation was utilized resulting in a total of ten ordinal logistic regression models for each of the five EQ-5D dimensions against age, sex, and all ten items of the DLQI. Using Monte Carlo simulation, predicted health utility estimates were derived and compared against those observed. This method was repeated for both OLR and a previously tested mapping methodology based on linear regression. The model was shown to be highly predictive and its repeated fitting demonstrated a stable model using OLR as well as linear regression. The mean differences between OLR-predicted health utility estimates and observed health utility estimates ranged from 0.0024 to 0.0239 across the ten modeling exercises, with an average overall difference of 0.0120 (a 1.6% underestimate, not of clinical importance). This modeling framework developed in this study will enable researchers to calculate EQ-5D health utility estimates from a specialty-specific study population, reducing patient and economic burden.

  18. Mathematical models of cell factories: moving towards the core of industrial biotechnology.

    PubMed

    Cvijovic, Marija; Bordel, Sergio; Nielsen, Jens

    2011-09-01

    Industrial biotechnology involves the utilization of cell factories for the production of fuels and chemicals. Traditionally, the development of highly productive microbial strains has relied on random mutagenesis and screening. The development of predictive mathematical models provides a new paradigm for the rational design of cell factories. Instead of selecting among a set of strains resulting from random mutagenesis, mathematical models allow the researchers to predict in silico the outcomes of different genetic manipulations and engineer new strains by performing gene deletions or additions leading to a higher productivity of the desired chemicals. In this review we aim to summarize the main modelling approaches of biological processes and illustrate the particular applications that they have found in the field of industrial microbiology. © 2010 The Authors. Journal compilation © 2010 Society for Applied Microbiology and Blackwell Publishing Ltd.

  19. Radiation breakage of DNA: a model based on random-walk chromatin structure

    NASA Technical Reports Server (NTRS)

    Ponomarev, A. L.; Sachs, R. K.

    2001-01-01

    Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.

  20. Parsimonious Continuous Time Random Walk Models and Kurtosis for Diffusion in Magnetic Resonance of Biological Tissue

    NASA Astrophysics Data System (ADS)

    Ingo, Carson; Sui, Yi; Chen, Yufen; Parrish, Todd; Webb, Andrew; Ronen, Itamar

    2015-03-01

    In this paper, we provide a context for the modeling approaches that have been developed to describe non-Gaussian diffusion behavior, which is ubiquitous in diffusion weighted magnetic resonance imaging of water in biological tissue. Subsequently, we focus on the formalism of the continuous time random walk theory to extract properties of subdiffusion and superdiffusion through novel simplifications of the Mittag-Leffler function. For the case of time-fractional subdiffusion, we compute the kurtosis for the Mittag-Leffler function, which provides both a connection and physical context to the much-used approach of diffusional kurtosis imaging. We provide Monte Carlo simulations to illustrate the concepts of anomalous diffusion as stochastic processes of the random walk. Finally, we demonstrate the clinical utility of the Mittag-Leffler function as a model to describe tissue microstructure through estimations of subdiffusion and kurtosis with diffusion MRI measurements in the brain of a chronic ischemic stroke patient.

  1. A unified framework for mesh refinement in random and physical space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jing; Stinis, Panos

    In recent work we have shown how an accurate reduced model can be utilized to perform mesh renement in random space. That work relied on the explicit knowledge of an accurate reduced model which is used to monitor the transfer of activity from the large to the small scales of the solution. Since this is not always available, we present in the current work a framework which shares the merits and basic idea of the previous approach but does not require an explicit knowledge of a reduced model. Moreover, the current framework can be applied for renement in both randommore » and physical space. In this manuscript we focus on the application to random space mesh renement. We study examples of increasing difficulty (from ordinary to partial differential equations) which demonstrate the effciency and versatility of our approach. We also provide some results from the application of the new framework to physical space mesh refinement.« less

  2. Utility of the sore throat pain model in a multiple-dose assessment of the acute analgesic flurbiprofen: a randomized controlled study.

    PubMed

    Schachtel, Bernard; Aspley, Sue; Shephard, Adrian; Shea, Timothy; Smith, Gary; Schachtel, Emily

    2014-07-03

    The sore throat pain model has been conducted by different clinical investigators to demonstrate the efficacy of acute analgesic drugs in single-dose randomized clinical trials. The model used here was designed to study the multiple-dose safety and efficacy of lozenges containing flurbiprofen at 8.75 mg. Adults (n=198) with moderate or severe acute sore throat and findings of pharyngitis on a Tonsillo-Pharyngitis Assessment (TPA) were randomly assigned to use either flurbiprofen 8.75 mg lozenges (n=101) or matching placebo lozenges (n=97) under double-blind conditions. Patients sucked one lozenge every three to six hours as needed, up to five lozenges per day, and rated symptoms on 100-mm scales: the Sore Throat Pain Intensity Scale (STPIS), the Difficulty Swallowing Scale (DSS), and the Swollen Throat Scale (SwoTS). Reductions in pain (lasting for three hours) and in difficulty swallowing and throat swelling (for four hours) were observed after a single dose of the flurbiprofen 8.75 mg lozenge (P<0.05 compared with placebo). After using multiple doses over 24 hours, flurbiprofen-treated patients experienced a 59% greater reduction in throat pain, 45% less difficulty swallowing, and 44% less throat swelling than placebo-treated patients (all P<0.01). There were no serious adverse events. Utilizing the sore throat pain model with multiple doses over 24 hours, flurbiprofen 8.75 mg lozenges were shown to be an effective, well-tolerated treatment for sore throat pain. Other pharmacologic actions (reduced difficulty swallowing and reduced throat swelling) and overall patient satisfaction from the flurbiprofen lozenges were also demonstrated in this multiple-dose implementation of the sore throat pain model. This trial was registered with ClinicalTrials.gov, registration number: NCT01048866, registration date: January 13, 2010.

  3. A value-based medicine analysis of ranibizumab for the treatment of subfoveal neovascular macular degeneration.

    PubMed

    Brown, Melissa M; Brown, Gary C; Brown, Heidi C; Peet, Jonathan

    2008-06-01

    To assess the conferred value and average cost-utility (cost-effectiveness) for intravitreal ranibizumab used to treat occult/minimally classic subfoveal choroidal neovascularization associated with age-related macular degeneration (AMD). Value-based medicine cost-utility analysis. MARINA (Minimally Classic/Occult Trial of the Anti-Vascular Endothelial Growth Factor Antibody Ranibizumab in the Treatment of Neovascular AMD) Study patients utilizing published primary data. Reference case, third-party insurer perspective, cost-utility analysis using 2006 United States dollars. Conferred value in the forms of (1) quality-adjusted life-years (QALYs) and (2) percent improvement in health-related quality of life. Cost-utility is expressed in terms of dollars expended per QALY gained. All outcomes are discounted at a 3% annual rate, as recommended by the Panel on Cost-effectiveness in Health and Medicine. Data are presented for the second-eye model, first-eye model, and combined model. Twenty-two intravitreal injections of 0.5 mg of ranibizumab administered over a 2-year period confer 1.039 QALYs, or a 15.8% improvement in quality of life for the 12-year period of the second-eye model reference case of occult/minimally classic age-related subfoveal choroidal neovascularization. The reference case treatment cost is $52652, and the cost-utility for the second-eye model is $50691/QALY. The quality-of-life gain from the first-eye model is 6.4% and the cost-utility is $123887, whereas the most clinically simulating combined model yields a quality-of-life gain of 10.4% and cost-utility of $74169. By conventional standards and the most commonly used second-eye and combined models, intravitreal ranibizumab administered for occult/minimally classic subfoveal choroidal neovascularization is a cost-effective therapy. Ranibizumab treatment confers considerably greater value than other neovascular macular degeneration pharmaceutical therapies that have been studied in randomized clinical trials.

  4. Vehicle track segmentation using higher order random fields

    DOE PAGES

    Quach, Tu -Thach

    2017-01-09

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  5. Vehicle track segmentation using higher order random fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quach, Tu -Thach

    Here, we present an approach to segment vehicle tracks in coherent change detection images, a product of combining two synthetic aperture radar images taken at different times. The approach uses multiscale higher order random field models to capture track statistics, such as curvatures and their parallel nature, that are not currently utilized in existing methods. These statistics are encoded as 3-by-3 patterns at different scales. The model can complete disconnected tracks often caused by sensor noise and various environmental effects. Coupling the model with a simple classifier, our approach is effective at segmenting salient tracks. We improve the F-measure onmore » a standard vehicle track data set to 0.963, up from 0.897 obtained by the current state-of-the-art method.« less

  6. Crash Frequency Analysis Using Hurdle Models with Random Effects Considering Short-Term Panel Data

    PubMed Central

    Chen, Feng; Ma, Xiaoxiang; Chen, Suren; Yang, Lin

    2016-01-01

    Random effect panel data hurdle models are established to research the daily crash frequency on a mountainous section of highway I-70 in Colorado. Road Weather Information System (RWIS) real-time traffic and weather and road surface conditions are merged into the models incorporating road characteristics. The random effect hurdle negative binomial (REHNB) model is developed to study the daily crash frequency along with three other competing models. The proposed model considers the serial correlation of observations, the unbalanced panel-data structure, and dominating zeroes. Based on several statistical tests, the REHNB model is identified as the most appropriate one among four candidate models for a typical mountainous highway. The results show that: (1) the presence of over-dispersion in the short-term crash frequency data is due to both excess zeros and unobserved heterogeneity in the crash data; and (2) the REHNB model is suitable for this type of data. Moreover, time-varying variables including weather conditions, road surface conditions and traffic conditions are found to play importation roles in crash frequency. Besides the methodological advancements, the proposed technology bears great potential for engineering applications to develop short-term crash frequency models by utilizing detailed data from field monitoring data such as RWIS, which is becoming more accessible around the world. PMID:27792209

  7. Random regression models using Legendre orthogonal polynomials to evaluate the milk production of Alpine goats.

    PubMed

    Silva, F G; Torres, R A; Brito, L F; Euclydes, R F; Melo, A L P; Souza, N O; Ribeiro, J I; Rodrigues, M T

    2013-12-11

    The objective of this study was to identify the best random regression model using Legendre orthogonal polynomials to evaluate Alpine goats genetically and to estimate the parameters for test day milk yield. On the test day, we analyzed 20,710 records of milk yield of 667 goats from the Goat Sector of the Universidade Federal de Viçosa. The evaluated models had combinations of distinct fitting orders for polynomials (2-5), random genetic (1-7), and permanent environmental (1-7) fixed curves and a number of classes for residual variance (2, 4, 5, and 6). WOMBAT software was used for all genetic analyses. A random regression model using the best Legendre orthogonal polynomial for genetic evaluation of milk yield on the test day of Alpine goats considered a fixed curve of order 4, curve of genetic additive effects of order 2, curve of permanent environmental effects of order 7, and a minimum of 5 classes of residual variance because it was the most economical model among those that were equivalent to the complete model by the likelihood ratio test. Phenotypic variance and heritability were higher at the end of the lactation period, indicating that the length of lactation has more genetic components in relation to the production peak and persistence. It is very important that the evaluation utilizes the best combination of fixed, genetic additive and permanent environmental regressions, and number of classes of heterogeneous residual variance for genetic evaluation using random regression models, thereby enhancing the precision and accuracy of the estimates of parameters and prediction of genetic values.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    ALAM,TODD M.

    Monte Carlo simulations of phosphate tetrahedron connectivity distributions in alkali and alkaline earth phosphate glasses are reported. By utilizing a discrete bond model, the distribution of next-nearest neighbor connectivities between phosphate polyhedron for random, alternating and clustering bonding scenarios was evaluated as a function of the relative bond energy difference. The simulated distributions are compared to experimentally observed connectivities reported for solid-state two-dimensional exchange and double-quantum NMR experiments of phosphate glasses. These Monte Carlo simulations demonstrate that the polyhedron connectivity is best described by a random distribution in lithium phosphate and calcium phosphate glasses.

  9. Using a shared parameter mixture model to estimate change during treatment when termination is related to recovery speed.

    PubMed

    Gottfredson, Nisha C; Bauer, Daniel J; Baldwin, Scott A; Okiishi, John C

    2014-10-01

    This study demonstrates how to use a shared parameter mixture model (SPMM) in longitudinal psychotherapy studies to accommodate missingness that is due to a correlation between rate of improvement and termination of therapy. Traditional growth models assume that such a relationship does not exist (i.e., assume that data are missing at random) and produce biased results if this assumption is incorrect. We used longitudinal data from 4,676 patients enrolled in a naturalistic study of psychotherapy to compare results from a latent growth model and an SPMM. In this data set, estimates of the rate of improvement during therapy differed by 6.50%-6.66% across the two models, indicating that participants with steeper trajectories left psychotherapy earliest, thereby potentially biasing inference for the slope in the latent growth model. We conclude that reported estimates of change during therapy may be underestimated in naturalistic studies of therapy in which participants and their therapists determine the end of treatment. Because non-randomly missing data can also occur in randomized controlled trials or in observational studies of development, the utility of the SPMM extends beyond naturalistic psychotherapy data. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  10. Random crystal field effects on the integer and half-integer mixed-spin system

    NASA Astrophysics Data System (ADS)

    Yigit, Ali; Albayrak, Erhan

    2018-05-01

    In this work, we have focused on the random crystal field effects on the phase diagrams of the mixed spin-1 and spin-5/2 Ising system obtained by utilizing the exact recursion relations (ERR) on the Bethe lattice (BL). The distribution function P(Di) = pδ [Di - D(1 + α) ] +(1 - p) δ [Di - D(1 - α) ] is used to randomize the crystal field.The phase diagrams are found to exhibit second- and first-order phase transitions depending on the values of α, D and p. It is also observed that the model displays tricritical point, isolated point, critical end point and three compensation temperatures for suitable values of the system parameters.

  11. Health utility of patients with Crohn's disease and ulcerative colitis: a systematic review and meta-analysis.

    PubMed

    Malinowski, Krzysztof Piotr; Kawalec, Paweł

    2016-08-01

    The aim of this systematic review was to collect and summarize the current data on the utilities of patients with Crohn's disease (CD) and ulcerative colitis (UC). A meta-analysis of the obtained utilities was performed using a random-effects model and meta-regression by the disease type and severity. A bootstrap analysis was performed as it does not require assumption on distribution of the data. The highest utility among patients with CD and UC was observed when the diseases were in remission. The meta-regression analysis showed that both disease severity and an instrument/method/questionnaire used to obtain utilities were significant predictors of utility. Utility was the lowest for severe disease and the highest for disease in remission, the association was more notable in patients with CD compared with UC. Expert commentary: The issue of patients' utility is important for healthcare decision makers but it has not been fully investigated and requires further study.

  12. Valuing SF-6D Health States Using a Discrete Choice Experiment.

    PubMed

    Norman, Richard; Viney, Rosalie; Brazier, John; Burgess, Leonie; Cronin, Paula; King, Madeleine; Ratcliffe, Julie; Street, Deborah

    2014-08-01

    SF-6D utility weights are conventionally produced using a standard gamble (SG). SG-derived weights consistently demonstrate a floor effect not observed with other elicitation techniques. Recent advances in discrete choice methods have allowed estimation of utility weights. The objective was to produce Australian utility weights for the SF-6D and to explore the application of discrete choice experiment (DCE) methods in this context. We hypothesized that weights derived using this method would reflect the largely monotonic construction of the SF-6D. We designed an online DCE and administered it to an Australia-representative online panel (n = 1017). A range of specifications investigating nonlinear preferences with respect to additional life expectancy were estimated using a random-effects probit model. The preferred model was then used to estimate a preference index such that full health and death were valued at 1 and 0, respectively, to provide an algorithm for Australian cost-utility analyses. Physical functioning, pain, mental health, and vitality were the largest drivers of utility weights. Combining levels to remove illogical orderings did not lead to a poorer model fit. Relative to international SG-derived weights, the range of utility weights was larger with 5% of health states valued below zero. s. DCEs can be used to investigate preferences for health profiles and to estimate utility weights for multi-attribute utility instruments. Australian cost-utility analyses can now use domestic SF-6D weights. The comparability of DCE results to those using other elicitation methods for estimating utility weights for quality-adjusted life-year calculations should be further investigated. © The Author(s) 2013.

  13. Detection of mastitis in dairy cattle by use of mixture models for repeated somatic cell scores: a Bayesian approach via Gibbs sampling.

    PubMed

    Odegård, J; Jensen, J; Madsen, P; Gianola, D; Klemetsdal, G; Heringstad, B

    2003-11-01

    The distribution of somatic cell scores could be regarded as a mixture of at least two components depending on a cow's udder health status. A heteroscedastic two-component Bayesian normal mixture model with random effects was developed and implemented via Gibbs sampling. The model was evaluated using datasets consisting of simulated somatic cell score records. Somatic cell score was simulated as a mixture representing two alternative udder health statuses ("healthy" or "diseased"). Animals were assigned randomly to the two components according to the probability of group membership (Pm). Random effects (additive genetic and permanent environment), when included, had identical distributions across mixture components. Posterior probabilities of putative mastitis were estimated for all observations, and model adequacy was evaluated using measures of sensitivity, specificity, and posterior probability of misclassification. Fitting different residual variances in the two mixture components caused some bias in estimation of parameters. When the components were difficult to disentangle, so were their residual variances, causing bias in estimation of Pm and of location parameters of the two underlying distributions. When all variance components were identical across mixture components, the mixture model analyses returned parameter estimates essentially without bias and with a high degree of precision. Including random effects in the model increased the probability of correct classification substantially. No sizable differences in probability of correct classification were found between models in which a single cow effect (ignoring relationships) was fitted and models where this effect was split into genetic and permanent environmental components, utilizing relationship information. When genetic and permanent environmental effects were fitted, the between-replicate variance of estimates of posterior means was smaller because the model accounted for random genetic drift.

  14. Preference uncertainty, preference learning, and paired comparison experiments

    Treesearch

    David C. Kingsley; Thomas C. Brown

    2010-01-01

    Results from paired comparison experiments suggest that as respondents progress through a sequence of binary choices they become more consistent, apparently fine-tuning their preferences. Consistency may be indicated by the variance of the estimated valuation distribution measured by the error term in the random utility model. A significant reduction in the variance is...

  15. Quincke random walkers

    NASA Astrophysics Data System (ADS)

    Pradillo, Gerardo; Heintz, Aneesh; Vlahovska, Petia

    2017-11-01

    The spontaneous rotation of a sphere in an applied uniform DC electric field (Quincke effect) has been utilized to engineer self-propelled particles: if the sphere is initially resting on a surface, it rolls. The Quincke rollers have been widely used as a model system to study collective behavior in ``active'' suspensions. If the applied field is DC, an isolated Quincke roller follows a straight line trajectory. In this talk, we discuss the design of a Quincke roller that executes a random-walk-like behavior. We utilize AC field - upon reversal of the field direction a fluctuation in the axis of rotation (which is degenerate in the plane perpendicular to the field and parallel to the surface) introduces randomness in the direction of motion. The MSD of an isolated Quincke walker depends on frequency, amplitude, and waveform of the electric field. Experiment and theory are compared. We also investigate the collective behavior of Quincke walkers,the transport of inert particles in a bath of Quincke walkers, and the spontaneous motion of a drop containing Quincke active particle. supported by NSF Grant CBET 1437545.

  16. On the design of henon and logistic map-based random number generator

    NASA Astrophysics Data System (ADS)

    Magfirawaty; Suryadi, M. T.; Ramli, Kalamullah

    2017-10-01

    The key sequence is one of the main elements in the cryptosystem. True Random Number Generators (TRNG) method is one of the approaches to generating the key sequence. The randomness source of the TRNG divided into three main groups, i.e. electrical noise based, jitter based and chaos based. The chaos based utilizes a non-linear dynamic system (continuous time or discrete time) as an entropy source. In this study, a new design of TRNG based on discrete time chaotic system is proposed, which is then simulated in LabVIEW. The principle of the design consists of combining 2D and 1D chaotic systems. A mathematical model is implemented for numerical simulations. We used comparator process as a harvester method to obtain the series of random bits. Without any post processing, the proposed design generated random bit sequence with high entropy value and passed all NIST 800.22 statistical tests.

  17. DISPERSION POLYMERIZATION OF STYRENE IN SUPERCRITICAL CARBON DIOXIDE UTILIZING RANDOM COPOLYMERS INCLUDING FLUORINATED ACRYLATE FOR PREPARING MICRON-SIZE POLYSTYRENE PARTICLES. (R826115)

    EPA Science Inventory

    The dispersion polymerization of styrene in supercritical CO2 utilizing CO2-philic random copolymers was investigated. The resulting high yield of polystyrene particles in the micron-size range was formed using various random copolymers as stabilizers. The p...

  18. Multi-state time-varying reliability evaluation of smart grid with flexible demand resources utilizing Lz transform

    NASA Astrophysics Data System (ADS)

    Jia, Heping; Jin, Wende; Ding, Yi; Song, Yonghua; Yu, Dezhao

    2017-01-01

    With the expanding proportion of renewable energy generation and development of smart grid technologies, flexible demand resources (FDRs) have been utilized as an approach to accommodating renewable energies. However, multiple uncertainties of FDRs may influence reliable and secure operation of smart grid. Multi-state reliability models for a single FDR and aggregating FDRs have been proposed in this paper with regard to responsive abilities for FDRs and random failures for both FDR devices and information system. The proposed reliability evaluation technique is based on Lz transform method which can formulate time-varying reliability indices. A modified IEEE-RTS has been utilized as an illustration of the proposed technique.

  19. Likelihood-Based Random-Effect Meta-Analysis of Binary Events.

    PubMed

    Amatya, Anup; Bhaumik, Dulal K; Normand, Sharon-Lise; Greenhouse, Joel; Kaizar, Eloise; Neelon, Brian; Gibbons, Robert D

    2015-01-01

    Meta-analysis has been used extensively for evaluation of efficacy and safety of medical interventions. Its advantages and utilities are well known. However, recent studies have raised questions about the accuracy of the commonly used moment-based meta-analytic methods in general and for rare binary outcomes in particular. The issue is further complicated for studies with heterogeneous effect sizes. Likelihood-based mixed-effects modeling provides an alternative to moment-based methods such as inverse-variance weighted fixed- and random-effects estimators. In this article, we compare and contrast different mixed-effect modeling strategies in the context of meta-analysis. Their performance in estimation and testing of overall effect and heterogeneity are evaluated when combining results from studies with a binary outcome. Models that allow heterogeneity in both baseline rate and treatment effect across studies have low type I and type II error rates, and their estimates are the least biased among the models considered.

  20. Statistical Evaluation of Utilization of the ISS

    NASA Technical Reports Server (NTRS)

    Andrews, Ross; Andrews, Alida

    2006-01-01

    PayLoad Utilization Modeler (PLUM) is a statistical-modeling computer program used to evaluate the effectiveness of utilization of the International Space Station (ISS) in terms of the number of research facilities that can be operated within a specified interval of time. PLUM is designed to balance the requirements of research facilities aboard the ISS against the resources available on the ISS. PLUM comprises three parts: an interface for the entry of data on constraints and on required and available resources, a database that stores these data as well as the program output, and a modeler. The modeler comprises two subparts: one that generates tens of thousands of random combinations of research facilities and another that calculates the usage of resources for each of those combinations. The results of these calculations are used to generate graphical and tabular reports to determine which facilities are most likely to be operable on the ISS, to identify which ISS resources are inadequate to satisfy the demands upon them, and to generate other data useful in allocation of and planning of resources.

  1. Automated Firearms Identification System (AFIDS), phase 1

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.; Framan, E. P.

    1974-01-01

    Items critical to the future development of an automated firearms identification system (AFIDS) have been examined, with the following specific results: (1) Types of objective data, that can be utilized to help establish a more factual basis for determining identity and nonidentity between pairs of fired bullets, have been identified. (2) A simulation study has indicated that randomly produced lines, similar in nature to the individual striations on a fired bullet, can be modeled and that random sequences, when compared to each other, have predictable relationships. (3) A schematic diagram of the general concept for AFIDS has been developed and individual elements of this system have been briefly tested for feasibility. Future implementation of such a proposed system will depend on such factors as speed, utility, projected total cost and user requirements for growth. The success of the proposed system, when operational, would depend heavily on existing firearms examiners.

  2. Fluoxetine and imipramine: are there differences in cost-utility for depression in primary care?

    PubMed

    Serrano-Blanco, Antoni; Suárez, David; Pinto-Meza, Alejandra; Peñarrubia, Maria T; Haro, Josep Maria

    2009-02-01

    Depressive disorders generate severe personal burden and high economic costs. Cost-utility analyses of the different therapeutical options are crucial to policy-makers and clinicians. Previous cost-utility studies, comparing selective serotonin reuptake inhibitors and tricyclic antidepressants, have used modelling techniques or have not included indirect costs in the economic analyses. To determine the cost-utility of fluoxetine compared with imipramine for treating depressive disorders in primary care. A 6-month randomized prospective naturalistic study comparing fluoxetine with imipramine was conducted in three primary care centres in Spain. One hundred and three patients requiring antidepressant treatment for a DSM-IV depressive disorder were included in the study. Patients were randomized either to fluoxetine (53 patients) or to imipramine (50 patients) treatment. Patients were treated with antidepressants according to their general practitioner's usual clinical practice. Outcome measures were the quality of life tariff of the European Quality of Life Questionnaire: EuroQoL-5D (five domains), direct costs, indirect costs and total costs. Subjects were evaluated at the beginning of treatment and after 1, 3 and 6 months. Incremental cost-utility ratios (ICUR) were obtained. To address uncertainty in the ICUR's sampling distribution, non-parametric bootstrapping was carried out. Taking into account adjusted total costs and incremental quality of life gained, imipramine dominated fluoxetine with 81.5% of the bootstrap replications in the dominance quadrant. Imipramine seems to be a better cost-utility antidepressant option for treating depressive disorders in primary care.

  3. Reinforcing integrated psychiatric service attendance in an opioid-agonist program: a randomized and controlled trial.

    PubMed

    Kidorf, Michael; Brooner, Robert K; Gandotra, Neeraj; Antoine, Denis; King, Van L; Peirce, Jessica; Ghazarian, Sharon

    2013-11-01

    The benefits of integrating substance abuse and psychiatric care may be limited by poor service utilization. This randomized clinical trial evaluated the efficacy of using contingency management to improve utilization of psychiatric services co-located and integrated within a community-based methadone maintenance treatment program. Opioid-dependent outpatients (n=125) with any current psychiatric disorder were randomly assigned to: (1) reinforced on-site integrated care (ROIC), with vouchers (worth $25.00) contingent on full adherence to each week of scheduled psychiatric services; or (2) standard on-site integrated care (SOIC). All participants received access to the same schedule of psychiatrist and mental health counseling sessions for 12-weeks. ROIC participants attended more overall psychiatric sessions at month 1 (M=7.53 vs. 3.97, p<.001), month 2 (M=6.31 vs. 2.81, p<.001), and month 3 (M=5.71 vs. 2.44, p<.001). Both conditions evidenced reductions in psychiatric distress (p<.001) and similar rates of drug-positive urine samples. No differences in study retention were observed. These findings suggest that contingency management can improve utilization of psychiatric services scheduled within an on-site and integrated treatment model. Delivering evidenced-based mental health counseling, or modifying the contingency plan to include illicit drug use, may be required to facilitate greater changes in psychiatric and substance abuse outcomes. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Strategic mating with common preferences.

    PubMed

    Alpern, Steve; Reyniers, Diane

    2005-12-21

    We present a two-sided search model in which individuals from two groups (males and females, employers and workers) would like to form a long-term relationship with a highly ranked individual of the other group, but are limited to individuals who they randomly encounter and to those who also accept them. This article extends the research program, begun in Alpern and Reyniers [1999. J. Theor. Biol. 198, 71-88], of providing a game theoretic analysis for the Kalick-Hamilton [1986. J. Personality Soc. Psychol. 51, 673-682] mating model in which a cohort of males and females of various 'fitness' or 'attractiveness' levels are randomly paired in successive periods and mate if they accept each other. Their model compared two acceptance rules chosen to represent homotypic (similarity) preferences and common (or 'type') preferences. Our earlier paper modeled the first kind by assuming that if a level x male mates with a level y female, both get utility -|x-y|, whereas this paper models the second kind by giving the male utility y and the female utility x. Our model can also be seen as a continuous generalization of the discrete fitness-level game of Johnstone [1997. Behav. Ecol. Sociobiol. 40, 51-59]. We establish the existence of equilibrium strategy pairs, give examples of multiple equilibria, and conditions guaranteeing uniqueness. In all equilibria individuals become less choosy over time, with high fitness individuals pairing off with each other first, leaving the rest to pair off later. This route to assortative mating was suggested by Parker [1983. Mate Choice, Cambridge University Press, Cambridge, pp. 141-164]. If the initial fitness distributions have atoms, then mixed strategy equilibria may also occur. If these distributions are unknown, there are equilibria in which only individuals in the same fitness band are mated, as in the steady-state model of MacNamara and Collins [1990. J. Appl. Prob. 28, 815-827] for the job search problem.

  5. Buses of Cuernavaca—an agent-based model for universal random matrix behavior minimizing mutual information

    NASA Astrophysics Data System (ADS)

    Warchoł, Piotr

    2018-06-01

    The public transportation system of Cuernavaca, Mexico, exhibits random matrix theory statistics. In particular, the fluctuation of times between the arrival of buses on a given bus stop, follows the Wigner surmise for the Gaussian unitary ensemble. To model this, we propose an agent-based approach in which each bus driver tries to optimize his arrival time to the next stop with respect to an estimated arrival time of his predecessor. We choose a particular form of the associated utility function and recover the appropriate distribution in numerical experiments for a certain value of the only parameter of the model. We then investigate whether this value of the parameter is otherwise distinguished within an information theoretic approach and give numerical evidence that indeed it is associated with a minimum of averaged pairwise mutual information.

  6. Using a Shared Parameter Mixture Model to Estimate Change during Treatment when Termination is Related to Recovery Speed

    PubMed Central

    Gottfredson, Nisha C.; Bauer, Daniel J.; Baldwin, Scott A.; Okiishi, John C.

    2014-01-01

    Objective This study demonstrates how to use a shared parameter mixture model (SPMM) in longitudinal psychotherapy studies to accommodate missing that are due to a correlation between rate of improvement and termination of therapy. Traditional growth models assume that such a relationship does not exist (i.e., assume that data are missing at random) and will produce biased results if this assumption is incorrect. Method We use longitudinal data from 4,676 patients enrolled in a naturalistic study of psychotherapy to compare results from a latent growth model and a shared parameter mixture model (SPMM). Results In this dataset, estimates of the rate of improvement during therapy differ by 6.50 – 6.66% across the two models, indicating that participants with steeper trajectories left psychotherapy earliest, thereby potentially biasing inference for the slope in the latent growth model. Conclusion We conclude that reported estimates of change during therapy may be underestimated in naturalistic studies of therapy in which participants and their therapists determine the end of treatment. Because non-randomly missing data can also occur in randomized controlled trials or in observational studies of development, the utility of the SPMM extends beyond naturalistic psychotherapy data. PMID:24274626

  7. Reduced mortality: the unexpected impact of a telephone-based care management intervention for older adults in managed care.

    PubMed

    Alkema, Gretchen E; Wilber, Kathleen H; Shannon, George R; Allen, Douglas

    2007-08-01

    This analysis evaluated mortality over 24 months for Medicare managed care members who participated in the Care Advocate Program (CA Program) designed to link those with high health care utilization to home- and community-based services. Secondary data from the CA Program, part of the California HealthCare Foundation's Elders in Managed Care Initiative. Randomized-control trial in which participants (N=781) were randomly assigned to intent-to-treat (ITT) and control groups. ITT group received telephonic social care management and 12 months of follow-up. Various multivariate analyses were used to evaluate mortality risk throughout multiple study periods controlling for sociodemographic characteristics, health status, and health care utilization. Older adults (65+) enrolled in a Medicare managed care plan who had high health care utilization in the previous year. ITT group had a significantly lower odds of mortality throughout the study (OR=0.55; p=.005) and during the care management intervention (OR=0.45; p=.006), whereas differential risk in the postintervention period was not statistically significant. Other significant predictors of mortality were age, gender, three chronic conditions (cancer, heart disease, and kidney disease), and emergency room utilization. Findings suggest that the care advocate model of social care management affected mortality while the program was in progress, but not after completion of the intervention phase. Key model elements accounted for the findings, which include individualized targeting, assessment, and monitoring; consumer choice, control, and participant self-management; and bridging medical and social service delivery systems through direct linkages and communication.

  8. A Methodological Study of a Computer-Managed Instructional Program in High School Physics.

    ERIC Educational Resources Information Center

    Denton, Jon James

    The purpose of this study was to develop and evaluate an instructional model which utilized the computer to produce individually prescribed instructional guides in physics at the secondary school level. The sample consisted of three classes. Of these, two were randomly selected to serve as the treatment groups, e.g., individualized instruction and…

  9. Vocational Interests in China: An Evaluation of the Personal Globe Inventory-Short

    ERIC Educational Resources Information Center

    Zhang, Yu; Kube, Erin; Wang, Yuzhong; Tracey, Terence J. G.

    2013-01-01

    A diverse Chinese sample of 2567 high school and college students was utilized to examine the structural validity of the PGI-S (Tracey, 2010) with respect to the fit to the circumplex structure, the theoretical model underlying the RIASEC types and the eight PGI (Tracey, 2002) interest types. The randomization test of hypothesized order relations…

  10. Underestimation of Variance of Predicted Health Utilities Derived from Multiattribute Utility Instruments.

    PubMed

    Chan, Kelvin K W; Xie, Feng; Willan, Andrew R; Pullenayegum, Eleanor M

    2017-04-01

    Parameter uncertainty in value sets of multiattribute utility-based instruments (MAUIs) has received little attention previously. This false precision leads to underestimation of the uncertainty of the results of cost-effectiveness analyses. The aim of this study is to examine the use of multiple imputation as a method to account for this uncertainty of MAUI scoring algorithms. We fitted a Bayesian model with random effects for respondents and health states to the data from the original US EQ-5D-3L valuation study, thereby estimating the uncertainty in the EQ-5D-3L scoring algorithm. We applied these results to EQ-5D-3L data from the Commonwealth Fund (CWF) Survey for Sick Adults ( n = 3958), comparing the standard error of the estimated mean utility in the CWF population using the predictive distribution from the Bayesian mixed-effect model (i.e., incorporating parameter uncertainty in the value set) with the standard error of the estimated mean utilities based on multiple imputation and the standard error using the conventional approach of using MAUI (i.e., ignoring uncertainty in the value set). The mean utility in the CWF population based on the predictive distribution of the Bayesian model was 0.827 with a standard error (SE) of 0.011. When utilities were derived using the conventional approach, the estimated mean utility was 0.827 with an SE of 0.003, which is only 25% of the SE based on the full predictive distribution of the mixed-effect model. Using multiple imputation with 20 imputed sets, the mean utility was 0.828 with an SE of 0.011, which is similar to the SE based on the full predictive distribution. Ignoring uncertainty of the predicted health utilities derived from MAUIs could lead to substantial underestimation of the variance of mean utilities. Multiple imputation corrects for this underestimation so that the results of cost-effectiveness analyses using MAUIs can report the correct degree of uncertainty.

  11. Effects of Interactive Voice Response Self-Monitoring on Natural Resolution of Drinking Problems: Utilization and Behavioral Economic Factors

    PubMed Central

    Tucker, Jalie A.; Roth, David L.; Huang, Jin; Scott Crawford, M.; Simpson, Cathy A.

    2012-01-01

    Objective: Most problem drinkers do not seek help, and many recover on their own. A randomized controlled trial evaluated whether supportive interactive voice response (IVR) self-monitoring facilitated such “natural” resolutions. Based on behavioral economics, effects on drinking outcomes were hypothesized to vary with drinkers’ baseline “time horizons,” reflecting preferences among commodities of different value available over different delays and with their IVR utilization. Method: Recently resolved untreated problem drinkers were randomized to a 24-week IVR self-monitoring program (n = 87) or an assessment-only control condition (n = 98). Baseline interviews assessed outcome predictors including behavioral economic measures of reward preferences (delay discounting, pre-resolution monetary allocation to alcohol vs. savings). Six-month outcomes were categorized as resolved abstinent, resolved nonabstinent, unresolved, or missing. Complier average causal effect (CACE) models examined IVR self-monitoring effects. Results: IVR self-monitoring compliers (≥70% scheduled calls completed) were older and had greater pre-resolution drinking control and lower discounting than noncompliers (<70%). A CACE model interaction showed that observed compliers in the IVR group with shorter time horizons (expressed by greater pre-resolution spending on alcohol than savings) were more likely to attain moderation than abstinent resolutions compared with predicted compliers in the control group with shorter time horizons and with all noncompliers. Intention-to-treat analytical models revealed no IVR-related effects. More balanced spending on savings versus alcohol predicted moderation in both approaches. Conclusions: IVR interventions should consider factors affecting IVR utilization and drinking outcomes, including person-specific behavioral economic variables. CACE models provide tools to evaluate interventions involving extended participation. PMID:22630807

  12. Emergency Department Frequent Utilization for Non-Emergent Presentments: Results from a Regional Urban Trauma Center Study.

    PubMed

    Behr, Joshua G; Diaz, Rafael

    2016-01-01

    First, to test a model of the drivers of frequent emergency department utilization conceptualized as falling within predisposing, enabling, and need dimensions. Second, to extend the model to include social networks and service quality as predictors of frequent utilization. Third, to illustrate the variation in thresholds that define frequent utilization in terms of the number of emergency department encounters by the predictors within the model. Primary data collection over an eight week period within a level-1 trauma urban hospital's emergency department. Representative randomized sample of 1,443 adult patients triaged ESI levels 4-5. Physicians and research staff interviewed patients as they received services. Relationships with the outcome variable, utilization, were tested using logistic regression to establish odds-ratios. 70.6 percent of patients have two or more, 48.3 percent have three or more, 25.3 percent have four or more, and 14.9 percent have five or more emergency department visits within 12 months. Factors associated with frequent utilization include gender, race, poor mental health, mental health drugs, prescription drug abuse, social networks, employment, perceptions of service quality, seriousness of condition, persistence of condition, and previous hospital admittance. Interventions targeting associated factors will change global emergency department encounters, although the mutability varies. Policy interventions to address predisposing factors such as substance abuse or access to mental health treatment as well as interventions that speak to enabling factors such as promoting the resiliency of social networks may result in decreased frequency of emergency department utilization.

  13. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part I: Fundamentals

    NASA Astrophysics Data System (ADS)

    Yan, Wang-Ji; Ren, Wei-Xin

    2016-12-01

    Recent advances in signal processing and structural dynamics have spurred the adoption of transmissibility functions in academia and industry alike. Due to the inherent randomness of measurement and variability of environmental conditions, uncertainty impacts its applications. This study is focused on statistical inference for raw scalar transmissibility functions modeled as complex ratio random variables. The goal is achieved through companion papers. This paper (Part I) is dedicated to dealing with a formal mathematical proof. New theorems on multivariate circularly-symmetric complex normal ratio distribution are proved on the basis of principle of probabilistic transformation of continuous random vectors. The closed-form distributional formulas for multivariate ratios of correlated circularly-symmetric complex normal random variables are analytically derived. Afterwards, several properties are deduced as corollaries and lemmas to the new theorems. Monte Carlo simulation (MCS) is utilized to verify the accuracy of some representative cases. This work lays the mathematical groundwork to find probabilistic models for raw scalar transmissibility functions, which are to be expounded in detail in Part II of this study.

  14. The cubic ternary complex receptor-occupancy model. III. resurrecting efficacy.

    PubMed

    Weiss, J M; Morgan, P H; Lutz, M W; Kenakin, T P

    1996-08-21

    Early work in pharmacology characterized the interaction of receptors and ligands in terms of two parameters, affinity and efficacy, an approach we term the bipartite view. A precise formulation of efficacy only exists for very simple pharmacological models. Here we extend the notion of efficacy to models that incorporate receptor activation and G-protein coupling. Using the cubic ternary complex model, we show that efficacy is not purely a property of the ligand-receptor interaction; it also depends upon the distributional details of the receptor species in the native receptor ensemble. This suggests a distinction between what we call potential efficacy (a vector) and realized efficacy (a scalar). To each receptor species in the native receptor ensemble we assign a part-worth utility; taken together these utilities comprise the potential efficacy vector. Realized efficacy is the expectation of these part-worth utilities with respect to the frequency distribution of receptor species in the native receptor ensemble. In the parlance of statistical decision theory, the binding of a ligand to a receptor ensemble is a random prospect and realized efficacy is the utility of this prospect. We explore the implications that our definition of efficacy has for understanding agonism and in assessing the legitimacy of the bipartite view in pharmacology.

  15. A practical approach to automate randomized design of experiments for ligand-binding assays.

    PubMed

    Tsoi, Jennifer; Patel, Vimal; Shih, Judy

    2014-03-01

    Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.

  16. Variation in utilization of multivessel percutaneous coronary intervention: influence of hospital volume.

    PubMed

    Patel, Nilay; Pant, Sadip; Panaich, Sidakpal S; Patel, Nileshkumar J; Arora, Shilpkumar; Gidwani, Umesh; Mohamad, Tamam; Schreiber, Theodore; Badheka, Apurva O; Grines, Cindy

    2015-12-01

    The purpose of this study was to investigate the contemporary trends in the utilization of multivessel percutaneous coronary interventions (MVPCIs) in the USA. We queried the Healthcare Cost and Utilization Project's Nationwide Inpatient Sample between 2006 and 2011 using the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) procedure codes 00.40 (single stent), 00.46, 00.47, and 00.48 (single vessel and multiple stents) and 00.41, 00.42 and 00.43 (MVPCI). We built a hierarchical three-level model adjusted for multiple confounding factors. A total of 543 434 (weighted: 2 683 206) procedures were identified. Independent predictors of increased MVPCI utilization (odds ratio, 95% confidence interval, P-value) were found to be age (1.05, 1.04-1.07, P<0.001) and comorbid conditions on using Deyo's modification of Charlson's comorbidity index of at least 2 (1.13, 1.09-1.16, P<0.001). Female sex (0.88, 0.87-0.90, P<0.001), myocardial infarction (0.86, 0.83-0.89, P<0.001), weekend admissions (0.94, 0.91-0.96, P<0.001), and urgent admissions (0.88, 0.83-0.93, P<0.001) predicted decreased utilization. Highest quartile of hospital (1.34, 1.16-1.54, P<0.001) predicted higher utilization. Between-hospital variation of 7.7% (interclass correlation coefficient) was observed, which was minimally affected by patient or hospital mix. A randomly selected patient was ∼1.6 (median odds ratio) times more likely to receive an MVPCI from a given hospital compared with another identical patient being treated at a different random hospital. The utilization rate of MVPCI varied considerably among hospitals. Higher annual hospital volume was associated with a higher utilization rate of MVPCI.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elliott, C.J.; McVey, B.; Quimby, D.C.

    The level of field errors in an FEL is an important determinant of its performance. We have computed 3D performance of a large laser subsystem subjected to field errors of various types. These calculations have been guided by simple models such as SWOOP. The technique of choice is utilization of the FELEX free electron laser code that now possesses extensive engineering capabilities. Modeling includes the ability to establish tolerances of various types: fast and slow scale field bowing, field error level, beam position monitor error level, gap errors, defocusing errors, energy slew, displacement and pointing errors. Many effects of thesemore » errors on relative gain and relative power extraction are displayed and are the essential elements of determining an error budget. The random errors also depend on the particular random number seed used in the calculation. The simultaneous display of the performance versus error level of cases with multiple seeds illustrates the variations attributable to stochasticity of this model. All these errors are evaluated numerically for comprehensive engineering of the system. In particular, gap errors are found to place requirements beyond mechanical tolerances of {plus minus}25{mu}m, and amelioration of these may occur by a procedure utilizing direct measurement of the magnetic fields at assembly time. 4 refs., 12 figs.« less

  18. Investigation of hit-and-run crash occurrence and severity using real-time loop detector data and hierarchical Bayesian binary logit model with random effects.

    PubMed

    Xie, Meiquan; Cheng, Wen; Gill, Gurdiljot Singh; Zhou, Jiao; Jia, Xudong; Choi, Simon

    2018-02-17

    Most of the extensive research dedicated to identifying the influential factors of hit-and-run (HR) crashes has utilized typical maximum likelihood estimation binary logit models, and none have employed real-time traffic data. To fill this gap, this study focused on investigating factors contributing to HR crashes, as well as the severity levels of HR. This study analyzed 4-year crash and real-time loop detector data by employing hierarchical Bayesian models with random effects within a sequential logit structure. In addition to evaluation of the impact of random effects on model fitness and complexity, the prediction capability of the models was examined. Stepwise incremental sensitivity and specificity were calculated and receiver operating characteristic (ROC) curves were utilized to graphically illustrate the predictive performance of the model. Among the real-time flow variables, the average occupancy and speed from the upstream detector were observed to be positively correlated with HR crash possibility. The average upstream speed and speed difference between upstream and downstream speeds were correlated with the occurrence of severe HR crashes. In addition to real-time factors, other variables found influential for HR and severe HR crashes were length of segment, adverse weather conditions, dark lighting conditions with malfunctioning street lights, driving under the influence of alcohol, width of inner shoulder, and nighttime. This study suggests the potential traffic conditions of HR and severe HR occurrence, which refer to relatively congested upstream traffic conditions with high upstream speed and significant speed deviations on long segments. The above findings suggest that traffic enforcement should be directed toward mitigating risky driving under the aforementioned traffic conditions. Moreover, enforcement agencies may employ alcohol checkpoints to counter driving under the influence (DUI) at night. With regard to engineering improvements, wider inner shoulders may be constructed to potentially reduce HR cases and street lights should be installed and maintained in working condition to make roads less prone to such crashes.

  19. A random utility based estimation framework for the household activity pattern problem.

    DOT National Transportation Integrated Search

    2016-06-01

    This paper develops a random utility based estimation framework for the Household Activity : Pattern Problem (HAPP). Based on the realization that output of complex activity-travel decisions : form a continuous pattern in space-time dimension, the es...

  20. An Enhanced MEMS Error Modeling Approach Based on Nu-Support Vector Regression

    PubMed Central

    Bhatt, Deepak; Aggarwal, Priyanka; Bhattacharya, Prabir; Devabhaktuni, Vijay

    2012-01-01

    Micro Electro Mechanical System (MEMS)-based inertial sensors have made possible the development of a civilian land vehicle navigation system by offering a low-cost solution. However, the accurate modeling of the MEMS sensor errors is one of the most challenging tasks in the design of low-cost navigation systems. These sensors exhibit significant errors like biases, drift, noises; which are negligible for higher grade units. Different conventional techniques utilizing the Gauss Markov model and neural network method have been previously utilized to model the errors. However, Gauss Markov model works unsatisfactorily in the case of MEMS units due to the presence of high inherent sensor errors. On the other hand, modeling the random drift utilizing Neural Network (NN) is time consuming, thereby affecting its real-time implementation. We overcome these existing drawbacks by developing an enhanced Support Vector Machine (SVM) based error model. Unlike NN, SVMs do not suffer from local minimisation or over-fitting problems and delivers a reliable global solution. Experimental results proved that the proposed SVM approach reduced the noise standard deviation by 10–35% for gyroscopes and 61–76% for accelerometers. Further, positional error drifts under static conditions improved by 41% and 80% in comparison to NN and GM approaches. PMID:23012552

  1. A new modelling approach for zooplankton behaviour

    NASA Astrophysics Data System (ADS)

    Keiyu, A. Y.; Yamazaki, H.; Strickler, J. R.

    We have developed a new simulation technique to model zooplankton behaviour. The approach utilizes neither the conventional artificial intelligence nor neural network methods. We have designed an adaptive behaviour network, which is similar to BEER [(1990) Intelligence as an adaptive behaviour: an experiment in computational neuroethology, Academic Press], based on observational studies of zooplankton behaviour. The proposed method is compared with non- "intelligent" models—random walk and correlated walk models—as well as observed behaviour in a laboratory tank. Although the network is simple, the model exhibits rich behavioural patterns similar to live copepods.

  2. Promoting exercise behavior among Chinese youth with hearing loss: a randomized controlled trial based on the transtheoretical model.

    PubMed

    Si, Qi; Yu, Kehong; Cardinal, Bradley J; Lee, Hyo; Yan, Zi; Loprinzi, Paul D; Li, Fuzhong; Liu, Haiqun

    2011-12-01

    The transtheoretical model proposes that behavior change is experienced as a series of stages. Interventions tailored to these stages are most likely to be effective in progressing people through the model's hypothesized behavior change continuum. In this study, a stage-tailored, 12-week, exercise behavior intervention based on the transtheoretical model was conducted among a sample of 150 Chinese youth with hearing loss. Participants were randomized into an intervention or control group with all the core transtheoretical model constructs assessed pre- and post-intervention. Participants in the intervention group showed greater advances in their stage of exercise behavior change, decisional balance, and processes of change use compared to those in the control group. The intervention, however, was insufficient for increasing participants' self-efficacy for exercise behavior. The findings partially support the utility of the theory-based intervention for improving the exercise behavior of Chinese youth with hearing loss, while simultaneously helping to identify areas in need of improvement for future applications.

  3. Economics of Utility Scale Photovoltaics at Purdue University

    NASA Astrophysics Data System (ADS)

    Arnett, William

    The research for this case study shows that utility scale solar photovoltaics has become a competitive energy investment option, even when a campus operates a power plant at low electricity rates. To evaluate this an economic model called SEEMS (Solar Economic Evaluation Modelling Spreadsheets) was developed to evaluate a number of financial scenarios in Real Time Pricing for universities. The three main financing structures considered are 1) land leasing, 2) university direct purchase, and 3) third party purchase. Unlike other commercially available models SEEMS specifically accounts for real time pricing, where the local utility provides electricity at an hourly rate that changes with the expected demand. In addition, SEEMS also includes a random simulation that allows the model to predict the likelihood of success for a given solar installation strategy. The research showed that there are several options for utility scale solar that are financially attractive. The most practical financing structure is with a third party partnership because of the opportunity to take advantage of tax incentives. Other options could become more attractive if non-financial benefits are considered. The case study for this research, Purdue University, has a unique opportunity to integrate utility-scale solar electricity into its strategic planning. Currently Purdue is updating its master plan which will define how land is developed. Purdue is also developing a sustainability plan that will define long term environmental goals. In addition, the university is developing over 500 acres of land west of campus as part of its Aerospace Innovation District. This research helps make the case for including utility-scale solar electricity as part of the university's strategic planning.

  4. Transoral Incisionless Fundoplication (TIF 2.0): A Meta-Analysis of Three Randomized, Controlled Clinical Trials.

    PubMed

    Gerson, Lauren; Stouch, Bruce; Lobonţiu, Adrian

    2018-01-01

    The TIF procedure has emerged as an endoscopic treatment for patients with refractory gastro-esophageal reflux disease (GERD). Previous systematic reviews of the TIF procedure conflated findings from studies with modalities that do not reflect the current 2.0 procedure technique or refined data-backed patient selection criteria. A meta-analysis was conducted using data only from randomized studies that assessed the TIF 2.0 procedure compared to a control. The purpose of the meta-analysis was to determine the efficacy and long-term outcomes associated with performance of the TIF 2.0 procedure in patients with chronic long-term refractory GERD on optimized PPI therapy, including esophageal pH, PPI utilization and quality of life. Methods: Three prospective research questions were predicated on the outcomes of the TIF procedure compared to patients who received PPI therapy or sham, concomitant treatment for GERD, and the patient-reported quality of life. Event rates were calculated using the random effect model. Since the time of follow-up post-TIF procedure was variable, analysis was performed to incorporate the time of follow-up for each individual patient at the 3-year time point. Results: Results from this meta-analysis, including data from 233 patients, demonstrated that TIF subjects at 3 years had improved esophageal pH, a decrease in PPI utilization, and improved quality of life. Conclusions: In a meta-analysis of randomized, controlled trials (RCTs), the TIF procedure data for patients with GERD refractory to PPI's produces significant changes, compared with sham or PPI therapy, in esophageal pH, decreased PPI utilization, and improved quality of life. Celsius.

  5. Intervention Tailoring for Chinese American Women: Comparing the Effects of Two Videos on Knowledge, Attitudes and Intentions to Obtain a Mammogram

    ERIC Educational Resources Information Center

    Wang, Judy Huei-yu; Schwartz, Marc D.; Luta, George; Maxwell, Annette E.; Mandelblatt, Jeanne S.

    2012-01-01

    This study utilized data from an ongoing randomized controlled trial to compare a culturally tailored video promoting positive attitudes toward mammography among Chinese immigrant women to a linguistically appropriate generic video and print media. Intervention development was guided by the Health Belief Model. Five hundred and ninety-two…

  6. The Influence of Neighborhood Characteristics and Parenting Practices on Academic Problems and Aggression Outcomes among Moderately to Highly Aggressive Children

    ERIC Educational Resources Information Center

    Barry, Tammy D.; Lochman, John E.; Fite, Paula J.; Wells, Karen C.; Colder, Craig R.

    2012-01-01

    The current study utilized a longitudinal design to examine the effects of neighborhood and parenting on 120 at-risk children's academic and aggressive outcomes, concurrently and at two later timepoints during the transition to middle school. Random effects regression models were estimated to examine whether neighborhood characteristics and harsh…

  7. Dynamic Quantum Allocation and Swap-Time Variability in Time-Sharing Operating Systems.

    ERIC Educational Resources Information Center

    Bhat, U. Narayan; Nance, Richard E.

    The effects of dynamic quantum allocation and swap-time variability on central processing unit (CPU) behavior are investigated using a model that allows both quantum length and swap-time to be state-dependent random variables. Effective CPU utilization is defined to be the proportion of a CPU busy period that is devoted to program processing, i.e.…

  8. Calibration of semi-stochastic procedure for simulating high-frequency ground motions

    USGS Publications Warehouse

    Seyhan, Emel; Stewart, Jonathan P.; Graves, Robert

    2013-01-01

    Broadband ground motion simulation procedures typically utilize physics-based modeling at low frequencies, coupled with semi-stochastic procedures at high frequencies. The high-frequency procedure considered here combines deterministic Fourier amplitude spectra (dependent on source, path, and site models) with random phase. Previous work showed that high-frequency intensity measures from this simulation methodology attenuate faster with distance and have lower intra-event dispersion than in empirical equations. We address these issues by increasing crustal damping (Q) to reduce distance attenuation bias and by introducing random site-to-site variations to Fourier amplitudes using a lognormal standard deviation ranging from 0.45 for Mw < 7 to zero for Mw 8. Ground motions simulated with the updated parameterization exhibit significantly reduced distance attenuation bias and revised dispersion terms are more compatible with those from empirical models but remain lower at large distances (e.g., > 100 km).

  9. Oropharyngeal dysphagia: surveying practice patterns of the speech-language pathologist.

    PubMed

    Martino, Rosemary; Pron, Gaylene; Diamant, Nicholas E

    2004-01-01

    The present study was designed to obtain a comprehensive view of the dysphagia assessment practice patterns of speech-language pathologists and their opinion on the importance of these practices using survey methods and taking into consideration clinician, patient, and practice-setting variables. A self-administered mail questionnaire was developed following established methodology to maximize response rates. Eight dysphagia experts independently rated the new survey for content validity. Test-retest reliability was assessed with a random sample of 23 participants. The survey was sent to 50 speech-language pathologists randomly selected from the Canadian professional association database of members who practice in dysphagia. Surveys were mailed according to the Dillman Total Design Method and included an incentive offer. High survey (64%) and item response (95%) rates were achieved and clinicians were reliable reporters of their practice behaviors (ICC>0.60). Of all the clinical assessment items, 36% were reported with high (>80%) utilization and 24% with low (<20%) utilization, the former pertaining to tongue motion and vocal quality after food/fluid intake and the latter to testing of oral sensation without food. One-third (33%) of instrumental assessment items were highly utilized and included assessment of bolus movement and laryngeal response to bolus misdirection. Overall, clinician experience and teaching institutions influenced greater utilization. Opinions of importance were similar to utilization behaviors (r = 0.947, p = 0.01). Of all patients referred for dysphagia assessment, full clinical assessments were administered to 71% of patients but instrumental assessments to only 36%. A hierarchical model of practice behavior is proposed to explain this pattern of progressively decreasing item utilization.

  10. Power spectral ensity of markov texture fields

    NASA Technical Reports Server (NTRS)

    Shanmugan, K. S.; Holtzman, J. C.

    1984-01-01

    Texture is an important image characteristic. A variety of spatial domain techniques were proposed for extracting and utilizing textural features for segmenting and classifying images. for the most part, these spatial domain techniques are ad hos in nature. A markov random field model for image texture is discussed. A frequency domain description of image texture is derived in terms of the power spectral density. This model is used for designing optimum frequency domain filters for enhancing, restoring and segmenting images based on their textural properties.

  11. Impact of Primary Care Intensive Management on High-Risk Veterans' Costs and Utilization: A Randomized Quality Improvement Trial.

    PubMed

    Yoon, Jean; Chang, Evelyn; Rubenstein, Lisa V; Park, Angel; Zulman, Donna M; Stockdale, Susan; Ong, Michael K; Atkins, David; Schectman, Gordon; Asch, Steven M

    2018-06-05

    Primary care models that offer comprehensive, accessible care to all patients may provide insufficient resources to meet the needs of patients with complex conditions who have the greatest risk for hospitalization. To assess whether augmenting usual primary care with team-based intensive management lowers utilization and costs for high-risk patients. Randomized quality improvement trial. (ClinicalTrials.gov: NCT03100526). 5 U.S. Department of Veterans Affairs (VA) medical centers. Primary care patients at high risk for hospitalization who had a recent acute care episode. Locally tailored intensive management programs providing care coordination, goals assessment, health coaching, medication reconciliation, and home visits through an interdisciplinary team, including a physician or nurse practitioner, a nurse, and psychosocial experts. Utilization and costs (including intensive management program expenses) 12 months before and after randomization. 2210 patients were randomly assigned, 1105 to intensive management and 1105 to usual care. Patients had a mean age of 63 years and an average of 7 chronic conditions; 90% were men. Of the patients assigned to intensive management, 487 (44%) received intensive outpatient care (that is, ≥3 encounters in person or by telephone) and 204 (18%) received limited intervention. From the pre- to postrandomization periods, mean inpatient costs decreased more for the intensive management than the usual care group (-$2164 [95% CI, -$7916 to $3587]). Outpatient costs increased more for the intensive management than the usual care group ($2636 [CI, $524 to $4748]), driven by greater use of primary care, home care, telephone care, and telehealth. Mean total costs were similar in the 2 groups before and after randomization. Sites took up to several months to contact eligible patients, limiting the time between treatment and outcome assessment. Only VA costs were assessed. High-risk patients with access to an intensive management program received more outpatient care with no increase in total costs. Veterans Health Administration Primary Care Services.

  12. An evaluation of behavior inferences from Bayesian state-space models: A case study with the Pacific walrus

    USGS Publications Warehouse

    Beatty, William; Jay, Chadwick V.; Fischbach, Anthony S.

    2016-01-01

    State-space models offer researchers an objective approach to modeling complex animal location data sets, and state-space model behavior classifications are often assumed to have a link to animal behavior. In this study, we evaluated the behavioral classification accuracy of a Bayesian state-space model in Pacific walruses using Argos satellite tags with sensors to detect animal behavior in real time. We fit a two-state discrete-time continuous-space Bayesian state-space model to data from 306 Pacific walruses tagged in the Chukchi Sea. We matched predicted locations and behaviors from the state-space model (resident, transient behavior) to true animal behavior (foraging, swimming, hauled out) and evaluated classification accuracy with kappa statistics (κ) and root mean square error (RMSE). In addition, we compared biased random bridge utilization distributions generated with resident behavior locations to true foraging behavior locations to evaluate differences in space use patterns. Results indicated that the two-state model fairly classified true animal behavior (0.06 ≤ κ ≤ 0.26, 0.49 ≤ RMSE ≤ 0.59). Kernel overlap metrics indicated utilization distributions generated with resident behavior locations were generally smaller than utilization distributions generated with true foraging behavior locations. Consequently, we encourage researchers to carefully examine parameters and priors associated with behaviors in state-space models, and reconcile these parameters with the study species and its expected behaviors.

  13. Utility of the sore throat pain model in a multiple-dose assessment of the acute analgesic flurbiprofen: a randomized controlled study

    PubMed Central

    2014-01-01

    Background The sore throat pain model has been conducted by different clinical investigators to demonstrate the efficacy of acute analgesic drugs in single-dose randomized clinical trials. The model used here was designed to study the multiple-dose safety and efficacy of lozenges containing flurbiprofen at 8.75 mg. Methods Adults (n = 198) with moderate or severe acute sore throat and findings of pharyngitis on a Tonsillo-Pharyngitis Assessment (TPA) were randomly assigned to use either flurbiprofen 8.75 mg lozenges (n = 101) or matching placebo lozenges (n = 97) under double-blind conditions. Patients sucked one lozenge every three to six hours as needed, up to five lozenges per day, and rated symptoms on 100-mm scales: the Sore Throat Pain Intensity Scale (STPIS), the Difficulty Swallowing Scale (DSS), and the Swollen Throat Scale (SwoTS). Results Reductions in pain (lasting for three hours) and in difficulty swallowing and throat swelling (for four hours) were observed after a single dose of the flurbiprofen 8.75 mg lozenge (P <0.05 compared with placebo). After using multiple doses over 24 hours, flurbiprofen-treated patients experienced a 59% greater reduction in throat pain, 45% less difficulty swallowing, and 44% less throat swelling than placebo-treated patients (all P <0.01). There were no serious adverse events. Conclusions Utilizing the sore throat pain model with multiple doses over 24 hours, flurbiprofen 8.75 mg lozenges were shown to be an effective, well-tolerated treatment for sore throat pain. Other pharmacologic actions (reduced difficulty swallowing and reduced throat swelling) and overall patient satisfaction from the flurbiprofen lozenges were also demonstrated in this multiple-dose implementation of the sore throat pain model. Trial registration This trial was registered with ClinicalTrials.gov, registration number: NCT01048866, registration date: January 13, 2010. PMID:24988909

  14. LiDAR based prediction of forest biomass using hierarchical models with spatially varying coefficients

    USGS Publications Warehouse

    Babcock, Chad; Finley, Andrew O.; Bradford, John B.; Kolka, Randall K.; Birdsey, Richard A.; Ryan, Michael G.

    2015-01-01

    Many studies and production inventory systems have shown the utility of coupling covariates derived from Light Detection and Ranging (LiDAR) data with forest variables measured on georeferenced inventory plots through regression models. The objective of this study was to propose and assess the use of a Bayesian hierarchical modeling framework that accommodates both residual spatial dependence and non-stationarity of model covariates through the introduction of spatial random effects. We explored this objective using four forest inventory datasets that are part of the North American Carbon Program, each comprising point-referenced measures of above-ground forest biomass and discrete LiDAR. For each dataset, we considered at least five regression model specifications of varying complexity. Models were assessed based on goodness of fit criteria and predictive performance using a 10-fold cross-validation procedure. Results showed that the addition of spatial random effects to the regression model intercept improved fit and predictive performance in the presence of substantial residual spatial dependence. Additionally, in some cases, allowing either some or all regression slope parameters to vary spatially, via the addition of spatial random effects, further improved model fit and predictive performance. In other instances, models showed improved fit but decreased predictive performance—indicating over-fitting and underscoring the need for cross-validation to assess predictive ability. The proposed Bayesian modeling framework provided access to pixel-level posterior predictive distributions that were useful for uncertainty mapping, diagnosing spatial extrapolation issues, revealing missing model covariates, and discovering locally significant parameters.

  15. An Evaluation of a Behaviorally Based Social Skills Group for Individuals Diagnosed with Autism Spectrum Disorder.

    PubMed

    Leaf, Justin B; Leaf, Jeremy A; Milne, Christine; Taubman, Mitchell; Oppenheim-Leaf, Misty; Torres, Norma; Townley-Cochran, Donna; Leaf, Ronald; McEachin, John; Yoder, Paul

    2017-02-01

    In this study we evaluated a social skills group which employed a progressive applied behavior analysis model for individuals diagnosed with autism spectrum disorder. A randomized control trial was utilized; eight participants were randomly assigned to a treatment group and seven participants were randomly assigned to a waitlist control group. The social skills group consisted of 32, 2 h sessions. Teachers implemented a variety of behaviorally based procedures. A blind evaluator measured participants' behavior immediately prior to intervention, immediately following intervention, and during 16 and 32-week maintenance probes. Results of the study demonstrated that participants made significant improvements with their social behavior (p < .001) following intervention, and the results were maintained up to 32 weeks after intervention had concluded.

  16. Emergency Department Frequent Utilization for Non-Emergent Presentments: Results from a Regional Urban Trauma Center Study

    PubMed Central

    2016-01-01

    Objectives First, to test a model of the drivers of frequent emergency department utilization conceptualized as falling within predisposing, enabling, and need dimensions. Second, to extend the model to include social networks and service quality as predictors of frequent utilization. Third, to illustrate the variation in thresholds that define frequent utilization in terms of the number of emergency department encounters by the predictors within the model. Data Source Primary data collection over an eight week period within a level-1 trauma urban hospital’s emergency department. Study Design Representative randomized sample of 1,443 adult patients triaged ESI levels 4–5. Physicians and research staff interviewed patients as they received services. Relationships with the outcome variable, utilization, were tested using logistic regression to establish odds-ratios. Principal Findings 70.6 percent of patients have two or more, 48.3 percent have three or more, 25.3 percent have four or more, and 14.9 percent have five or more emergency department visits within 12 months. Factors associated with frequent utilization include gender, race, poor mental health, mental health drugs, prescription drug abuse, social networks, employment, perceptions of service quality, seriousness of condition, persistence of condition, and previous hospital admittance. Conclusions Interventions targeting associated factors will change global emergency department encounters, although the mutability varies. Policy interventions to address predisposing factors such as substance abuse or access to mental health treatment as well as interventions that speak to enabling factors such as promoting the resiliency of social networks may result in decreased frequency of emergency department utilization. PMID:26784515

  17. Stochastic damage evolution in textile laminates

    NASA Technical Reports Server (NTRS)

    Dzenis, Yuris A.; Bogdanovich, Alexander E.; Pastore, Christopher M.

    1993-01-01

    A probabilistic model utilizing random material characteristics to predict damage evolution in textile laminates is presented. Model is based on a division of each ply into two sublaminas consisting of cells. The probability of cell failure is calculated using stochastic function theory and maximal strain failure criterion. Three modes of failure, i.e. fiber breakage, matrix failure in transverse direction, as well as matrix or interface shear cracking, are taken into account. Computed failure probabilities are utilized in reducing cell stiffness based on the mesovolume concept. A numerical algorithm is developed predicting the damage evolution and deformation history of textile laminates. Effect of scatter of fiber orientation on cell properties is discussed. Weave influence on damage accumulation is illustrated with the help of an example of a Kevlar/epoxy laminate.

  18. Examining the Efficacy of a Family Peer Advocate Model for Black and Hispanic Caregivers of Children with Autism Spectrum Disorder.

    PubMed

    Jamison, J M; Fourie, E; Siper, P M; Trelles, M P; George-Jones, Julia; Buxbaum Grice, A; Krata, J; Holl, E; Shaoul, J; Hernandez, B; Mitchell, L; McKay, M M; Buxbaum, J D; Kolevzon, Alexander

    2017-05-01

    Autism spectrum disorder (ASD) affects individuals across all racial and ethnic groups, yet rates of diagnosis are disproportionately higher for Black and Hispanic children. Caregivers of children with ASD experience significant stressors, which have been associated with parental strain, inadequate utilization of mental health services and lower quality of life. The family peer advocate (FPA) model has been utilized across service delivery systems to provide family-to-family support, facilitate engagement, and increase access to care. This study used a randomized controlled design to examine the efficacy of FPAs in a racially and ethnically diverse sample. Results demonstrate significantly increased knowledge of ASD and reduced levels of stress for caregivers who received the FPA intervention as compared to treatment as usual.

  19. Information on center characteristics as costs' determinants in multicenter clinical trials: is modeling center effect worth the effort?

    PubMed

    Petrinco, Michele; Pagano, Eva; Desideri, Alessandro; Bigi, Riccardo; Ghidina, Marco; Ferrando, Alberto; Cortigiani, Lauro; Merletti, Franco; Gregori, Dario

    2009-01-01

    Several methodological problems arise when health outcomes and resource utilization are collected at different sites. To avoid misleading conclusions in multi-center economic evaluations the center effect needs to be taken into adequate consideration. The aim of this article is to compare several models, which make use of a different amount of information about the enrolling center. To model the association of total medical costs with the levels of two sets of covariates, one at patient and one at center level, we considered four statistical models, based on the Gamma model in the class of the Generalized Linear Models with a log link, which use different amount of information on the enrolling centers. Models were applied to Cost of Strategies after Myocardial Infarction data, an international randomized trial on costs of uncomplicated acute myocardial infarction (AMI). The simple center effect adjustment based on a single random effect results in a more conservative estimation of the parameters as compared with approaches which make use of deeper information on the centers characteristics. This study shows, with reference to a real multicenter trial, that center information cannot be neglected and should be collected and inserted in the analysis, better in combination with one or more random effect, taking into account in this way also the heterogeneity among centers because of unobserved centers characteristics.

  20. Dendritic growth model of multilevel marketing

    NASA Astrophysics Data System (ADS)

    Pang, James Christopher S.; Monterola, Christopher P.

    2017-02-01

    Biologically inspired dendritic network growth is utilized to model the evolving connections of a multilevel marketing (MLM) enterprise. Starting from agents at random spatial locations, a network is formed by minimizing a distance cost function controlled by a parameter, termed the balancing factor bf, that weighs the wiring and the path length costs of connection. The paradigm is compared to an actual MLM membership data and is shown to be successful in statistically capturing the membership distribution, better than the previously reported agent based preferential attachment or analytic branching process models. Moreover, it recovers the known empirical statistics of previously studied MLM, specifically: (i) a membership distribution characterized by the existence of peak levels indicating limited growth, and (ii) an income distribution obeying the 80 - 20 Pareto principle. Extensive types of income distributions from uniform to Pareto to a "winner-take-all" kind are also modeled by varying bf. Finally, the robustness of our dendritic growth paradigm to random agent removals is explored and its implications to MLM income distributions are discussed.

  1. Double-Pulse Two-Micron IPDA Lidar Simulation for Airborne Carbon Dioxide Measurements

    NASA Technical Reports Server (NTRS)

    Refaat, Tamer F.; Singh, Upendra N.; Yu, Jirong; Petros, Mulugeta

    2015-01-01

    An advanced double-pulsed 2-micron integrated path differential absorption lidar has been developed at NASA Langley Research Center for measuring atmospheric carbon dioxide. The instrument utilizes a state-of-the-art 2-micron laser transmitter with tunable on-line wavelength and advanced receiver. Instrument modeling and airborne simulations are presented in this paper. Focusing on random errors, results demonstrate instrument capabilities of performing precise carbon dioxide differential optical depth measurement with less than 3% random error for single-shot operation from up to 11 km altitude. This study is useful for defining CO2 measurement weighting, instrument setting, validation and sensitivity trade-offs.

  2. Analyzing crash frequency in freeway tunnels: A correlated random parameters approach.

    PubMed

    Hou, Qinzhong; Tarko, Andrew P; Meng, Xianghai

    2018-02-01

    The majority of past road safety studies focused on open road segments while only a few focused on tunnels. Moreover, the past tunnel studies produced some inconsistent results about the safety effects of the traffic patterns, the tunnel design, and the pavement conditions. The effects of these conditions therefore remain unknown, especially for freeway tunnels in China. The study presented in this paper investigated the safety effects of these various factors utilizing a four-year period (2009-2012) of data as well as three models: 1) a random effects negative binomial model (RENB), 2) an uncorrelated random parameters negative binomial model (URPNB), and 3) a correlated random parameters negative binomial model (CRPNB). Of these three, the results showed that the CRPNB model provided better goodness-of-fit and offered more insights into the factors that contribute to tunnel safety. The CRPNB was not only able to allocate the part of the otherwise unobserved heterogeneity to the individual model parameters but also was able to estimate the cross-correlations between these parameters. Furthermore, the study results showed that traffic volume, tunnel length, proportion of heavy trucks, curvature, and pavement rutting were associated with higher frequencies of traffic crashes, while the distance to the tunnel wall, distance to the adjacent tunnel, distress ratio, International Roughness Index (IRI), and friction coefficient were associated with lower crash frequencies. In addition, the effects of the heterogeneity of the proportion of heavy trucks, the curvature, the rutting depth, and the friction coefficient were identified and their inter-correlations were analyzed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. More efficient optimization of long-term water supply portfolios

    NASA Astrophysics Data System (ADS)

    Kirsch, Brian R.; Characklis, Gregory W.; Dillard, Karen E. M.; Kelley, C. T.

    2009-03-01

    The use of temporary transfers, such as options and leases, has grown as utilities attempt to meet increases in demand while reducing dependence on the expansion of costly infrastructure capacity (e.g., reservoirs). Earlier work has been done to construct optimal portfolios comprising firm capacity and transfers, using decision rules that determine the timing and volume of transfers. However, such work has only focused on the short-term (e.g., 1-year scenarios), which limits the utility of these planning efforts. Developing multiyear portfolios can lead to the exploration of a wider range of alternatives but also increases the computational burden. This work utilizes a coupled hydrologic-economic model to simulate the long-term performance of a city's water supply portfolio. This stochastic model is linked with an optimization search algorithm that is designed to handle the high-frequency, low-amplitude noise inherent in many simulations, particularly those involving expected values. This noise is detrimental to the accuracy and precision of the optimized solution and has traditionally been controlled by investing greater computational effort in the simulation. However, the increased computational effort can be substantial. This work describes the integration of a variance reduction technique (control variate method) within the simulation/optimization as a means of more efficiently identifying minimum cost portfolios. Random variation in model output (i.e., noise) is moderated using knowledge of random variations in stochastic input variables (e.g., reservoir inflows, demand), thereby reducing the computing time by 50% or more. Using these efficiency gains, water supply portfolios are evaluated over a 10-year period in order to assess their ability to reduce costs and adapt to demand growth, while still meeting reliability goals. As a part of the evaluation, several multiyear option contract structures are explored and compared.

  4. A model for incomplete longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C

    2008-12-30

    In studies where multiple outcome items are repeatedly measured over time, missing data often occur. A longitudinal item response theory model is proposed for analysis of multivariate ordinal outcomes that are repeatedly measured. Under the MAR assumption, this model accommodates missing data at any level (missing item at any time point and/or missing time point). It allows for multiple random subject effects and the estimation of item discrimination parameters for the multiple outcome items. The covariates in the model can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is described utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher-scoring solution, which provides standard errors for all model parameters, is used. A data set from a longitudinal prevention study is used to motivate the application of the proposed model. In this study, multiple ordinal items of health behavior are repeatedly measured over time. Because of a planned missing design, subjects answered only two-third of all items at a given point. Copyright 2008 John Wiley & Sons, Ltd.

  5. Texture classification using autoregressive filtering

    NASA Technical Reports Server (NTRS)

    Lawton, W. M.; Lee, M.

    1984-01-01

    A general theory of image texture models is proposed and its applicability to the problem of scene segmentation using texture classification is discussed. An algorithm, based on half-plane autoregressive filtering, which optimally utilizes second order statistics to discriminate between texture classes represented by arbitrary wide sense stationary random fields is described. Empirical results of applying this algorithm to natural and sysnthesized scenes are presented and future research is outlined.

  6. The Utility of Conflict Resolution and Study Skills Interventions with Middle School Students at Risk for Antisocial Behavior: A Methodological Illustration

    ERIC Educational Resources Information Center

    Kalberg, Jemma Robertson; Lane, Kathleen; Lambert, Warren

    2012-01-01

    This article provides a methodological illustration of how to conduct randomized controls trials (RCT) for secondary levels of prevention within the context of three-tiered models of support. First, the authors demonstrate one method of using school-wide data to identify middle school students (N = 45) who were struggling in academic and…

  7. Valuing the Recreational Benefits from the Creation of Nature Reserves in Irish Forests

    Treesearch

    Riccardo Scarpa; Susan M. Chilton; W. George Hutchinson; Joseph Buongiorno

    2000-01-01

    Data from a large-scale contingent valuation study are used to investigate the effects of forest attribum on willingness to pay for forest recreation in Ireland. In particular, the presence of a nature reserve in the forest is found to significantly increase the visitors' willingness to pay. A random utility model is used to estimate the welfare change associated...

  8. Meta-analysis of diagnostic accuracy studies accounting for disease prevalence: alternative parameterizations and model selection.

    PubMed

    Chu, Haitao; Nie, Lei; Cole, Stephen R; Poole, Charles

    2009-08-15

    In a meta-analysis of diagnostic accuracy studies, the sensitivities and specificities of a diagnostic test may depend on the disease prevalence since the severity and definition of disease may differ from study to study due to the design and the population considered. In this paper, we extend the bivariate nonlinear random effects model on sensitivities and specificities to jointly model the disease prevalence, sensitivities and specificities using trivariate nonlinear random-effects models. Furthermore, as an alternative parameterization, we also propose jointly modeling the test prevalence and the predictive values, which reflect the clinical utility of a diagnostic test. These models allow investigators to study the complex relationship among the disease prevalence, sensitivities and specificities; or among test prevalence and the predictive values, which can reveal hidden information about test performance. We illustrate the proposed two approaches by reanalyzing the data from a meta-analysis of radiological evaluation of lymph node metastases in patients with cervical cancer and a simulation study. The latter illustrates the importance of carefully choosing an appropriate normality assumption for the disease prevalence, sensitivities and specificities, or the test prevalence and the predictive values. In practice, it is recommended to use model selection techniques to identify a best-fitting model for making statistical inference. In summary, the proposed trivariate random effects models are novel and can be very useful in practice for meta-analysis of diagnostic accuracy studies. Copyright 2009 John Wiley & Sons, Ltd.

  9. Irregularity, volatility, risk, and financial market time series

    PubMed Central

    Pincus, Steve; Kalman, Rudolf E.

    2004-01-01

    The need to assess subtle, potentially exploitable changes in serial structure is paramount in the analysis of financial data. Herein, we demonstrate the utility of approximate entropy (ApEn), a model-independent measure of sequential irregularity, toward this goal, by several distinct applications. We consider both empirical data and models, including composite indices (Standard and Poor's 500 and Hang Seng), individual stock prices, the random-walk hypothesis, and the Black–Scholes and fractional Brownian motion models. Notably, ApEn appears to be a potentially useful marker of system stability, with rapid increases possibly foreshadowing significant changes in a financial variable. PMID:15358860

  10. Aggregated N-of-1 randomized controlled trials: modern data analytics applied to a clinically valid method of intervention effectiveness.

    PubMed

    Cushing, Christopher C; Walters, Ryan W; Hoffman, Lesa

    2014-03-01

    Aggregated N-of-1 randomized controlled trials (RCTs) combined with multilevel modeling represent a methodological advancement that may help bridge science and practice in pediatric psychology. The purpose of this article is to offer a primer for pediatric psychologists interested in conducting aggregated N-of-1 RCTs. An overview of N-of-1 RCT methodology is provided and 2 simulated data sets are analyzed to demonstrate the clinical and research potential of the methodology. The simulated data example demonstrates the utility of aggregated N-of-1 RCTs for understanding the clinical impact of an intervention for a given individual and the modeling of covariates to explain why an intervention worked for one patient and not another. Aggregated N-of-1 RCTs hold potential for improving the science and practice of pediatric psychology.

  11. Physically based reflectance model utilizing polarization measurement.

    PubMed

    Nakano, Takayuki; Tamagawa, Yasuhisa

    2005-05-20

    A surface bidirectional reflectance distribution function (BRDF) depends on both the optical properties of the material and the microstructure of the surface and appears as combination of these factors. We propose a method for modeling the BRDF based on a separate optical-property (refractive-index) estimation by polarization measurement. Because the BRDF and the refractive index for precisely the same place can be determined, errors cased by individual difference or spatial dependence can be eliminated. Our BRDF model treats the surface as an aggregation of microfacets, and the diffractive effect is negligible because of randomness. An example model of a painted aluminum plate is presented.

  12. Clipped Random Wave Morphologies and the Analysis of the SAXS of an Ionomer Formed by Copolymerization of Tetrafluoroethylene and CF[subscript 2]=CFO(CF[subscript 2])[subscript 4]SO[subscript 3]H

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aieta, Niccolo V.; Stanis, Ronald J.; Horan, James L.

    Using SAXS data, the microstructure of the ionomer formed by copolymerization of tetrafluoroethylene and CF{sub 2}=CFO(CF{sub 2}){sub 4}SO{sub 3}H films has been approached by two methods: a numerical method (the unified fit approach) utilizing a simple model of spherical scattering objects to determine the radius of gyration of different scattering features of the ionomer films and by a graphical method, the clipped random wave approach (CRW), using the scattering data and a porosity parameter to generate a random wave which is clipped to produce a real-space image of the microstructure. We studied films with EW of 733, 825, 900, andmore » 1082 in both the as-cast and annealed 'dry' and boiled 'wet' states. The results of the two data analysis techniques are in good size agreement with each other. In addition, the CRW model show striking similarities to the structure proposed in a recent dissipative particle dynamic models. This has been the first time to our knowledge that the CRW technique has been applied to a PFSA type ionomer.« less

  13. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  14. A Practical Guide to Conducting a Systematic Review and Meta-analysis of Health State Utility Values.

    PubMed

    Petrou, Stavros; Kwon, Joseph; Madan, Jason

    2018-05-10

    Economic analysts are increasingly likely to rely on systematic reviews and meta-analyses of health state utility values to inform the parameter inputs of decision-analytic modelling-based economic evaluations. Beyond the context of economic evaluation, evidence from systematic reviews and meta-analyses of health state utility values can be used to inform broader health policy decisions. This paper provides practical guidance on how to conduct a systematic review and meta-analysis of health state utility values. The paper outlines a number of stages in conducting a systematic review, including identifying the appropriate evidence, study selection, data extraction and presentation, and quality and relevance assessment. The paper outlines three broad approaches that can be used to synthesise multiple estimates of health utilities for a given health state or condition, namely fixed-effect meta-analysis, random-effects meta-analysis and mixed-effects meta-regression. Each approach is illustrated by a synthesis of utility values for a hypothetical decision problem, and software code is provided. The paper highlights a number of methodological issues pertinent to the conduct of meta-analysis or meta-regression. These include the importance of limiting synthesis to 'comparable' utility estimates, for example those derived using common utility measurement approaches and sources of valuation; the effects of reliance on limited or poorly reported published data from primary utility assessment studies; the use of aggregate outcomes within analyses; approaches to generating measures of uncertainty; handling of median utility values; challenges surrounding the disentanglement of utility estimates collected serially within the context of prospective observational studies or prospective randomised trials; challenges surrounding the disentanglement of intervention effects; and approaches to measuring model validity. Areas of methodological debate and avenues for future research are highlighted.

  15. Individual and contextual factors influencing dental health care utilization by preschool children: a multilevel analysis

    PubMed

    Piovesan, Chaiana; Ardenghi, Thiago Machado; Mendes, Fausto Medeiros; Agostini, Bernardo Antonio; Michel-Crosato, Edgard

    2017-03-30

    The effect of contextual factors on dental care utilization was evaluated after adjustment for individual characteristics of Brazilian preschool children. This cross-sectional study assessed 639 preschool children aged 1 to 5 years from Santa Maria, a town in Rio Grande do Sul State, located in southern Brazil. Participants were randomly selected from children attending the National Children's Vaccination Day and 15 health centers were selected for this research. Visual examinations followed the ICDAS criteria. Parents answered a questionnaire about demographic and socioeconomic characteristics. Contextual influences on children's dental care utilization were obtained from two community-related variables: presence of dentists and presence of workers' associations in the neighborhood. Unadjusted and adjusted multilevel logistic regression models were used to describe the association between outcome and predictor variables. A prevalence of 21.6% was found for regular use of dental services. The unadjusted assessment of the associations of dental health care utilization with individual and contextual factors included children's ages, family income, parents' schooling, mothers' participation in their children's school activities, dental caries, and presence of workers' associations in the neighborhood as the main outcome covariates. Individual variables remained associated with the outcome after adding contextual variables in the model. In conclusion, individual and contextual variables were associated with dental health care utilization by preschool children.

  16. Empirical likelihood inference in randomized clinical trials.

    PubMed

    Zhang, Biao

    2017-01-01

    In individually randomized controlled trials, in addition to the primary outcome, information is often available on a number of covariates prior to randomization. This information is frequently utilized to undertake adjustment for baseline characteristics in order to increase precision of the estimation of average treatment effects; such adjustment is usually performed via covariate adjustment in outcome regression models. Although the use of covariate adjustment is widely seen as desirable for making treatment effect estimates more precise and the corresponding hypothesis tests more powerful, there are considerable concerns that objective inference in randomized clinical trials can potentially be compromised. In this paper, we study an empirical likelihood approach to covariate adjustment and propose two unbiased estimating functions that automatically decouple evaluation of average treatment effects from regression modeling of covariate-outcome relationships. The resulting empirical likelihood estimator of the average treatment effect is as efficient as the existing efficient adjusted estimators 1 when separate treatment-specific working regression models are correctly specified, yet are at least as efficient as the existing efficient adjusted estimators 1 for any given treatment-specific working regression models whether or not they coincide with the true treatment-specific covariate-outcome relationships. We present a simulation study to compare the finite sample performance of various methods along with some results on analysis of a data set from an HIV clinical trial. The simulation results indicate that the proposed empirical likelihood approach is more efficient and powerful than its competitors when the working covariate-outcome relationships by treatment status are misspecified.

  17. A cluster-randomized evaluation of an intervention to increase skilled birth attendant utilization in mid- and far-western Nepal.

    PubMed

    Choulagai, Bishnu P; Onta, Sharad; Subedi, Narayan; Bhatta, Dharma N; Shrestha, Binjwala; Petzold, Max; Krettek, Alexandra

    2017-10-01

    Skilled birth attendant (SBA) utilization is low in remote and rural areas of Nepal. We designed and implemented an evaluation to assess the effectiveness of a five-component intervention that addressed previously identified barriers to SBA services in mid- and far-western Nepal. We randomly and equally allocated 36 village development committees with low SBA utilization among 1-year intervention and control groups. The eligible participants for the survey were women that had delivered a baby within the past 12 months preceding the survey. Implementation was administered by trained health volunteers, youth groups, mothers' groups and health facility management committee members. Post-intervention, we used difference-in-differences and mixed-effects regression models to assess and analyse any increase in the utilization of skilled birth care and antenatal care (ANC) services. All analyses were done by intention to treat. Our trial registration number was ISRCTN78892490 (http://www.isrctn.com/ISRCTN78892490). Interviewees included 1746 and 2098 eligible women in the intervention and control groups, respectively. The 1-year intervention was effective in increasing the use of skilled birth care services (OR = 1.57; CI 1.19-2.08); however, the intervention had no effect on the utilization of ANC services. Expanding the intervention with modifications, e.g. mobilizing more active and stable community groups, ensuring adequate human resources and improving quality of services as well as longer or repeated interventions will help achieve greater effect in increasing the utilization of SBA. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.

  18. Assessing the Potential of Land Use Modification to Mitigate Ambient NO₂ and Its Consequences for Respiratory Health.

    PubMed

    Rao, Meenakshi; George, Linda A; Shandas, Vivek; Rosenstiel, Todd N

    2017-07-10

    Understanding how local land use and land cover (LULC) shapes intra-urban concentrations of atmospheric pollutants-and thus human health-is a key component in designing healthier cities. Here, NO₂ is modeled based on spatially dense summer and winter NO₂ observations in Portland-Hillsboro-Vancouver (USA), and the spatial variation of NO₂ with LULC investigated using random forest, an ensemble data learning technique. The NO 2 random forest model, together with BenMAP, is further used to develop a better understanding of the relationship among LULC, ambient NO₂ and respiratory health. The impact of land use modifications on ambient NO₂, and consequently on respiratory health, is also investigated using a sensitivity analysis. We find that NO₂ associated with roadways and tree-canopied areas may be affecting annual incidence rates of asthma exacerbation in 4-12 year olds by +3000 per 100,000 and -1400 per 100,000, respectively. Our model shows that increasing local tree canopy by 5% may reduce local incidences rates of asthma exacerbation by 6%, indicating that targeted local tree-planting efforts may have a substantial impact on reducing city-wide incidence of respiratory distress. Our findings demonstrate the utility of random forest modeling in evaluating LULC modifications for enhanced respiratory health.

  19. Evaluating the Effectiveness of an Antimicrobial Stewardship Program on Reducing the Incidence Rate of Healthcare-Associated Clostridium difficile Infection: A Non-Randomized, Stepped Wedge, Single-Site, Observational Study.

    PubMed

    DiDiodato, Giulio; McArthur, Leslie

    2016-01-01

    The incidence rate of healthcare-associated Clostridium difficile infection (HA-CDI) is estimated at 1 in 100 patients. Antibiotic exposure is the most consistently reported risk factor for HA-CDI. Strategies to reduce the risk of HA-CDI have focused on reducing antibiotic utilization. Prospective audit and feedback is a commonly used antimicrobial stewardship intervention (ASi). The impact of this ASi on risk of HA-CDI is equivocal. This study examines the effectiveness of a prospective audit and feedback ASi on reducing the risk of HA-CDI. Single-site, 339 bed community-hospital in Barrie, Ontario, Canada. Primary outcome is HA-CDI incidence rate. Daily prospective and audit ASi is the exposure variable. ASi implemented across 6 wards in a non-randomized, stepped wedge design. Criteria for ASi; any intravenous antibiotic use for ≥ 48 hrs, any oral fluoroquinolone or oral second generation cephalosporin use for ≥ 48 hrs, or any antimicrobial use for ≥ 5 days. HA-CDI cases and model covariates were aggregated by ward, year and month starting September 2008 and ending February 2016. Multi-level mixed effect negative binomial regression analysis was used to model the primary outcome, with intercept and slope coefficients for ward-level random effects estimated. Other covariates tested for inclusion in the final model were derived from previously published risk factors. Deviance residuals were used to assess the model's goodness-of-fit. The dataset included 486 observation periods, of which 350 were control periods and 136 were intervention periods. After accounting for all other model covariates, the estimated overall ASi incidence rate ratio (IRR) was 0.48 (95% 0.30, 0.79). The ASi effect was independent of antimicrobial utilization. The ASi did not seem to reduce the risk of Clostridium difficile infection on the surgery wards (IRR 0.87, 95% CI 0.45, 1.69) compared to the medicine wards (IRR 0.42, 95% CI 0.28, 0.63). The ward-level burden of Clostridium difficile as measured by the ward's previous month's total CDI cases (CDI Lag) and the ward's current month's community-associated CDI cases (CA-CDI) was significantly associated with an increased risk of HA-CDI, with the estimated CDI Lag IRR of 1.21 (95% 1.15, 1.28) and the estimated CA-CDI IRR of 1.10 (95% CI 1.01, 1.20). The ward-level random intercept and slope coefficients were not significant. The final model demonstrated good fit. In this study, a daily prospective audit and feedback ASi resulted in a significant reduction in the risk of HA-CDI on the medicine wards, however, this effect was independent of an overall reduction in antibiotic utilization. In addition, the ward-level burden of Clostridium difficile was shown to significantly increase the risk of HA-CDI, reinforcing the importance of the environment as a source of HA-CDI.

  20. Being there is important, but getting there matters too: the role of path in the valuation process.

    PubMed

    Goldberg, Julie H

    2006-01-01

    Traditional decision-analytic models presume that utilities are invariant to context. The influence of 2 types of context on patients' utility assessments was examined here the path by which one reaches a health state and personal experience with a health state. Three groups of patients were interviewed: men older than age 49 years with prostate cancer but no diabetes (CaP), diabetes but no prostate cancer (DM), and neither disease (ND). The utility of erectile dysfunction (ED) was assessed using a standard gamble (SG). Each subject completed 2 SGs: 1) a no-context version that gave no explanation for the cause of ED and 2) a contextualized version in which prostate cancer treatment, the failure to manage diabetes, or the natural course of aging was said to be the cause. Patients with disease assigned higher utilities to ED in a matching context than in discrepant contexts. Regression models found that the valuation process was also sensitive to the match between disease path in the utility assessment and patients' personal experiences. These findings lend insight into why acontextual utility assessments typically used in decision analyses have not been able to predict patient behavior as well as expected. The valuation process appears to change systematically when context is specified, suggesting that unspecified contexts rather than random error may lead to fluctuations in the values assigned to identical health states.

  1. Analysis of a utility-interactive wind-photovoltaic hybrid system with battery storage using neural network

    NASA Astrophysics Data System (ADS)

    Giraud, Francois

    1999-10-01

    This dissertation investigates the application of neural network theory to the analysis of a 4-kW Utility-interactive Wind-Photovoltaic System (WPS) with battery storage. The hybrid system comprises a 2.5-kW photovoltaic generator and a 1.5-kW wind turbine. The wind power generator produces power at variable speed and variable frequency (VSVF). The wind energy is converted into dc power by a controlled, tree-phase, full-wave, bridge rectifier. The PV power is maximized by a Maximum Power Point Tracker (MPPT), a dc-to-dc chopper, switching at a frequency of 45 kHz. The whole dc power of both subsystems is stored in the battery bank or conditioned by a single-phase self-commutated inverter to be sold to the utility at a predetermined amount. First, the PV is modeled using Artificial Neural Network (ANN). To reduce model uncertainty, the open-circuit voltage VOC and the short-circuit current ISC of the PV are chosen as model input variables of the ANN. These input variables have the advantage of incorporating the effects of the quantifiable and non-quantifiable environmental variants affecting the PV power. Then, a simplified way to predict accurately the dynamic responses of the grid-linked WPS to gusty winds using a Recurrent Neural Network (RNN) is investigated. The RNN is a single-output feedforward backpropagation network with external feedback, which allows past responses to be fed back to the network input. In the third step, a Radial Basis Functions (RBF) Network is used to analyze the effects of clouds on the Utility-Interactive WPS. Using the irradiance as input signal, the network models the effects of random cloud movement on the output current, the output voltage, the output power of the PV system, as well as the electrical output variables of the grid-linked inverter. Fourthly, using RNN, the combined effects of a random cloud and a wind gusts on the system are analyzed. For short period intervals, the wind speed and the solar radiation are considered as the sole sources of power, whose variations influence the system variables. Since both subsystems have different dynamics, their respective responses are expected to impact differently the whole system behavior. The dispatchability of the battery-supported system as well as its stability and reliability during gusts and/or cloud passage is also discussed. In the fifth step, the goal is to determine to what extent the overall power quality of the grid would be affected by a proliferation of Utility-interactive hybrid system and whether recourse to bulky or individual filtering and voltage controller is necessary. The final stage of the research includes a steady-state analysis of two-year operation (May 96--Apr 98) of the system, with a discussion on system reliability, on any loss of supply probability, and on the effects of the randomness in the wind and solar radiation upon the system design optimization.

  2. MIXREG: a computer program for mixed-effects regression analysis with autocorrelated errors.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-05-01

    MIXREG is a program that provides estimates for a mixed-effects regression model (MRM) for normally-distributed response data including autocorrelated errors. This model can be used for analysis of unbalanced longitudinal data, where individuals may be measured at a different number of timepoints, or even at different timepoints. Autocorrelated errors of a general form or following an AR(1), MA(1), or ARMA(1,1) form are allowable. This model can also be used for analysis of clustered data, where the mixed-effects model assumes data within clusters are dependent. The degree of dependency is estimated jointly with estimates of the usual model parameters, thus adjusting for clustering. MIXREG uses maximum marginal likelihood estimation, utilizing both the EM algorithm and a Fisher-scoring solution. For the scoring solution, the covariance matrix of the random effects is expressed in its Gaussian decomposition, and the diagonal matrix reparameterized using the exponential transformation. Estimation of the individual random effects is accomplished using an empirical Bayes approach. Examples illustrating usage and features of MIXREG are provided.

  3. The effect of social support features and gamification on a Web-based intervention for rheumatoid arthritis patients: randomized controlled trial.

    PubMed

    Allam, Ahmed; Kostova, Zlatina; Nakamoto, Kent; Schulz, Peter Johannes

    2015-01-09

    Rheumatoid arthritis (RA) is chronic systematic disease that affects people during the most productive period of their lives. Web-based health interventions have been effective in many studies; however, there is little evidence and few studies showing the effectiveness of online social support and especially gamification on patients' behavioral and health outcomes. The aim of this study was to look into the effects of a Web-based intervention that included online social support features and gamification on physical activity, health care utilization, medication overuse, empowerment, and RA knowledge of RA patients. The effect of gamification on website use was also investigated. We conducted a 5-arm parallel randomized controlled trial for RA patients in Ticino (Italian-speaking part of Switzerland). A total of 157 patients were recruited through brochures left with physicians and were randomly allocated to 1 of 4 experimental conditions with different types of access to online social support and gamification features and a control group that had no access to the website. Data were collected at 3 time points through questionnaires at baseline, posttest 2 months later, and at follow-up after another 2 months. Primary outcomes were physical activity, health care utilization, and medication overuse; secondary outcomes included empowerment and RA knowledge. All outcomes were self-reported. Intention-to-treat analysis was followed and multilevel linear mixed models were used to study the change of outcomes over time. The best-fit multilevel models (growth curve models) that described the change in the primary outcomes over the course of the intervention included time and empowerment as time-variant predictors. The growth curve analyses of experimental conditions were compared to the control group. Physical activity increased over time for patients having access to social support sections plus gaming (unstandardized beta coefficient [B]=3.39, P=.02). Health care utilization showed a significant decrease for patients accessing social support features (B=-0.41, P=.01) and patients accessing both social support features and gaming (B=-0.33, P=.03). Patients who had access to either social support sections or the gaming experience of the website gained more empowerment (B=2.59, P=.03; B=2.29, P=.05; respectively). Patients who were offered a gamified experience used the website more often than the ones without gaming (t91=-2.41, P=.02; U=812, P=.02). The Web-based intervention had a positive impact (more desirable outcomes) on intervention groups compared to the control group. Social support sections on the website decreased health care utilization and medication overuse and increased empowerment. Gamification alone or with social support increased physical activity and empowerment and decreased health care utilization. This study provides evidence demonstrating the potential positive effect of gamification and online social support on health and behavioral outcomes. International Standard Randomized Controlled Trial Number (ISRCTN): 57366516; http://www.controlled-trials. com/ISRCTN57366516 (Archived by webcite at http://www.webcitation.org/6PBvvAvvV).

  4. The Effect of Social Support Features and Gamification on a Web-Based Intervention for Rheumatoid Arthritis Patients: Randomized Controlled Trial

    PubMed Central

    Kostova, Zlatina; Nakamoto, Kent; Schulz, Peter Johannes

    2015-01-01

    Background Rheumatoid arthritis (RA) is chronic systematic disease that affects people during the most productive period of their lives. Web-based health interventions have been effective in many studies; however, there is little evidence and few studies showing the effectiveness of online social support and especially gamification on patients’ behavioral and health outcomes. Objective The aim of this study was to look into the effects of a Web-based intervention that included online social support features and gamification on physical activity, health care utilization, medication overuse, empowerment, and RA knowledge of RA patients. The effect of gamification on website use was also investigated. Methods We conducted a 5-arm parallel randomized controlled trial for RA patients in Ticino (Italian-speaking part of Switzerland). A total of 157 patients were recruited through brochures left with physicians and were randomly allocated to 1 of 4 experimental conditions with different types of access to online social support and gamification features and a control group that had no access to the website. Data were collected at 3 time points through questionnaires at baseline, posttest 2 months later, and at follow-up after another 2 months. Primary outcomes were physical activity, health care utilization, and medication overuse; secondary outcomes included empowerment and RA knowledge. All outcomes were self-reported. Intention-to-treat analysis was followed and multilevel linear mixed models were used to study the change of outcomes over time. Results The best-fit multilevel models (growth curve models) that described the change in the primary outcomes over the course of the intervention included time and empowerment as time-variant predictors. The growth curve analyses of experimental conditions were compared to the control group. Physical activity increased over time for patients having access to social support sections plus gaming (unstandardized beta coefficient [B]=3.39, P=.02). Health care utilization showed a significant decrease for patients accessing social support features (B=–0.41, P=.01) and patients accessing both social support features and gaming (B=–0.33, P=.03). Patients who had access to either social support sections or the gaming experience of the website gained more empowerment (B=2.59, P=.03; B=2.29, P=.05; respectively). Patients who were offered a gamified experience used the website more often than the ones without gaming (t 91=–2.41, P=.02; U=812, P=.02). Conclusions The Web-based intervention had a positive impact (more desirable outcomes) on intervention groups compared to the control group. Social support sections on the website decreased health care utilization and medication overuse and increased empowerment. Gamification alone or with social support increased physical activity and empowerment and decreased health care utilization. This study provides evidence demonstrating the potential positive effect of gamification and online social support on health and behavioral outcomes. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 57366516; http://www.controlled-trials. com/ISRCTN57366516 (Archived by webcite at http://www.webcitation.org/6PBvvAvvV). PMID:25574939

  5. TiO2-based memristors and ReRAM: materials, mechanisms and models (a review)

    NASA Astrophysics Data System (ADS)

    Gale, Ella

    2014-10-01

    The memristor is the fundamental nonlinear circuit element, with uses in computing and computer memory. Resistive Random Access Memory (ReRAM) is a resistive switching memory proposed as a non-volatile memory. In this review we shall summarize the state of the art for these closely-related fields, concentrating on titanium dioxide, the well-utilized and archetypal material for both. We shall cover material properties, switching mechanisms and models to demonstrate what ReRAM and memristor scientists can learn from each other and examine the outlook for these technologies.

  6. A new computational approach to simulate pattern formation in Paenibacillus dendritiformis bacterial colonies

    NASA Astrophysics Data System (ADS)

    Tucker, Laura Jane

    Under the harsh conditions of limited nutrient and hard growth surface, Paenibacillus dendritiformis in agar plates form two classes of patterns (morphotypes). The first class, called the dendritic morphotype, has radially directed branches. The second class, called the chiral morphotype, exhibits uniform handedness. The dendritic morphotype has been modeled successfully using a continuum model on a regular lattice; however, a suitable computational approach was not known to solve a continuum chiral model. This work details a new computational approach to solving the chiral continuum model of pattern formation in P. dendritiformis. The approach utilizes a random computational lattice and new methods for calculating certain derivative terms found in the model.

  7. Family Access to a Dentist Study (FADS): A Multi-Center Randomized Controlled Trial

    PubMed Central

    Nelson, Suchitra; Riedy, Christine; Albert, Jeffrey M; Lee, Wonik; Slusar, Mary Beth; Curtan, Shelley; Ferretti, Gerald; Cunha-Cruz, Joana; Milgrom, Peter

    2015-01-01

    Introduction Many low-income parent/caregivers do not understand the importance of cavity-free primary (baby) teeth and the chronic nature of dental caries (tooth decay). As a consequence, dental preventive and treatment utilization is low even when children are screened in schools and referred for care. This study aims to test a referral letter and Dental Information Guide (DIG) designed using the Common-Sense Model of Self-Regulation (CSM) framework to improve caregivers’ illness perception of dental caries and increase utilization of care by children with restorative dental needs. Methods A multi-site randomized controlled trial with caregivers of Kindergarten to 4th grade children in urban Ohio and rural Washington State will compare five arms: (1) CSM referral letter alone; (2) CSM referral letter + DIG; (3) reduced CSM referral letter alone; (4) reduced CSM referral letter + DIG; (5) standard (control) referral. At baseline, children will be screened at school to determine restorative dental needs. If in need of treatment, caregivers will be randomized to study arms and an intervention packet will be sent home. The primary outcome will be dental care based on a change in oral health status by clinical examination 7 months post-screening (ICDAS sealant codes 1 and 2; restoration codes 3–8; extraction). Enrollment commenced summer 2015 with results in summer 2016. Conclusion This study uses the CSM framework to develop and test behavioral interventions to increase dental utilization among low-income caregivers. If effective this simple intervention has broad applicability in clinical and community-based settings. PMID:26500170

  8. Family Access to a Dentist Study (FADS): A multi-center randomized controlled trial.

    PubMed

    Nelson, Suchitra; Riedy, Christine; Albert, Jeffrey M; Lee, Wonik; Slusar, Mary Beth; Curtan, Shelley; Ferretti, Gerald; Cunha-Cruz, Joana; Milgrom, Peter

    2015-11-01

    Many low-income parent/caregivers do not understand the importance of cavity-free primary (baby) teeth and the chronic nature of dental caries (tooth decay). As a consequence, dental preventive and treatment utilization is low even when children are screened in schools and referred for care. This study aims to test a referral letter and Dental Information Guide (DIG) designed using the Common-Sense Model of Self-Regulation (CSM) framework to improve caregivers' illness perception of dental caries and increase utilization of care by children with restorative dental needs. A multi-site randomized controlled trial with caregivers of Kindergarten to 4th grade children in urban Ohio and rural Washington State will compare five arms: (1) CSM referral letter alone; (2) CSM referral letter+DIG; (3) reduced CSM referral letter alone; (4) reduced CSM referral letter+DIG; and (5) standard (control) referral. At baseline, children will be screened at school to determine restorative dental needs. If in need of treatment, caregivers will be randomized to study arms and an intervention packet will be sent home. The primary outcome will be dental care based on a change in oral health status by clinical examination 7 months post-screening (ICDAS sealant codes 1 and 2; restoration codes 3-8; extraction). Enrollment commenced summer 2015 with results in summer 2016. This study uses the CSM framework to develop and test behavioral interventions to increase dental utilization among low-income caregivers. If effective this simple intervention has broad applicability in clinical and community-based settings. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. The Self-Adapting Focused Review System. Probability sampling of medical records to monitor utilization and quality of care.

    PubMed

    Ash, A; Schwartz, M; Payne, S M; Restuccia, J D

    1990-11-01

    Medical record review is increasing in importance as the need to identify and monitor utilization and quality of care problems grow. To conserve resources, reviews are usually performed on a subset of cases. If judgment is used to identify subgroups for review, this raises the following questions: How should subgroups be determined, particularly since the locus of problems can change over time? What standard of comparison should be used in interpreting rates of problems found in subgroups? How can population problem rates be estimated from observed subgroup rates? How can the bias be avoided that arises because reviewers know that selected cases are suspected of having problems? How can changes in problem rates over time be interpreted when evaluating intervention programs? Simple random sampling, an alternative to subgroup review, overcomes the problems implied by these questions but is inefficient. The Self-Adapting Focused Review System (SAFRS), introduced and described here, provides an adaptive approach to record selection that is based upon model-weighted probability sampling. It retains the desirable inferential properties of random sampling while allowing reviews to be concentrated on cases currently thought most likely to be problematic. Model development and evaluation are illustrated using hospital data to predict inappropriate admissions.

  10. Long-term strength and damage accumulation in laminates

    NASA Astrophysics Data System (ADS)

    Dzenis, Yuris A.; Joshi, Shiv P.

    1993-04-01

    A modified version of the probabilistic model developed by authors for damage evolution analysis of laminates subjected to random loading is utilized to predict long-term strength of laminates. The model assumes that each ply in a laminate consists of a large number of mesovolumes. Probabilistic variation functions for mesovolumes stiffnesses as well as strengths are used in the analysis. Stochastic strains are calculated using the lamination theory and random function theory. Deterioration of ply stiffnesses is calculated on the basis of the probabilities of mesovolumes failures using the theory of excursions of random process beyond the limits. Long-term strength and damage accumulation in a Kevlar/epoxy laminate under tension and complex in-plane loading are investigated. Effects of the mean level and stochastic deviation of loading on damage evolution and time-to-failure of laminate are discussed. Long-term cumulative damage at the time of the final failure at low loading levels is more than at high loading levels. The effect of the deviation in loading is more pronounced at lower mean loading levels.

  11. Characterizing Time to Diagnostic Resolution After an Abnormal Cancer Screening Exam in Older Adult Participants in the Ohio Patient Navigation Research Program.

    PubMed

    DeSalvo, Jennifer M; Young, Gregory S; Krok-Schoen, Jessica L; Paskett, Electra D

    2017-06-01

    This study aims to test the effectiveness of a patient navigation (PN) intervention to reduce time to diagnostic resolution among older adults age ≥65 years versus those <65 years with abnormal breast, cervical, or colorectal cancer screening exams participating in the Ohio Patient Navigation Research Program (OPNRP). The OPNRP utilized a nested cohort group-randomized trial design to randomize 862 participants ( n = 67 for ≥65 years; n = 795 for <65 years) to PN or usual care conditions. A shared frailty Cox model tested the effect of PN on time to resolution. Older adult participants randomized to PN achieved a 6-month resolution rate that was 127% higher than those randomized to usual care ( p = .001). This effect was not significantly different from participants <65 years. PN significantly reduced time to diagnostic resolution among older adults beginning 6 months after an abnormal cancer screening exam. Health care systems should include this population in PN programs to reduce cancer disparities.

  12. Pólya number and first return of bursty random walk: Rigorous solutions

    NASA Astrophysics Data System (ADS)

    Wan, J.; Xu, X. P.

    2012-03-01

    The recurrence properties of random walks can be characterized by Pólya number, i.e., the probability that the walker has returned to the origin at least once. In this paper, we investigate Pólya number and first return for bursty random walk on a line, in which the walk has different step size and moving probabilities. Using the concept of the Catalan number, we obtain exact results for first return probability, the average first return time and Pólya number for the first time. We show that Pólya number displays two different functional behavior when the walk deviates from the recurrent point. By utilizing the Lagrange inversion formula, we interpret our findings by transferring Pólya number to the closed-form solutions of an inverse function. We also calculate Pólya number using another approach, which corroborates our results and conclusions. Finally, we consider the recurrence properties and Pólya number of two variations of the bursty random walk model.

  13. Design of a randomized, controlled, comparative-effectiveness trial testing a Family Model of Diabetes Self-Management Education (DSME) vs. Standard DSME for Marshallese in the United States.

    PubMed

    Kim Yeary, Karen Hye-Cheon; Long, Christopher R; Bursac, Zoran; McElfish, Pearl Anna

    2017-06-01

    Type 2 diabetes (T2D) is a significant public health problem, with U.S. Pacific Islander communities-such as the Marshallese-bearing a disproportionate burden. Using a community-based participatory approach (CBPR) that engages the strong family-based social infrastructure characteristic of Marshallese communities is a promising way to manage T2D. Led by a collaborative community-academic partnership, the Family Model of Diabetes Self-Management Education (DSME) aimed to change diabetes management behaviors to improve glycemic control in Marshallese adults with T2D by engaging the entire family. To test the Family Model of DSME, a randomized, controlled, comparative effectiveness trial with 240 primary participants was implemented. Half of the primary participants were randomly assigned to the Standard DSME and half were randomly assigned to the Family Model DSME. Both arms received ten hours of content comprised of 6-8 sessions delivered over a 6-8 week period. The Family Model DSME was a cultural adaptation of DSME, whereby the intervention focused on engaging family support for the primary participant with T2D. The Standard DSME was delivered to the primary participant in a community-based group format. Primary participants and participating family members were assessed at baseline and immediate post-intervention, and will also be assessed at 6 and 12 months. The Family Model of DSME aimed to improve glycemic control in Marshallese with T2D. The utilization of a CBPR approach that involves the local stakeholders and the engagement of the family-based social infrastructure of Marshallese communities increase potential for the intervention's success and sustainability.

  14. Integration Profile and Safety of an Adenovirus Hybrid-Vector Utilizing Hyperactive Sleeping Beauty Transposase for Somatic Integration

    PubMed Central

    Zhang, Wenli; Muck-Hausl, Martin; Wang, Jichang; Sun, Chuanbo; Gebbing, Maren; Miskey, Csaba; Ivics, Zoltan; Izsvak, Zsuzsanna; Ehrhardt, Anja

    2013-01-01

    We recently developed adenovirus/transposase hybrid-vectors utilizing the previously described hyperactive Sleeping Beauty (SB) transposase HSB5 for somatic integration and we could show stabilized transgene expression in mice and a canine model for hemophilia B. However, the safety profile of these hybrid-vectors with respect to vector dose and genotoxicity remains to be investigated. Herein, we evaluated this hybrid-vector system in C57Bl/6 mice with escalating vector dose settings. We found that in all mice which received the hyperactive SB transposase, transgene expression levels were stabilized in a dose-dependent manner and that the highest vector dose was accompanied by fatalities in mice. To analyze potential genotoxic side-effects due to somatic integration into host chromosomes, we performed a genome-wide integration site analysis using linker-mediated PCR (LM-PCR) and linear amplification-mediated PCR (LAM-PCR). Analysis of genomic DNA samples obtained from HSB5 treated female and male mice revealed a total of 1327 unique transposition events. Overall the chromosomal distribution pattern was close-to-random and we observed a random integration profile with respect to integration into gene and non-gene areas. Notably, when using the LM-PCR protocol, 27 extra-chromosomal integration events were identified, most likely caused by transposon excision and subsequent transposition into the delivered adenoviral vector genome. In total, this study provides a careful evaluation of the safety profile of adenovirus/Sleeping Beauty transposase hybrid-vectors. The obtained information will be useful when designing future preclinical studies utilizing hybrid-vectors in small and large animal models. PMID:24124483

  15. Effects of a recovery management intervention on Chinese heroin users' community recovery through the mediation effect of enhanced service utilization

    PubMed Central

    Wu, F.; Fu, L.M.; Hser, Y.H.

    2015-01-01

    Background This study investigates whether a recovery management intervention (RMI) can improve the utilization of community drug treatment and wraparound services among heroin users in China and subsequently lead to positive recovery outcomes. Methods Secondary analysis was conducted drawing data from a randomized controlled trial; 100 heroin users with no severe mental health problems were recruited in two Shanghai districts (Hongkou and Yangpu) upon their release from compulsory rehabilitation facilities. A latent variable modeling approach was utilized to test whether the RMI influences heroin users' perceived motivation and readiness for treatment, enhances treatment and wraparound service participation, and, in turn, predicts better recovery outcomes. Results Enrollment in drug treatment and other social service utilization increased significantly as a result of RMI rather than an individual drug user's motivation and readiness for treatment. Increased service utilization thus led to more positive individual recovery outcomes. In addition to this mediation effect through service utilization, the RMI also improved participants' community recovery directly. Conclusions Findings suggest that better drug treatment enrollment, community service utilization and recovery outcomes can be potentially achieved among heroin users in China with carefully designed case management interventions. PMID:24990956

  16. DeepDeath: Learning to predict the underlying cause of death with Big Data.

    PubMed

    Hassanzadeh, Hamid Reza; Ying Sha; Wang, May D

    2017-07-01

    Multiple cause-of-death data provides a valuable source of information that can be used to enhance health standards by predicting health related trajectories in societies with large populations. These data are often available in large quantities across U.S. states and require Big Data techniques to uncover complex hidden patterns. We design two different classes of models suitable for large-scale analysis of mortality data, a Hadoop-based ensemble of random forests trained over N-grams, and the DeepDeath, a deep classifier based on the recurrent neural network (RNN). We apply both classes to the mortality data provided by the National Center for Health Statistics and show that while both perform significantly better than the random classifier, the deep model that utilizes long short-term memory networks (LSTMs), surpasses the N-gram based models and is capable of learning the temporal aspect of the data without a need for building ad-hoc, expert-driven features.

  17. Analysing the primacy of distance in the utilization of health services in the Ahafo-Ano South district, Ghana.

    PubMed

    Buor, Daniel

    2003-01-01

    Although the distance factor has been identified as key in the utilization of health services in rural areas of developing countries, it has been analysed without recourse to related factors of travel time and transport cost. Also, the influence of distance on vulnerable groups in utilization has not been an object of survey by researchers. This paper addresses the impact of distance on utilization, and how distance compares with travel time and transport cost that are related to it in the utilization of health services in the Ahafo-Ano South (rural) district in Ghana. The study, a cross-sectional survey, also identifies the position of distance among other important factors of utilization. A sample of 400, drawn through systematic random technique, was used for the survey. Data were analysed using the regression model and some graphic techniques. The main instruments used in data collection were formal (face-by-face) interview and a questionnaire. The survey finds that distance is the most important factor that influences the utilization of health services in the Ahafo-Ano South district. Other key factors are income, service cost and education. The effect of travel time on utilization reflects that of distance and utilization. Recommendations to reduce distance coverage, improve formal education and reduce poverty have been made.

  18. A Cerebellar-model Associative Memory as a Generalized Random-access Memory

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1989-01-01

    A versatile neural-net model is explained in terms familiar to computer scientists and engineers. It is called the sparse distributed memory, and it is a random-access memory for very long words (for patterns with thousands of bits). Its potential utility is the result of several factors: (1) a large pattern representing an object or a scene or a moment can encode a large amount of information about what it represents; (2) this information can serve as an address to the memory, and it can also serve as data; (3) the memory is noise tolerant--the information need not be exact; (4) the memory can be made arbitrarily large and hence an arbitrary amount of information can be stored in it; and (5) the architecture is inherently parallel, allowing large memories to be fast. Such memories can become important components of future computers.

  19. In Vivo Physiological Experiments in the Random Positioning Macine: A Study on the Rat Intestinal Transit

    NASA Astrophysics Data System (ADS)

    Peana, A. T.; Marzocco, S.; Bianco, G.; Autore, G.; Pinto, A.; Pippia, P.

    2008-06-01

    The aim of this work is to evaluate the rat intestinal transit as well as the expression of enzymes involved in this process and in gastrointestinal homeostasis as ciclooxygenase (COX-1 and COX-2), the inducibile isoform of nitric oxide synthase (iNOS), ICAM-1 and heat shock proteins HSP70 and HSP90. The modeled microgravity conditions were performed utilizing a three-dimensional clinostat, the Random Positioning Machine (RPM). Our results indicate that modeled microgravity significantly reduce rat intestinal transit. Western blot analysis on small intestine tissues of RPM rats reveals a significant increase in iNOS expression, a significant reduction in COX-2 levels, while COX-1 expression remains unaltered, and a significant increase in ICAM-1 and HSP 70 expression. Also a significant increase in HSP 90 stomach expression indicates a strong effect of simulated low g on gastrointestinal homeostasis.

  20. Evaluating diagnosis-based case-mix measures: how well do they apply to the VA population?

    PubMed

    Rosen, A K; Loveland, S; Anderson, J J; Rothendler, J A; Hankin, C S; Rakovski, C C; Moskowitz, M A; Berlowitz, D R

    2001-07-01

    Diagnosis-based case-mix measures are increasingly used for provider profiling, resource allocation, and capitation rate setting. Measures developed in one setting may not adequately capture the disease burden in other settings. To examine the feasibility of adapting two such measures, Adjusted Clinical Groups (ACGs) and Diagnostic Cost Groups (DCGs), to the Department of Veterans Affairs (VA) population. A 60% random sample of veterans who used health care services during FY 1997 was obtained from VA inpatient and outpatient administrative databases. A split-sample technique was used to obtain a 40% sample (n = 1,046,803) for development and a 20% sample (n = 524,461) for validation. Concurrent ACG and DCG risk adjustment models, using 1997 diagnoses and demographics to predict FY 1997 utilization (ambulatory provider encounters, and service days-the sum of a patient's inpatient and outpatient visit days), were fitted and cross-validated. Patients were classified into groupings that indicated a population with multiple psychiatric and medical diseases. Model R-squares explained between 6% and 32% of the variation in service utilization. Although reparameterized models did better in predicting utilization than models with external weights, none of the models was adequate in characterizing the entire population. For predicting service days, DCGs were superior to ACGs in most categories, whereas ACGs did better at discriminating among veterans who had the lowest utilization. Although "off-the-shelf" case-mix measures perform moderately well when applied to another setting, modifications may be required to accurately characterize a population's disease burden with respect to the resource needs of all patients.

  1. Size-dependent piezoelectric energy-harvesting analysis of micro/nano bridges subjected to random ambient excitations

    NASA Astrophysics Data System (ADS)

    Radgolchin, Moeen; Moeenfard, Hamid

    2018-02-01

    The construction of self-powered micro-electro-mechanical units by converting the mechanical energy of the systems into electrical power has attracted much attention in recent years. While power harvesting from deterministic external excitations is state of the art, it has been much more difficult to derive mathematical models for scavenging electrical energy from ambient random vibrations, due to the stochastic nature of the excitations. The current research concerns analytical modeling of micro-bridge energy harvesters based on random vibration theory. Since classical elasticity fails to accurately predict the mechanical behavior of micro-structures, strain gradient theory is employed as a powerful tool to increase the accuracy of the random vibration modeling of the micro-harvester. Equations of motion of the system in the time domain are derived using the Lagrange approach. These are then utilized to determine the frequency and impulse responses of the structure. Assuming the energy harvester to be subjected to a combination of broadband and limited-band random support motion and transverse loading, closed-form expressions for mean, mean square, correlation and spectral density of the output power are derived. The suggested formulation is further exploited to investigate the effect of the different design parameters, including the geometric properties of the structure as well as the properties of the electrical circuit on the resulting power. Furthermore, the effect of length scale parameters on the harvested energy is investigated in detail. It is observed that the predictions of classical and even simple size-dependent theories (such as couple stress) appreciably differ from the findings of strain gradient theory on the basis of random vibration. This study presents a first-time modeling of micro-scale harvesters under stochastic excitations using a size-dependent approach and can be considered as a reliable foundation for future research in the field of micro/nano harvesters subjected to non-deterministic loads.

  2. GeoGebra Assist Discovery Learning Model for Problem Solving Ability and Attitude toward Mathematics

    NASA Astrophysics Data System (ADS)

    Murni, V.; Sariyasa, S.; Ardana, I. M.

    2017-09-01

    This study aims to describe the effet of GeoGebra utilization in the discovery learning model on mathematical problem solving ability and students’ attitude toward mathematics. This research was quasi experimental and post-test only control group design was used in this study. The population in this study was 181 of students. The sampling technique used was cluster random sampling, so the sample in this study was 120 students divided into 4 classes, 2 classes for the experimental class and 2 classes for the control class. Data were analyzed by using one way MANOVA. The results of data analysis showed that the utilization of GeoGebra in discovery learning can lead to solving problems and attitudes towards mathematics are better. This is because the presentation of problems using geogebra can assist students in identifying and solving problems and attracting students’ interest because geogebra provides an immediate response process to students. The results of the research are the utilization of geogebra in the discovery learning can be applied in learning and teaching wider subject matter, beside subject matter in this study.

  3. Characterization of nitride hole lateral transport in a charge trap flash memory by using a random telegraph signal method

    NASA Astrophysics Data System (ADS)

    Liu, Yu-Heng; Jiang, Cheng-Min; Lin, Hsiao-Yi; Wang, Tahui; Tsai, Wen-Jer; Lu, Tao-Cheng; Chen, Kuang-Chao; Lu, Chih-Yuan

    2017-07-01

    We use a random telegraph signal method to investigate nitride trapped hole lateral transport in a charge trap flash memory. The concept of this method is to utilize an interface oxide trap and its associated random telegraph signal as an internal probe to detect a local channel potential change resulting from nitride charge lateral movement. We apply different voltages to the drain of a memory cell and vary a bake temperature in retention to study the electric field and temperature dependence of hole lateral movement in a nitride. Thermal energy absorption by trapped holes in lateral transport is characterized. Mechanisms of hole lateral transport in retention are investigated. From the measured and modeled results, we find that thermally assisted trap-to-band tunneling is a major trapped hole emission mechanism in nitride hole lateral transport.

  4. A Spatio-Temporally Explicit Random Encounter Model for Large-Scale Population Surveys

    PubMed Central

    Jousimo, Jussi; Ovaskainen, Otso

    2016-01-01

    Random encounter models can be used to estimate population abundance from indirect data collected by non-invasive sampling methods, such as track counts or camera-trap data. The classical Formozov–Malyshev–Pereleshin (FMP) estimator converts track counts into an estimate of mean population density, assuming that data on the daily movement distances of the animals are available. We utilize generalized linear models with spatio-temporal error structures to extend the FMP estimator into a flexible Bayesian modelling approach that estimates not only total population size, but also spatio-temporal variation in population density. We also introduce a weighting scheme to estimate density on habitats that are not covered by survey transects, assuming that movement data on a subset of individuals is available. We test the performance of spatio-temporal and temporal approaches by a simulation study mimicking the Finnish winter track count survey. The results illustrate how the spatio-temporal modelling approach is able to borrow information from observations made on neighboring locations and times when estimating population density, and that spatio-temporal and temporal smoothing models can provide improved estimates of total population size compared to the FMP method. PMID:27611683

  5. Effectiveness and cost-effectiveness of a health coaching intervention to improve the lifestyle of patients with knee osteoarthritis: cluster randomized clinical trial.

    PubMed

    Carmona-Terés, Victoria; Lumillo-Gutiérrez, Iris; Jodar-Fernández, Lina; Rodriguez-Blanco, Teresa; Moix-Queraltó, Joanna; Pujol-Ribera, Enriqueta; Mas, Xavier; Batlle-Gualda, Enrique; Gobbo-Montoya, Milena; Berenguera, Anna

    2015-02-25

    The prevalence of osteoarthritis and knee osteoarthritis in the Spanish population is estimated at 17% and 10.2%, respectively. The clinical guidelines concur that the first line treatment for knee osteoarthritis should be non-pharmacological and include weight loss, physical activity and self-management of pain. Health Coaching has been defined as an intervention that facilitates the achievement of health improvement goals, the reduction of unhealthy lifestyles, the improvement of self-management for chronic conditions and quality of life enhancement. The aim of this study is to analyze the effectiveness, cost-effectiveness and cost-utility of a health coaching intervention on quality of life, pain, overweight and physical activity in patients from 18 primary care centres of Barcelona with knee osteoarthritis. Methodology from the Medical Research Council on developing complex interventions. Phase 1: Intervention modelling and operationalization through a qualitative, socioconstructivist study using theoretical sampling with 10 in-depth interviews to patients with knee osteoarthritis and 4 discussion groups of 8-12 primary care professionals, evaluated using a sociological discourse analysis. Phase 2: Effectiveness, cost-effectiveness and cost-utility study with a community-based randomized clinical trial. 360 patients with knee osteoarthritis (180 in each group). Randomization unit: Primary Care Centre. Intervention Group: will receive standard care plus 20-hour health coaching and follow-up sessions. will receive standard care. quality of life as measured by the WOMAC index. Data Analyses: will include standardized response mean and multilevel analysis of repeated measures. Economic analysis: based on cost-effectiveness and cost-utility measures. Phase 3: Evaluation of the intervention programme with a qualitative study. Methodology as in Phase 1. If the analyses show the cost-effectiveness and cost-utility of the intervention the results can be incorporated into the clinical guidelines for the management of knee osteoarthritis in primary care. ISRCTN57405925. Registred 20 June 2014.

  6. A Predictive Analysis of the Department of Defense Distribution System Utilizing Random Forests

    DTIC Science & Technology

    2016-06-01

    resources capable of meeting both customer and individual resource constraints and goals while also maximizing the global benefit to the supply...and probability rules to determine the optimal red wine distribution network for an Italian-based wine producer. The decision support model for...combinations of factors that will result in delivery of the highest quality wines . The model’s first stage inputs basic logistics information to look

  7. Comparison of classification methods for voxel-based prediction of acute ischemic stroke outcome following intra-arterial intervention

    NASA Astrophysics Data System (ADS)

    Winder, Anthony J.; Siemonsen, Susanne; Flottmann, Fabian; Fiehler, Jens; Forkert, Nils D.

    2017-03-01

    Voxel-based tissue outcome prediction in acute ischemic stroke patients is highly relevant for both clinical routine and research. Previous research has shown that features extracted from baseline multi-parametric MRI datasets have a high predictive value and can be used for the training of classifiers, which can generate tissue outcome predictions for both intravenous and conservative treatments. However, with the recent advent and popularization of intra-arterial thrombectomy treatment, novel research specifically addressing the utility of predictive classi- fiers for thrombectomy intervention is necessary for a holistic understanding of current stroke treatment options. The aim of this work was to develop three clinically viable tissue outcome prediction models using approximate nearest-neighbor, generalized linear model, and random decision forest approaches and to evaluate the accuracy of predicting tissue outcome after intra-arterial treatment. Therefore, the three machine learning models were trained, evaluated, and compared using datasets of 42 acute ischemic stroke patients treated with intra-arterial thrombectomy. Classifier training utilized eight voxel-based features extracted from baseline MRI datasets and five global features. Evaluation of classifier-based predictions was performed via comparison to the known tissue outcome, which was determined in follow-up imaging, using the Dice coefficient and leave-on-patient-out cross validation. The random decision forest prediction model led to the best tissue outcome predictions with a mean Dice coefficient of 0.37. The approximate nearest-neighbor and generalized linear model performed equally suboptimally with average Dice coefficients of 0.28 and 0.27 respectively, suggesting that both non-linearity and machine learning are desirable properties of a classifier well-suited to the intra-arterial tissue outcome prediction problem.

  8. Complete hazard ranking to analyze right-censored data: An ALS survival study.

    PubMed

    Huang, Zhengnan; Zhang, Hongjiu; Boss, Jonathan; Goutman, Stephen A; Mukherjee, Bhramar; Dinov, Ivo D; Guan, Yuanfang

    2017-12-01

    Survival analysis represents an important outcome measure in clinical research and clinical trials; further, survival ranking may offer additional advantages in clinical trials. In this study, we developed GuanRank, a non-parametric ranking-based technique to transform patients' survival data into a linear space of hazard ranks. The transformation enables the utilization of machine learning base-learners including Gaussian process regression, Lasso, and random forest on survival data. The method was submitted to the DREAM Amyotrophic Lateral Sclerosis (ALS) Stratification Challenge. Ranked first place, the model gave more accurate ranking predictions on the PRO-ACT ALS dataset in comparison to Cox proportional hazard model. By utilizing right-censored data in its training process, the method demonstrated its state-of-the-art predictive power in ALS survival ranking. Its feature selection identified multiple important factors, some of which conflicts with previous studies.

  9. Regulatory environment and its impact on the market value of investor-owned electric utilities

    NASA Astrophysics Data System (ADS)

    Vishwanathan, Raman

    While other regulated industries have one by one been exposed to competitive reform, electric power, for over eighty years, has remained a great monopoly. For all those years, the vertically integrated suppliers of electricity in the United States have been assigned exclusive territorial (consumer) franchises and have been closely regulated. This environment is in the process change because the electric power industry is currently undergoing some dramatic adjustments. Since 1992, a number of states have initiated regulatory reform and are moving to allow retail customers to choose their energy supplier. There has also been a considerable federal government role in encouraging competition in the generation and transmission of electricity. The objective of this research is to investigate the reaction of investors to the prevailing regulatory environment in the electric utility industry by analyzing the market-to-book value for investor-owned electric utilities in the United States as a gauge of investor concern or support for change. In this study, the variable of interest is the market valuation of utilities, as it captures investor confidence to changes in the regulatory environment. Initially a classic regression model is analyzed on the full sample (of the 96 investor-owned utilities for the years 1992 through 1996), providing a total number of 480 (96 firms over 5 years) observations. Later fixed- and random-effects models are analyzed for the same full-sample model specified in the previous analysis. Also, the analysis is carried forward to examine the impact of the size of the utility and its degree of reliability on nuclear power generation on market values. In the period of this study, 1992--1996, the financial security markets downgraded utilities that were still operating in a regulated environment or had a substantial percentage of their power generation from nuclear power plants. It was also found that the financial market was sensitive to the size of the electric utility. The negative impact of the regulatory environment declined with the increase in the size of the utility, indicating favorable treatment for larger utilities by financial markets. Similarly, for the electric utility industry as a whole, financial markets reacted negatively to nuclear power generation.

  10. Multilevel mixed effects parametric survival models using adaptive Gauss-Hermite quadrature with application to recurrent events and individual participant data meta-analysis.

    PubMed

    Crowther, Michael J; Look, Maxime P; Riley, Richard D

    2014-09-28

    Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Trial-Based Cost-Utility Analysis of Icotinib versus Gefitinib as Second-Line Therapy for Advanced Non-Small Cell Lung Cancer in China.

    PubMed

    Zhang, Chunxiang; Zhang, Hongmei; Shi, Jinning; Wang, Dong; Zhang, Xiuwei; Yang, Jian; Zhai, Qizhi; Ma, Aixia

    2016-01-01

    Our objective is to compare the cost-utility of icotinib and gefitinib for the second-line treatment of advanced non-small cell lung cancer (NSCLC) from the perspective of the Chinese healthcare system. Model technology was applied to assess the data of randomized clinical trials and the direct medical costs from the perspective of the Chinese healthcare system. Five-year quality-adjusted life years (QALYs) and incremental cost-utility ratios (ICURs) were calculated. One-way and probabilistic sensitivity analyses (PSA) were performed. Our model suggested that the median progression-free survival (PFS) was 4.2 months in the icotinib group and 3.5 months in the gefitinib group while they were 4.6 months and 3.4 months, respectively, in the trials. The 5-year QALYs was 0.279 in the icotinib group and 0.269 in the gefitinib group, and the according medical costs were $10662.82 and $13127.57. The ICUR/QALY of icotinib versus gefitinib presented negative in this study. The most sensitive parameter to the ICUR was utility of PFS, ranging from $-1,259,991.25 to $-182,296.61; accordingly the icotinib treatment consistently represented a dominant cost-utility strategy. The icotinib strategy, as a second-line therapy for advanced NSCLC patients in China, is the preferred strategy relative to gefitinib because of the dominant cost-utility. In addition, icotinib shows a good curative effect and safety, resulting in a strong demand for the Chinese market.

  12. Non-Gaussian Multi-resolution Modeling of Magnetosphere-Ionosphere Coupling Processes

    NASA Astrophysics Data System (ADS)

    Fan, M.; Paul, D.; Lee, T. C. M.; Matsuo, T.

    2016-12-01

    The most dynamic coupling between the magnetosphere and ionosphere occurs in the Earth's polar atmosphere. Our objective is to model scale-dependent stochastic characteristics of high-latitude ionospheric electric fields that originate from solar wind magnetosphere-ionosphere interactions. The Earth's high-latitude ionospheric electric field exhibits considerable variability, with increasing non-Gaussian characteristics at decreasing spatio-temporal scales. Accurately representing the underlying stochastic physical process through random field modeling is crucial not only for scientific understanding of the energy, momentum and mass exchanges between the Earth's magnetosphere and ionosphere, but also for modern technological systems including telecommunication, navigation, positioning and satellite tracking. While a lot of efforts have been made to characterize the large-scale variability of the electric field in the context of Gaussian processes, no attempt has been made so far to model the small-scale non-Gaussian stochastic process observed in the high-latitude ionosphere. We construct a novel random field model using spherical needlets as building blocks. The double localization of spherical needlets in both spatial and frequency domains enables the model to capture the non-Gaussian and multi-resolutional characteristics of the small-scale variability. The estimation procedure is computationally feasible due to the utilization of an adaptive Gibbs sampler. We apply the proposed methodology to the computational simulation output from the Lyon-Fedder-Mobarry (LFM) global magnetohydrodynamics (MHD) magnetosphere model. Our non-Gaussian multi-resolution model results in characterizing significantly more energy associated with the small-scale ionospheric electric field variability in comparison to Gaussian models. By accurately representing unaccounted-for additional energy and momentum sources to the Earth's upper atmosphere, our novel random field modeling approach will provide a viable remedy to the current numerical models' systematic biases resulting from the underestimation of high-latitude energy and momentum sources.

  13. Assessing the Potential of Land Use Modification to Mitigate Ambient NO2 and Its Consequences for Respiratory Health

    PubMed Central

    Rao, Meenakshi; George, Linda A.; Shandas, Vivek; Rosenstiel, Todd N.

    2017-01-01

    Understanding how local land use and land cover (LULC) shapes intra-urban concentrations of atmospheric pollutants—and thus human health—is a key component in designing healthier cities. Here, NO2 is modeled based on spatially dense summer and winter NO2 observations in Portland-Hillsboro-Vancouver (USA), and the spatial variation of NO2 with LULC investigated using random forest, an ensemble data learning technique. The NO2 random forest model, together with BenMAP, is further used to develop a better understanding of the relationship among LULC, ambient NO2 and respiratory health. The impact of land use modifications on ambient NO2, and consequently on respiratory health, is also investigated using a sensitivity analysis. We find that NO2 associated with roadways and tree-canopied areas may be affecting annual incidence rates of asthma exacerbation in 4–12 year olds by +3000 per 100,000 and −1400 per 100,000, respectively. Our model shows that increasing local tree canopy by 5% may reduce local incidences rates of asthma exacerbation by 6%, indicating that targeted local tree-planting efforts may have a substantial impact on reducing city-wide incidence of respiratory distress. Our findings demonstrate the utility of random forest modeling in evaluating LULC modifications for enhanced respiratory health. PMID:28698523

  14. Preclinical neuroprotective actions of xenon and possible implications for human therapeutics: a narrative review.

    PubMed

    Maze, Mervyn

    2016-02-01

    The purpose of this report is to facilitate an understanding of the possible application of xenon for neuroprotection in critical care settings. This narrative review appraises the literature assessing the efficacy and safety of xenon in preclinical models of acute ongoing neurologic injury. Databases of the published literature (MEDLINE® and EMBASE™) were appraised for peer-reviewed manuscripts addressing the use of xenon in both preclinical models and disease states of acute ongoing neurologic injury. For randomized clinical trials not yet reported, the investigators' declarations in the National Institutes of Health clinical trials website were considered. While not a primary focus of this review, to date, xenon cannot be distinguished as superior for surgical anesthesia over existing alternatives in adults. Nevertheless, studies in a variety of preclinical disease models from multiple laboratories have consistently shown xenon's neuroprotective properties. These properties are enhanced in settings where xenon is combined with hypothermia. Small randomized clinical trials are underway to explore xenon's efficacy and safety in clinical settings of acute neurologic injury where hypothermia is the current standard of care. According to the evidence to date, the neuroprotective efficacy of xenon in preclinical models and its safety in clinical anesthesia set the stage for the launch of randomized clinical trials to determine whether these encouraging neuroprotective findings can be translated into clinical utility.

  15. Model of random center vortex lines in continuous 2 +1 -dimensional spacetime

    NASA Astrophysics Data System (ADS)

    Altarawneh, Derar; Engelhardt, Michael; Höllwieser, Roman

    2016-12-01

    A picture of confinement in QCD based on a condensate of thick vortices with fluxes in the center of the gauge group (center vortices) is studied. Previous concrete model realizations of this picture utilized a hypercubic space-time scaffolding, which, together with many advantages, also has some disadvantages, e.g., in the treatment of vortex topological charge. In the present work, we explore a center vortex model which does not rely on such a scaffolding. Vortices are represented by closed random lines in continuous 2 +1 -dimensional space-time. These random lines are modeled as being piecewise linear, and an ensemble is generated by Monte Carlo methods. The physical space in which the vortex lines are defined is a torus with periodic boundary conditions. Besides moving, growing, and shrinking of the vortex configurations, also reconnections are allowed. Our ensemble therefore contains not a fixed but a variable number of closed vortex lines. This is expected to be important for realizing the deconfining phase transition. We study both vortex percolation and the potential V (R ) between the quark and antiquark as a function of distance R at different vortex densities, vortex segment lengths, reconnection conditions, and at different temperatures. We find three deconfinement phase transitions, as a function of density, as a function of vortex segment length, and as a function of temperature.

  16. Random and non-random mating populations: Evolutionary dynamics in meiotic drive.

    PubMed

    Sarkar, Bijan

    2016-01-01

    Game theoretic tools are utilized to analyze a one-locus continuous selection model of sex-specific meiotic drive by considering nonequivalence of the viabilities of reciprocal heterozygotes that might be noticed at an imprinted locus. The model draws attention to the role of viability selections of different types to examine the stable nature of polymorphic equilibrium. A bridge between population genetics and evolutionary game theory has been built up by applying the concept of the Fundamental Theorem of Natural Selection. In addition to pointing out the influences of male and female segregation ratios on selection, configuration structure reveals some noted results, e.g., Hardy-Weinberg frequencies hold in replicator dynamics, occurrence of faster evolution at the maximized variance fitness, existence of mixed Evolutionarily Stable Strategy (ESS) in asymmetric games, the tending evolution to follow not only a 1:1 sex ratio but also a 1:1 different alleles ratio at particular gene locus. Through construction of replicator dynamics in the group selection framework, our selection model introduces a redefining bases of game theory to incorporate non-random mating where a mating parameter associated with population structure is dependent on the social structure. Also, the model exposes the fact that the number of polymorphic equilibria will depend on the algebraic expression of population structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, J.; Polly, B.; Collis, J.

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define 'explicit' input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  18. Evaluation of Automated Model Calibration Techniques for Residential Building Energy Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    and Ben Polly, Joseph Robertson; Polly, Ben; Collis, Jon

    2013-09-01

    This simulation study adapts and applies the general framework described in BESTEST-EX (Judkoff et al 2010) for self-testing residential building energy model calibration methods. BEopt/DOE-2.2 is used to evaluate four mathematical calibration methods in the context of monthly, daily, and hourly synthetic utility data for a 1960's-era existing home in a cooling-dominated climate. The home's model inputs are assigned probability distributions representing uncertainty ranges, random selections are made from the uncertainty ranges to define "explicit" input values, and synthetic utility billing data are generated using the explicit input values. The four calibration methods evaluated in this study are: an ASHRAEmore » 1051-RP-based approach (Reddy and Maor 2006), a simplified simulated annealing optimization approach, a regression metamodeling optimization approach, and a simple output ratio calibration approach. The calibration methods are evaluated for monthly, daily, and hourly cases; various retrofit measures are applied to the calibrated models and the methods are evaluated based on the accuracy of predicted savings, computational cost, repeatability, automation, and ease of implementation.« less

  19. Immunology and Homeopathy. 3. Experimental Studies on Animal Models

    PubMed Central

    Bellavite, Paolo; Ortolani, Riccardo; Conforti, Anita

    2006-01-01

    A search of the literature and the experiments carried out by the authors of this review show that there are a number of animal models where the effect of homeopathic dilutions or the principles of homeopathic medicine have been tested. The results relate to the immunostimulation by ultralow doses of antigens, the immunological models of the ‘simile’, the regulation of acute or chronic inflammatory processes and the use of homeopathic medicines in farming. The models utilized by different research groups are extremely etherogeneous and differ as the test medicines, the dilutions and the outcomes are concerned. Some experimental lines, particularly those utilizing mice models of immunomodulation and anti-inflammatory effects of homeopathic complex formulations, give support to a real effect of homeopathic high dilutions in animals, but often these data are of preliminary nature and have not been independently replicated. The evidence emerging from animal models is supporting the traditional ‘simile’ rule, according to which ultralow doses of compounds, that in high doses are pathogenic, may have paradoxically a protective or curative effect. Despite a few encouraging observational studies, the effectiveness of the homeopathic prevention or therapy of infections in veterinary medicine is not sufficiently supported by randomized and controlled trials. PMID:16786046

  20. Guidance for the utility of linear models in meta-analysis of genetic association studies of binary phenotypes.

    PubMed

    Cook, James P; Mahajan, Anubha; Morris, Andrew P

    2017-02-01

    Linear mixed models are increasingly used for the analysis of genome-wide association studies (GWAS) of binary phenotypes because they can efficiently and robustly account for population stratification and relatedness through inclusion of random effects for a genetic relationship matrix. However, the utility of linear (mixed) models in the context of meta-analysis of GWAS of binary phenotypes has not been previously explored. In this investigation, we present simulations to compare the performance of linear and logistic regression models under alternative weighting schemes in a fixed-effects meta-analysis framework, considering designs that incorporate variable case-control imbalance, confounding factors and population stratification. Our results demonstrate that linear models can be used for meta-analysis of GWAS of binary phenotypes, without loss of power, even in the presence of extreme case-control imbalance, provided that one of the following schemes is used: (i) effective sample size weighting of Z-scores or (ii) inverse-variance weighting of allelic effect sizes after conversion onto the log-odds scale. Our conclusions thus provide essential recommendations for the development of robust protocols for meta-analysis of binary phenotypes with linear models.

  1. Economic evaluations and their use in infection prevention and control: a narrative review.

    PubMed

    Rennert-May, Elissa; Conly, John; Leal, Jenine; Smith, Stephanie; Manns, Braden

    2018-01-01

    The objective of this review is to provide a comprehensive overview of the different types of economic evaluations that can be utilized by Infection Prevention and Control practitioners with a particular focus on the use of the quality adjusted life year, and its associated challenges. We also highlight existing economic evaluations published within Infection Prevention and Control, research gaps and future directions. Narrative Review. To date the majority of economic evaluations within Infection Prevention and Control are considered partial economic evaluations. Acknowledging the challenges, which include variable utilities within infection prevention and control, a lack of randomized controlled trials, and difficulty in modelling infectious diseases in general, future economic evaluation studies should strive to be consistent with published guidelines for economic evaluations. This includes the use of quality adjusted life years. Further research is required to estimate utility scores of relevance within Infection Prevention and Control.

  2. Matched-filter algorithm for subpixel spectral detection in hyperspectral image data

    NASA Astrophysics Data System (ADS)

    Borough, Howard C.

    1991-11-01

    Hyperspectral imagery, spatial imagery with associated wavelength data for every pixel, offers a significant potential for improved detection and identification of certain classes of targets. The ability to make spectral identifications of objects which only partially fill a single pixel (due to range or small size) is of considerable interest. Multiband imagery such as Landsat's 5 and 7 band imagery has demonstrated significant utility in the past. Hyperspectral imaging systems with hundreds of spectral bands offer improved performance. To explore the application of differentpixel spectral detection algorithms a synthesized set of hyperspectral image data (hypercubes) was generated utilizing NASA earth resources and other spectral data. The data was modified using LOWTRAN 7 to model the illumination, atmospheric contributions, attenuations and viewing geometry to represent a nadir view from 10,000 ft. altitude. The base hypercube (HC) represented 16 by 21 spatial pixels with 101 wavelength samples from 0.5 to 2.5 micrometers for each pixel. Insertions were made into the base data to provide random location, random pixel percentage, and random material. Fifteen different hypercubes were generated for blind testing of candidate algorithms. An algorithm utilizing a matched filter in the spectral dimension proved surprisingly good yielding 100% detections for pixels filled greater than 40% with a standard camouflage paint, and a 50% probability of detection for pixels filled 20% with the paint, with no false alarms. The false alarm rate as a function of the number of spectral bands in the range from 101 to 12 bands was measured and found to increase from zero to 50% illustrating the value of a large number of spectral bands. This test was on imagery without system noise; the next step is to incorporate typical system noise sources.

  3. Effects of Enhancing School-Based Body Mass Index Screening Reports with Parent Education on Report Utility and Parental Intent To Modify Obesity Risk Factors.

    PubMed

    Bailey-Davis, Lisa; Peyer, Karissa L; Fang, Yinan; Kim, Jae-Kwang; Welk, Greg J

    2017-04-01

    School-based body mass index screenings (SBMIS) have been controversial. We aimed to determine if parents would indicate improved utility with SBMIS when the report included parent education and whether parental intent to modify obesity risk factors would vary with report type or child weight. A cluster-controlled trial was conducted with 31 elementary schools randomized to distribute a standard SBMIS report or the standard report plus education (SBMIS+). A random subsample of parents completed a mailed survey (731 SBMIS, 738 SBMIS+). Using a two-stage cluster sampling design, logistic regression models with school-level random effect were used to assess differences between conditions and by weight category. Parents in the SBMIS+ condition vs. the standard condition were more likely to indicate that the report provided useful information (not significant) and an intent to help their child get enough sleep (p < 0.001). Parents of children who were overweight or obese were less likely than parents of children who were not to indicate that the report provided useful information about their child's weight status (p < 0.001) or access to resources (p < 0.05). However, these parents were more likely to plan a visit to healthcare provider (p < 0.001) and to intend to limit sugar-sweetened beverages (p < 0.05). Parental education can enhance the utility of the SBMIS report and parental intention to modify at least one obesity risk factor. SBMIS reports prompted parents of children with overweight and obesity to seek clinical care and limit sugar-sweetened drinks.

  4. Genetic evaluation and selection response for growth in meat-type quail through random regression models using B-spline functions and Legendre polynomials.

    PubMed

    Mota, L F M; Martins, P G M A; Littiere, T O; Abreu, L R A; Silva, M A; Bonafé, C M

    2018-04-01

    The objective was to estimate (co)variance functions using random regression models (RRM) with Legendre polynomials, B-spline function and multi-trait models aimed at evaluating genetic parameters of growth traits in meat-type quail. A database containing the complete pedigree information of 7000 meat-type quail was utilized. The models included the fixed effects of contemporary group and generation. Direct additive genetic and permanent environmental effects, considered as random, were modeled using B-spline functions considering quadratic and cubic polynomials for each individual segment, and Legendre polynomials for age. Residual variances were grouped in four age classes. Direct additive genetic and permanent environmental effects were modeled using 2 to 4 segments and were modeled by Legendre polynomial with orders of fit ranging from 2 to 4. The model with quadratic B-spline adjustment, using four segments for direct additive genetic and permanent environmental effects, was the most appropriate and parsimonious to describe the covariance structure of the data. The RRM using Legendre polynomials presented an underestimation of the residual variance. Lesser heritability estimates were observed for multi-trait models in comparison with RRM for the evaluated ages. In general, the genetic correlations between measures of BW from hatching to 35 days of age decreased as the range between the evaluated ages increased. Genetic trend for BW was positive and significant along the selection generations. The genetic response to selection for BW in the evaluated ages presented greater values for RRM compared with multi-trait models. In summary, RRM using B-spline functions with four residual variance classes and segments were the best fit for genetic evaluation of growth traits in meat-type quail. In conclusion, RRM should be considered in genetic evaluation of breeding programs.

  5. The effectiveness of patient navigation to improve healthcare utilization outcomes: A meta-analysis of randomized controlled trials.

    PubMed

    Ali-Faisal, Sobia F; Colella, Tracey J F; Medina-Jaudes, Naomi; Benz Scott, Lisa

    2017-03-01

    To determine the effects of patient navigation (PN) on healthcare utilization outcomes using meta-analysis and the quality of evidence. Medical and social science databases were searched for randomized controlled trials published in English between 1989 and May 2015. The review process was guided by PRISMA. Included studies were assessed for quality using the Downs and Black tool. Data were extracted to assess the effect of navigation on: health screening rates, diagnostic resolution, cancer care follow-up treatment adherence, and attendance of care events. Random-effects models were used to compute risk ratios and I 2 statistics determined the impact of heterogeneity. Of 3985 articles screened, 25 articles met inclusion criteria. Compared to usual care, patients who received PN were significantly more likely to access health screening (OR 2.48, 95% CI, 1.93-3.18, P<0.00001) and attend a recommended care event (OR 2.55, 95% CI, 1.27-5.10, P<0.01). PN was favoured to increase adherence to cancer care follow-up treatment and obtain diagnoses. Most studies involved trained lay navigators (n=12) compared to health professionals (n=9). PN is effective to increase screening rates and complete care events. PN is an effective intervention for use in healthcare. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  7. Early Workplace Communication and Problem Solving to Prevent Back Disability: Results of a Randomized Controlled Trial Among High-Risk Workers and Their Supervisors.

    PubMed

    Linton, Steven J; Boersma, Katja; Traczyk, Michal; Shaw, William; Nicholas, Michael

    2016-06-01

    Purpose There is a clear need for interventions that successfully prevent the development of disability due to back pain. We hypothesized that an intervention aimed at both the worker and the workplace could be effective. Hence, we tested the effects of a new early intervention, based on the misdirected problem solving model, aimed at both workers at risk of long-term impairments and their workplace. Methods Supervisors of volunteers with back pain, no red flags, and a high score on a screen (Örebro Musculoskeletal Screening Questionnaire) were randomized to either an evidence based treatment as usual (TAU) or to a worker and workplace package (WWP). The WWP intervention included communication and problem solving skills for the patient and their immediate supervisor. The key outcome variables of work absence due to pain, health-care utilization, perceived health, and pain intensity were collected before, after and at a 6 month follow up. Results The WWP showed significantly larger improvements relative to the TAU for work absence due to pain, perceived health, and health-care utilization. Both groups improved on pain ratings but there was no significant difference between the groups. The WWP not only had significantly fewer participants utilizing health care and work absence due to pain, but the number of health care visits and days absent were also significantly lower than the TAU. Conclusions The WWP with problem solving and communication skills resulted in fewer days off work, fewer health care visits and better perceived health. This supports the misdirected problem solving model and indicates that screening combined with an active intervention to enhance skills is quite successful and likely cost-effective. Future research should replicate and extend these findings with health-economic analyses.

  8. Toward negative Poisson's ratio composites: Investigation of the auxetic behavior of fibrous networks

    NASA Astrophysics Data System (ADS)

    Tatlier, Mehmet Seha

    Random fibrous can be found among natural and synthetic materials. Some of these random fibrous networks possess negative Poisson's ratio and they are extensively called auxetic materials. The governing mechanisms behind this counter intuitive property in random networks are yet to be understood and this kind of auxetic material remains widely under-explored. However, most of synthetic auxetic materials suffer from their low strength. This shortcoming can be rectified by developing high strength auxetic composites. The process of embedding auxetic random fibrous networks in a polymer matrix is an attractive alternate route to the manufacture of auxetic composites, however before such an approach can be developed, a methodology for designing fibrous networks with the desired negative Poisson's ratios must first be established. This requires an understanding of the factors which bring about negative Poisson's ratios in these materials. In this study, a numerical model is presented in order to investigate the auxetic behavior in compressed random fiber networks. Finite element analyses of three-dimensional stochastic fiber networks were performed to gain insight into the effects of parameters such as network anisotropy, network density, and degree of network compression on the out-of-plane Poisson's ratio and Young's modulus. The simulation results suggest that the compression is the critical parameter that gives rise to negative Poisson's ratio while anisotropy significantly promotes the auxetic behavior. This model can be utilized to design fibrous auxetic materials and to evaluate feasibility of developing auxetic composites by using auxetic fibrous networks as the reinforcing layer.

  9. Is prostate cancer screening cost-effective? A microsimulation model of prostate-specific antigen-based screening for British Columbia, Canada

    PubMed Central

    Pataky, Reka; Gulati, Roman; Etzioni, Ruth; Black, Peter; Chi, Kim N.; Coldman, Andrew J.; Pickles, Tom; Tyldesley, Scott; Peacock, Stuart

    2015-01-01

    Prostate-specific antigen (PSA) screening for prostate cancer may reduce mortality, but it incurs considerable risk of overdiagnosis and potential harm to quality of life. Our objective was to evaluate the cost-effectiveness of PSA screening, with and without adjustment for quality of life, for the British Columbia (BC) population. We adapted an existing natural history model using BC incidence, treatment, cost and mortality patterns. The modeled mortality benefit of screening derives from a stage-shift mechanism, assuming mortality reduction consistent with the European Study of Randomized Screening for Prostate Cancer. The model projected outcomes for 40 year-old men under 14 combinations of screening ages and frequencies. Cost and utility estimates were explored with deterministic sensitivity analysis. The incremental cost-effectiveness of regular screening ranged from $36,300/LYG, for screening every four years from ages 55-69, to $588,300/LYG, for screening every two years from ages 40-74. The marginal benefits of increasing screening frequency to two years or starting screening at age 40 were small and came at significant cost. After utility adjustment, all screening strategies resulted in a loss of QALYs; however, this result was very sensitive to utility estimates. Plausible outcomes under a range of screening strategies inform discussion of prostate cancer screening policy in BC and similar jurisdictions. Screening may be cost-effective but the sensitivity of results to utility values suggests individual preferences for quality versus quantity of life should be a key consideration. PMID:24443367

  10. Enhanced risk prediction model for emergency department use and hospitalizations in patients in a primary care medical home.

    PubMed

    Takahashi, Paul Y; Heien, Herbert C; Sangaralingham, Lindsey R; Shah, Nilay D; Naessens, James M

    2016-07-01

    With the advent of healthcare payment reform, identifying high-risk populations has become more important to providers. Existing risk-prediction models often focus on chronic conditions. This study sought to better understand other factors to improve identification of the highest risk population. A retrospective cohort study of a paneled primary care population utilizing 2010 data to calibrate a risk prediction model of hospital and emergency department (ED) use in 2011. Data were randomly split into development and validation data sets. We compared the enhanced model containing the additional risk predictors with the Minnesota medical tiering model. The study was conducted in the primary care practice of an integrated delivery system at an academic medical center in Rochester, Minnesota. The study focus was primary care medical home patients in 2010 and 2011 (n = 84,752), with the primary outcome of subsequent hospitalization or ED visit. A total of 42,384 individuals derived the enhanced risk-prediction model and 42,368 individuals validated the model. Predictors included Adjusted Clinical Groups-based Minnesota medical tiering, patient demographics, insurance status, and prior year healthcare utilization. Additional variables included specific mental and medical conditions, use of high-risk medications, and body mass index. The area under the curve in the enhanced model was 0.705 (95% CI, 0.698-0.712) compared with 0.662 (95% CI, 0.656-0.669) in the Minnesota medical tiering-only model. New high-risk patients in the enhanced model were more likely to have lack of health insurance, presence of Medicaid, diagnosed depression, and prior ED utilization. An enhanced model including additional healthcare-related factors improved the prediction of risk of hospitalization or ED visit.

  11. Identifying relevant hyperspectral bands using Boruta: a temporal analysis of water hyacinth biocontrol

    NASA Astrophysics Data System (ADS)

    Agjee, Na'eem Hoosen; Ismail, Riyad; Mutanga, Onisimo

    2016-10-01

    Water hyacinth plants (Eichhornia crassipes) are threatening freshwater ecosystems throughout Africa. The Neochetina spp. weevils are seen as an effective solution that can combat the proliferation of the invasive alien plant. We aimed to determine if multitemporal hyperspectral data could be utilized to detect the efficacy of the biocontrol agent. The random forest (RF) algorithm was used to classify variable infestation levels for 6 weeks using: (1) all the hyperspectral bands, (2) bands selected by the recursive feature elimination (RFE) algorithm, and (3) bands selected by the Boruta algorithm. Results showed that the RF model using all the bands successfully produced low-classification errors (12.50% to 32.29%) for all 6 weeks. However, the RF model using Boruta selected bands produced lower classification errors (8.33% to 15.62%) than the RF model using all the bands or bands selected by the RFE algorithm (11.25% to 21.25%) for all 6 weeks, highlighting the utility of Boruta as an all relevant band selection algorithm. All relevant bands selected by Boruta included: 352, 754, 770, 771, 775, 781, 782, 783, 786, and 789 nm. It was concluded that RF coupled with Boruta band-selection algorithm can be utilized to undertake multitemporal monitoring of variable infestation levels on water hyacinth plants.

  12. An Active RBSE Framework to Generate Optimal Stimulus Sequences in a BCI for Spelling

    NASA Astrophysics Data System (ADS)

    Moghadamfalahi, Mohammad; Akcakaya, Murat; Nezamfar, Hooman; Sourati, Jamshid; Erdogmus, Deniz

    2017-10-01

    A class of brain computer interfaces (BCIs) employs noninvasive recordings of electroencephalography (EEG) signals to enable users with severe speech and motor impairments to interact with their environment and social network. For example, EEG based BCIs for typing popularly utilize event related potentials (ERPs) for inference. Presentation paradigm design in current ERP-based letter by letter typing BCIs typically query the user with an arbitrary subset characters. However, the typing accuracy and also typing speed can potentially be enhanced with more informed subset selection and flash assignment. In this manuscript, we introduce the active recursive Bayesian state estimation (active-RBSE) framework for inference and sequence optimization. Prior to presentation in each iteration, rather than showing a subset of randomly selected characters, the developed framework optimally selects a subset based on a query function. Selected queries are made adaptively specialized for users during each intent detection. Through a simulation-based study, we assess the effect of active-RBSE on the performance of a language-model assisted typing BCI in terms of typing speed and accuracy. To provide a baseline for comparison, we also utilize standard presentation paradigms namely, row and column matrix presentation paradigm and also random rapid serial visual presentation paradigms. The results show that utilization of active-RBSE can enhance the online performance of the system, both in terms of typing accuracy and speed.

  13. Association Between Cognitive Decline in Older Adults and Utilization of Primary Care Physician Services in Pennsylvania

    PubMed Central

    Fowler, Nicole R.; Morrow, Lisa A.; Tu, Li-Chuan; Landsittel, Douglas P.; Snitz, Beth E.; Rodriquez, Eric G.; Saxton, Judith A.

    2012-01-01

    OBJECTIVE To assess the relationship between cognitive decline of older patients (≥65 years) and utilization of primary care physician (PCP) services over 24-months. DESIGN Retrospective analysis of prospectively collected data from a cluster randomized trial that took place from 2006 to 2010 and investigated the relationship between formal neuropsychological evaluation and patient outcomes in primary care. SETTING Twenty-four PCPs in 11 practices in southwestern Pennsylvania. Most practices were suburban and included more than 5 PCPs. PARTICIPANTS A sample of 423 primary care patients 65 years or older. MEASUREMENTS The association between the number of PCP visits and a decline in cognitive status, as determined by multivariable analyses that controlled for patient-level, physician-level, and practice-level factors (e.g., patient age, comorbidities, and symptoms of depression; practice location and size; PCP age and sex) and used a linear mixed model with a random intercept to adjust for clustering. RESULTS Over a two year follow-up, 199 patients (47.0%) experienced a decline in cognitive status. Patients with a cognitive decline had a mean of 0.69 more PCP visits than did patients without a cognitive decline (P<0.05). CONCLUSIONS Early signs of cognitive decline may be an indicator of greater utilization of primary care. Given the demographic trends, more PCPs are likely to be needed to meet the increasing needs of the older population. PMID:22798988

  14. Adaptive adjustment of the randomization ratio using historical control data

    PubMed Central

    Hobbs, Brian P.; Carlin, Bradley P.; Sargent, Daniel J.

    2013-01-01

    Background Prospective trial design often occurs in the presence of “acceptable” [1] historical control data. Typically this data is only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. Purpose We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al. [2], succeeded a similar trial reported by Saltz et al. [3], and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. Methods The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS) characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors [4] are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial’s frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Results Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure leads to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Limitations Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. Conclusions The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on pre-existing information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare. PMID:23690095

  15. Adaptive adjustment of the randomization ratio using historical control data.

    PubMed

    Hobbs, Brian P; Carlin, Bradley P; Sargent, Daniel J

    2013-01-01

    Prospective trial design often occurs in the presence of 'acceptable' historical control data. Typically, these data are only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al., succeeded a similar trial reported by Saltz et al., and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS), characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial's frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure lead to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on preexisting information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare.

  16. Hardware architecture for projective model calculation and false match refining using random sample consensus algorithm

    NASA Astrophysics Data System (ADS)

    Azimi, Ehsan; Behrad, Alireza; Ghaznavi-Ghoushchi, Mohammad Bagher; Shanbehzadeh, Jamshid

    2016-11-01

    The projective model is an important mapping function for the calculation of global transformation between two images. However, its hardware implementation is challenging because of a large number of coefficients with different required precisions for fixed point representation. A VLSI hardware architecture is proposed for the calculation of a global projective model between input and reference images and refining false matches using random sample consensus (RANSAC) algorithm. To make the hardware implementation feasible, it is proved that the calculation of the projective model can be divided into four submodels comprising two translations, an affine model and a simpler projective mapping. This approach makes the hardware implementation feasible and considerably reduces the required number of bits for fixed point representation of model coefficients and intermediate variables. The proposed hardware architecture for the calculation of a global projective model using the RANSAC algorithm was implemented using Verilog hardware description language and the functionality of the design was validated through several experiments. The proposed architecture was synthesized by using an application-specific integrated circuit digital design flow utilizing 180-nm CMOS technology as well as a Virtex-6 field programmable gate array. Experimental results confirm the efficiency of the proposed hardware architecture in comparison with software implementation.

  17. Modeling and Bayesian parameter estimation for shape memory alloy bending actuators

    NASA Astrophysics Data System (ADS)

    Crews, John H.; Smith, Ralph C.

    2012-04-01

    In this paper, we employ a homogenized energy model (HEM) for shape memory alloy (SMA) bending actuators. Additionally, we utilize a Bayesian method for quantifying parameter uncertainty. The system consists of a SMA wire attached to a flexible beam. As the actuator is heated, the beam bends, providing endoscopic motion. The model parameters are fit to experimental data using an ordinary least-squares approach. The uncertainty in the fit model parameters is then quantified using Markov Chain Monte Carlo (MCMC) methods. The MCMC algorithm provides bounds on the parameters, which will ultimately be used in robust control algorithms. One purpose of the paper is to test the feasibility of the Random Walk Metropolis algorithm, the MCMC method used here.

  18. Emergent central pattern generator behavior in gap-junction-coupled Hodgkin-Huxley style neuron model.

    PubMed

    Horn, Kyle G; Memelli, Heraldo; Solomon, Irene C

    2012-01-01

    Most models of central pattern generators (CPGs) involve two distinct nuclei mutually inhibiting one another via synapses. Here, we present a single-nucleus model of biologically realistic Hodgkin-Huxley neurons with random gap junction coupling. Despite no explicit division of neurons into two groups, we observe a spontaneous division of neurons into two distinct firing groups. In addition, we also demonstrate this phenomenon in a simplified version of the model, highlighting the importance of afterhyperpolarization currents (I(AHP)) to CPGs utilizing gap junction coupling. The properties of these CPGs also appear sensitive to gap junction conductance, probability of gap junction coupling between cells, topology of gap junction coupling, and, to a lesser extent, input current into our simulated nucleus.

  19. Geographic Gossip: Efficient Averaging for Sensor Networks

    NASA Astrophysics Data System (ADS)

    Dimakis, Alexandros D. G.; Sarwate, Anand D.; Wainwright, Martin J.

    Gossip algorithms for distributed computation are attractive due to their simplicity, distributed nature, and robustness in noisy and uncertain environments. However, using standard gossip algorithms can lead to a significant waste in energy by repeatedly recirculating redundant information. For realistic sensor network model topologies like grids and random geometric graphs, the inefficiency of gossip schemes is related to the slow mixing times of random walks on the communication graph. We propose and analyze an alternative gossiping scheme that exploits geographic information. By utilizing geographic routing combined with a simple resampling method, we demonstrate substantial gains over previously proposed gossip protocols. For regular graphs such as the ring or grid, our algorithm improves standard gossip by factors of $n$ and $\\sqrt{n}$ respectively. For the more challenging case of random geometric graphs, our algorithm computes the true average to accuracy $\\epsilon$ using $O(\\frac{n^{1.5}}{\\sqrt{\\log n}} \\log \\epsilon^{-1})$ radio transmissions, which yields a $\\sqrt{\\frac{n}{\\log n}}$ factor improvement over standard gossip algorithms. We illustrate these theoretical results with experimental comparisons between our algorithm and standard methods as applied to various classes of random fields.

  20. Health utility of patients with advanced gastrointestinal stromal tumors (GIST) after failure of imatinib and sunitinib: findings from GRID, a randomized, double-blind, placebo-controlled phase III study of regorafenib versus placebo.

    PubMed

    Poole, Chris D; Connolly, Mark P; Chang, Jane; Currie, Craig J

    2015-07-01

    In this analysis we report patients with advanced gastrointestinal stromal tumors (GIST) refractory to imatinib and sunitinib therapy as derived from the EuroQol-5D (EQ-5D) for progression-free (PF) and progressive disease health status. Data were analyzed from a phase III trial conducted at 57 hospitals in 17 countries (trial registration number, NCT01271712). Patients with advanced GIST were randomized (2:1) to receive blinded treatment using oral regorafenib 160 mg daily or placebo, plus best supportive care (BSC) in both groups, for the first 3 weeks of each 4-week cycle. EQ-5D-3L was administered on day 1 of each cycle before contact with their physician and before any study-related procedures. The effect of disease progression on the utility of EQ-5D was tested with paired-samples comparison and general linear mixed modeling (GLMM). One hundred and eighty five patients [93 % of the intention-to-treat (ITT) population] completed 803 EQ-5D questionnaires: 77.7 % in progression-free (PF) state, 6.5 % at progression, 13.9 % following first progression, and 1.9 % after second progression. Mean baseline utility was 0.767 (SD 0.221) with no significant between-group differences for active treatment and BSC. The first post-progression health state was 0.647 (SD 0.343), suggesting significantly impaired health-related quality of life after confirmed disease progression showed a decrease of -0.120 (paired samples t test, p = 0.001). GLMM showed no effect of study treatment or cycle number on utility. We demonstrate a significant and clinically meaningful difference in health state utility values between PF and progression. Utility values remained stable over successive regorafenib cycles after controlling for disease status and treatment type.

  1. Utilization of services in a randomized trial testing phone- and web-based interventions for smoking cessation.

    PubMed

    Zbikowski, Susan M; Jack, Lisa M; McClure, Jennifer B; Deprey, Mona; Javitz, Harold S; McAfee, Timothy A; Catz, Sheryl L; Richards, Julie; Bush, Terry; Swan, Gary E

    2011-05-01

    Phone counseling has become standard for behavioral smoking cessation treatment. Newer options include Web and integrated phone-Web treatment. No prior research, to our knowledge, has systematically compared the effectiveness of these three treatment modalities in a randomized trial. Understanding how utilization varies by mode, the impact of utilization on outcomes, and predictors of utilization across each mode could lead to improved treatments. One thousand two hundred and two participants were randomized to phone, Web, or combined phone-Web cessation treatment. Services varied by modality and were tracked using automated systems. All participants received 12 weeks of varenicline, printed guides, an orientation call, and access to a phone supportline. Self-report data were collected at baseline and 6-month follow-up. Overall, participants utilized phone services more often than the Web-based services. Among treatment groups with Web access, a significant proportion logged in only once (37% phone-Web, 41% Web), and those in the phone-Web group logged in less often than those in the Web group (mean = 2.4 vs. 3.7, p = .0001). Use of the phone also was correlated with increased use of the Web. In multivariate analyses, greater use of the phone- or Web-based services was associated with higher cessation rates. Finally, older age and the belief that certain treatments could improve success were consistent predictors of greater utilization across groups. Other predictors varied by treatment group. Opportunities for enhancing treatment utilization exist, particularly for Web-based programs. Increasing utilization more broadly could result in better overall treatment effectiveness for all intervention modalities.

  2. Random access to mobile networks with advanced error correction

    NASA Technical Reports Server (NTRS)

    Dippold, Michael

    1990-01-01

    A random access scheme for unreliable data channels is investigated in conjunction with an adaptive Hybrid-II Automatic Repeat Request (ARQ) scheme using Rate Compatible Punctured Codes (RCPC) Forward Error Correction (FEC). A simple scheme with fixed frame length and equal slot sizes is chosen and reservation is implicit by the first packet transmitted randomly in a free slot, similar to Reservation Aloha. This allows the further transmission of redundancy if the last decoding attempt failed. Results show that a high channel utilization and superior throughput can be achieved with this scheme that shows a quite low implementation complexity. For the example of an interleaved Rayleigh channel and soft decision utilization and mean delay are calculated. A utilization of 40 percent may be achieved for a frame with the number of slots being equal to half the station number under high traffic load. The effects of feedback channel errors and some countermeasures are discussed.

  3. Maintaining homeostasis by decision-making.

    PubMed

    Korn, Christoph W; Bach, Dominik R

    2015-05-01

    Living organisms need to maintain energetic homeostasis. For many species, this implies taking actions with delayed consequences. For example, humans may have to decide between foraging for high-calorie but hard-to-get, and low-calorie but easy-to-get food, under threat of starvation. Homeostatic principles prescribe decisions that maximize the probability of sustaining appropriate energy levels across the entire foraging trajectory. Here, predictions from biological principles contrast with predictions from economic decision-making models based on maximizing the utility of the endpoint outcome of a choice. To empirically arbitrate between the predictions of biological and economic models for individual human decision-making, we devised a virtual foraging task in which players chose repeatedly between two foraging environments, lost energy by the passage of time, and gained energy probabilistically according to the statistics of the environment they chose. Reaching zero energy was framed as starvation. We used the mathematics of random walks to derive endpoint outcome distributions of the choices. This also furnished equivalent lotteries, presented in a purely economic, casino-like frame, in which starvation corresponded to winning nothing. Bayesian model comparison showed that--in both the foraging and the casino frames--participants' choices depended jointly on the probability of starvation and the expected endpoint value of the outcome, but could not be explained by economic models based on combinations of statistical moments or on rank-dependent utility. This implies that under precisely defined constraints biological principles are better suited to explain human decision-making than economic models based on endpoint utility maximization.

  4. Maintaining Homeostasis by Decision-Making

    PubMed Central

    Korn, Christoph W.; Bach, Dominik R.

    2015-01-01

    Living organisms need to maintain energetic homeostasis. For many species, this implies taking actions with delayed consequences. For example, humans may have to decide between foraging for high-calorie but hard-to-get, and low-calorie but easy-to-get food, under threat of starvation. Homeostatic principles prescribe decisions that maximize the probability of sustaining appropriate energy levels across the entire foraging trajectory. Here, predictions from biological principles contrast with predictions from economic decision-making models based on maximizing the utility of the endpoint outcome of a choice. To empirically arbitrate between the predictions of biological and economic models for individual human decision-making, we devised a virtual foraging task in which players chose repeatedly between two foraging environments, lost energy by the passage of time, and gained energy probabilistically according to the statistics of the environment they chose. Reaching zero energy was framed as starvation. We used the mathematics of random walks to derive endpoint outcome distributions of the choices. This also furnished equivalent lotteries, presented in a purely economic, casino-like frame, in which starvation corresponded to winning nothing. Bayesian model comparison showed that—in both the foraging and the casino frames—participants’ choices depended jointly on the probability of starvation and the expected endpoint value of the outcome, but could not be explained by economic models based on combinations of statistical moments or on rank-dependent utility. This implies that under precisely defined constraints biological principles are better suited to explain human decision-making than economic models based on endpoint utility maximization. PMID:26024504

  5. Decision-Making in Agent-Based Models of Migration: State of the Art and Challenges.

    PubMed

    Klabunde, Anna; Willekens, Frans

    We review agent-based models (ABM) of human migration with respect to their decision-making rules. The most prominent behavioural theories used as decision rules are the random utility theory, as implemented in the discrete choice model, and the theory of planned behaviour. We identify the critical choices that must be made in developing an ABM, namely the modelling of decision processes and social networks. We also discuss two challenges that hamper the widespread use of ABM in the study of migration and, more broadly, demography and the social sciences: (a) the choice and the operationalisation of a behavioural theory (decision-making and social interaction) and (b) the selection of empirical evidence to validate the model. We offer advice on how these challenges might be overcome.

  6. Object recognition in images via a factor graph model

    NASA Astrophysics Data System (ADS)

    He, Yong; Wang, Long; Wu, Zhaolin; Zhang, Haisu

    2018-04-01

    Object recognition in images suffered from huge search space and uncertain object profile. Recently, the Bag-of- Words methods are utilized to solve these problems, especially the 2-dimension CRF(Conditional Random Field) model. In this paper we suggest the method based on a general and flexible fact graph model, which can catch the long-range correlation in Bag-of-Words by constructing a network learning framework contrasted from lattice in CRF. Furthermore, we explore a parameter learning algorithm based on the gradient descent and Loopy Sum-Product algorithms for the factor graph model. Experimental results on Graz 02 dataset show that, the recognition performance of our method in precision and recall is better than a state-of-art method and the original CRF model, demonstrating the effectiveness of the proposed method.

  7. Wealth-related Inequality in Utilization of Antihypertensive Medicines in Iran: an Ecological Study on Population Level Data.

    PubMed

    Hashemi-Meshkini, Amir; Kebriaeezadeh, Abbas; Jamshidi, Hamidreza; Akbari-Sari, Ali; Rezaei-Darzi, Ehsan; Mehdipour, Parinaz; Nikfar, Shekoufeh; Farzadfar, Farshad

    2016-02-01

    We aim to evaluate the trend of national and sub-national (provincial) utilization pattern of antihypertensive medicines in the Iranian population in the past decade and evaluate whether there is any wealth-related inequality in utilization of these medicines among different provinces. Either fixed effect or random effect linear panel data model was used to check the effect of wealth index on utilization of all antihypertensive medicines and each class, adjusting for other covariates including years of schooling, urbanization, mean age, and food type of provinces. The principal component analysis was applied to make summery measures for covariates using available national datasets. The effect of wealth category on the utilization of all antihypertensive medicines among Iranian provinces was positive and significant (0.84; 95% CI: 0.09, 1.59). Accordingly as subgroup analysis, in BBs and CCBs classes, the effects of wealth category on utilization of medicines were positive and significant (0.36; 95% CI: 0.12, 0.60 and 0.27; 95% CI: 0.07, 0.40, respectively). However in ACEIs and Diuretics classes, the effects of wealth category were positive but not significant. In ARBs class, the effect of wealth on utilization was negative and not significant (-0.04; 95% CI: -0.27, 0.18). According to this study, an inequality could be observed in Iran related to wealth category in utilization of total antihypertensive medicines between provinces.

  8. Trial-Based Cost-Utility Analysis of Icotinib versus Gefitinib as Second-Line Therapy for Advanced Non-Small Cell Lung Cancer in China

    PubMed Central

    Zhang, Chunxiang; Zhang, Hongmei; Shi, Jinning; Wang, Dong; Zhang, Xiuwei; Yang, Jian; Zhai, Qizhi; Ma, Aixia

    2016-01-01

    Background Our objective is to compare the cost-utility of icotinib and gefitinib for the second-line treatment of advanced non-small cell lung cancer (NSCLC) from the perspective of the Chinese healthcare system. Methods Model technology was applied to assess the data of randomized clinical trials and the direct medical costs from the perspective of the Chinese healthcare system. Five-year quality-adjusted life years (QALYs) and incremental cost-utility ratios (ICURs) were calculated. One-way and probabilistic sensitivity analyses (PSA) were performed. Results Our model suggested that the median progression-free survival (PFS) was 4.2 months in the icotinib group and 3.5 months in the gefitinib group while they were 4.6 months and 3.4 months, respectively, in the trials. The 5-year QALYs was 0.279 in the icotinib group and 0.269 in the gefitinib group, and the according medical costs were $10662.82 and $13127.57. The ICUR/QALY of icotinib versus gefitinib presented negative in this study. The most sensitive parameter to the ICUR was utility of PFS, ranging from $-1,259,991.25 to $-182,296.61; accordingly the icotinib treatment consistently represented a dominant cost-utility strategy. Conclusions The icotinib strategy, as a second-line therapy for advanced NSCLC patients in China, is the preferred strategy relative to gefitinib because of the dominant cost-utility. In addition, icotinib shows a good curative effect and safety, resulting in a strong demand for the Chinese market. PMID:27015267

  9. Influence of organizational characteristics and context on research utilization.

    PubMed

    Cummings, Greta G; Estabrooks, Carole A; Midodzi, William K; Wallin, Lars; Hayduk, Leslie

    2007-01-01

    Despite three decades of empirical investigation into research utilization and a renewed emphasis on evidence-based medicine and evidence-based practice in the past decade, understanding of factors influencing research uptake in nursing remains limited. There is, however, increased awareness that organizational influences are important. To develop and test a theoretical model of organizational influences that predict research utilization by nurses and to assess the influence of varying degrees of context, based on the Promoting Action on Research Implementation in Health Services (PARIHS) framework, on research utilization and other variables. The study sample was drawn from a census of registered nurses working in acute care hospitals in Alberta, Canada, accessed through their professional licensing body (n = 6,526 nurses; 52.8% response rate). Three variables that measured PARIHS dimensions of context (culture, leadership, and evaluation) were used to sort cases into one of four mutually exclusive data sets that reflected less positive to more positive context. Then, a theoretical model of hospital- and unit-level influences on research utilization was developed and tested, using structural equation modeling, and 300 cases were randomly selected from each of the four data sets. Model test results were as follows--low context: chi2= 124.5, df = 80, p <. 001; partially low: chi2= 144.2, p <. 001, df = 80; partially high: chi2= 157.3, df = 80, p <. 001; and partially low: chi2= 146.0, df = 80, p <. 001. Hospital characteristics that positively influenced research utilization by nurses were staff development, opportunity for nurse-to-nurse collaboration, and staffing and support services. Increased emotional exhaustion led to less reported research utilization and higher rates of patient and nurse adverse events. Nurses working in contexts with more positive culture, leadership, and evaluation also reported significantly more research utilization, staff development, and lower rates of patient and staff adverse events than did nurses working in less positive contexts (i.e., those that lacked positive culture, leadership, or evaluation). The findings highlight the combined importance of culture, leadership, and evaluation to increase research utilization and improve patient safety. The findings may serve to strengthen the PARIHS framework and to suggest that, although it is not fully developed, the framework is an appropriate guide to implement research into practice.

  10. Computer modelling of grain microstructure in three dimensions

    NASA Astrophysics Data System (ADS)

    Narayan, K. Lakshmi

    We present a program that generates the two-dimensional micrographs of a three dimensional grain microstructure. The code utilizes a novel scanning, pixel mapping technique to secure statistical distributions of surface areas, grain sizes, aspect ratios, perimeters, number of nearest neighbors and volumes of the randomly nucleated particles. The program can be used for comparing the existing theories of grain growth, and interpretation of two-dimensional microstructure of three-dimensional samples. Special features have been included to minimize the computation time and resource requirements.

  11. A multiple-objective optimal exploration strategy

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1988-01-01

    Exploration for natural resources is accomplished through partial sampling of extensive domains. Such imperfect knowledge is subject to sampling error. Complex systems of equations resulting from modelling based on the theory of correlated random fields are reduced to simple analytical expressions providing global indices of estimation variance. The indices are utilized by multiple objective decision criteria to find the best sampling strategies. The approach is not limited by geometric nature of the sampling, covers a wide range in spatial continuity and leads to a step-by-step procedure. ?? 1988.

  12. Random walk and graph cut based active contour model for three-dimension interactive pituitary adenoma segmentation from MR images

    NASA Astrophysics Data System (ADS)

    Sun, Min; Chen, Xinjian; Zhang, Zhiqiang; Ma, Chiyuan

    2017-02-01

    Accurate volume measurements of pituitary adenoma are important to the diagnosis and treatment for this kind of sellar tumor. The pituitary adenomas have different pathological representations and various shapes. Particularly, in the case of infiltrating to surrounding soft tissues, they present similar intensities and indistinct boundary in T1-weighted (T1W) magnetic resonance (MR) images. Then the extraction of pituitary adenoma from MR images is still a challenging task. In this paper, we propose an interactive method to segment the pituitary adenoma from brain MR data, by combining graph cuts based active contour model (GCACM) and random walk algorithm. By using the GCACM method, the segmentation task is formulated as an energy minimization problem by a hybrid active contour model (ACM), and then the problem is solved by the graph cuts method. The region-based term in the hybrid ACM considers the local image intensities as described by Gaussian distributions with different means and variances, expressed as maximum a posteriori probability (MAP). Random walk is utilized as an initialization tool to provide initialized surface for GCACM. The proposed method is evaluated on the three-dimensional (3-D) T1W MR data of 23 patients and compared with the standard graph cuts method, the random walk method, the hybrid ACM method, a GCACM method which considers global mean intensity in region forces, and a competitive region-growing based GrowCut method planted in 3D Slicer. Based on the experimental results, the proposed method is superior to those methods.

  13. Close woods utilization commonly practiced in Delaware

    Treesearch

    Wayne G. Banks

    1956-01-01

    Woods-utilization studies were recently made on randomly selected logging operations scattered throughout Delaware. As would be expected, most of the operations studied were in pine stands in the southern part of the state. They showed that very close utilization of the trees cut was the general rule.

  14. Factors influencing service utilization and mood symptom severity in children with mood disorders: effects of multifamily psychoeducation groups (MFPGs).

    PubMed

    Mendenhall, Amy N; Fristad, Mary A; Early, Theresa J

    2009-06-01

    This study investigated the impact of psychoeducation on service utilization and mood symptom severity in children with mood disorders. Parents' knowledge of mood disorders, beliefs about treatment, and perceptions of children's need for treatment were hypothesized to mediate the relationship between psychoeducation and service utilization and between psychoeducation and mood symptom severity. Linear mixed effects modeling and joint significance test for mediation were used in secondary data analyses of the multifamily psychoeducation group (MFPG) study, a randomized controlled trial of 165 children ages 8 to 12 years with mood disorders. A majority of those sampled were male (73%) and White, non-Hispanic (90%), and the median range of family income was $40,000-$59,000. Participation in MFPG significantly improved quality of services utilized, mediated by parents' beliefs about treatment. Participation in MFPG also significantly improved severity of child's mood symptoms, mediated by quality of services utilized. MFPG appears to be a psychoeducational intervention that helps parents to become better consumers of the mental health system who access higher quality services. Children's symptom severity decreases as a result. Copyright 2009 APA

  15. A stochastic method for stand-alone photovoltaic system sizing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cabral, Claudia Valeria Tavora; Filho, Delly Oliveira; Martins, Jose Helvecio

    Photovoltaic systems utilize solar energy to generate electrical energy to meet load demands. Optimal sizing of these systems includes the characterization of solar radiation. Solar radiation at the Earth's surface has random characteristics and has been the focus of various academic studies. The objective of this study was to stochastically analyze parameters involved in the sizing of photovoltaic generators and develop a methodology for sizing of stand-alone photovoltaic systems. Energy storage for isolated systems and solar radiation were analyzed stochastically due to their random behavior. For the development of the methodology proposed stochastic analysis were studied including the Markov chainmore » and beta probability density function. The obtained results were compared with those for sizing of stand-alone using from the Sandia method (deterministic), in which the stochastic model presented more reliable values. Both models present advantages and disadvantages; however, the stochastic one is more complex and provides more reliable and realistic results. (author)« less

  16. Evidentiary Pluralism as a Strategy for Research and Evidence-Based Practice in Rehabilitation Psychology

    PubMed Central

    Tucker, Jalie A.; Reed, Geoffrey M.

    2008-01-01

    This paper examines the utility of evidentiary pluralism, a research strategy that selects methods in service of content questions, in the context of rehabilitation psychology. Hierarchical views that favor randomized controlled clinical trials (RCTs) over other evidence are discussed, and RCTs are considered as they intersect with issues in the field. RCTs are vital for establishing treatment efficacy, but whether they are uniformly the best evidence to inform practice is critically evaluated. We argue that because treatment is only one of several variables that influence functioning, disability, and participation over time, an expanded set of conceptual and data analytic approaches should be selected in an informed way to support an expanded research agenda that investigates therapeutic and extra-therapeutic influences on rehabilitation processes and outcomes. The benefits of evidentiary pluralism are considered, including helping close the gap between the narrower clinical rehabilitation model and a public health disability model. KEY WORDS: evidence-based practice, evidentiary pluralism, rehabilitation psychology, randomized controlled trials PMID:19649150

  17. PTBS segmentation scheme for synthetic aperture radar

    NASA Astrophysics Data System (ADS)

    Friedland, Noah S.; Rothwell, Brian J.

    1995-07-01

    The Image Understanding Group at Martin Marietta Technologies in Denver, Colorado has developed a model-based synthetic aperture radar (SAR) automatic target recognition (ATR) system using an integrated resource architecture (IRA). IRA, an adaptive Markov random field (MRF) environment, utilizes information from image, model, and neighborhood resources to create a discrete, 2D feature-based world description (FBWD). The IRA FBWD features are peak, target, background and shadow (PTBS). These features have been shown to be very useful for target discrimination. The FBWD is used to accrue evidence over a model hypothesis set. This paper presents the PTBS segmentation process utilizing two IRA resources. The image resource (IR) provides generic (the physics of image formation) and specific (the given image input) information. The neighborhood resource (NR) provides domain knowledge of localized FBWD site behaviors. A simulated annealing optimization algorithm is used to construct a `most likely' PTBS state. Results on simulated imagery illustrate the power of this technique to correctly segment PTBS features, even when vehicle signatures are immersed in heavy background clutter. These segmentations also suppress sidelobe effects and delineate shadows.

  18. Randomly iterated search and statistical competency as powerful inversion tools for deformation source modeling: Application to volcano interferometric synthetic aperture radar data

    NASA Astrophysics Data System (ADS)

    Shirzaei, M.; Walter, T. R.

    2009-10-01

    Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.

  19. Impact of Predicting Health Care Utilization Via Web Search Behavior: A Data-Driven Analysis.

    PubMed

    Agarwal, Vibhu; Zhang, Liangliang; Zhu, Josh; Fang, Shiyuan; Cheng, Tim; Hong, Chloe; Shah, Nigam H

    2016-09-21

    By recent estimates, the steady rise in health care costs has deprived more than 45 million Americans of health care services and has encouraged health care providers to better understand the key drivers of health care utilization from a population health management perspective. Prior studies suggest the feasibility of mining population-level patterns of health care resource utilization from observational analysis of Internet search logs; however, the utility of the endeavor to the various stakeholders in a health ecosystem remains unclear. The aim was to carry out a closed-loop evaluation of the utility of health care use predictions using the conversion rates of advertisements that were displayed to the predicted future utilizers as a surrogate. The statistical models to predict the probability of user's future visit to a medical facility were built using effective predictors of health care resource utilization, extracted from a deidentified dataset of geotagged mobile Internet search logs representing searches made by users of the Baidu search engine between March 2015 and May 2015. We inferred presence within the geofence of a medical facility from location and duration information from users' search logs and putatively assigned medical facility visit labels to qualifying search logs. We constructed a matrix of general, semantic, and location-based features from search logs of users that had 42 or more search days preceding a medical facility visit as well as from search logs of users that had no medical visits and trained statistical learners for predicting future medical visits. We then carried out a closed-loop evaluation of the utility of health care use predictions using the show conversion rates of advertisements displayed to the predicted future utilizers. In the context of behaviorally targeted advertising, wherein health care providers are interested in minimizing their cost per conversion, the association between show conversion rate and predicted utilization score, served as a surrogate measure of the model's utility. We obtained the highest area under the curve (0.796) in medical visit prediction with our random forests model and daywise features. Ablating feature categories one at a time showed that the model performance worsened the most when location features were dropped. An online evaluation in which advertisements were served to users who had a high predicted probability of a future medical visit showed a 3.96% increase in the show conversion rate. Results from our experiments done in a research setting suggest that it is possible to accurately predict future patient visits from geotagged mobile search logs. Results from the offline and online experiments on the utility of health utilization predictions suggest that such prediction can have utility for health care providers.

  20. Autoregressive Modeling of Drift and Random Error to Characterize a Continuous Intravascular Glucose Monitoring Sensor.

    PubMed

    Zhou, Tony; Dickson, Jennifer L; Geoffrey Chase, J

    2018-01-01

    Continuous glucose monitoring (CGM) devices have been effective in managing diabetes and offer potential benefits for use in the intensive care unit (ICU). Use of CGM devices in the ICU has been limited, primarily due to the higher point accuracy errors over currently used traditional intermittent blood glucose (BG) measures. General models of CGM errors, including drift and random errors, are lacking, but would enable better design of protocols to utilize these devices. This article presents an autoregressive (AR) based modeling method that separately characterizes the drift and random noise of the GlySure CGM sensor (GlySure Limited, Oxfordshire, UK). Clinical sensor data (n = 33) and reference measurements were used to generate 2 AR models to describe sensor drift and noise. These models were used to generate 100 Monte Carlo simulations based on reference blood glucose measurements. These were then compared to the original CGM clinical data using mean absolute relative difference (MARD) and a Trend Compass. The point accuracy MARD was very similar between simulated and clinical data (9.6% vs 9.9%). A Trend Compass was used to assess trend accuracy, and found simulated and clinical sensor profiles were similar (simulated trend index 11.4° vs clinical trend index 10.9°). The model and method accurately represents cohort sensor behavior over patients, providing a general modeling approach to any such sensor by separately characterizing each type of error that can arise in the data. Overall, it enables better protocol design based on accurate expected CGM sensor behavior, as well as enabling the analysis of what level of each type of sensor error would be necessary to obtain desired glycemic control safety and performance with a given protocol.

  1. Perfect Information vs Random Investigation: Safety Guidelines for a Consumer in the Jungle of Product Differentiation.

    PubMed

    Biondo, Alessio Emanuele; Giarlotta, Alfio; Pluchino, Alessandro; Rapisarda, Andrea

    2016-01-01

    We present a graph-theoretic model of consumer choice, where final decisions are shown to be influenced by information and knowledge, in the form of individual awareness, discriminating ability, and perception of market structure. Building upon the distance-based Hotelling's differentiation idea, we describe the behavioral experience of several prototypes of consumers, who walk a hypothetical cognitive path in an attempt to maximize their satisfaction. Our simulations show that even consumers endowed with a small amount of information and knowledge may reach a very high level of utility. On the other hand, complete ignorance negatively affects the whole consumption process. In addition, rather unexpectedly, a random walk on the graph reveals to be a winning strategy, below a minimal threshold of information and knowledge.

  2. Perfect Information vs Random Investigation: Safety Guidelines for a Consumer in the Jungle of Product Differentiation

    PubMed Central

    Biondo, Alessio Emanuele; Giarlotta, Alfio; Pluchino, Alessandro; Rapisarda, Andrea

    2016-01-01

    We present a graph-theoretic model of consumer choice, where final decisions are shown to be influenced by information and knowledge, in the form of individual awareness, discriminating ability, and perception of market structure. Building upon the distance-based Hotelling’s differentiation idea, we describe the behavioral experience of several prototypes of consumers, who walk a hypothetical cognitive path in an attempt to maximize their satisfaction. Our simulations show that even consumers endowed with a small amount of information and knowledge may reach a very high level of utility. On the other hand, complete ignorance negatively affects the whole consumption process. In addition, rather unexpectedly, a random walk on the graph reveals to be a winning strategy, below a minimal threshold of information and knowledge. PMID:26784700

  3. Wigner-Eisenbud-Smith photoionization time delay due to autoioinization resonances

    NASA Astrophysics Data System (ADS)

    Deshmukh, P. C.; Kumar, A.; Varma, H. R.; Banerjee, S.; Manson, Steven T.; Dolmatov, V. K.; Kheifets, A. S.

    2018-03-01

    An empirical ansatz for the complex photoionization amplitude and Wigner-Eisenbud-Smith time delay in the vicinity of a Fano autoionization resonance are proposed to evaluate and interpret the time delay in the resonant region. The utility of this expression is evaluated in comparison with accurate numerical calculations employing the ab initio relativistic random phase approximation and relativistic multichannel quantum defect theory. The indisputably good qualitative agreement (and semiquantitative agreement) between corresponding results of the proposed model and results produced by the ab initio theories proves the usability of the model. In addition, the phenomenology of the time delay in the vicinity of multichannel autoionizing resonances is detailed.

  4. Initiation, adherence, and retention in a randomized controlled trial of directly administered antiretroviral therapy.

    PubMed

    Maru, Duncan Smith-Rohrberg; Bruce, R Douglas; Walton, Mary; Mezger, Jo Anne; Springer, Sandra A; Shield, David; Altice, Frederick L

    2008-03-01

    Directly administered antiretroviral therapy (DAART) can improve health outcomes among HIV-infected drug users. An understanding of the utilization of DAART-initiation, adherence, and retention-is critical to successful program design. Here, we use the Behavioral Model to assess the enabling, predisposing, and need factors impacting adherence in our randomized, controlled trial of DAART versus self-administered therapy (SAT) among 141 HIV-infected drug users. Of 88 participants randomized to DAART, 74 (84%) initiated treatment, and 51 (69%) of those who initiated were retained in the program throughout the entire six-month period. Mean adherence to directly observed visits was 73%, and the mean overall composite adherence score was 77%. These results were seen despite the finding that 75% of participants indicated that they would prefer to take their own medications. Major causes of DAART discontinuation included hospitalization, incarceration, and entry into drug-treatment programs. The presence of depression and the lack of willingness to travel greater than four blocks to receive DAART predicted time-to-discontinuation.

  5. Initiation, Adherence, and Retention in a Randomized Controlled Trial of Directly Administered Antiretroviral Therapy

    PubMed Central

    Maru, Duncan Smith-Rohrberg; Bruce, R. Douglas; Walton, Mary; Mezger, Jo Anne; Springer, Sandra A.; Shield, David

    2009-01-01

    Directly administered antiretroviral therapy (DAART) can improve health outcomes among HIV-infected drug users. An understanding of the utilization of DAART—initiation, adherence, and retention—is critical to successful program design. Here, we use the Behavioral Model to assess the enabling, predisposing, and need factors impacting adherence in our randomized, controlled trial of DAART versus self-administered therapy (SAT) among 141 HIV-infected drug users. Of 88 participants randomized to DAART, 74 (84%) initiated treatment, and 51 (69%) of those who initiated were retained in the program throughout the entire six-month period. Mean adherence to directly observed visits was 73%, and the mean overall composite adherence score was 77%. These results were seen despite the finding that 75% of participants indicated that they would prefer to take their own medications. Major causes of DAART discontinuation included hospitalization, incarceration, and entry into drug-treatment programs. The presence of depression and the lack of willingness to travel greater than four blocks to receive DAART predicted time-to-discontinuation. PMID:18085432

  6. A Randomized, Controlled Pilot Study of a Single-Session Psychoeducation Treatment for Urban, Culturally Diverse, Trauma-Exposed Adults.

    PubMed

    Ghafoori, Bita; Fisher, Dennis; Korosteleva, Olga; Hong, Madelyn

    2016-06-01

    This randomized pilot study aimed to determine whether a single session of psychoeducation improved mental health outcomes, attitudes toward treatment, and service engagement among urban, impoverished, culturally diverse, trauma-exposed adults. Sixty-seven individuals were randomly assigned to a single-session psychoeducation treatment or a delayed treatment comparison control group. The control group was found to be superior to the treatment group at posttest with respect to symptoms of posttraumatic stress disorder, anxiety, and occupational and family disability. At follow-up, all participants had completed the psychoeducation treatment, and a mixed-effects model indicated significant improvements over time in symptoms of posttraumatic stress disorder, anxiety, depression, somatization, and attitudes toward treatment. Ninety-eight percent of the participants reported the psychoeducation was helpful at follow-up. Participants also reported a 19.1% increase in mental health service utilization at follow-up compared with baseline. Implications for treatment and future research are discussed.

  7. Clustering Single-Cell Expression Data Using Random Forest Graphs.

    PubMed

    Pouyan, Maziyar Baran; Nourani, Mehrdad

    2017-07-01

    Complex tissues such as brain and bone marrow are made up of multiple cell types. As the study of biological tissue structure progresses, the role of cell-type-specific research becomes increasingly important. Novel sequencing technology such as single-cell cytometry provides researchers access to valuable biological data. Applying machine-learning techniques to these high-throughput datasets provides deep insights into the cellular landscape of the tissue where those cells are a part of. In this paper, we propose the use of random-forest-based single-cell profiling, a new machine-learning-based technique, to profile different cell types of intricate tissues using single-cell cytometry data. Our technique utilizes random forests to capture cell marker dependences and model the cellular populations using the cell network concept. This cellular network helps us discover what cell types are in the tissue. Our experimental results on public-domain datasets indicate promising performance and accuracy of our technique in extracting cell populations of complex tissues.

  8. Utilization of Services in a Randomized Trial Testing Phone- and Web-Based Interventions for Smoking Cessation

    PubMed Central

    Jack, Lisa M.; McClure, Jennifer B.; Deprey, Mona; Javitz, Harold S.; McAfee, Timothy A.; Catz, Sheryl L.; Richards, Julie; Bush, Terry; Swan, Gary E.

    2011-01-01

    Introduction: Phone counseling has become standard for behavioral smoking cessation treatment. Newer options include Web and integrated phone–Web treatment. No prior research, to our knowledge, has systematically compared the effectiveness of these three treatment modalities in a randomized trial. Understanding how utilization varies by mode, the impact of utilization on outcomes, and predictors of utilization across each mode could lead to improved treatments. Methods: One thousand two hundred and two participants were randomized to phone, Web, or combined phone–Web cessation treatment. Services varied by modality and were tracked using automated systems. All participants received 12 weeks of varenicline, printed guides, an orientation call, and access to a phone supportline. Self-report data were collected at baseline and 6-month follow-up. Results: Overall, participants utilized phone services more often than the Web-based services. Among treatment groups with Web access, a significant proportion logged in only once (37% phone–Web, 41% Web), and those in the phone–Web group logged in less often than those in the Web group (mean = 2.4 vs. 3.7, p = .0001). Use of the phone also was correlated with increased use of the Web. In multivariate analyses, greater use of the phone- or Web-based services was associated with higher cessation rates. Finally, older age and the belief that certain treatments could improve success were consistent predictors of greater utilization across groups. Other predictors varied by treatment group. Conclusions: Opportunities for enhancing treatment utilization exist, particularly for Web-based programs. Increasing utilization more broadly could result in better overall treatment effectiveness for all intervention modalities. PMID:21330267

  9. Cost utility analysis of caudal epidural injections in the treatment of lumbar disc herniation, axial or discogenic low back pain, central spinal stenosis, and post lumbar surgery syndrome.

    PubMed

    Manchikanti, Laxmaiah; Falco, Frank J E; Pampati, Vidyasagar; Cash, Kimberly A; Benyamin, Ramsin M; Hirsch, Joshua A

    2013-01-01

    In this era of escalating health care costs and the questionable effectiveness of multiple interventions, cost effectiveness or cost utility analysis has become the cornerstone of evidence-based medicine, and has an influence coverage decisions. Even though multiple cost effectiveness analysis studies have been performed over the years, extensive literature is lacking for interventional techniques. Cost utility analysis studies of epidural injections for managing chronic low back pain demonstrated highly variable results including a lack of cost utility in randomized trials and contrasting results in observational studies. There has not been any cost utility analysis studies of epidural injections in large randomized trials performed in interventional pain management settings. To assess the cost utility of caudal epidural injections in managing chronic low back pain secondary to lumbar disc herniation, axial or discogenic low back pain, lumbar central spinal stenosis, and lumbar post surgery syndrome. This analysis is based on 4 previously published randomized trials. A private, specialty referral interventional pain management center in the United States. Four randomized trials were conducted assessing the clinical effectiveness of caudal epidural injections with or without steroids for lumbar disc herniation, lumbar discogenic or axial low back pain, lumbar central spinal stenosis, and post surgery syndrome. A cost utility analysis was performed with direct payment data for a total of 480 patients over a period of 2 years from these 4 trials. Outcome included various measures with significant improvement defined as at least a 50% improvement in pain reduction and disability status. The results of 4 randomized controlled trials of low back pain with 480 patients with a 2 year follow-up with the actual reimbursement data showed cost utility for one year of quality-adjusted life year (QALY) of $2,206 for disc herniation, $2,136 for axial or discogenic pain without disc herniation, $2,155 for central spinal stenosis, and $2,191 for post surgery syndrome. All patients showed significant improvement clinically and showed positive results in the cost utility analysis with an average cost per one year QALY of $2,172.50 for all patients and $1,966.03 for patients judged to be successful. The results of this assessment show a better cost utility or lower cost of managing chronic, intractable low back pain with caudal epidural injections at a QALY that is similar or lower in price than medical therapy only, physical therapy, manipulation, and surgery in most cases. The limitations of this cost utility analysis include that it is a single center evaluation, even though 480 patients were included in the analysis. Further, only the costs of interventional procedures and physician visits were included. The benefits of returning to work were not assessed.   This cost utility analysis of caudal epidural injections in the treatment of disc herniation, axial or discogenic low back pain, central spinal stenosis, and post surgery syndrome in the lumbar spine shows the clinical effectiveness and cost utility of these injections at less than $2,200 per one year of QALY.

  10. Improving Adherence to Smoking Cessation Treatment: Smoking Outcomes in a Web-based Randomized Trial.

    PubMed

    Graham, Amanda L; Papandonatos, George D; Cha, Sarah; Erar, Bahar; Amato, Michael S

    2018-03-15

    Partial adherence in Internet smoking cessation interventions presents treatment and evaluation challenges. Increasing adherence may improve outcomes. To present smoking outcomes from an Internet randomized trial of two strategies to encourage adherence to tobacco dependence treatment components: (i) a social network (SN) strategy to integrate smokers into an online community and (ii) free nicotine replacement therapy (NRT). In addition to intent-to-treat analyses, we used novel statistical methods to distinguish the impact of treatment assignment from treatment utilization. A total of 5,290 current smokers on a cessation website (WEB) were randomized to WEB, WEB + SN, WEB + NRT, or WEB + SN + NRT. The main outcome was 30-day point prevalence abstinence at 3 and 9 months post-randomization. Adherence measures included self-reported medication use (meds), and website metrics of skills training (sk) and community use (comm). Inverse Probability of Retention Weighting and Inverse Probability of Treatment Weighting jointly addressed dropout and treatment selection. Propensity weights were used to calculate Average Treatment effects on the Treated. Treatment assignment analyses showed no effects on abstinence for either adherence strategy. Abstinence rates were 25.7%-32.2% among participants that used all three treatment components (sk+comm +meds).Treatment utilization analyses revealed that among such participants, sk+comm+meds yielded large percentage point increases in 3-month abstinence rates over sk alone across arms: WEB = 20.6 (95% CI = 10.8, 30.4), WEB + SN = 19.2 (95% CI = 11.1, 27.3), WEB + NRT = 13.1 (95% CI = 4.1, 22.0), and WEB + SN + NRT = 20.0 (95% CI = 12.2, 27.7). Novel propensity weighting approaches can serve as a model for establishing efficacy of Internet interventions and yield important insights about mechanisms. NCT01544153.

  11. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    NASA Astrophysics Data System (ADS)

    Brokamp, Cole; Jandarov, Roman; Rao, M. B.; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  12. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches.

    PubMed

    Brokamp, Cole; Jandarov, Roman; Rao, M B; LeMasters, Grace; Ryan, Patrick

    2017-02-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment.

  13. Exposure assessment models for elemental components of particulate matter in an urban environment: A comparison of regression and random forest approaches

    PubMed Central

    Brokamp, Cole; Jandarov, Roman; Rao, M.B.; LeMasters, Grace; Ryan, Patrick

    2017-01-01

    Exposure assessment for elemental components of particulate matter (PM) using land use modeling is a complex problem due to the high spatial and temporal variations in pollutant concentrations at the local scale. Land use regression (LUR) models may fail to capture complex interactions and non-linear relationships between pollutant concentrations and land use variables. The increasing availability of big spatial data and machine learning methods present an opportunity for improvement in PM exposure assessment models. In this manuscript, our objective was to develop a novel land use random forest (LURF) model and compare its accuracy and precision to a LUR model for elemental components of PM in the urban city of Cincinnati, Ohio. PM smaller than 2.5 μm (PM2.5) and eleven elemental components were measured at 24 sampling stations from the Cincinnati Childhood Allergy and Air Pollution Study (CCAAPS). Over 50 different predictors associated with transportation, physical features, community socioeconomic characteristics, greenspace, land cover, and emission point sources were used to construct LUR and LURF models. Cross validation was used to quantify and compare model performance. LURF and LUR models were created for aluminum (Al), copper (Cu), iron (Fe), potassium (K), manganese (Mn), nickel (Ni), lead (Pb), sulfur (S), silicon (Si), vanadium (V), zinc (Zn), and total PM2.5 in the CCAAPS study area. LURF utilized a more diverse and greater number of predictors than LUR and LURF models for Al, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all showed a decrease in fractional predictive error of at least 5% compared to their LUR models. LURF models for Al, Cu, Fe, K, Mn, Pb, Si, Zn, TRAP, and PM2.5 all had a cross validated fractional predictive error less than 30%. Furthermore, LUR models showed a differential exposure assessment bias and had a higher prediction error variance. Random forest and other machine learning methods may provide more accurate exposure assessment. PMID:28959135

  14. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    PubMed

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  15. Ecological statistics of Gestalt laws for the perceptual organization of contours.

    PubMed

    Elder, James H; Goldberg, Richard M

    2002-01-01

    Although numerous studies have measured the strength of visual grouping cues for controlled psychophysical stimuli, little is known about the statistical utility of these various cues for natural images. In this study, we conducted experiments in which human participants trace perceived contours in natural images. These contours are automatically mapped to sequences of discrete tangent elements detected in the image. By examining relational properties between pairs of successive tangents on these traced curves, and between randomly selected pairs of tangents, we are able to estimate the likelihood distributions required to construct an optimal Bayesian model for contour grouping. We employed this novel methodology to investigate the inferential power of three classical Gestalt cues for contour grouping: proximity, good continuation, and luminance similarity. The study yielded a number of important results: (1) these cues, when appropriately defined, are approximately uncorrelated, suggesting a simple factorial model for statistical inference; (2) moderate image-to-image variation of the statistics indicates the utility of general probabilistic models for perceptual organization; (3) these cues differ greatly in their inferential power, proximity being by far the most powerful; and (4) statistical modeling of the proximity cue indicates a scale-invariant power law in close agreement with prior psychophysics.

  16. Quality of life, activity impairment, and healthcare resource utilization associated with atrial fibrillation in the US National Health and Wellness Survey.

    PubMed

    Goren, Amir; Liu, Xianchen; Gupta, Shaloo; Simon, Teresa A; Phatak, Hemant

    2013-01-01

    This study builds upon current studies of atrial fibrillation (AF) and health outcomes by examining more comprehensively the humanistic burden of illness (quality of life, activity impairment, and healthcare resource utilization) among adult patients with AF, using a large, nationally representative sample and matched controls. Data were analyzed from the Internet-based 2009 US National Health and Wellness Survey. Outcomes were Mental and Physical Component Summary (MCS and PCS) and health utility scores from the SF-12, activity impairment, hospitalizations, and healthcare provider and emergency room (ER) visits. Patients with self-reported diagnosis of AF were matched randomly on age and gender with an equal number of respondents without AF. Generalized linear models examined outcomes as a function of AF vs. non-AF status, controlling for CHADS2 score, comorbidity counts, demographics, and clinical variables. Exploratory structural equation modeling assessed the above in an integrated model of humanistic burden. Mean age of AF patients (1,296 from a total sample of 75,000) was 64.9 years and 65.1% were male. Adjusting for covariates, compared with non-AF patients, AF patients had lower MCS, PCS, and utility scores, greater activity impairment (rate ratio = 1.26), more traditional provider visits (rate ratio = 1.43), and increased odds of ER visits (OR = 2.53) and hospitalizations (OR = 2.71). Exploratory structural equation modeling analyses revealed that persons with AF experienced a significantly higher overall humanistic burden. This study highlights and clarifies the substantial burden of AF and its implications for preparing efficacious AF management plans to address the imminent rise in prevalence.

  17. Reducing youth internalizing symptoms: Effects of a family-based preventive intervention on parental guilt induction and youth cognitive style

    PubMed Central

    McKEE, LAURA G.; PARENT, JUSTIN; FOREHAND, REX; RAKOW, AARON; WATSON, KELLY H.; DUNBAR, JENNIFER P.; REISING, MICHELLE M.; HARDCASTLE, EMILY; COMPAS, BRUCE E.

    2014-01-01

    This study utilized structural equation modeling to examine the associations among parental guilt induction (a form of psychological control), youth cognitive style, and youth internalizing symptoms, with parents and youth participating in a randomized controlled trial of a family-based group cognitive–behavioral preventive intervention targeting families with a history of caregiver depression. The authors present separate models utilizing parent report and youth report of internalizing symptoms. Findings suggest that families in the active condition (family-based group cognitive–behavioral group) relative to the comparison condition showed a significant decline in parent use of guilt induction at the conclusion of the intervention (6 months postbaseline). Furthermore, reductions in parental guilt induction at 6 months were associated with significantly lower levels of youth negative cognitive style at 12 months. Finally, reductions in parental use of guilt induction were associated with lower youth internalizing symptoms 1 year following the conclusion of the intervention (18 months postbaseline). PMID:24438999

  18. Simulation of flight maneuver-load distributions by utilizing stationary, non-Gaussian random load histories

    NASA Technical Reports Server (NTRS)

    Leybold, H. A.

    1971-01-01

    Random numbers were generated with the aid of a digital computer and transformed such that the probability density function of a discrete random load history composed of these random numbers had one of the following non-Gaussian distributions: Poisson, binomial, log-normal, Weibull, and exponential. The resulting random load histories were analyzed to determine their peak statistics and were compared with cumulative peak maneuver-load distributions for fighter and transport aircraft in flight.

  19. Pigeons' Choices between Fixed-Interval and Random-Interval Schedules: Utility of Variability?

    ERIC Educational Resources Information Center

    Andrzejewski, Matthew E.; Cardinal, Claudia D.; Field, Douglas P.; Flannery, Barbara A.; Johnson, Michael; Bailey, Kathleen; Hineline, Philip N.

    2005-01-01

    Pigeons' choosing between fixed-interval and random-interval schedules of reinforcement was investigated in three experiments using a discrete-trial procedure. In all three experiments, the random-interval schedule was generated by sampling a probability distribution at an interval (and in multiples of the interval) equal to that of the…

  20. Hybrid spread spectrum radio system

    DOEpatents

    Smith, Stephen F.; Dress, William B.

    2010-02-02

    Systems and methods are described for hybrid spread spectrum radio systems. A method includes modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control an amplification circuit that provides a gain to the signal. Another method includes: modulating a signal by utilizing a subset of bits from a pseudo-random code generator to control a fast hopping frequency synthesizer; and fast frequency hopping the signal with the fast hopping frequency synthesizer, wherein multiple frequency hops occur within a single data-bit time.

  1. Randomized controlled trial of a cognitive-behavioral intervention for HIV-positive persons: an investigation of treatment effects on psychosocial adjustment.

    PubMed

    Carrico, Adam W; Chesney, Margaret A; Johnson, Mallory O; Morin, Stephen F; Neilands, Torsten B; Remien, Robert H; Rotheram-Borus, Mary Jane; Lennie Wong, F

    2009-06-01

    Questions remain regarding the clinical utility of psychological interventions for HIV-positive persons because randomized controlled trials have utilized stringent inclusion criteria and focused extensively on gay men. The present randomized controlled trial examined the efficacy of a 15-session, individually delivered cognitive-behavioral intervention (n = 467) compared to a wait-list control (n = 469) in a diverse sample of HIV-positive persons who reported HIV transmission risk behavior. Five intervention sessions that dealt with executing effective coping responses were delivered between baseline and the 5 months post-randomization. Additional assessments were completed through 25 months post-randomization. Despite previously documented reductions in HIV transmission risk, no intervention-related changes in psychosocial adjustment were observed across the 25-month investigation period. In addition, there were no intervention effects on psychosocial adjustment among individuals who presented with mild to moderate depressive symptoms. More intensive mental health interventions may be necessary to improve psychosocial adjustment among HIV-positive individuals.

  2. Frequency, predictors, and consequences of crossing over to revascularization within 12 months of randomization to optimal medical therapy in the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial.

    PubMed

    Spertus, John A; Maron, David J; Cohen, David J; Kolm, Paul; Hartigan, Pam; Weintraub, William S; Berman, Daniel S; Teo, Koon K; Shaw, Leslee J; Sedlis, Steven P; Knudtson, Merril; Aslan, Mihaela; Dada, Marcin; Boden, William E; Mancini, G B John

    2013-07-01

    In the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial, some patients with stable ischemic heart disease randomized to optimal medical therapy (OMT) crossed over to early revascularization. The predictors and outcomes of patients who crossed over from OMT to revascularization are unknown. We compared characteristics of OMT patients who did and did not undergo revascularization within 12 months and created a Cox regression model to identify predictors of early revascularization. Patients' health status was measured with the Seattle Angina Questionnaire. To quantify the potential consequences of initiating OMT without percutaneous coronary intervention, we compared the outcomes of crossover patients with a matched cohort randomized to immediate percutaneous coronary intervention. Among 1148 patients randomized to OMT, 185 (16.1%) underwent early revascularization. Patient characteristics independently associated with early revascularization were worse baseline Seattle Angina Questionnaire scores and healthcare system. Among 156 OMT patients undergoing early revascularization matched to 156 patients randomized to percutaneous coronary intervention, rates of mortality (hazard ratio=0.51 [0.13-2.1]) and nonfatal myocardial infarction (hazard ratio=1.9 [0.75-4.6]) were similar, as were 1-year Seattle Angina Questionnaire scores. OMT patients, however, experienced worse health status over the initial year of treatment and more unstable angina admissions (hazard ratio=2.8 [1.1-7.5]). Among COURAGE patients assigned to OMT alone, patients' angina, dissatisfaction with their current treatment, and, to a lesser extent, their health system were associated with early revascularization. Because early crossover was not associated with an increase in irreversible ischemic events or impaired 12-month health status, these findings support an initial trial of OMT in stable ischemic heart disease with close follow-up of the most symptomatic patients. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00007657.

  3. Effects of a recovery management intervention on Chinese heroin users' community recovery through the mediation effect of enhanced service utilization.

    PubMed

    Wu, F; Fu, L M; Hser, Y H

    2015-09-01

    This study investigates whether a recovery management intervention (RMI) can improve the utilization of community drug treatment and wraparound services among heroin users in China and subsequently lead to positive recovery outcomes. Secondary analysis was conducted drawing data from a randomized controlled trial; 100 heroin users with no severe mental health problems were recruited in two Shanghai districts (Hongkou and Yangpu) upon their release from compulsory rehabilitation facilities. A latent variable modeling approach was utilized to test whether the RMI influences heroin users' perceived motivation and readiness for treatment, enhances treatment and wraparound service participation, and, in turn, predicts better recovery outcomes. Enrollment in drug treatment and other social service utilization increased significantly as a result of RMI rather than an individual drug user's motivation and readiness for treatment. Increased service utilization thus led to more positive individual recovery outcomes. In addition to this mediation effect through service utilization, the RMI also improved participants' community recovery directly. Findings suggest that better drug treatment enrollment, community service utilization and recovery outcomes can be potentially achieved among heroin users in China with carefully designed case management interventions. © The Author 2014. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Dynamical behaviors of inter-out-of-equilibrium state intervals in Korean futures exchange markets

    NASA Astrophysics Data System (ADS)

    Lim, Gyuchang; Kim, SooYong; Kim, Kyungsik; Lee, Dong-In; Scalas, Enrico

    2008-05-01

    A recently discovered feature of financial markets, the two-phase phenomenon, is utilized to categorize a financial time series into two phases, namely equilibrium and out-of-equilibrium states. For out-of-equilibrium states, we analyze the time intervals at which the state is revisited. The power-law distribution of inter-out-of-equilibrium state intervals is shown and we present an analogy with discrete-time heat bath dynamics, similar to random Ising systems. In the mean-field approximation, this model reduces to a one-dimensional multiplicative process. By varying global and local model parameters, the relevance between volatilities in financial markets and the interaction strengths between agents in the Ising model are investigated and discussed.

  5. Probabilistic choice models in health-state valuation research: background, theories, assumptions and applications.

    PubMed

    Arons, Alexander M M; Krabbe, Paul F M

    2013-02-01

    Interest is rising in measuring subjective health outcomes, such as treatment outcomes that are not directly quantifiable (functional disability, symptoms, complaints, side effects and health-related quality of life). Health economists in particular have applied probabilistic choice models in the area of health evaluation. They increasingly use discrete choice models based on random utility theory to derive values for healthcare goods or services. Recent attempts have been made to use discrete choice models as an alternative method to derive values for health states. In this article, various probabilistic choice models are described according to their underlying theory. A historical overview traces their development and applications in diverse fields. The discussion highlights some theoretical and technical aspects of the choice models and their similarity and dissimilarity. The objective of the article is to elucidate the position of each model and their applications for health-state valuation.

  6. Genetically Engineered Pig Models for Human Diseases

    PubMed Central

    Prather, Randall S.; Lorson, Monique; Ross, Jason W.; Whyte, Jeffrey J.; Walters, Eric

    2015-01-01

    Although pigs are used widely as models of human disease, their utility as models has been enhanced by genetic engineering. Initially, transgenes were added randomly to the genome, but with the application of homologous recombination, zinc finger nucleases, and transcription activator-like effector nuclease (TALEN) technologies, now most any genetic change that can be envisioned can be completed. To date these genetic modifications have resulted in animals that have the potential to provide new insights into human diseases for which a good animal model did not exist previously. These new animal models should provide the preclinical data for treatments that are developed for diseases such as Alzheimer's disease, cystic fibrosis, retinitis pigmentosa, spinal muscular atrophy, diabetes, and organ failure. These new models will help to uncover aspects and treatments of these diseases that were otherwise unattainable. The focus of this review is to describe genetically engineered pigs that have resulted in models of human diseases. PMID:25387017

  7. TREATMENT SWITCHING: STATISTICAL AND DECISION-MAKING CHALLENGES AND APPROACHES.

    PubMed

    Latimer, Nicholas R; Henshall, Chris; Siebert, Uwe; Bell, Helen

    2016-01-01

    Treatment switching refers to the situation in a randomized controlled trial where patients switch from their randomly assigned treatment onto an alternative. Often, switching is from the control group onto the experimental treatment. In this instance, a standard intention-to-treat analysis does not identify the true comparative effectiveness of the treatments under investigation. We aim to describe statistical methods for adjusting for treatment switching in a comprehensible way for nonstatisticians, and to summarize views on these methods expressed by stakeholders at the 2014 Adelaide International Workshop on Treatment Switching in Clinical Trials. We describe three statistical methods used to adjust for treatment switching: marginal structural models, two-stage adjustment, and rank preserving structural failure time models. We draw upon discussion heard at the Adelaide International Workshop to explore the views of stakeholders on the acceptability of these methods. Stakeholders noted that adjustment methods are based on assumptions, the validity of which may often be questionable. There was disagreement on the acceptability of adjustment methods, but consensus that when these are used, they should be justified rigorously. The utility of adjustment methods depends upon the decision being made and the processes used by the decision-maker. Treatment switching makes estimating the true comparative effect of a new treatment challenging. However, many decision-makers have reservations with adjustment methods. These, and how they affect the utility of adjustment methods, require further exploration. Further technical work is required to develop adjustment methods to meet real world needs, to enhance their acceptability to decision-makers.

  8. Concentration change of DA, DOPAC, Glu and GABA in brain tissues in schizophrenia developmental model rats induced by MK-801.

    PubMed

    Liu, Yong; Tang, Yamei; Pu, Weidan; Zhang, Xianghui; Zhao, Jingping

    2011-08-01

    To explore the related neurobiochemical mechanism by comparing the concentration change of dopamine (DA), dihydroxy-phenyl acetic acid (DOPAC), glutamate (Glu), and γ-aminobutyric acid (GABA) in the brain tissues in schizophrenia (SZ) developmental model rats and chronic medication model rats. A total of 60 neonatal male Spragur-Dawley (SD) rats were randomly assigned to 3 groups at the postnatal day 6: an SZ developmental rat model group (subcutaneous injection with MK-801 at the postnatal day 7-10, 0.1 mg/kg, Bid), a chronic medication model group (intraperitoneal injection at the postnatal day 47-60, 0.2 mg/kg,Qd), and a normal control group (injection with 0.9% normal saline during the corresponding periods). DA, DOPAC, Glu, and GABA of the tissue homogenate from the medial prefrontal cortex (mPFC) and hippocampus were examined with Coularray electrochemic detection by high performance liquid chromatogram technique. The utilization rate of DA and Glu was calculated. Compared with the normal control group, the concentration of DA and DOPAC in the mPFC and the hippocampus in the SZ developmental model group significantly decreased (P<0.05), and the GABA concentration and Glu utilization rate in the mPFC also decreased (P<0.05). Compared with the chronic medication model group, the DA concentration of the mPFC in the SZ developmental group decreased (P<0.05), and the DOPAC concentration and the utility rate of DA in the hippocampus also decreased (P<0.01, P<0.05, respectively). The activities of DA, Glu and GABA system decrease in the mPFC and the DA system function reduces in the hippocampus of SZ developmental rats.

  9. Demand side management in recycling and electricity retail pricing

    NASA Astrophysics Data System (ADS)

    Kazan, Osman

    This dissertation addresses several problems from the recycling industry and electricity retail market. The first paper addresses a real-life scheduling problem faced by a national industrial recycling company. Based on their practices, a scheduling problem is defined, modeled, analyzed, and a solution is approximated efficiently. The recommended application is tested on the real-life data and randomly generated data. The scheduling improvements and the financial benefits are presented. The second problem is from electricity retail market. There are well-known patterns in daily usage in hours. These patterns change in shape and magnitude by seasons and days of the week. Generation costs are multiple times higher during the peak hours of the day. Yet most consumers purchase electricity at flat rates. This work explores analytic pricing tools to reduce peak load electricity demand for retailers. For that purpose, a nonlinear model that determines optimal hourly prices is established based on two major components: unit generation costs and consumers' utility. Both are analyzed and estimated empirically in the third paper. A pricing model is introduced to maximize the electric retailer's profit. As a result, a closed-form expression for the optimal price vector is obtained. Possible scenarios are evaluated for consumers' utility distribution. For the general case, we provide a numerical solution methodology to obtain the optimal pricing scheme. The models recommended are tested under various scenarios that consider consumer segmentation and multiple pricing policies. The recommended model reduces the peak load significantly in most cases. Several utility companies offer hourly pricing to their customers. They determine prices using historical data of unit electricity cost over time. In this dissertation we develop a nonlinear model that determines optimal hourly prices with parameter estimation. The last paper includes a regression analysis of the unit generation cost function obtained from Independent Service Operators. A consumer experiment is established to replicate the peak load behavior. As a result, consumers' utility function is estimated and optimal retail electricity prices are computed.

  10. Utility Estimation for Pediatric Vesicoureteral Reflux: Methodological Considerations Using an Online Survey Platform.

    PubMed

    Tejwani, Rohit; Wang, Hsin-Hsiao S; Lloyd, Jessica C; Kokorowski, Paul J; Nelson, Caleb P; Routh, Jonathan C

    2017-03-01

    The advent of online task distribution has opened a new avenue for efficiently gathering community perspectives needed for utility estimation. Methodological consensus for estimating pediatric utilities is lacking, with disagreement over whom to sample, what perspective to use (patient vs parent) and whether instrument induced anchoring bias is significant. We evaluated what methodological factors potentially impact utility estimates for vesicoureteral reflux. Cross-sectional surveys using a time trade-off instrument were conducted via the Amazon Mechanical Turk® (https://www.mturk.com) online interface. Respondents were randomized to answer questions from child, parent or dyad perspectives on the utility of a vesicoureteral reflux health state and 1 of 3 "warm-up" scenarios (paralysis, common cold, none) before a vesicoureteral reflux scenario. Utility estimates and potential predictors were fitted to a generalized linear model to determine what factors most impacted utilities. A total of 1,627 responses were obtained. Mean respondent age was 34.9 years. Of the respondents 48% were female, 38% were married and 44% had children. Utility values were uninfluenced by child/personal vesicoureteral reflux/urinary tract infection history, income or race. Utilities were affected by perspective and were higher in the child group (34% lower in parent vs child, p <0.001, and 13% lower in dyad vs child, p <0.001). Vesicoureteral reflux utility was not significantly affected by the presence or type of time trade-off warm-up scenario (p = 0.17). Time trade-off perspective affects utilities when estimated via an online interface. However, utilities are unaffected by the presence, type or absence of warm-up scenarios. These findings could have significant methodological implications for future utility elicitations regarding other pediatric conditions. Copyright © 2017 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  11. Heterogeneity and nonlinearity in consumers’ preferences: An application to the olive oil shopping behavior in Chile

    PubMed Central

    Romo-Muñoz, Rodrigo Alejandro; Cabas-Monje, Juan Hernán; Garrido-Henrríquez, Héctor Manuel

    2017-01-01

    In relatively unknown products, consumers use prices as a quality reference. Under such circumstances, the utility function can be non-negative for a specific price range and generate an inverted U-shaped function. The extra virgin olive oil market in Chile is a good example. Although domestic production and consumption have increased significantly in the last few years, consumer knowledge of this product is still limited. The objective of this study was to analyze Chilean consumer preferences and willingness to pay for extra virgin olive oil attributes. Consumers were segmented taking into account purchasing frequency. A Random Parameter Logit model was estimated for preference heterogeneity. Results indicate that the utility function is nonlinear allowing us to differentiate between two regimes. In the first regime, olive oil behaves as a conspicuous good, that is, higher utility is assigned to higher prices and consumers prefer foreign products in smaller containers. Under the second regime, Chilean olive oil in larger containers is preferred. PMID:28892516

  12. Heterogeneity and nonlinearity in consumers' preferences: An application to the olive oil shopping behavior in Chile.

    PubMed

    Romo-Muñoz, Rodrigo Alejandro; Cabas-Monje, Juan Hernán; Garrido-Henrríquez, Héctor Manuel; Gil, José María

    2017-01-01

    In relatively unknown products, consumers use prices as a quality reference. Under such circumstances, the utility function can be non-negative for a specific price range and generate an inverted U-shaped function. The extra virgin olive oil market in Chile is a good example. Although domestic production and consumption have increased significantly in the last few years, consumer knowledge of this product is still limited. The objective of this study was to analyze Chilean consumer preferences and willingness to pay for extra virgin olive oil attributes. Consumers were segmented taking into account purchasing frequency. A Random Parameter Logit model was estimated for preference heterogeneity. Results indicate that the utility function is nonlinear allowing us to differentiate between two regimes. In the first regime, olive oil behaves as a conspicuous good, that is, higher utility is assigned to higher prices and consumers prefer foreign products in smaller containers. Under the second regime, Chilean olive oil in larger containers is preferred.

  13. A spatial dynamic model to assess piospheric land degradation processes of SW Iberian rangelands

    NASA Astrophysics Data System (ADS)

    Herguido Sevillano, Estela; Ibáñez, Javier; Francisco Lavado Contador, Joaquín; Pulido-Fernández, Manuel; Schnabel, Susanne

    2015-04-01

    Iberian open wooded rangelands (known as dehesas or montados) constitute valuable agro-silvo-pastoral systems traditionally considered as highly sustainable. Nevertheless, in the recent decades, those systems are undergoing changes of land use and management practices that compromise its sustainability. Some of those changes, as the rising construction of watering points and the high spatial fragmentation and livestock movement restriction associated to fencing, show an aggregated effect with livestock, producing an impact gradient over soil and vegetation. Soil compaction related to livestock pressure is higher around watering points, with bare soil halos and patches of scarce vegetation or nude soil developing with higher frequency in areas close to them. Using the freeware Dinamica EGO as environmental modeling platform, we have developed a theoretic spatial dynamic model that represents some of the processes of land degradation associated to livestock grazing in dehesa fenced enclosures. Spatial resolution is high since every cell in the model is a square unit area of 1 m2. We paid particular attention to the relationships between soil degradation by compaction (porosity), livestock pressure, rainfall, pasture growth and shrub cover and bare soil generation. The model considers pasture growth as related to soil compaction, measured by the pore space in the top 10 cm soil layer. Annual precipitation is randomly generated following a normal distribution. When annual precipitation and pore space increase, also does pasture growth. Besides, there is a feedback between pasture growth and pore space, given that pasture roots increases soil porosity. The cell utility for livestock function has been defined as an exponential function of the distance of a cell to watering points and the amount of pasture present in it. The closer the cell to a pond and the higher the amount of pasture, the higher is cell utility. The latter is modulated by a normal random variable to capture accidental effects. This variable has zero mean and a standard deviation linearly related to the distance to the pond. Livestock utilization of a cell is a function of its relative utility, the stocking rate and the time that animals spend at the enclosure. Since livestock trampling promotes soil compaction, livestock utilization has a negative effect on pore space. The probability of transition from herbaceous to shrubs is also modulated by pore space, and thus livestock utilization, as shrub development needs a minimum porosity value for seeds to successfully germinate. In addition, it is influenced by the proportion of cells occupied by shrubs in a radius where seed dispersal or exclusion by competition may occur. The model contemplates the probability of transition from shrubs to herbaceous through shrub mortality, and the age of the shrubs, which influences seed production and shrub cover. Pasture consumption by livestock and pasture remaining at the end of summer were also modeled, so that it is possible to obtain maps of bare soil at that time. Likewise, the model generates maps of vegetation state (shrubs or herbaceous) and pasture growth. The values of the set of 31 parameters were obtained from field measurements and from publications. Those parameters lacking quantitative information were calibrated by comparing model performance with the dynamics of true enclosures analyzed between 1984 and 2009 in ortophotographs. Stocking rates were inferred from farmers' interviews performed in 2009 about present and past land use and management practices. The model developed is intended to analyze strategies of livestock management in dehesas. Particularly, soil conservation practices as related to livestock pressure can be simulated looking for optimized schemes. Moreover, the model provides the possibility of generating simulations for future climate scenarios, studying the effects of climate change on livestock carrying capacity on these systems. Thanks to the Spanish Ministerio de Economía y Competitividad for financially supporting this study through AMID (CGL2011-23361) project.

  14. Stochastic modelling of a single ion channel: an alternating renewal approach with application to limited time resolution.

    PubMed

    Milne, R K; Yeo, G F; Edeson, R O; Madsen, B W

    1988-04-22

    Stochastic models of ion channels have been based largely on Markov theory where individual states and transition rates must be specified, and sojourn-time densities for each state are constrained to be exponential. This study presents an approach based on random-sum methods and alternating-renewal theory, allowing individual states to be grouped into classes provided the successive sojourn times in a given class are independent and identically distributed. Under these conditions Markov models form a special case. The utility of the approach is illustrated by considering the effects of limited time resolution (modelled by using a discrete detection limit, xi) on the properties of observable events, with emphasis on the observed open-time (xi-open-time). The cumulants and Laplace transform for a xi-open-time are derived for a range of Markov and non-Markov models; several useful approximations to the xi-open-time density function are presented. Numerical studies show that the effects of limited time resolution can be extreme, and also highlight the relative importance of the various model parameters. The theory could form a basis for future inferential studies in which parameter estimation takes account of limited time resolution in single channel records. Appendixes include relevant results concerning random sums and a discussion of the role of exponential distributions in Markov models.

  15. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This process is repeated until a threshold in the objective function is met or insufficient changes are produced in successive iterations.

  16. Logging utilization in Idaho: Current and past trends

    Treesearch

    Eric A. Simmons; Todd A. Morgan; Erik C. Berg; Stanley J. Zarnoch; Steven W. Hayes; Mike T. Thompson

    2014-01-01

    A study of commercial timber-harvesting activities in Idaho was conducted during 2008 and 2011 to characterize current tree utilization, logging operations, and changes from previous Idaho logging utilization studies. A two-stage simple random sampling design was used to select sites and felled trees for measurement within active logging sites. Thirty-three logging...

  17. Modelling the Cost Effectiveness of Disease-Modifying Treatments for Multiple Sclerosis

    PubMed Central

    Thompson, Joel P.; Abdolahi, Amir; Noyes, Katia

    2013-01-01

    Several cost-effectiveness models of disease-modifying treatments (DMTs) for multiple sclerosis (MS) have been developed for different populations and different countries. Vast differences in the approaches and discrepancies in the results give rise to heated discussions and limit the use of these models. Our main objective is to discuss the methodological challenges in modelling the cost effectiveness of treatments for MS. We conducted a review of published models to describe the approaches taken to date, to identify the key parameters that influence the cost effectiveness of DMTs, and to point out major areas of weakness and uncertainty. Thirty-six published models and analyses were identified. The greatest source of uncertainty is the absence of head-to-head randomized clinical trials. Modellers have used various techniques to compensate, including utilizing extension trials. The use of large observational cohorts in recent studies aids in identifying population-based, ‘real-world’ treatment effects. Major drivers of results include the time horizon modelled and DMT acquisition costs. Model endpoints must target either policy makers (using cost-utility analysis) or clinicians (conducting cost-effectiveness analyses). Lastly, the cost effectiveness of DMTs outside North America and Europe is currently unknown, with the lack of country-specific data as the major limiting factor. We suggest that limited data should not preclude analyses, as models may be built and updated in the future as data become available. Disclosure of modelling methods and assumptions could improve the transferability and applicability of models designed to reflect different healthcare systems. PMID:23640103

  18. Dynamic Resource Management for Parallel Tasks in an Oversubscribed Energy-Constrained Heterogeneous Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Koenig, Gregory A; Machovec, Dylan

    2016-01-01

    Abstract: The worth of completing parallel tasks is modeled using utility functions, which monotonically-decrease with time and represent the importance and urgency of a task. These functions define the utility earned by a task at the time of its completion. The performance of such a system is measured as the total utility earned by all completed tasks over some interval of time (e.g., 24 hours). To maximize system performance when scheduling dynamically arriving parallel tasks onto a high performance computing (HPC) system that is oversubscribed and energy-constrained, we have designed, analyzed, and compared different heuristic techniques. Four utility-aware heuristics (i.e.,more » Max Utility, Max Utility-per-Time, Max Utility-per-Resource, and Max Utility-per-Energy), three FCFS-based heuristics (Conservative Backfilling, EASY Backfilling, and FCFS with Multiple Queues), and a Random heuristic were examined in this study. A technique that is often used with the FCFS-based heuristics is the concept of a permanent reservation. We compare the performance of permanent reservations with temporary place-holders to demonstrate the advantages that place-holders can provide. We also present a novel energy filtering technique that constrains the maximum energy-per-resource used by each task. We conducted a simulation study to evaluate the performance of these heuristics and techniques in an energy-constrained oversubscribed HPC environment. With place-holders, energy filtering, and dropping tasks with low potential utility, our utility-aware heuristics are able to significantly outperform the existing FCFS-based techniques.« less

  19. An analytics approach to designing patient centered medical homes.

    PubMed

    Ajorlou, Saeede; Shams, Issac; Yang, Kai

    2015-03-01

    Recently the patient centered medical home (PCMH) model has become a popular team based approach focused on delivering more streamlined care to patients. In current practices of medical homes, a clinical based prediction frame is recommended because it can help match the portfolio capacity of PCMH teams with the actual load generated by a set of patients. Without such balances in clinical supply and demand, issues such as excessive under and over utilization of physicians, long waiting time for receiving the appropriate treatment, and non-continuity of care will eliminate many advantages of the medical home strategy. In this paper, by using the hierarchical generalized linear model with multivariate responses, we develop a clinical workload prediction model for care portfolio demands in a Bayesian framework. The model allows for heterogeneous variances and unstructured covariance matrices for nested random effects that arise through complex hierarchical care systems. We show that using a multivariate approach substantially enhances the precision of workload predictions at both primary and non primary care levels. We also demonstrate that care demands depend not only on patient demographics but also on other utilization factors, such as length of stay. Our analyses of a recent data from Veteran Health Administration further indicate that risk adjustment for patient health conditions can considerably improve the prediction power of the model.

  20. [Prediction model of health workforce and beds in county hospitals of Hunan by multiple linear regression].

    PubMed

    Ling, Ru; Liu, Jiawang

    2011-12-01

    To construct prediction model for health workforce and hospital beds in county hospitals of Hunan by multiple linear regression. We surveyed 16 counties in Hunan with stratified random sampling according to uniform questionnaires,and multiple linear regression analysis with 20 quotas selected by literature view was done. Independent variables in the multiple linear regression model on medical personnels in county hospitals included the counties' urban residents' income, crude death rate, medical beds, business occupancy, professional equipment value, the number of devices valued above 10 000 yuan, fixed assets, long-term debt, medical income, medical expenses, outpatient and emergency visits, hospital visits, actual available bed days, and utilization rate of hospital beds. Independent variables in the multiple linear regression model on county hospital beds included the the population of aged 65 and above in the counties, disposable income of urban residents, medical personnel of medical institutions in county area, business occupancy, the total value of professional equipment, fixed assets, long-term debt, medical income, medical expenses, outpatient and emergency visits, hospital visits, actual available bed days, utilization rate of hospital beds, and length of hospitalization. The prediction model shows good explanatory and fitting, and may be used for short- and mid-term forecasting.

  1. Under What Circumstances Does External Knowledge about the Correlation Structure Improve Power in Cluster Randomized Designs?

    ERIC Educational Resources Information Center

    Rhoads, Christopher

    2014-01-01

    Recent publications have drawn attention to the idea of utilizing prior information about the correlation structure to improve statistical power in cluster randomized experiments. Because power in cluster randomized designs is a function of many different parameters, it has been difficult for applied researchers to discern a simple rule explaining…

  2. The lead time tradeoff: the case of health states better than dead.

    PubMed

    Pinto-Prades, José Luis; Rodríguez-Míguez, Eva

    2015-04-01

    Lead time tradeoff (L-TTO) is a variant of the time tradeoff (TTO). L-TTO introduces a lead period in full health before illness onset, avoiding the need to use 2 different procedures for states better and worse than dead. To estimate utilities, additive separability is assumed. We tested to what extent violations of this assumption can bias utilities estimated with L-TTO. A sample of 500 members of the Spanish general population evaluated 24 health states, using face-to-face interviews. A total of 188 subjects were interviewed with L-TTO and the rest with TTO. Both samples evaluated the same set of 24 health states, divided into 4 groups with 6 health states per set. Each subject evaluated 1 of the sets. A random effects regression model was fitted to our data. Only health states better than dead were included in the regression since it is in this subset where additive separability can be tested clearly. Utilities were higher in L-TTO in relation to TTO (on average L-TTO adds about 0.2 points to the utility of health states), suggesting that additive separability is violated. The difference between methods increased with the severity of the health state. Thus, L-TTO adds about 0.14 points to the average utility of the less severe states, 0.23 to the intermediate states, and 0.28 points to the more severe estates. L-TTO produced higher utilities than TTO. Health problems are perceived as less severe if a lead period in full health is added upfront, implying that there are interactions between disjointed time periods. The advantages of this method have to be compared with the cost of modeling the interaction between periods. © The Author(s) 2014.

  3. Economic evaluation in short bowel syndrome (SBS): an algorithm to estimate utility scores for a patient-reported SBS-specific quality of life scale (SBS-QoL™).

    PubMed

    Lloyd, Andrew; Kerr, Cicely; Breheny, Katie; Brazier, John; Ortiz, Aurora; Borg, Emma

    2014-03-01

    Condition-specific preference-based measures can offer utility data where they would not otherwise be available or where generic measures may lack sensitivity, although they lack comparability across conditions. This study aimed to develop an algorithm for estimating utilities from the short bowel syndrome health-related quality of life scale (SBS-QoL™). SBS-QoL™ items were selected based on factor and item performance analysis of a European SBS-QoL™ dataset and consultation with 3 SBS clinical experts. Six-dimension health states were developed using 8 SBS-QoL™ items (2 dimensions combined 2 SBS-QoL™ items). SBS health states were valued by a UK general population sample (N = 250) using the lead-time time trade-off method. Preference weights or 'utility decrements' for each severity level of each dimension were estimated by regression models and used to develop the scoring algorithm. Mean utilities for the SBS health states ranged from -0.46 (worst health state, very much affected on all dimensions) to 0.92 (best health state, not at all affected on all dimensions). The random effects model with maximum likelihood estimation regression had the best predictive ability and lowest root mean squared error and mean absolute error, and was used to develop the scoring algorithm. The preference-weighted scoring algorithm for the SBS-QoL™ developed is able to estimate a wide range of utility values from patient-level SBS-QoL™ data. This allows estimation of SBS HRQL impact for the purpose of economic evaluation of SBS treatment benefits.

  4. A cost-utility analysis of the use of preoperative computed tomographic angiography in abdomen-based perforator flap breast reconstruction.

    PubMed

    Offodile, Anaeze C; Chatterjee, Abhishek; Vallejo, Sergio; Fisher, Carla S; Tchou, Julia C; Guo, Lifei

    2015-04-01

    Computed tomographic angiography is a diagnostic tool increasingly used for preoperative vascular mapping in abdomen-based perforator flap breast reconstruction. This study compared the use of computed tomographic angiography and the conventional practice of Doppler ultrasonography only in postmastectomy reconstruction using a cost-utility model. Following a comprehensive literature review, a decision analytic model was created using the three most clinically relevant health outcomes in free autologous breast reconstruction with computed tomographic angiography versus Doppler ultrasonography only. Cost and utility estimates for each health outcome were used to derive the quality-adjusted life-years and incremental cost-utility ratio. One-way sensitivity analysis was performed to scrutinize the robustness of the authors' results. Six studies and 782 patients were identified. Cost-utility analysis revealed a baseline cost savings of $3179, a gain in quality-adjusted life-years of 0.25. This yielded an incremental cost-utility ratio of -$12,716, implying a dominant choice favoring preoperative computed tomographic angiography. Sensitivity analysis revealed that computed tomographic angiography was costlier when the operative time difference between the two techniques was less than 21.3 minutes. However, the clinical advantage of computed tomographic angiography over Doppler ultrasonography only showed that computed tomographic angiography would still remain the cost-effective option even if it offered no additional operating time advantage. The authors' results show that computed tomographic angiography is a cost-effective technology for identifying lower abdominal perforators for autologous breast reconstruction. Although the perfect study would be a randomized controlled trial of the two approaches with true cost accrual, the authors' results represent the best available evidence.

  5. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM

    PubMed Central

    Koopmeiners, Joseph S.; Wey, Andrew

    2017-01-01

    The primary object of a phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose-finding. Recently, it was shown that the CRM has a tendency to get “stuck” on a dose-level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable trade-off with respect to the average number treated at the MTD. PMID:28340333

  6. The Randomized CRM: An Approach to Overcoming the Long-Memory Property of the CRM.

    PubMed

    Koopmeiners, Joseph S; Wey, Andrew

    2017-01-01

    The primary object of a Phase I clinical trial is to determine the maximum tolerated dose (MTD). Typically, the MTD is identified using a dose-escalation study, where initial subjects are treated at the lowest dose level and subsequent subjects are treated at progressively higher dose levels until the MTD is identified. The continual reassessment method (CRM) is a popular model-based dose-escalation design, which utilizes a formal model for the relationship between dose and toxicity to guide dose finding. Recently, it was shown that the CRM has a tendency to get "stuck" on a dose level, with little escalation or de-escalation in the late stages of the trial, due to the long-memory property of the CRM. We propose the randomized CRM (rCRM), which introduces random escalation and de-escalation into the standard CRM dose-finding algorithm, as well as a hybrid approach that incorporates escalation and de-escalation only when certain criteria are met. Our simulation results show that both the rCRM and the hybrid approach reduce the trial-to-trial variability in the number of cohorts treated at the MTD but that the hybrid approach has a more favorable tradeoff with respect to the average number treated at the MTD.

  7. Unravelling changing interspecific interactions across environmental gradients using Markov random fields.

    PubMed

    Clark, Nicholas J; Wells, Konstans; Lindberg, Oscar

    2018-05-16

    Inferring interactions between co-occurring species is key to identify processes governing community assembly. Incorporating interspecific interactions in predictive models is common in ecology, yet most methods do not adequately account for indirect interactions (where an interaction between two species is masked by their shared interactions with a third) and assume interactions do not vary along environmental gradients. Markov random fields (MRF) overcome these limitations by estimating interspecific interactions, while controlling for indirect interactions, from multispecies occurrence data. We illustrate the utility of MRFs for ecologists interested in interspecific interactions, and demonstrate how covariates can be included (a set of models known as Conditional Random Fields, CRF) to infer how interactions vary along environmental gradients. We apply CRFs to two data sets of presence-absence data. The first illustrates how blood parasite (Haemoproteus, Plasmodium, and nematode microfilaria spp.) co-infection probabilities covary with relative abundance of their avian hosts. The second shows that co-occurrences between mosquito larvae and predatory insects vary along water temperature gradients. Other applications are discussed, including the potential to identify replacement or shifting impacts of highly connected species along climate or land-use gradients. We provide tools for building CRFs and plotting/interpreting results as an R package. © 2018 by the Ecological Society of America.

  8. Impact of a social-emotional and character development program on school-level indicators of academic achievement, absenteeism, and disciplinary outcomes: A matched-pair, cluster randomized, controlled trial.

    PubMed

    Snyder, Frank; Flay, Brian; Vuchinich, Samuel; Acock, Alan; Washburn, Isaac; Beets, Michael; Li, Kin-Kit

    2010-01-01

    This paper reports the effects of a comprehensive elementary school-based social-emotional and character education program on school-level achievement, absenteeism, and disciplinary outcomes utilizing a matched-pair, cluster randomized, controlled design. The Positive Action Hawai'i trial included 20 racially/ethnically diverse schools (mean enrollment = 544) and was conducted from the 2002-03 through the 2005-06 academic years. Using school-level archival data, analyses comparing change from baseline (2002) to one-year post trial (2007) revealed that intervention schools scored 9.8% better on the TerraNova (2 nd ed.) test for reading and 8.8% on math; 20.7% better in Hawai'i Content and Performance Standards scores for reading and 51.4% better in math; and that intervention schools reported 15.2% lower absenteeism and fewer suspensions (72.6%) and retentions (72.7%). Overall, effect sizes were moderate to large (range 0.5-1.1) for all of the examined outcomes. Sensitivity analyses using permutation models and random-intercept growth curve models substantiated results. The results provide evidence that a comprehensive school-based program, specifically developed to target student behavior and character, can positively influence school-level achievement, attendance, and disciplinary outcomes concurrently.

  9. Institutional delivery and postnatal care services utilizations in Abuna Gindeberet District, West Shewa, Oromiya Region, Central Ethiopia: A Community-based cross sectional study.

    PubMed

    Darega, Birhanu; Dida, Nagasa; Tafese, Fikru; Ololo, Shimeles

    2016-07-07

    Delivery at health institutions under the care of trained health-care providers and utilization of postnatal cares services plays vital roles in promoting child survival and reducing the risk of maternal mortality. More than 80 % of maternal deaths can be prevented if pregnant women access to essential maternity cares like antenatal care, institutional delivery and postnatal care services. Thus, this study aimed to assess institutional delivery and postnatal care services utilizations in Abuna Gindeberet District, West Shewa, Oromiya Regional State, Ethiopia. A community-based cross-sectional study design was employed among 703 randomly identified mothers of Abuna Gindeberet district in March, 2013. Data were collected through interviewer-administered questionnaires and analyzed using SPSS version 16.0. Descriptive, bivariate and multivariate analyses were used to determine prevalence and to identify associated factors with institutional delivery and postnatal care, considering p-value of less than 0.05 as significant. The results were presented in a narrative forms, tables and graphs. One hundred one (14.4 %) of mothers gave birth to their last baby in health institutions. From 556 (79.1 %) of respondents who heard about postnatal care services, only 223 (31.7 %) of them utilized postnatal care services for their recent childbirth. From the total postnatal care users, 204 (91.5 %) of them took the services from health extension workers. Decision-making styles, household distances from health institutions, household being model family and ANC services utilizations were found to be statistically significant with both institutional delivery and postnatal care services utilizations. But educational status of husbands was statistically significant with only postnatal care services utilizations. Both institutional delivery and postnatal care services utilizations from health institutions were low. Decision-making styles, household distances from health institutions, household being model family and ANC services utilizations were the common factors that affect institutional delivery and postnatal care services utilizations from health institutions. Therefore, giving attention to the identified factors could improve and sustain institutional delivery and postnatal care services utilizations from health institutions.

  10. Prediction of treatment outcomes to exercise in patients with nonremitted major depressive disorder.

    PubMed

    Rethorst, Chad D; South, Charles C; Rush, A John; Greer, Tracy L; Trivedi, Madhukar H

    2017-12-01

    Only one-third of patients with major depressive disorder (MDD) achieve remission with initial treatment. Consequently, current clinical practice relies on a "trial-and-error" approach to identify an effective treatment for each patient. The purpose of this report was to determine whether we could identify a set of clinical and biological parameters with potential clinical utility for prescription of exercise for treatment of MDD in a secondary analysis of the Treatment with Exercise Augmentation in Depression (TREAD) trial. Participants with nonremitted MDD were randomized to one of two exercise doses for 12 weeks. Participants were categorized as "remitters" (≤12 on the IDS-C), nonresponders (<30% drop in IDS-C), or neither. The least absolute shrinkage and selection operator (LASSO) and random forests were used to evaluate 30 variables as predictors of both remission and nonresponse. Predictors were used to model treatment outcomes using logistic regression. Of the 122 participants, 36 were categorized as remitters (29.5%), 56 as nonresponders (45.9%), and 30 as neither (24.6%). Predictors of remission were higher levels of brain-derived neurotrophic factor (BDNF) and IL-1B, greater depressive symptom severity, and higher postexercise positive affect. Predictors of treatment nonresponse were low cardiorespiratory fitness, lower levels of IL-6 and BDNF, and lower postexercise positive affect. Models including these predictors resulted in predictive values greater than 70% (true predicted remitters/all predicted remitters) with specificities greater than 25% (true predicted remitters/all remitters). Results indicate feasibility in identifying patients who will either remit or not respond to exercise as a treatment for MDD utilizing a clinical decision model that incorporates multiple patient characteristics. © 2017 Wiley Periodicals, Inc.

  11. Bayesian randomized clinical trials: From fixed to adaptive design.

    PubMed

    Yin, Guosheng; Lam, Chi Kin; Shi, Haolun

    2017-08-01

    Randomized controlled studies are the gold standard for phase III clinical trials. Using α-spending functions to control the overall type I error rate, group sequential methods are well established and have been dominating phase III studies. Bayesian randomized design, on the other hand, can be viewed as a complement instead of competitive approach to the frequentist methods. For the fixed Bayesian design, the hypothesis testing can be cast in the posterior probability or Bayes factor framework, which has a direct link to the frequentist type I error rate. Bayesian group sequential design relies upon Bayesian decision-theoretic approaches based on backward induction, which is often computationally intensive. Compared with the frequentist approaches, Bayesian methods have several advantages. The posterior predictive probability serves as a useful and convenient tool for trial monitoring, and can be updated at any time as the data accrue during the trial. The Bayesian decision-theoretic framework possesses a direct link to the decision making in the practical setting, and can be modeled more realistically to reflect the actual cost-benefit analysis during the drug development process. Other merits include the possibility of hierarchical modeling and the use of informative priors, which would lead to a more comprehensive utilization of information from both historical and longitudinal data. From fixed to adaptive design, we focus on Bayesian randomized controlled clinical trials and make extensive comparisons with frequentist counterparts through numerical studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Variability in utilization of drug eluting stents in United States: Insights from nationwide inpatient sample.

    PubMed

    Panaich, Sidakpal S; Badheka, Apurva O; Arora, Shilpkumar; Patel, Nileshkumar J; Thakkar, Badal; Patel, Nilay; Singh, Vikas; Chothani, Ankit; Deshmukh, Abhishek; Agnihotri, Kanishk; Jhamnani, Sunny; Lahewala, Sopan; Manvar, Sohilkumar; Panchal, Vinaykumar; Patel, Achint; Patel, Neil; Bhatt, Parth; Savani, Chirag; Patel, Jay; Savani, Ghanshyambhai T; Solanki, Shantanu; Patel, Samir; Kaki, Amir; Mohamad, Tamam; Elder, Mahir; Kondur, Ashok; Cleman, Michael; Forrest, John K; Schreiber, Theodore; Grines, Cindy

    2016-01-01

    We studied the trends and predictors of drug eluting stent (DES) utilization from 2006 to 2011 to further expound the inter-hospital variability in their utilization. We queried the Healthcare Cost and Utilization Project's Nationwide Inpatient Sample (NIS) between 2006 and 2011 using ICD-9-CM procedure code, 36.06 (bare metal stent) or 36.07 (drug eluting stents) for Percutaneous Coronary Intervention (PCI). Annual hospital volume was calculated using unique identification numbers and divided into quartiles for analysis. We built a hierarchical two level model adjusted for multiple confounding factors, with hospital ID incorporated as random effects in the model. About 665,804 procedures (weighted n = 3,277,884) were analyzed. Safety concerns arising in 2006 reduced utilization DES from 90% of all PCIs performed in 2006 to a nadir of 69% in 2008 followed by increase (76% of all stents in 2009) and plateau (75% in 2011). Significant between-hospital variation was noted in DES utilization irrespective of patient or hospital characteristics. Independent patient level predictors of DES were (OR, 95% CI, P-value) age (0.99, 0.98-0.99, <0.001), female(1.12, 1.09-1.15, <0.001), acute myocardial infarction(0.75, 0.71-0.79, <0.001), shock (0.53, 0.49-0.58, <0.001), Charlson Co-morbidity index (0.81,0.77-0.86, <0.001), private insurance/HMO (1.27, 1.20-1.34, <0.001), and elective admission (1.16, 1.05-1.29, <0.001). Highest quartile hospital (1.64, 1.25-2.16, <0.001) volume was associated with higher DES placement. There is significant between-hospital variation in DES utilization and a higher annual hospital volume is associated with higher utilization rate of DES. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  13. On Utilizing Optimal and Information Theoretic Syntactic Modeling for Peptide Classification

    NASA Astrophysics Data System (ADS)

    Aygün, Eser; Oommen, B. John; Cataltepe, Zehra

    Syntactic methods in pattern recognition have been used extensively in bioinformatics, and in particular, in the analysis of gene and protein expressions, and in the recognition and classification of bio-sequences. These methods are almost universally distance-based. This paper concerns the use of an Optimal and Information Theoretic (OIT) probabilistic model [11] to achieve peptide classification using the information residing in their syntactic representations. The latter has traditionally been achieved using the edit distances required in the respective peptide comparisons. We advocate that one can model the differences between compared strings as a mutation model consisting of random Substitutions, Insertions and Deletions (SID) obeying the OIT model. Thus, in this paper, we show that the probability measure obtained from the OIT model can be perceived as a sequence similarity metric, using which a Support Vector Machine (SVM)-based peptide classifier, referred to as OIT_SVM, can be devised.

  14. Interpreting Black-Box Classifiers Using Instance-Level Visual Explanations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagnini, Paolo; Krause, Josua W.; Dasgupta, Aritra

    2017-05-14

    To realize the full potential of machine learning in diverse real- world domains, it is necessary for model predictions to be readily interpretable and actionable for the human in the loop. Analysts, who are the users but not the developers of machine learning models, often do not trust a model because of the lack of transparency in associating predictions with the underlying data space. To address this problem, we propose Rivelo, a visual analytic interface that enables analysts to understand the causes behind predictions of binary classifiers by interactively exploring a set of instance-level explanations. These explanations are model-agnostic, treatingmore » a model as a black box, and they help analysts in interactively probing the high-dimensional binary data space for detecting features relevant to predictions. We demonstrate the utility of the interface with a case study analyzing a random forest model on the sentiment of Yelp reviews about doctors.« less

  15. Homeless Veterans' Use of Peer Mentors and Effects on Costs and Utilization in VA Clinics.

    PubMed

    Yoon, Jean; Lo, Jeanie; Gehlert, Elizabeth; Johnson, Erin E; O'Toole, Thomas P

    2017-06-01

    The study compared health care utilization and costs among homeless veterans randomly assigned to peer mentors or usual care and described contacts with peer mentors. Homeless patients at four Department of Veterans Affairs clinics were randomly assigned to a peer mentor (N=195) or to usual care (N=180). Administrative data on utilization and costs over a six-month follow-up were combined with peer mentors' reports of patient contacts. Most patients (87%) in the peer mentor group had at least one peer contact. Patients in this group spent the largest proportions of time discussing housing and health issues with peer mentors and had more outpatient encounters than those in usual care, although differences were not significant. No other between-group differences were found in utilization or costs. Although significant impacts of peer mentors on health care patterns or costs were not detected, some patients had frequent contact with peer mentors.

  16. Detection of mitotic nuclei in breast histopathology images using localized ACM and Random Kitchen Sink based classifier.

    PubMed

    Beevi, K Sabeena; Nair, Madhu S; Bindu, G R

    2016-08-01

    The exact measure of mitotic nuclei is a crucial parameter in breast cancer grading and prognosis. This can be achieved by improving the mitotic detection accuracy by careful design of segmentation and classification techniques. In this paper, segmentation of nuclei from breast histopathology images are carried out by Localized Active Contour Model (LACM) utilizing bio-inspired optimization techniques in the detection stage, in order to handle diffused intensities present along object boundaries. Further, the application of a new optimal machine learning algorithm capable of classifying strong non-linear data such as Random Kitchen Sink (RKS), shows improved classification performance. The proposed method has been tested on Mitosis detection in breast cancer histological images (MITOS) dataset provided for MITOS-ATYPIA CONTEST 2014. The proposed framework achieved 95% recall, 98% precision and 96% F-score.

  17. Skills training versus health education to prevent STDs/HIV in heterosexual women: a randomized controlled trial utilizing biological outcomes.

    PubMed

    Baker, Sharon A; Beadnell, Blair; Stoner, Susan; Morrison, Diane M; Gordon, Judith; Collier, Cheza; Knox, Kay; Wickizer, Lauren; Stielstra, Sorrel

    2003-02-01

    We compared the effectiveness of two different 16-session group interventions for reducing new STD infection among heterosexual women. Two hundred twenty-nine at-risk heterosexual women were randomly assigned to skills training (ST) based on the relapse prevention model or health education (HE). Participants were monitored during the year following intervention for STD acquisition, self-reports of sexual behavior, and risk reduction skills. Participants in the ST intervention were significantly less likely to be diagnosed with a STD in the year following intervention and demonstrated superior risk reduction skills at 12-month follow-up. Both conditions showed statistically significant reductions in self reports of risky sexual behavior following intervention and at 12-month follow-up. In this sample, the ST intervention was superior to HE for reducing STD acquisition.

  18. Health Care Utilization and Expenditures Associated With Remote Monitoring in Patients With Implantable Cardiac Devices.

    PubMed

    Ladapo, Joseph A; Turakhia, Mintu P; Ryan, Michael P; Mollenkopf, Sarah A; Reynolds, Matthew R

    2016-05-01

    Several randomized trials and decision analysis models have found that remote monitoring may reduce health care utilization and expenditures in patients with cardiac implantable electronic devices (CIEDs), compared with in-office monitoring. However, little is known about the generalizability of these findings to unselected populations in clinical practice. To compare health care utilization and expenditures associated with remote monitoring and in-office monitoring in patients with CIEDs, we used Truven Health MarketScan Commercial Claims and Medicare Supplemental Databases. We selected patients newly implanted with an implantable cardioverter defibrillators (ICD), cardiac resynchronization therapy defibrillator (CRT-D), or permanent pacemaker (PPM), in 2009, who had continuous health plan enrollment 2 years after implantation. Generalized linear models and propensity score matching were used to adjust for confounders and estimate differences in health care utilization and expenditures in patients with remote or in-office monitoring. We identified 1,127; 427; and 1,295 pairs of patients with a similar propensity for receiving an ICD, CRT-D, or PPM, respectively. Remotely monitored patients with ICDs experienced fewer emergency department visits resulting in discharge (p = 0.050). Remote monitoring was associated with lower health care expenditures in office visits among patients with PPMs (p = 0.025) and CRT-Ds (p = 0.006) and lower total inpatient and outpatient expenditures in patients with ICDs (p <0.0001). In conclusion, remote monitoring of patients with CIEDs may be associated with reductions in health care utilization and expenditures compared with exclusive in-office care. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Identifying Specific Combinations of Multimorbidity that Contribute to Health Care Resource Utilization: An Analytic Approach.

    PubMed

    Schiltz, Nicholas K; Warner, David F; Sun, Jiayang; Bakaki, Paul M; Dor, Avi; Given, Charles W; Stange, Kurt C; Koroukian, Siran M

    2017-03-01

    Multimorbidity affects the majority of elderly adults and is associated with higher health costs and utilization, but how specific patterns of morbidity influence resource use is less understood. The objective was to identify specific combinations of chronic conditions, functional limitations, and geriatric syndromes associated with direct medical costs and inpatient utilization. Retrospective cohort study using the Health and Retirement Study (2008-2010) linked to Medicare claims. Analysis used machine-learning techniques: classification and regression trees and random forest. A population-based sample of 5771 Medicare-enrolled adults aged 65 and older in the United States. Main covariates: self-reported chronic conditions (measured as none, mild, or severe), geriatric syndromes, and functional limitations. Secondary covariates: demographic, social, economic, behavioral, and health status measures. Medicare expenditures in the top quartile and inpatient utilization. Median annual expenditures were $4354, and 41% were hospitalized within 2 years. The tree model shows some notable combinations: 64% of those with self-rated poor health plus activities of daily living and instrumental activities of daily living disabilities had expenditures in the top quartile. Inpatient utilization was highest (70%) in those aged 77-83 with mild to severe heart disease plus mild to severe diabetes. Functional limitations were more important than many chronic diseases in explaining resource use. The multimorbid population is heterogeneous and there is considerable variation in how specific combinations of morbidity influence resource use. Modeling the conjoint effects of chronic conditions, functional limitations, and geriatric syndromes can advance understanding of groups at greatest risk and inform targeted tailored interventions aimed at cost containment.

  20. Quantitative Analysis of Complex Glioma Cell Migration on Electrospun Polycaprolactone Using Time-Lapse Microscopy

    PubMed Central

    Johnson, Jed; Nowicki, M. Oskar; Lee, Carol H.; Chiocca, E. Antonio; Viapiano, Mariano S.; Lawler, Sean E.

    2009-01-01

    Malignant gliomas are the most common tumors originating within the central nervous system and account for over 15,000 deaths annually in the United States. The median survival for glioblastoma, the most common and aggressive of these tumors, is only 14 months. Therapeutic strategies targeting glioma cells migrating away from the tumor core are currently hampered by the difficulty of reproducing migration in the neural parenchyma in vitro. We utilized a tissue engineering approach to develop a physiologically relevant model of glioma cell migration. This revealed that glioma cells display dramatic differences in migration when challenged by random versus aligned electrospun poly-ɛ-caprolactone nanofibers. Cells on aligned fibers migrated at an effective velocity of 4.2 ± 0.39 μm/h compared to 0.8 ± 0.08 μm/h on random fibers, closely matching in vivo models and prior observations of glioma spread in white versus gray matter. Cells on random fibers exhibited extension along multiple fiber axes that prevented net motion; aligned fibers promoted a fusiform morphology better suited to infiltration. Time-lapse microscopy revealed that the motion of individual cells was complex and was influenced by cell cycle and local topography. Glioma stem cell–containing neurospheres seeded on random fibers did not show cell detachment and retained their original shape; on aligned fibers, cells detached and migrated in the fiber direction over a distance sixfold greater than the perpendicular direction. This chemically and physically flexible model allows time-lapse analysis of glioma cell migration while recapitulating in vivo cell morphology, potentially allowing identification of physiological mediators and pharmacological inhibitors of invasion. PMID:19199562

  1. The Evaluation of Bivariate Mixed Models in Meta-analyses of Diagnostic Accuracy Studies with SAS, Stata and R.

    PubMed

    Vogelgesang, Felicitas; Schlattmann, Peter; Dewey, Marc

    2018-05-01

    Meta-analyses require a thoroughly planned procedure to obtain unbiased overall estimates. From a statistical point of view not only model selection but also model implementation in the software affects the results. The present simulation study investigates the accuracy of different implementations of general and generalized bivariate mixed models in SAS (using proc mixed, proc glimmix and proc nlmixed), Stata (using gllamm, xtmelogit and midas) and R (using reitsma from package mada and glmer from package lme4). Both models incorporate the relationship between sensitivity and specificity - the two outcomes of interest in meta-analyses of diagnostic accuracy studies - utilizing random effects. Model performance is compared in nine meta-analytic scenarios reflecting the combination of three sizes for meta-analyses (89, 30 and 10 studies) with three pairs of sensitivity/specificity values (97%/87%; 85%/75%; 90%/93%). The evaluation of accuracy in terms of bias, standard error and mean squared error reveals that all implementations of the generalized bivariate model calculate sensitivity and specificity estimates with deviations less than two percentage points. proc mixed which together with reitsma implements the general bivariate mixed model proposed by Reitsma rather shows convergence problems. The random effect parameters are in general underestimated. This study shows that flexibility and simplicity of model specification together with convergence robustness should influence implementation recommendations, as the accuracy in terms of bias was acceptable in all implementations using the generalized approach. Schattauer GmbH.

  2. Design of robust reliable control for T-S fuzzy Markovian jumping delayed neutral type neural networks with probabilistic actuator faults and leakage delays: An event-triggered communication scheme.

    PubMed

    Syed Ali, M; Vadivel, R; Saravanakumar, R

    2018-06-01

    This study examines the problem of robust reliable control for Takagi-Sugeno (T-S) fuzzy Markovian jumping delayed neural networks with probabilistic actuator faults and leakage terms. An event-triggered communication scheme. First, the randomly occurring actuator faults and their failures rates are governed by two sets of unrelated random variables satisfying certain probabilistic failures of every actuator, new type of distribution based event triggered fault model is proposed, which utilize the effect of transmission delay. Second, Takagi-Sugeno (T-S) fuzzy model is adopted for the neural networks and the randomness of actuators failures is modeled in a Markov jump model framework. Third, to guarantee the considered closed-loop system is exponential mean square stable with a prescribed reliable control performance, a Markov jump event-triggered scheme is designed in this paper, which is the main purpose of our study. Fourth, by constructing appropriate Lyapunov-Krasovskii functional, employing Newton-Leibniz formulation and integral inequalities, several delay-dependent criteria for the solvability of the addressed problem are derived. The obtained stability criteria are stated in terms of linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Finally, numerical examples are given to illustrate the effectiveness and reduced conservatism of the proposed results over the existing ones, among them one example was supported by real-life application of the benchmark problem. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Machine Learning Algorithms for prediction of regions of high Reynolds Averaged Navier Stokes Uncertainty

    NASA Astrophysics Data System (ADS)

    Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    In spite of their deficiencies, RANS models represent the workhorse for industrial investigations into turbulent flows. In this context, it is essential to provide diagnostic measures to assess the quality of RANS predictions. To this end, the primary step is to identify feature importances amongst massive sets of potentially descriptive and discriminative flow features. This aids the physical interpretability of the resultant discrepancy model and its extensibility to similar problems. Recent investigations have utilized approaches such as Random Forests, Support Vector Machines and the Least Absolute Shrinkage and Selection Operator for feature selection. With examples, we exhibit how such methods may not be suitable for turbulent flow datasets. The underlying rationale, such as the correlation bias and the required conditions for the success of penalized algorithms, are discussed with illustrative examples. Finally, we provide alternate approaches using convex combinations of regularized regression approaches and randomized sub-sampling in combination with feature selection algorithms, to infer model structure from data. This research was supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  4. A Study to Design a Functional Patient Health Education Program for Implementation at the United States Army Medical Department Activity, Fort Benning, Georgia

    DTIC Science & Technology

    1980-08-01

    Inventory of Innovations Required in Outpatient Care Delivery Mechanisms. 6APC Model #14: 2-3. 1k. 7 Elizabeth A. Lee, "Health Education," Hospitals...within Appendix K. A complete inventory of clinical settings was then assembled, and utilizing a random numbers table, gross numbers of patient...are aimed at most of the major seg- I. ments of the patient beneficiary population, while addressing a well balanced inventory of subject matter. In

  5. Multiple scattering in planetary regoliths using first-order incoherent interactions

    NASA Astrophysics Data System (ADS)

    Muinonen, Karri; Markkanen, Johannes; Väisänen, Timo; Penttilä, Antti

    2017-10-01

    We consider scattering of light by a planetary regolith modeled using discrete random media of spherical particles. The size of the random medium can range from microscopic sizes of a few wavelengths to macroscopic sizes approaching infinity. The size of the particles is assumed to be of the order of the wavelength. We extend the numerical Monte Carlo method of radiative transfer and coherent backscattering (RT-CB) to the case of dense packing of particles. We adopt the ensemble-averaged first-order incoherent extinction, scattering, and absorption characteristics of a volume element of particles as input for the RT-CB. The volume element must be larger than the wavelength but smaller than the mean free path length of incoherent extinction. In the radiative transfer part, at each absorption and scattering process, we account for absorption with the help of the single-scattering albedo and peel off the Stokes parameters of radiation emerging from the medium in predefined scattering angles. We then generate a new scattering direction using the joint probability density for the local polar and azimuthal scattering angles. In the coherent backscattering part, we utilize amplitude scattering matrices along the radiative-transfer path and the reciprocal path, and utilize the reciprocity of electromagnetic waves to verify the computation. We illustrate the incoherent volume-element scattering characteristics and compare the dense-medium RT-CB to asymptotically exact results computed using the Superposition T-matrix method (STMM). We show that the dense-medium RT-CB compares favorably to the STMM results for the current cases of sparse and dense discrete random media studied. The novel method can be applied in modeling light scattering by the surfaces of asteroids and other airless solar system objects, including UV-Vis-NIR spectroscopy, photometry, polarimetry, and radar scattering problems.Acknowledgments. Research supported by European Research Council with Advanced Grant No. 320773 SAEMPL, Scattering and Absorption of ElectroMagnetic waves in ParticuLate media. Computational resources provided by CSC - IT Centre for Science Ltd, Finland.

  6. Utility and Cost-Effectiveness of Motivational Messaging to Increase Survey Response in Physicians: A Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Chan, Randolph C. H.; Mak, Winnie W. S.; Pang, Ingrid H. Y.; Wong, Samuel Y. S.; Tang, Wai Kwong; Lau, Joseph T. F.; Woo, Jean; Lee, Diana T. F.; Cheung, Fanny M.

    2018-01-01

    The present study examined whether, when, and how motivational messaging can boost the response rate of postal surveys for physicians based on Higgin's regulatory focus theory, accounting for its cost-effectiveness. A three-arm, blinded, randomized controlled design was used. A total of 3,270 doctors were randomly selected from the registration…

  7. Raw and Central Moments of Binomial Random Variables via Stirling Numbers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2013-01-01

    We consider here the problem of calculating the moments of binomial random variables. It is shown how formulae for both the raw and the central moments of such random variables may be obtained in a recursive manner utilizing Stirling numbers of the first kind. Suggestions are also provided as to how students might be encouraged to explore this…

  8. The Effect of Utilizing Organizational Culture Improvement Model of Patient Education on Coronary Artery Bypass Graft Patients' Anxiety and Satisfaction: Theory Testing.

    PubMed

    Farahani, Mansoureh Ashghali; Ghaffari, Fatemeh; Norouzinezhad, Faezeh; Orak, Roohangiz Jamshidi

    2016-11-01

    Due to the increasing prevalence of arteriosclerosis and the mortality caused by this disease, Coronary Artery Bypass Graft (CABG) has become one of the most common surgical procedures. Utilization of patient education is approved as an effective solution for increasing patient survival and outcomes of treatment. However, failure to consider different aspects of patient education has turned this goal into an unattainable one. The objective of this research was to determine the effect of utilizing the organizational culture improvement model of patient education on CABG patients' anxiety and satisfaction. The present study is a randomized controlled trial. This study was conducted on eighty CABG patients. The patients were selected from the CCU and Post-CCU wards of a hospital affiliated with Iran University of Medical Sciences in Tehran, Iran, during 2015. Eshpel Burger's Anxiety Inventory and Patients' Satisfaction Questionnaire were used to collect the required information. Levels of anxiety and satisfaction of patients before intervention and at the time of release were measured. The intervention took place after preparing a programmed package based on the organizational culture improvement model for the following dimensions: effective communication, participatory decision-making, goal setting, planning, implementation and recording, supervision and control, and improvement of motivation. After recording the data, it was analyzed in the chi-square test, t-independent and Mann-Whitney U tests. The significance level of tests was assumed to be 0.05. SPSS version 18 was also utilized for data analysis. Research results revealed that variations in the mean scores of situational and personality anxiety of the control and experiment group were descending following the intervention, but the decrease was higher in the experiment group (p≤0.0001). In addition, the variations of the mean scores of patients' satisfaction with education were higher in the experiment group than the control group (p≤0.0001). Utilization of the organizational culture improvement model of patient education reduces stress in CABG patients and increases their satisfaction with the provided education considering the factors involved in patient education, which were incorporated in the designed model.

  9. Rural-urban migration including formal and informal workers in the urban sector: an agent-based numerical simulation study

    NASA Astrophysics Data System (ADS)

    Branco, Nilton; Oliveira, Tharnier; Silveira, Jaylson

    2012-02-01

    The goal of this work is to study rural-urban migration in the early stages of industrialization. We use an agent-based model and take into account the existence of informal and formal workers on the urban sector and possible migration movements, dependent on the agents' social and private utilities. Our agents are place on vertices of a square lattice, such that each vertex has only one agent. Rural, urban informal and urban formal workers are represented by different states of a three-state Ising model. At every step, a fraction a of the agents may change sectors or migrate. The total utility of a given agent is then calculated and compared to a random utility, in order to check if this agent turns into an actual migrant or changes sector. The dynamics is carried out until an equilibrium state is reached and equilibrium variables are then calculated and compared to available data. We find that a generalized Harris-Todaro condition is satisfied [1] on these equilibrium regimes, i.e, the ratio between expected wages between any pair of sectors reach a constant value. [4pt] [1] J. J. Silveira, A. L. Esp'indola and T. J. Penna, Physica A, 364, 445 (2006).

  10. Inversion of left-right asymmetry alters performance of Xenopus tadpoles in nonlateralized cognitive tasks.

    PubMed

    Blackiston, Douglas J; Levin, Michael

    2013-08-01

    Left-right behavioural biases are well documented across the animal kingdom, and handedness has long been associated with cognitive performance. However, the relationship between body laterality and cognitive ability is poorly understood. The embryonic pathways dictating normal left-right patterning have been molecularly dissected in model vertebrates, and numerous genetic and pharmacological treatments now facilitate experimental randomization or reversal of the left-right axis in these animals. Several recent studies showed a link between brain asymmetry and strongly lateralized behaviours such as eye use preference. However, links between laterality of the body and performance on cognitive tasks utilizing nonlateralized cues remain unknown. Xenopus tadpoles are an established model for the study of early left-right patterning, and protocols were recently developed to quantitatively evaluate learning and memory in these animals. Using an automated testing and training platform, we tested wild-type, left-right-randomized and left-right-reversed tadpoles for their ability to learn colour cues in an automated assay. Our results indicate that animals with either randomization or reversal of somatic left-right patterning learned more slowly than wild-type siblings, although all groups were able to reach the same performance optimum given enough training sessions. These results are the first analysis of the link between body laterality and learning of nonlateralized cues, and they position the Xenopus tadpole as an attractive and tractable model for future studies of the links between asymmetry of the body, lateralization of the brain and behaviour.

  11. Intra-Accumbens Injection of a Dopamine Aptamer Abates MK-801-Induced Cognitive Dysfunction in a Model of Schizophrenia

    PubMed Central

    Holahan, Matthew R.; Madularu, Dan; McConnell, Erin M.; Walsh, Ryan; DeRosa, Maria C.

    2011-01-01

    Systemic administration of the noncompetitive NMDA-receptor antagonist, MK-801, has been proposed to model cognitive deficits similar to those seen in patients with schizophrenia. The present work investigated the ability of a dopamine-binding DNA aptamer to regulate these MK-801-induced cognitive deficits when injected into the nucleus accumbens. Rats were trained to bar press for chocolate pellet rewards then randomly assigned to receive an intra-accumbens injection of a DNA aptamer (200 nM; n = 7), tris buffer (n = 6) or a randomized DNA oligonucleotide (n = 7). Animals were then treated systemically with MK-801 (0.1 mg/kg) and tested for their ability to extinguish their bar pressing response. Two control groups were also included that did not receive MK-801. Data revealed that injection of Tris buffer or the random oligonucleotide sequence into the nucleus accumbens prior to treatment with MK-801 did not reduce the MK-801-induced extinction deficit. Animals continued to press at a high rate over the entire course of the extinction session. Injection of the dopamine aptamer reversed this MK-801-induced elevation in lever pressing to levels as seen in rats not treated with MK-801. Tests for activity showed that the aptamer did not impair locomotor activity. Results demonstrate the in vivo utility of DNA aptamers as tools to investigate neurobiological processes in preclinical animal models of mental health disease. PMID:21779401

  12. Satisfaction of active duty soldiers with family dental care.

    PubMed

    Chisick, M C

    1997-02-01

    In the fall of 1992, a random, worldwide sample of 6,442 married and single parent soldiers completed a self-administered survey on satisfaction with 22 attributes of family dental care. Simple descriptive statistics for each attribute were derived, as was a composite overall satisfaction score using factor analysis. Composite scores were regressed on demographics, annual dental utilization, and access barriers to identify those factors having an impact on a soldier's overall satisfaction with family dental care. Separate regression models were constructed for single parents, childless couples, and couples with children. Results show below-average satisfaction with nearly all attributes of family dental care, with access attributes having the lowest average satisfaction scores. Factors influencing satisfaction with family dental care varied by family type with one exception: dependent dental utilization within the past year contributed positively to satisfaction across all family types.

  13. Improving the performance of the mass transfer-based reference evapotranspiration estimation approaches through a coupled wavelet-random forest methodology

    NASA Astrophysics Data System (ADS)

    Shiri, Jalal

    2018-06-01

    Among different reference evapotranspiration (ETo) modeling approaches, mass transfer-based methods have been less studied. These approaches utilize temperature and wind speed records. On the other hand, the empirical equations proposed in this context generally produce weak simulations, except when a local calibration is used for improving their performance. This might be a crucial drawback for those equations in case of local data scarcity for calibration procedure. So, application of heuristic methods can be considered as a substitute for improving the performance accuracy of the mass transfer-based approaches. However, given that the wind speed records have usually higher variation magnitudes than the other meteorological parameters, application of a wavelet transform for coupling with heuristic models would be necessary. In the present paper, a coupled wavelet-random forest (WRF) methodology was proposed for the first time to improve the performance accuracy of the mass transfer-based ETo estimation approaches using cross-validation data management scenarios in both local and cross-station scales. The obtained results revealed that the new coupled WRF model (with the minimum scatter index values of 0.150 and 0.192 for local and external applications, respectively) improved the performance accuracy of the single RF models as well as the empirical equations to great extent.

  14. Does the universal health insurance program affect urban-rural differences in health service utilization among the elderly? Evidence from a longitudinal study in taiwan.

    PubMed

    Liao, Pei-An; Chang, Hung-Hao; Yang, Fang-An

    2012-01-01

    To assess the impact of the introduction of Taiwan's National Health Insurance (NHI) on urban-rural inequality in health service utilization among the elderly. A longitudinal data set of 1,504 individuals aged 65 and older was constructed from the Survey of Health and Living Status of the Elderly. A difference-in-differences model was employed and estimated by the random-effect probit method. The introduction of universal NHI in Taiwan heterogeneously affected outpatient and inpatient health service utilization among the elderly in urban and rural areas. The introduction of NHI reduced the disparity of outpatient (inpatient) utilization between the previously uninsured and insured older urban residents by 12.9 (22.0) percentage points. However, there was no significant reduction in the utilization disparity between the previously uninsured and insured elderly among rural residents. Our study on Taiwan's experience should provide a valuable lesson to countries that are in an initial stage of proposing a universal health insurance system. Although NHI is designed to ensure the equitable right to access health care, it may result in differential impacts on health service utilization among the elderly across areas. The rural elderly tend to confront more challenges in accessing health care associated with spatial distance, transportation, social isolation, poverty, and a lack of health care providers, especially medical specialists. © 2011 National Rural Health Association.

  15. An Australian discrete choice experiment to value eq-5d health states.

    PubMed

    Viney, Rosalie; Norman, Richard; Brazier, John; Cronin, Paula; King, Madeleine T; Ratcliffe, Julie; Street, Deborah

    2014-06-01

    Conventionally, generic quality-of-life health states, defined within multi-attribute utility instruments, have been valued using a Standard Gamble or a Time Trade-Off. Both are grounded in expected utility theory but impose strong assumptions about the form of the utility function. Preference elicitation tasks for both are complicated, limiting the number of health states that each respondent can value and, therefore, that can be valued overall. The usual approach has been to value a set of the possible health states and impute values for the remainder. Discrete Choice Experiments (DCEs) offer an attractive alternative, allowing investigation of more flexible specifications of the utility function and greater coverage of the response surface. We designed a DCE to obtain values for EQ-5D health states and implemented it in an Australia-representative online panel (n = 1,031). A range of specifications investigating non-linear preferences with respect to time and interactions between EQ-5D levels were estimated using a random-effects probit model. The results provide empirical support for a flexible utility function, including at least some two-factor interactions. We then constructed a preference index such that full health and death were valued at 1 and 0, respectively, to provide a DCE-based algorithm for Australian cost-utility analyses. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Unbiased split variable selection for random survival forests using maximally selected rank statistics.

    PubMed

    Wright, Marvin N; Dankowski, Theresa; Ziegler, Andreas

    2017-04-15

    The most popular approach for analyzing survival data is the Cox regression model. The Cox model may, however, be misspecified, and its proportionality assumption may not always be fulfilled. An alternative approach for survival prediction is random forests for survival outcomes. The standard split criterion for random survival forests is the log-rank test statistic, which favors splitting variables with many possible split points. Conditional inference forests avoid this split variable selection bias. However, linear rank statistics are utilized by default in conditional inference forests to select the optimal splitting variable, which cannot detect non-linear effects in the independent variables. An alternative is to use maximally selected rank statistics for the split point selection. As in conditional inference forests, splitting variables are compared on the p-value scale. However, instead of the conditional Monte-Carlo approach used in conditional inference forests, p-value approximations are employed. We describe several p-value approximations and the implementation of the proposed random forest approach. A simulation study demonstrates that unbiased split variable selection is possible. However, there is a trade-off between unbiased split variable selection and runtime. In benchmark studies of prediction performance on simulated and real datasets, the new method performs better than random survival forests if informative dichotomous variables are combined with uninformative variables with more categories and better than conditional inference forests if non-linear covariate effects are included. In a runtime comparison, the method proves to be computationally faster than both alternatives, if a simple p-value approximation is used. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  17. On the virtues of automated quantitative structure-activity relationship: the new kid on the block.

    PubMed

    de Oliveira, Marcelo T; Katekawa, Edson

    2018-02-01

    Quantitative structure-activity relationship (QSAR) has proved to be an invaluable tool in medicinal chemistry. Data availability at unprecedented levels through various databases have collaborated to a resurgence in the interest for QSAR. In this context, rapid generation of quality predictive models is highly desirable for hit identification and lead optimization. We showcase the application of an automated QSAR approach, which randomly selects multiple training/test sets and utilizes machine-learning algorithms to generate predictive models. Results demonstrate that AutoQSAR produces models of improved or similar quality to those generated by practitioners in the field but in just a fraction of the time. Despite the potential of the concept to the benefit of the community, the AutoQSAR opportunity has been largely undervalued.

  18. Two-dimensional signal processing with application to image restoration

    NASA Technical Reports Server (NTRS)

    Assefi, T.

    1974-01-01

    A recursive technique for modeling and estimating a two-dimensional signal contaminated by noise is presented. A two-dimensional signal is assumed to be an undistorted picture, where the noise introduces the distortion. Both the signal and the noise are assumed to be wide-sense stationary processes with known statistics. Thus, to estimate the two-dimensional signal is to enhance the picture. The picture representing the two-dimensional signal is converted to one dimension by scanning the image horizontally one line at a time. The scanner output becomes a nonstationary random process due to the periodic nature of the scanner operation. Procedures to obtain a dynamical model corresponding to the autocorrelation function of the scanner output are derived. Utilizing the model, a discrete Kalman estimator is designed to enhance the image.

  19. MIXOR: a computer program for mixed-effects ordinal regression analysis.

    PubMed

    Hedeker, D; Gibbons, R D

    1996-03-01

    MIXOR provides maximum marginal likelihood estimates for mixed-effects ordinal probit, logistic, and complementary log-log regression models. These models can be used for analysis of dichotomous and ordinal outcomes from either a clustered or longitudinal design. For clustered data, the mixed-effects model assumes that data within clusters are dependent. The degree of dependency is jointly estimated with the usual model parameters, thus adjusting for dependence resulting from clustering of the data. Similarly, for longitudinal data, the mixed-effects approach can allow for individual-varying intercepts and slopes across time, and can estimate the degree to which these time-related effects vary in the population of individuals. MIXOR uses marginal maximum likelihood estimation, utilizing a Fisher-scoring solution. For the scoring solution, the Cholesky factor of the random-effects variance-covariance matrix is estimated, along with the effects of model covariates. Examples illustrating usage and features of MIXOR are provided.

  20. Quantitative structure-property relationship (QSPR) modeling of drug-loaded polymeric micelles via genetic function approximation.

    PubMed

    Wu, Wensheng; Zhang, Canyang; Lin, Wenjing; Chen, Quan; Guo, Xindong; Qian, Yu; Zhang, Lijuan

    2015-01-01

    Self-assembled nano-micelles of amphiphilic polymers represent a novel anticancer drug delivery system. However, their full clinical utilization remains challenging because the quantitative structure-property relationship (QSPR) between the polymer structure and the efficacy of micelles as a drug carrier is poorly understood. Here, we developed a series of QSPR models to account for the drug loading capacity of polymeric micelles using the genetic function approximation (GFA) algorithm. These models were further evaluated by internal and external validation and a Y-randomization test in terms of stability and generalization, yielding an optimization model that is applicable to an expanded materials regime. As confirmed by experimental data, the relationship between microstructure and drug loading capacity can be well-simulated, suggesting that our models are readily applicable to the quantitative evaluation of the drug-loading capacity of polymeric micelles. Our work may offer a pathway to the design of formulation experiments.

  1. Quantitative Structure-Property Relationship (QSPR) Modeling of Drug-Loaded Polymeric Micelles via Genetic Function Approximation

    PubMed Central

    Lin, Wenjing; Chen, Quan; Guo, Xindong; Qian, Yu; Zhang, Lijuan

    2015-01-01

    Self-assembled nano-micelles of amphiphilic polymers represent a novel anticancer drug delivery system. However, their full clinical utilization remains challenging because the quantitative structure-property relationship (QSPR) between the polymer structure and the efficacy of micelles as a drug carrier is poorly understood. Here, we developed a series of QSPR models to account for the drug loading capacity of polymeric micelles using the genetic function approximation (GFA) algorithm. These models were further evaluated by internal and external validation and a Y-randomization test in terms of stability and generalization, yielding an optimization model that is applicable to an expanded materials regime. As confirmed by experimental data, the relationship between microstructure and drug loading capacity can be well-simulated, suggesting that our models are readily applicable to the quantitative evaluation of the drug-loading capacity of polymeric micelles. Our work may offer a pathway to the design of formulation experiments. PMID:25780923

  2. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    NASA Astrophysics Data System (ADS)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  3. Electroencephalography (EEG) forward modeling via H(div) finite element sources with focal interpolation.

    PubMed

    Pursiainen, S; Vorwerk, J; Wolters, C H

    2016-12-21

    The goal of this study is to develop focal, accurate and robust finite element method (FEM) based approaches which can predict the electric potential on the surface of the computational domain given its structure and internal primary source current distribution. While conducting an EEG evaluation, the placement of source currents to the geometrically complex grey matter compartment is a challenging but necessary task to avoid forward errors attributable to tissue conductivity jumps. Here, this task is approached via a mathematically rigorous formulation, in which the current field is modeled via divergence conforming H(div) basis functions. Both linear and quadratic functions are used while the potential field is discretized via the standard linear Lagrangian (nodal) basis. The resulting model includes dipolar sources which are interpolated into a random set of positions and orientations utilizing two alternative approaches: the position based optimization (PBO) and the mean position/orientation (MPO) method. These results demonstrate that the present dipolar approach can reach or even surpass, at least in some respects, the accuracy of two classical reference methods, the partial integration (PI) and St. Venant (SV) approach which utilize monopolar loads instead of dipolar currents.

  4. Impact of Predicting Health Care Utilization Via Web Search Behavior: A Data-Driven Analysis

    PubMed Central

    Zhang, Liangliang; Zhu, Josh; Fang, Shiyuan; Cheng, Tim; Hong, Chloe; Shah, Nigam H

    2016-01-01

    Background By recent estimates, the steady rise in health care costs has deprived more than 45 million Americans of health care services and has encouraged health care providers to better understand the key drivers of health care utilization from a population health management perspective. Prior studies suggest the feasibility of mining population-level patterns of health care resource utilization from observational analysis of Internet search logs; however, the utility of the endeavor to the various stakeholders in a health ecosystem remains unclear. Objective The aim was to carry out a closed-loop evaluation of the utility of health care use predictions using the conversion rates of advertisements that were displayed to the predicted future utilizers as a surrogate. The statistical models to predict the probability of user’s future visit to a medical facility were built using effective predictors of health care resource utilization, extracted from a deidentified dataset of geotagged mobile Internet search logs representing searches made by users of the Baidu search engine between March 2015 and May 2015. Methods We inferred presence within the geofence of a medical facility from location and duration information from users’ search logs and putatively assigned medical facility visit labels to qualifying search logs. We constructed a matrix of general, semantic, and location-based features from search logs of users that had 42 or more search days preceding a medical facility visit as well as from search logs of users that had no medical visits and trained statistical learners for predicting future medical visits. We then carried out a closed-loop evaluation of the utility of health care use predictions using the show conversion rates of advertisements displayed to the predicted future utilizers. In the context of behaviorally targeted advertising, wherein health care providers are interested in minimizing their cost per conversion, the association between show conversion rate and predicted utilization score, served as a surrogate measure of the model’s utility. Results We obtained the highest area under the curve (0.796) in medical visit prediction with our random forests model and daywise features. Ablating feature categories one at a time showed that the model performance worsened the most when location features were dropped. An online evaluation in which advertisements were served to users who had a high predicted probability of a future medical visit showed a 3.96% increase in the show conversion rate. Conclusions Results from our experiments done in a research setting suggest that it is possible to accurately predict future patient visits from geotagged mobile search logs. Results from the offline and online experiments on the utility of health utilization predictions suggest that such prediction can have utility for health care providers. PMID:27655225

  5. Estimating safety effects of pavement management factors utilizing Bayesian random effect models.

    PubMed

    Jiang, Ximiao; Huang, Baoshan; Zaretzki, Russell L; Richards, Stephen; Yan, Xuedong

    2013-01-01

    Previous studies of pavement management factors that relate to the occurrence of traffic-related crashes are rare. Traditional research has mostly employed summary statistics of bidirectional pavement quality measurements in extended longitudinal road segments over a long time period, which may cause a loss of important information and result in biased parameter estimates. The research presented in this article focuses on crash risk of roadways with overall fair to good pavement quality. Real-time and location-specific data were employed to estimate the effects of pavement management factors on the occurrence of crashes. This research is based on the crash data and corresponding pavement quality data for the Tennessee state route highways from 2004 to 2009. The potential temporal and spatial correlations among observations caused by unobserved factors were considered. Overall 6 models were built accounting for no correlation, temporal correlation only, and both the temporal and spatial correlations. These models included Poisson, negative binomial (NB), one random effect Poisson and negative binomial (OREP, ORENB), and two random effect Poisson and negative binomial (TREP, TRENB) models. The Bayesian method was employed to construct these models. The inference is based on the posterior distribution from the Markov chain Monte Carlo (MCMC) simulation. These models were compared using the deviance information criterion. Analysis of the posterior distribution of parameter coefficients indicates that the pavement management factors indexed by Present Serviceability Index (PSI) and Pavement Distress Index (PDI) had significant impacts on the occurrence of crashes, whereas the variable rutting depth was not significant. Among other factors, lane width, median width, type of terrain, and posted speed limit were significant in affecting crash frequency. The findings of this study indicate that a reduction in pavement roughness would reduce the likelihood of traffic-related crashes. Hence, maintaining a low level of pavement roughness is strongly suggested. In addition, the results suggested that the temporal correlation among observations was significant and that the ORENB model outperformed all other models.

  6. A Meta-Analysis of Transcutaneous Electrical Nerve Stimulation for Chronic Low Back Pain.

    PubMed

    Jauregui, Julio J; Cherian, Jeffrey J; Gwam, Chukwuweike U; Chughtai, Morad; Mistry, Jaydev B; Elmallah, Randa K; Harwin, Steven F; Bhave, Anil; Mont, Michael A

    2016-04-01

    Transcutaneous electrical nerve stimulation (TENS) may provide a safe alternative to current side-effect-heavy narcotics and anti-inflammatories utilized in chronic low back pain. Therefore, we performed a meta-analysis to evaluate the efficacy of TENS for the treatment of chronic low back pain. We included randomized controlled trials (RCTs), cohort studies, and randomized crossover studies on TENS for the management of low back pain. We utilized a visual analogue scale (VAS) for pain as our primary outcome. Effectiveness of treatment was quantified using improvement in outcome scores for each study. Of the studies that met the criteria, 13 allowed for calculation of weighted mean differences in pain reduction. We used a random model effect to evaluate changes in pain produced by the intervention. Included were nine level I and four level II, encompassing 267 patients (39% male) who had a mean follow-up of seven weeks (range; 2 to 24 weeks). The mean duration of treatment was six weeks (range; 2 to 24 weeks). The standardized mean difference in pain from pre- to post-treatment for TENS was 0.844, which demonstrated significant improvement of TENS on pain reduction. When subdividing treatment duration, patients that were treated for < 5 weeks had significant effects on pain, while those treated for > 5 weeks did not. Treatment of chronic low back pain with TENS demonstrated significant pain reduction. The application of TENS may lead to less pain medication usage and should be incorporated into the treatment armamentarium for chronic low back pain.

  7. A panel multinomial logit analysis of elderly living arrangements: evidence from Aging In Manitoba longitudinal data, Canada.

    PubMed

    Sarma, Sisira; Simpson, Wayne

    2007-12-01

    Utilizing a unique longitudinal survey linked with home care use data, this paper analyzes the determinants of elderly living arrangements in Manitoba, Canada using a random effects multinomial logit model that accounts for unobserved individual heterogeneity. Because current home ownership is potentially endogenous in a living arrangements choice model, we use prior home ownership as an instrument. We also use prior home care use as an instrument for home care and use a random coefficient framework to account for unobserved health status. After controlling for relevant socio-demographic factors and accounting for unobserved individual heterogeneity, we find that home care and home ownership reduce the probability of living in a nursing home. Consistent with previous studies, we find that age is a strong predictor of nursing home entry. We also find that married people, those who have lived longer in the same community, and those who are healthy are more likely to live independently and less likely to be institutionalized or to cohabit with individuals other than their spouse.

  8. Directed self assembly of block copolymers using chemical patterns with sidewall guiding lines, backfilled with random copolymer brushes.

    PubMed

    Pandav, Gunja; Durand, William J; Ellison, Christopher J; Willson, C Grant; Ganesan, Venkat

    2015-12-21

    Recently, alignment of block copolymer domains has been achieved using a topographically patterned substrate with a sidewall preferential to one of the blocks. This strategy has been suggested as an option to overcome the patterning resolution challenges facing chemoepitaxy strategies, which utilize chemical stripes with a width of about half the period of block copolymer to orient the equilibrium morphologies. In this work, single chain in mean field simulation methodology was used to study the self assembly of symmetric block copolymers on topographically patterned substrates with sidewall interactions. Random copolymer brushes grafted to the background region (space between patterns) were modeled explicitly. The effects of changes in pattern width, film thicknesses and strength of sidewall interaction on the resulting morphologies were examined and the conditions which led to perpendicular morphologies required for lithographic applications were identified. A number of density multiplication schemes were studied in order to gauge the efficiency with which the sidewall pattern can guide the self assembly of block copolymers. The results indicate that such a patterning technique can potentially utilize pattern widths of the order of one-two times the period of block copolymer and still be able to guide ordering of the block copolymer domains up to 8X density multiplication.

  9. Markov chain decision model for urinary incontinence procedures.

    PubMed

    Kumar, Sameer; Ghildayal, Nidhi; Ghildayal, Neha

    2017-03-13

    Purpose Urinary incontinence (UI) is a common chronic health condition, a problem specifically among elderly women that impacts quality of life negatively. However, UI is usually viewed as likely result of old age, and as such is generally not evaluated or even managed appropriately. Many treatments are available to manage incontinence, such as bladder training and numerous surgical procedures such as Burch colposuspension and Sling for UI which have high success rates. The purpose of this paper is to analyze which of these popular surgical procedures for UI is effective. Design/methodology/approach This research employs randomized, prospective studies to obtain robust cost and utility data used in the Markov chain decision model for examining which of these surgical interventions is more effective in treating women with stress UI based on two measures: number of quality adjusted life years (QALY) and cost per QALY. Treeage Pro Healthcare software was employed in Markov decision analysis. Findings Results showed the Sling procedure is a more effective surgical intervention than the Burch. However, if a utility greater than certain utility value, for which both procedures are equally effective, is assigned to persistent incontinence, the Burch procedure is more effective than the Sling procedure. Originality/value This paper demonstrates the efficacy of a Markov chain decision modeling approach to study the comparative effectiveness analysis of available treatments for patients with UI, an important public health issue, widely prevalent among elderly women in developed and developing countries. This research also improves upon other analyses using a Markov chain decision modeling process to analyze various strategies for treating UI.

  10. Multicenter Comparison of Machine Learning Methods and Conventional Regression for Predicting Clinical Deterioration on the Wards.

    PubMed

    Churpek, Matthew M; Yuen, Trevor C; Winslow, Christopher; Meltzer, David O; Kattan, Michael W; Edelson, Dana P

    2016-02-01

    Machine learning methods are flexible prediction algorithms that may be more accurate than conventional regression. We compared the accuracy of different techniques for detecting clinical deterioration on the wards in a large, multicenter database. Observational cohort study. Five hospitals, from November 2008 until January 2013. Hospitalized ward patients None Demographic variables, laboratory values, and vital signs were utilized in a discrete-time survival analysis framework to predict the combined outcome of cardiac arrest, intensive care unit transfer, or death. Two logistic regression models (one using linear predictor terms and a second utilizing restricted cubic splines) were compared to several different machine learning methods. The models were derived in the first 60% of the data by date and then validated in the next 40%. For model derivation, each event time window was matched to a non-event window. All models were compared to each other and to the Modified Early Warning score, a commonly cited early warning score, using the area under the receiver operating characteristic curve (AUC). A total of 269,999 patients were admitted, and 424 cardiac arrests, 13,188 intensive care unit transfers, and 2,840 deaths occurred in the study. In the validation dataset, the random forest model was the most accurate model (AUC, 0.80 [95% CI, 0.80-0.80]). The logistic regression model with spline predictors was more accurate than the model utilizing linear predictors (AUC, 0.77 vs 0.74; p < 0.01), and all models were more accurate than the MEWS (AUC, 0.70 [95% CI, 0.70-0.70]). In this multicenter study, we found that several machine learning methods more accurately predicted clinical deterioration than logistic regression. Use of detection algorithms derived from these techniques may result in improved identification of critically ill patients on the wards.

  11. A novel prediction approach for antimalarial activities of Trimethoprim, Pyrimethamine, and Cycloguanil analogues using extremely randomized trees.

    PubMed

    Nattee, Cholwich; Khamsemanan, Nirattaya; Lawtrakul, Luckhana; Toochinda, Pisanu; Hannongbua, Supa

    2017-01-01

    Malaria is still one of the most serious diseases in tropical regions. This is due in part to the high resistance against available drugs for the inhibition of parasites, Plasmodium, the cause of the disease. New potent compounds with high clinical utility are urgently needed. In this work, we created a novel model using a regression tree to study structure-activity relationships and predict the inhibition constant, K i of three different antimalarial analogues (Trimethoprim, Pyrimethamine, and Cycloguanil) based on their molecular descriptors. To the best of our knowledge, this work is the first attempt to study the structure-activity relationships of all three analogues combined. The most relevant descriptors and appropriate parameters of the regression tree are harvested using extremely randomized trees. These descriptors are water accessible surface area, Log of the aqueous solubility, total hydrophobic van der Waals surface area, and molecular refractivity. Out of all possible combinations of these selected parameters and descriptors, the tree with the strongest coefficient of determination is selected to be our prediction model. Predicted K i values from the proposed model show a strong coefficient of determination, R 2 =0.996, to experimental K i values. From the structure of the regression tree, compounds with high accessible surface area of all hydrophobic atoms (ASA_H) and low aqueous solubility of inhibitors (Log S) generally possess low K i values. Our prediction model can also be utilized as a screening test for new antimalarial drug compounds which may reduce the time and expenses for new drug development. New compounds with high predicted K i should be excluded from further drug development. It is also our inference that a threshold of ASA_H greater than 575.80 and Log S less than or equal to -4.36 is a sufficient condition for a new compound to possess a low K i . Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Incomparable hardness and modulus of biomimetic porous polyurethane films prepared by directional melt crystallization of a solvent

    NASA Astrophysics Data System (ADS)

    An, Suyeong; Kim, Byoungsoo; Lee, Jonghwi

    2017-07-01

    Porous materials with surprisingly diverse structures have been utilized in nature for many functional purposes. However, the structures and applications of porous man-made polymer materials have been limited by the use of processing techniques involving foaming agents. Herein, we demonstrate for the first time the outstanding hardness and modulus properties of an elastomer that originate from the novel processing approach applied. Polyurethane films of 100-μm thickness with biomimetic ordered porous structures were prepared using directional melt crystallization of a solvent and exhibited hardness and modulus values that were 6.8 and 4.3 times higher than those of the random pore structure, respectively. These values surpass the theoretical prediction of the typical model for porous materials, which works reasonably well for random pores but not for directional pores. Both the ordered and random pore structures exhibited similar porosities and pore sizes, which decreased with increasing solution concentration. This unexpectedly significant improvement of the hardness and modulus could open up new application areas for porous polymeric materials using this relatively novel processing technique.

  13. Developing small-area predictions for smoking and obesity prevalence in the United States for use in Environmental Public Health Tracking.

    PubMed

    Ortega Hinojosa, Alberto M; Davies, Molly M; Jarjour, Sarah; Burnett, Richard T; Mann, Jennifer K; Hughes, Edward; Balmes, John R; Turner, Michelle C; Jerrett, Michael

    2014-10-01

    Globally and in the United States, smoking and obesity are leading causes of death and disability. Reliable estimates of prevalence for these risk factors are often missing variables in public health surveillance programs. This may limit the capacity of public health surveillance to target interventions or to assess associations between other environmental risk factors (e.g., air pollution) and health because smoking and obesity are often important confounders. To generate prevalence estimates of smoking and obesity rates over small areas for the United States (i.e., at the ZIP code and census tract levels). We predicted smoking and obesity prevalence using a combined approach first using a lasso-based variable selection procedure followed by a two-level random effects regression with a Poisson link clustered on state and county. We used data from the Behavioral Risk Factor Surveillance System (BRFSS) from 1991 to 2010 to estimate the model. We used 10-fold cross-validated mean squared errors and the variance of the residuals to test our model. To downscale the estimates we combined the prediction equations with 1990 and 2000 U.S. Census data for each of the four five-year time periods in this time range at the ZIP code and census tract levels. Several sensitivity analyses were conducted using models that included only basic terms, that accounted for spatial autocorrelation, and used Generalized Linear Models that did not include random effects. The two-level random effects model produced improved estimates compared to the fixed effects-only models. Estimates were particularly improved for the two-thirds of the conterminous U.S. where BRFSS data were available to estimate the county level random effects. We downscaled the smoking and obesity rate predictions to derive ZIP code and census tract estimates. To our knowledge these smoking and obesity predictions are the first to be developed for the entire conterminous U.S. for census tracts and ZIP codes. Our estimates could have significant utility for public health surveillance. Copyright © 2014. Published by Elsevier Inc.

  14. Divalproex Sodium for the Treatment of PTSD and Conduct Disordered Youth: A Pilot Randomized Controlled Clinical Trial

    ERIC Educational Resources Information Center

    Steiner, Hans; Saxena, Kirti S.; Carrion, Victor; Khanzode, Leena A.; Silverman, Melissa; Chang, Kiki

    2007-01-01

    We examined the efficacy of divalproex sodium (DVP) for the treatment of PTSD in conduct disorder, utilizing a previous study in which 71 youth were enrolled in a randomized controlled clinical trial. Twelve had PTSD. Subjects (all males, mean age 16, SD 1.0) were randomized into high and low dose conditions. Clinical Global Impression (CGI)…

  15. Numerical modeling for dilute and dense sprays

    NASA Technical Reports Server (NTRS)

    Chen, C. P.; Kim, Y. M.; Shang, H. M.; Ziebarth, J. P.; Wang, T. S.

    1992-01-01

    We have successfully implemented a numerical model for spray-combustion calculations. In this model, the governing gas-phase equations in Eulerian coordinate are solved by a time-marching multiple pressure correction procedure based on the operator-splitting technique. The droplet-phase equations in Lagrangian coordinate are solved by a stochastic discrete particle technique. In order to simplify the calculation procedure for the circulating droplets, the effective conductivity model is utilized. The k-epsilon models are utilized to characterize the time and length scales of the gas phase in conjunction with turbulent modulation by droplets and droplet dispersion by turbulence. This method entails random sampling of instantaneous gas flow properties and the stochastic process requires a large number of computational parcels to produce the satisfactory dispersion distributions even for rather dilute sprays. Two major improvements in spray combustion modelings were made. Firstly, we have developed a probability density function approach in multidimensional space to represent a specific computational particle. Secondly, we incorporate the Taylor Analogy Breakup (TAB) model for handling the dense spray effects. This breakup model is based on the reasonable assumption that atomization and drop breakup are indistinguishable processes within a dense spray near the nozzle exit. Accordingly, atomization is prescribed by injecting drops which have a characteristic size equal to the nozzle exit diameter. Example problems include the nearly homogeneous and inhomogeneous turbulent particle dispersion, and the non-evaporating, evaporating, and burning dense sprays. Comparison with experimental data will be discussed in detail.

  16. A novel super-resolution camera model

    NASA Astrophysics Data System (ADS)

    Shao, Xiaopeng; Wang, Yi; Xu, Jie; Wang, Lin; Liu, Fei; Luo, Qiuhua; Chen, Xiaodong; Bi, Xiangli

    2015-05-01

    Aiming to realize super resolution(SR) to single image and video reconstruction, a super resolution camera model is proposed for the problem that the resolution of the images obtained by traditional cameras behave comparatively low. To achieve this function we put a certain driving device such as piezoelectric ceramics in the camera. By controlling the driving device, a set of continuous low resolution(LR) images can be obtained and stored instantaneity, which reflect the randomness of the displacements and the real-time performance of the storage very well. The low resolution image sequences have different redundant information and some particular priori information, thus it is possible to restore super resolution image factually and effectively. The sample method is used to derive the reconstruction principle of super resolution, which analyzes the possible improvement degree of the resolution in theory. The super resolution algorithm based on learning is used to reconstruct single image and the variational Bayesian algorithm is simulated to reconstruct the low resolution images with random displacements, which models the unknown high resolution image, motion parameters and unknown model parameters in one hierarchical Bayesian framework. Utilizing sub-pixel registration method, a super resolution image of the scene can be reconstructed. The results of 16 images reconstruction show that this camera model can increase the image resolution to 2 times, obtaining images with higher resolution in currently available hardware levels.

  17. Modeling and parameters identification of 2-keto-L-gulonic acid fed-batch fermentation.

    PubMed

    Wang, Tao; Sun, Jibin; Yuan, Jingqi

    2015-04-01

    This article presents a modeling approach for industrial 2-keto-L-gulonic acid (2-KGA) fed-batch fermentation by the mixed culture of Ketogulonicigenium vulgare (K. vulgare) and Bacillus megaterium (B. megaterium). A macrokinetic model of K. vulgare is constructed based on the simplified metabolic pathways. The reaction rates obtained from the macrokinetic model are then coupled into a bioreactor model such that the relationship between substrate feeding rates and the main state variables, e.g., the concentrations of the biomass, substrate and product, is constructed. A differential evolution algorithm using the Lozi map as the random number generator is utilized to perform the model parameters identification, with the industrial data of 2-KGA fed-batch fermentation. Validation results demonstrate that the model simulations of substrate and product concentrations are well in coincidence with the measurements. Furthermore, the model simulations of biomass concentrations reflect principally the growth kinetics of the two microbes in the mixed culture.

  18. Asymptotic Linear Spectral Statistics for Spiked Hermitian Random Matrices

    NASA Astrophysics Data System (ADS)

    Passemier, Damien; McKay, Matthew R.; Chen, Yang

    2015-07-01

    Using the Coulomb Fluid method, this paper derives central limit theorems (CLTs) for linear spectral statistics of three "spiked" Hermitian random matrix ensembles. These include Johnstone's spiked model (i.e., central Wishart with spiked correlation), non-central Wishart with rank-one non-centrality, and a related class of non-central matrices. For a generic linear statistic, we derive simple and explicit CLT expressions as the matrix dimensions grow large. For all three ensembles under consideration, we find that the primary effect of the spike is to introduce an correction term to the asymptotic mean of the linear spectral statistic, which we characterize with simple formulas. The utility of our proposed framework is demonstrated through application to three different linear statistics problems: the classical likelihood ratio test for a population covariance, the capacity analysis of multi-antenna wireless communication systems with a line-of-sight transmission path, and a classical multiple sample significance testing problem.

  19. Randomized, Double-Blind, Placebo-Controlled Study on Decolonization Procedures for Methicillin-Resistant Staphylococcus aureus (MRSA) among HIV-Infected Adults

    PubMed Central

    Weintrob, Amy; Bebu, Ionut; Agan, Brian; Diem, Alona; Johnson, Erica; Lalani, Tahaniyat; Wang, Xun; Bavaro, Mary; Ellis, Michael; Mende, Katrin; Crum-Cianflone, Nancy

    2015-01-01

    Background HIV-infected persons have increased risk of MRSA colonization and skin and soft-tissue infections (SSTI). However, no large clinical trial has examined the utility of decolonization procedures in reducing MRSA colonization or infection among community-dwelling HIV-infected persons. Methods 550 HIV-infected adults at four geographically diverse US military HIV clinics were prospectively screened for MRSA colonization at five body locations every 6 months during a 2-year period. Those colonized were randomized in a double-blind fashion to nasal mupirocin (Bactroban) twice daily and hexachlorophene (pHisoHex) soaps daily for 7 days compared to placeboes similar in appearance but without specific antibacterial activity. The primary endpoint was MRSA colonization at 6-months post-randomization; secondary endpoints were time to MRSA clearance, subsequent MRSA infections/SSTI, and predictors for MRSA clearance at the 6-month time point. Results Forty-nine (9%) HIV-infected persons were MRSA colonized and randomized. Among those with 6-month colonization data (80% of those randomized), 67% were negative for MRSA colonization in both groups (p = 1.0). Analyses accounting for missing 6-month data showed no significant differences could have been achieved. In the multivariate adjusted models, randomization group was not associated with 6-month MRSA clearance. The median time to MRSA clearance was similar in the treatment vs. placebo groups (1.4 vs. 1.8 months, p = 0.35). There was no difference on subsequent development of MRSA infections/SSTI (p = 0.89). In a multivariable model, treatment group, demographics, and HIV-specific factors were not predictive of MRSA clearance at the 6-month time point. Conclusion A one-week decolonization procedure had no effect on MRSA colonization at the 6-month time point or subsequent infection rates among community-dwelling HIV-infected persons. More aggressive or novel interventions may be needed to reduce the burden of MRSA in this population. Trial Registration ClinicalTrials.gov NCT00631566 PMID:26018036

  20. Random trinomial tree models and vanilla options

    NASA Astrophysics Data System (ADS)

    Ganikhodjaev, Nasir; Bayram, Kamola

    2013-09-01

    In this paper we introduce and study random trinomial model. The usual trinomial model is prescribed by triple of numbers (u, d, m). We call the triple (u, d, m) an environment of the trinomial model. A triple (Un, Dn, Mn), where {Un}, {Dn} and {Mn} are the sequences of independent, identically distributed random variables with 0 < Dn < 1 < Un and Mn = 1 for all n, is called a random environment and trinomial tree model with random environment is called random trinomial model. The random trinomial model is considered to produce more accurate results than the random binomial model or usual trinomial model.

  1. Blastocyst utilization rates after continuous culture in two commercial single-step media: a prospective randomized study with sibling oocytes.

    PubMed

    Sfontouris, Ioannis A; Kolibianakis, Efstratios M; Lainas, George T; Venetis, Christos A; Petsas, George K; Tarlatzis, Basil C; Lainas, Tryfon G

    2017-10-01

    The aim of this study is to determine whether blastocyst utilization rates are different after continuous culture in two different commercial single-step media. This is a paired randomized controlled trial with sibling oocytes conducted in infertility patients, aged ≤40 years with ≥10 oocytes retrieved assigned to blastocyst culture and transfer. Retrieved oocytes were randomly allocated to continuous culture in either Sage one-step medium (Origio) or Continuous Single Culture (CSC) medium (Irvine Scientific) without medium renewal up to day 5 post oocyte retrieval. Main outcome measure was the proportion of embryos suitable for clinical use (utilization rate). A total of 502 oocytes from 33 women were randomly allocated to continuous culture in either Sage one-step medium (n = 250) or CSC medium (n = 252). Fertilization was performed by either in vitro fertilization or intracytoplasmic sperm injection, and embryo transfers were performed on day 5. Two patients had all blastocysts frozen due to the occurrence of severe ovarian hyperstimulation syndrome. Fertilization and cleavage rates, as well as embryo quality on day 3, were similar in the two media. Blastocyst utilization rates (%, 95% CI) [55.4% (46.4-64.1) vs 54.7% (44.9-64.6), p = 0.717], blastocyst formation rates [53.6% (44.6-62.5) vs 51.9 (42.2-61.6), p = 0.755], and proportion of good quality blastocysts [36.8% (28.1-45.4) vs 36.1% (27.2-45.0), p = 0.850] were similar in Sage one-step and CSC media, respectively. Continuous culture of embryos in Sage one-step and CSC media is associated with similar blastocyst development and utilization rates. Both single-step media appear to provide adequate support during in vitro preimplantation embryo development. Whether these observations are also valid for other continuous single medium protocols remains to be determined. NCT02302638.

  2. Regression discontinuity was a valid design for dichotomous outcomes in three randomized trials.

    PubMed

    van Leeuwen, Nikki; Lingsma, Hester F; Mooijaart, Simon P; Nieboer, Daan; Trompet, Stella; Steyerberg, Ewout W

    2018-06-01

    Regression discontinuity (RD) is a quasi-experimental design that may provide valid estimates of treatment effects in case of continuous outcomes. We aimed to evaluate validity and precision in the RD design for dichotomous outcomes. We performed validation studies in three large randomized controlled trials (RCTs) (Corticosteroid Randomization After Significant Head injury [CRASH], the Global Utilization of Streptokinase and Tissue Plasminogen Activator for Occluded Coronary Arteries [GUSTO], and PROspective Study of Pravastatin in elderly individuals at risk of vascular disease [PROSPER]). To mimic the RD design, we selected patients above and below a cutoff (e.g., age 75 years) randomized to treatment and control, respectively. Adjusted logistic regression models using restricted cubic splines (RCS) and polynomials and local logistic regression models estimated the odds ratio (OR) for treatment, with 95% confidence intervals (CIs) to indicate precision. In CRASH, treatment increased mortality with OR 1.22 [95% CI 1.06-1.40] in the RCT. The RD estimates were 1.42 (0.94-2.16) and 1.13 (0.90-1.40) with RCS adjustment and local regression, respectively. In GUSTO, treatment reduced mortality (OR 0.83 [0.72-0.95]), with more extreme estimates in the RD analysis (OR 0.57 [0.35; 0.92] and 0.67 [0.51; 0.86]). In PROSPER, similar RCT and RD estimates were found, again with less precision in RD designs. We conclude that the RD design provides similar but substantially less precise treatment effect estimates compared with an RCT, with local regression being the preferred method of analysis. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Consumer Behavior in the Choice of Mode of Transport: A Case Study in the Toledo-Madrid Corridor

    PubMed Central

    Muro-Rodríguez, Ana I.; Perez-Jiménez, Israel R.; Gutiérrez-Broncano, Santiago

    2017-01-01

    Within the context of the consumption of goods or services the decisions made by individuals involve the choice between a set of discrete alternatives, such as the choice of mode of transport. The methodology for analyzing the consumer behavior are the models of discrete choice based on the Theory of Random Utility. These models are based on the definition of preferences through a utility function that is maximized. These models also denominated of disaggregated demand derived from the decision of a set of individuals, who are formalized by the application of probabilistic models. The objective of this study is to determine the behavior of the consumer in the choice of a service, namely of transport services and in a short-distance corridor, such as Toledo-Madrid. The Toledo-Madrid corridor is characterized by being short distance, with high speed train available within the choice options to get the airport, along with the bus and the car. And where offers of HST and aircraft services can be proposed as complementary modes. By applying disaggregated transport models with revealed preference survey data and declared preferences, one can determine the most important variables involved in the choice and determine the arrangements for payment of individuals. These payment provisions may condition the use of certain transport policies to promote the use of efficient transportation. PMID:28676776

  4. Consumer Behavior in the Choice of Mode of Transport: A Case Study in the Toledo-Madrid Corridor.

    PubMed

    Muro-Rodríguez, Ana I; Perez-Jiménez, Israel R; Gutiérrez-Broncano, Santiago

    2017-01-01

    Within the context of the consumption of goods or services the decisions made by individuals involve the choice between a set of discrete alternatives, such as the choice of mode of transport. The methodology for analyzing the consumer behavior are the models of discrete choice based on the Theory of Random Utility. These models are based on the definition of preferences through a utility function that is maximized. These models also denominated of disaggregated demand derived from the decision of a set of individuals, who are formalized by the application of probabilistic models. The objective of this study is to determine the behavior of the consumer in the choice of a service, namely of transport services and in a short-distance corridor, such as Toledo-Madrid. The Toledo-Madrid corridor is characterized by being short distance, with high speed train available within the choice options to get the airport, along with the bus and the car. And where offers of HST and aircraft services can be proposed as complementary modes. By applying disaggregated transport models with revealed preference survey data and declared preferences, one can determine the most important variables involved in the choice and determine the arrangements for payment of individuals. These payment provisions may condition the use of certain transport policies to promote the use of efficient transportation.

  5. Variations in the use of emergency PCI for the treatment of re-infarction following intravenous fibrinolytic therapy: impact on outcomes in HERO-2.

    PubMed

    Edmond, J J; French, J K; Aylward, P E G; Wong, C K; Stewart, R A H; Williams, B F; De Pasquale, C G; O'connell, R L; Van den Berg, K; Van de Werf, F J; Simes, R J; White, H D

    2007-06-01

    Patients who suffer re-infarction during initial hospitalization for ST-elevation myocardial infarction (STEMI) have decreased survival compared to patients without re-infarction, so treatment of re-infarction may influence survival. To determine whether the utilization of reperfusion therapies varied within 12 h of re-infarction and was associated with 30-day mortality, we studied 552 patients with re-infarction of 17,073 patients with STEMI enrolled in HERO-2 in five regions (Russia, Eastern Europe, Western Countries, Asia, and Latin America). Patients presenting within 6 h of symptom-onset were randomized to receive either bivalirudin or unfractionated heparin intravenously just prior to streptokinase. Re-infarction occurred in 2.8 and 3.6% of bivalirudin and heparin treated patients, respectively (P = 0.004), but treatment assignment did not influence mortality after re-infarction. Patients with re-infarction had a higher 30-day mortality than those without re-infarction (24 vs. 10%; P < 0.001 by Cox model). Within 12 h of re-infarction, fibrinolytic therapy was administered to 12.0 and 8.2% underwent percutaneous coronary intervention (PCI); these two treatments were more frequently utilized in patients from Western countries (n = 112), compared to patients from other countries (n = 440) (34.8 and 16.1% compared to 6.1 and 6.1%, respectively, P < 0.001). Mortality was 15% in patients receiving reperfusion therapy for re-infarction and 27% for those with conservative management, hazard ratio (HR) 0.53 (95% CI 0.32-0.88), P = 0.01. In multiple Cox regression analysis which included adjustment for clinical variables and randomized treatment assignment, 30-day mortality after re-infarction varied by region (highest Latin America 29%, lowest Western countries 15%; P = 0.01). Other independent prognostic factors included age, time from randomization to re-infarction, and Killip class at randomization. The HR for PCI treatment of re-infarction was 0.18 [(95% CI 0.04-0.76), P = 0.02] in analyses which excluded deaths within 12 h. Treatment of re-infarction with reperfusion therapies was markedly under-utilized, especially in non-western countries. PCI for re-infarction, in particular, was associated with a lower 30-day mortality, which may reflect both patient selection and effects of treatment.

  6. Developing and testing a decision model for predicting influenza vaccination compliance.

    PubMed Central

    Carter, W B; Beach, L R; Inui, T S; Kirscht, J P; Prodzinski, J C

    1986-01-01

    Influenza vaccination has long been recommended for elderly high-risk patients, yet national surveys indicate that vaccination compliance rates are remarkably low (20 percent). We conducted a study to model prospectively the flu shot decisions and subsequent behavior of an elderly and/or chronically diseased (at high risk for complications of influenza) ambulatory care population at the Seattle VA Medical Center. Prior to the 1980-81 flu shot season, a random (stratified by disease) sample of 63 patients, drawn from the total population of high-risk patients in the general medicine clinic, was interviewed to identify patient-defined concerns regarding flu shots. Six potential consequences of influenza and nine of vaccination were emphasized by patients and provided the content for a weighted hierarchical utility model questionnaire. The utility model provides an operational framework for (1) obtaining subjective value and relative importance judgments from patients; (2) combining these judgments to obtain a prediction of behavioral intention and behavior for each patient; and, if the model is valid (predictive of behavior), (3) identifying those factors which are most salient to patient's decisions and subsequent behavior. Prior to the 1981-82 flu season, the decision model questionnaire was administered to 350 other high-risk patients from the same general medicine clinic population. The decision model correctly predicted behavioral intention for 87 percent and vaccination behavior for 82 percent of this population and, more importantly, differentiated shot "takers" and "nontakers" along several attitudinal dimensions that suggest specific content areas for clinical compliance intervention strategies. PMID:3949541

  7. Comparing efficacy of reduced-toxicity allogeneic hematopoietic cell transplantation with conventional chemo-(immuno) therapy in patients with relapsed or refractory CLL: a Markov decision analysis.

    PubMed

    Kharfan-Dabaja, M A; Pidala, J; Kumar, A; Terasawa, T; Djulbegovic, B

    2012-09-01

    Despite therapeutic advances, relapsed/refractory CLL, particularly after fludarabine-based regimens, remains a major challenge for which optimal therapy is undefined. No randomized comparative data exist to suggest the superiority of reduced-toxicity allogeneic hematopoietic cell transplantation (RT-allo-HCT) over conventional chemo-(immuno) therapy (CCIT). By using estimates from a systematic review and by meta-analysis of available published evidence, we constructed a Markov decision model to examine these competing modalities. Cohort analysis demonstrated superior outcome for RT-allo-HCT, with a 10-month overall life expectancy (and 6-month quality-adjusted life expectancy (QALE)) advantage over CCIT. Although the model was sensitive to changes in base-case assumptions and transition probabilities, RT-allo-HCT provided superior overall life expectancy through a range of values supported by the meta-analysis. QALE was superior for RT-allo-HCT compared with CCIT. This conclusion was sensitive to change in the anticipated state utility associated with the post-allogeneic HCT state; however, RT-allo-HCT remained the optimal strategy for values supported by existing literature. This analysis provides a quantitative comparison of outcomes between RT-allo-HCT and CCIT for relapsed/refractory CLL in the absence of randomized comparative trials. Confirmation of these findings requires a prospective randomized trial, which compares the most effective RT-allo-HCT and CCIT regimens for relapsed/refractory CLL.

  8. Rational analyses of information foraging on the web.

    PubMed

    Pirolli, Peter

    2005-05-06

    This article describes rational analyses and cognitive models of Web users developed within information foraging theory. This is done by following the rational analysis methodology of (a) characterizing the problems posed by the environment, (b) developing rational analyses of behavioral solutions to those problems, and (c) developing cognitive models that approach the realization of those solutions. Navigation choice is modeled as a random utility model that uses spreading activation mechanisms that link proximal cues (information scent) that occur in Web browsers to internal user goals. Web-site leaving is modeled as an ongoing assessment by the Web user of the expected benefits of continuing at a Web site as opposed to going elsewhere. These cost-benefit assessments are also based on spreading activation models of information scent. Evaluations include a computational model of Web user behavior called Scent-Based Navigation and Information Foraging in the ACT Architecture, and the Law of Surfing, which characterizes the empirical distribution of the length of paths of visitors at a Web site. 2005 Lawrence Erlbaum Associates, Inc.

  9. Drag Reduction of an Airfoil Using Deep Learning

    NASA Astrophysics Data System (ADS)

    Jiang, Chiyu; Sun, Anzhu; Marcus, Philip

    2017-11-01

    We reduced the drag of a 2D airfoil by starting with a NACA-0012 airfoil and used deep learning methods. We created a database which consists of simulations of 2D external flow over randomly generated shapes. We then developed a machine learning framework for external flow field inference given input shapes. Past work which utilized machine learning in Computational Fluid Dynamics focused on estimations of specific flow parameters, but this work is novel in the inference of entire flow fields. We further showed that learned flow patterns are transferable to cases that share certain similarities. This study illustrates the prospects of deeper integration of data-based modeling into current CFD simulation frameworks for faster flow inference and more accurate flow modeling.

  10. Three-Dimensional Models of Topological Insulators: Engineering of Dirac Cones and Robustness of the Spin Texture

    NASA Astrophysics Data System (ADS)

    Soriano, David; Ortmann, Frank; Roche, Stephan

    2012-12-01

    We design three-dimensional models of topological insulator thin films, showing a tunability of the odd number of Dirac cones driven by the atomic-scale geometry at the boundaries. A single Dirac cone at the Γ-point can be obtained as well as full suppression of quantum tunneling between Dirac states at geometrically differentiated surfaces. The spin texture of surface states changes from a spin-momentum-locking symmetry to a surface spin randomization upon the introduction of bulk disorder. These findings illustrate the richness of the Dirac physics emerging in thin films of topological insulators and may prove utile for engineering Dirac cones and for quantifying bulk disorder in materials with ultraclean surfaces.

  11. Microseismic response characteristics modeling and locating of underground water supply pipe leak

    NASA Astrophysics Data System (ADS)

    Wang, J.; Liu, J.

    2015-12-01

    In traditional methods of pipeline leak location, geophones must be located on the pipe wall. If the exact location of the pipeline is unknown, the leaks cannot be identified accurately. To solve this problem, taking into account the characteristics of the pipeline leak, we propose a continuous random seismic source model and construct geological models to investigate the proposed method for locating underground pipeline leaks. Based on two dimensional (2D) viscoacoustic equations and the staggered grid finite-difference (FD) algorithm, the microseismic wave field generated by a leaking pipe is modeled. Cross-correlation analysis and the simulated annealing (SA) algorithm were utilized to obtain the time difference and the leak location. We also analyze and discuss the effect of the number of recorded traces, the survey layout, and the offset and interval of the traces on the accuracy of the estimated location. The preliminary results of the simulation and data field experiment indicate that (1) a continuous random source can realistically represent the leak microseismic wave field in a simulation using 2D visco-acoustic equations and a staggered grid FD algorithm. (2) The cross-correlation method is effective for calculating the time difference of the direct wave relative to the reference trace. However, outside the refraction blind zone, the accuracy of the time difference is reduced by the effects of the refracted wave. (3) The acquisition method of time difference based on the microseismic theory and SA algorithm has a great potential for locating leaks from underground pipelines from an array located on the ground surface. Keywords: Viscoacoustic finite-difference simulation; continuous random source; simulated annealing algorithm; pipeline leak location

  12. Residential green power demand in the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagher, Leila; Bird, Lori; Heeter, Jenny

    This paper investigates the demand determinants of green power in the U.S. residential sector. The data employed were collected by the National Renewable Energy Laboratory and consist of a cross-section of seven utilities observed over 13 years. A series of tests are performed that resulted in estimating a demand equation using the one-way cross-section random effects model. As expected, we find that demand is highly price inelastic. More interestingly though, is that elasticity with respect to number of customers is 0.52 leading to the conclusion that new subscribers tend to purchase less green power on average than the existing customers.more » Another compelling finding is that obtaining accreditation will have a 28.5% positive impact on consumption. Knowing that gaining green accreditation is important to the success of programs, utilities may want to seek certification and highlight it in their advertising campaigns.« less

  13. Randomized Control Trials on the Dynamic Geometry Approach

    ERIC Educational Resources Information Center

    Jiang, Zhonghong; White, Alexander; Rosenwasser, Alana

    2011-01-01

    The project reported here is conducting repeated randomized control trials of an approach to high school geometry that utilizes Dynamic Geometry (DG) software to supplement ordinary instructional practices. It compares effects of that intervention with standard instruction that does not make use of computer drawing/exploration tools. The basic…

  14. Challenges and Innovations in a Community-Based Participatory Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Goodkind, Jessica R.; Amer, Suha; Christian, Charlisa; Hess, Julia Meredith; Bybee, Deborah; Isakson, Brian L.; Baca, Brandon; Ndayisenga, Martin; Greene, R. Neil; Shantzek, Cece

    2017-01-01

    Randomized controlled trials (RCTs) are a long-standing and important design for conducting rigorous tests of the effectiveness of health interventions. However, many questions have been raised about the external validity of RCTs, their utility in explicating mechanisms of intervention and participants' intervention experiences, and their…

  15. Impact of school-based vegetable garden and physical activity coordinated health interventions on weight status and weight-related behaviors of ethnically diverse, low-income students: Study design and baseline data of the Texas, Grow! Eat! Go! (TGEG) cluster-randomized controlled trial.

    PubMed

    Evans, A; Ranjit, N; Hoelscher, D; Jovanovic, C; Lopez, M; McIntosh, A; Ory, M; Whittlesey, L; McKyer, L; Kirk, A; Smith, C; Walton, C; Heredia, N I; Warren, J

    2016-09-13

    Coordinated, multi-component school-based interventions can improve health behaviors in children, as well as parents, and impact the weight status of students. By leveraging a unique collaboration between Texas AgriLife Extension (a federal, state and county funded educational outreach organization) and the University of Texas School of Public Health, the Texas Grow! Eat! Go! Study (TGEG) modeled the effectiveness of utilizing existing programs and volunteer infrastructure to disseminate an enhanced Coordinated School Health program. The five-year TGEG study was developed to assess the independent and combined impact of gardening, nutrition and physical activity intervention(s) on the prevalence of healthy eating, physical activity and weight status among low-income elementary students. The purpose of this paper is to report on study design, baseline characteristics, intervention approaches, data collection and baseline data. The study design for the TGEG study consisted of a factorial group randomized controlled trial (RCT) in which 28 schools were randomly assigned to one of 4 treatment groups: (1) Coordinated Approach to Child Health (CATCH) only (Comparison), (2) CATCH plus school garden intervention [Learn, Grow, Eat & Go! (LGEG)], (3) CATCH plus physical activity intervention [Walk Across Texas (WAT)], and (4) CATCH plus LGEG plus WAT (Combined). The outcome variables include student's weight status, vegetable and sugar sweetened beverage consumption, physical activity, and sedentary behavior. Parents were assessed for home environmental variables including availability of certain foods, social support of student health behaviors, parent engagement and behavior modeling. Descriptive data are presented for students (n = 1369) and parents (n = 1206) at baseline. The sample consisted primarily of Hispanic and African American (53 % and 18 %, respectively) and low-income (i.e., 78 % eligible for Free and Reduced Price School Meals program and 43 % food insecure) students. On average, students did not meet national guidelines for vegetable consumption or physical activity. At baseline, no statistical differences for demographic or key outcome variables among the 4 treatment groups were observed. The TGEG study targets a population of students and parents at high risk of obesity and related chronic conditions, utilizing a novel and collaborative approach to program formulation and delivery, and a rigorous, randomized study design.

  16. A randomized controlled trial to assess the efficacy of an interactive mobile messaging intervention for underserved smokers: Project ACTION.

    PubMed

    Vidrine, Damon J; Fletcher, Faith E; Danysh, Heather E; Marani, Salma; Vidrine, Jennifer Irvin; Cantor, Scott B; Prokhorov, Alexander V

    2012-08-25

    Despite a significant decrease in smoking prevalence over the past ten years, cigarette smoking still represents the leading cause of preventable morbidity and mortality in the United States. Moreover, smoking prevalence is significantly higher among those with low levels of education and those living at, or below, the poverty level. These groups tend to be confronted with significant barriers to utilizing more traditional smoking cessation intervention approaches. The purpose of the study, Project ACTION (Adult smoking Cessation Treatment through Innovative Outreach to Neighborhoods), is to utilize a mobile clinic model, a network of community sites (i.e., community centers and churches) and an interactive mobile messaging system to reach and deliver smoking cessation treatment to underserved, low-income communities. We are using a group-randomized design, with the community site as the sampling unit, to compare the efficacy of three smoking cessation interventions: 1) Standard Care--brief advice to quit smoking, nicotine replacement therapy (NRT), and self-help materials; 2) Enhanced Care--standard care components plus a cell phone-delivered text/graphical messaging component; and 3) Intensive Care--enhanced care components plus a series of 11 cell phone-delivered proactive counseling sessions. An economic evaluation will also be performed to evaluate the relative cost effectiveness of the three treatment approaches. We will recruit 756 participants (252 participants in each of the 3 intervention groups). At the time of randomization, participants complete a baseline assessment, consisting of smoking history, socio-demographic, and psychosocial variables. Monthly cell phone assessments are conducted for 6 months-post enrollment, and a final 12-month follow-up is conducted at the original neighborhood site of enrollment. We will perform mixed-model logistic regression to compare the efficacy of the three smoking cessation intervention treatment groups. It is hypothesized that the intensive care approach will most successfully address the needs of the target population and result in the highest smoking cessation rates. In addition to increasing cessation rates, the intervention offers several features (including neighborhood outreach and use of mHealth technology) that are likely to reduce treatment barriers while enhancing participant engagement and retention to treatment. This randomized controlled trial is registered with clinicaltrials.gov registration number NCT00948129.

  17. Compatibility of the Space Station Freedom life sciences research centrifuge with microgravity requirements

    NASA Technical Reports Server (NTRS)

    Hasha, Martin D.

    1990-01-01

    NASA is developing a Life Sciences Centrifuge Facility for Space Station Freedom. In includes a 2.5-meter artificial gravity Bioresearch Centrifuge (BC), which is perhaps the most critical single element in the life sciences space research program. It rotates continuously at precise selectable rates, and utilizes advanced reliable technologies to reduce vibrations. Three disturbance types are analyzed using a current Space Station Freedom dynamic model in the 0.0 to 5.0 Hz range: sinusoidal, random, and transient. Results show that with proper selection of proven design techniques, BC vibrations are compatible with requirements.

  18. A random forest approach for predicting the presence of Echinococcus multilocularis intermediate host Ochotona spp. presence in relation to landscape characteristics in western China

    PubMed Central

    Marston, Christopher G.; Danson, F. Mark; Armitage, Richard P.; Giraudoux, Patrick; Pleydell, David R.J.; Wang, Qian; Qui, Jiamin; Craig, Philip S.

    2014-01-01

    Understanding distribution patterns of hosts implicated in the transmission of zoonotic disease remains a key goal of parasitology. Here, random forests are employed to model spatial patterns of the presence of the plateau pika (Ochotona spp.) small mammal intermediate host for the parasitic tapeworm Echinococcus multilocularis which is responsible for a significant burden of human zoonoses in western China. Landsat ETM+ satellite imagery and digital elevation model data were utilized to generate quantified measures of environmental characteristics across a study area in Sichuan Province, China. Land cover maps were generated identifying the distribution of specific land cover types, with landscape metrics employed to describe the spatial organisation of land cover patches. Random forests were used to model spatial patterns of Ochotona spp. presence, enabling the relative importance of the environmental characteristics in relation to Ochotona spp. presence to be ranked. An index of habitat aggregation was identified as the most important variable in influencing Ochotona spp. presence, with area of degraded grassland the most important land cover class variable. 71% of the variance in Ochotona spp. presence was explained, with a 90.98% accuracy rate as determined by ‘out-of-bag’ error assessment. Identification of the environmental characteristics influencing Ochotona spp. presence enables us to better understand distribution patterns of hosts implicated in the transmission of Em. The predictive mapping of this Em host enables the identification of human populations at increased risk of infection, enabling preventative strategies to be adopted. PMID:25386042

  19. Self-efficacy is associated with increased food security in novel food pantry program.

    PubMed

    Martin, Katie S; Colantonio, Angela G; Picho, Katherine; Boyle, Katie E

    2016-12-01

    We examined the effect of a novel food pantry intervention (Freshplace) that includes client-choice and motivational interviewing on self-efficacy and food security in food pantry clients. The study was designed as a randomized control trial. Participants were recruited over one year from traditional food pantries in Hartford, CT. Participants were randomized to Freshplace or traditional food pantries (controls) and data collection occurred at baseline with quarterly follow-ups for 18 months. Food security was measured using the USDA 18-item Food Security Module. A newly developed scale was utilized to measure self-efficacy. Scale reliability was measured using a Cronbach alpha test; validity was measured via correlating with a related variable. Analyses included chi-square tests for bivariate analyses and hierarchical linear modeling for longitudinal analyses. A total of 227 adults were randomized to the Freshplace intervention ( n =112) or control group ( n =115). The overall group was 60% female, 73% Black, mean age=51. The new self-efficacy scale showed good reliability and validity. Self-efficacy was significantly inversely associated with very low food security ( p <.05). Being in the Freshplace intervention ( p =.01) and higher self-efficacy ( p =.04) were independently associated with decreased very low food security. The traditional food pantry model fails to recognize the influence of self-efficacy on a person's food security. A food pantry model with client-choice, motivational interviewing and targeted referral services can increase self-efficacy of clients. Prioritizing the self-efficacy of clients over the efficiency of pantry operations is required to increase food security among disadvantaged populations.

  20. The futility of utility: how market dynamics marginalize Adam Smith

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2000-10-01

    Economic theorizing is based on the postulated, nonempiric notion of utility. Economists assume that prices, dynamics, and market equilibria are supposed to be derived from utility. The results are supposed to represent mathematically the stabilizing action of Adam Smith's invisible hand. In deterministic excess demand dynamics I show the following. A utility function generally does not exist mathematically due to nonintegrable dynamics when production/investment are accounted for, resolving Mirowski's thesis. Price as a function of demand does not exist mathematically either. All equilibria are unstable. I then explain how deterministic chaos can be distinguished from random noise at short times. In the generalization to liquid markets and finance theory described by stochastic excess demand dynamics, I also show the following. Market price distributions cannot be rescaled to describe price movements as ‘equilibrium’ fluctuations about a systematic drift in price. Utility maximization does not describe equilibrium. Maximization of the Gibbs entropy of the observed price distribution of an asset would describe equilibrium, if equilibrium could be achieved, but equilibrium does not describe real, liquid markets (stocks, bonds, foreign exchange). There are three inconsistent definitions of equilibrium used in economics and finance, only one of which is correct. Prices in unregulated free markets are unstable against both noise and rising or falling expectations: Adam Smith's stabilizing invisible hand does not exist, either in mathematical models of liquid market data, or in real market data.

  1. Increasing the realism of projected tree species ranges by incorporating migration potential: an eastern US case study

    NASA Astrophysics Data System (ADS)

    Rogers, B. M.; Jantz, P.; Goetz, S. J.

    2015-12-01

    Models of vegetation distributions are used for a wide variety of purposes, from global assessments of biome shifts and biogeochemical feedbacks to local management planning. Dynamic vegetation models, mostly mechanistic in origin, are valuable for regional to global studies but remain limited for more local-scale applications, especially those that require species-specific responses to climate change. Species distribution models (SDMs) are broadly used for such applications, but these too have several outstanding limitations, one of the most prominent being a lack of dispersal and migration. Several hybrid models have recently been developed, but these generally require detailed parameterization of species-level attributes that may not be known. Here we present an approach to couple migration potential with SDM output for a large number of species in order to more realistically project future range shifts. We focus on 40 tree species in the eastern US of potential management concern, either because of their canopy dominance, ecosystem functions, or potential for utilizing future climates. Future climates were taken from a CMIP5 model ensemble average using RCP 4.5 and 8.5 scenarios. We used Random Forests to characterize current and future environmental suitability, and modeled migration as a negative exponential kernel that is affected by forest fragmentation and the density of current seed sources. We present results in a vulnerability framework relevant for a number of ongoing management activities in the region. We find an overarching pattern of northward and eastward range shifts, with high-elevation and northern species being the most adversely impacted. Because of limitations to migration, many newly suitable areas could not be utilized without active intervention. Only a few areas exhibited consistently favorable conditions that could be utilized by the relevant species, including the central Appalachian foothills and the Florida panhandle. We suggest that a continued effort to include migration potential into vegetation models can lead to more realistic results and management-relevant products.

  2. Physical Models of Cognition

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1994-01-01

    This paper presents and discusses physical models for simulating some aspects of neural intelligence, and, in particular, the process of cognition. The main departure from the classical approach here is in utilization of a terminal version of classical dynamics introduced by the author earlier. Based upon violations of the Lipschitz condition at equilibrium points, terminal dynamics attains two new fundamental properties: it is spontaneous and nondeterministic. Special attention is focused on terminal neurodynamics as a particular architecture of terminal dynamics which is suitable for modeling of information flows. Terminal neurodynamics possesses a well-organized probabilistic structure which can be analytically predicted, prescribed, and controlled, and therefore which presents a powerful tool for modeling real-life uncertainties. Two basic phenomena associated with random behavior of neurodynamic solutions are exploited. The first one is a stochastic attractor ; a stable stationary stochastic process to which random solutions of a closed system converge. As a model of the cognition process, a stochastic attractor can be viewed as a universal tool for generalization and formation of classes of patterns. The concept of stochastic attractor is applied to model a collective brain paradigm explaining coordination between simple units of intelligence which perform a collective task without direct exchange of information. The second fundamental phenomenon discussed is terminal chaos which occurs in open systems. Applications of terminal chaos to information fusion as well as to explanation and modeling of coordination among neurons in biological systems are discussed. It should be emphasized that all the models of terminal neurodynamics are implementable in analog devices, which means that all the cognition processes discussed in the paper are reducible to the laws of Newtonian mechanics.

  3. The Influence of Consumer Goals and Marketing Activities on Product Bundling

    NASA Astrophysics Data System (ADS)

    Haijun, Wang

    Upon entering a store, consumers are faced with the questions of whether to buy, what to buy, and how much to buy. Consumers include products from different categories in their decision process. Product categories can be related in different ways. Product bundling is a process that involves the choice of at least two non-substitutable items. In this research, the consumers' explicit product bundling activity at the point of sale is focused. We focuses on the retailers' perspective and therefore leaves out consumers' brand choice decisions, concentrating on purchase incidence and quantity. At the base of the current model of the exist researches, we integrate behavioural choice analysis and predictive choice modelling through the underlying behavioural models, called random utility maximization (RUM) models. The methodological contribution of this research lies therein to combine a nested logit choice model with a latent variable factor model. We point out several limitations for both theory and practice at the end.

  4. Bayesian Approach for Flexible Modeling of Semicompeting Risks Data

    PubMed Central

    Han, Baoguang; Yu, Menggang; Dignam, James J.; Rathouz, Paul J.

    2016-01-01

    Summary Semicompeting risks data arise when two types of events, non-terminal and terminal, are observed. When the terminal event occurs first, it censors the non-terminal event, but not vice versa. To account for possible dependent censoring of the non-terminal event by the terminal event and to improve prediction of the terminal event using the non-terminal event information, it is crucial to model their association properly. Motivated by a breast cancer clinical trial data analysis, we extend the well-known illness-death models to allow flexible random effects to capture heterogeneous association structures in the data. Our extension also represents a generalization of the popular shared frailty models that usually assume that the non-terminal event does not affect the hazards of the terminal event beyond a frailty term. We propose a unified Bayesian modeling approach that can utilize existing software packages for both model fitting and individual specific event prediction. The approach is demonstrated via both simulation studies and a breast cancer data set analysis. PMID:25274445

  5. Constraining Thermal Histories by Monte Carlo Simulation of Mg-Fe Isotopic Profiles in Olivine

    NASA Astrophysics Data System (ADS)

    Sio, C. K. I.; Dauphas, N.

    2016-12-01

    In thermochronology, random time-temperature (t-T) paths are generated and used as inputs to model fission track data. This random search method is used to identify a range of acceptable thermal histories that can describe the data. We have extended this modeling approach to magmatic systems. This approach utilizes both the chemical and stable isotope profiles measured in crystals as model constraints. Specifically, the isotopic profiles are used to determine the relative contribution of crystal growth vs. diffusion in generating chemical profiles, and to detect changes in melt composition. With this information, tighter constraints can be placed on the thermal evolution of magmatic bodies. We use an olivine phenocryst from the Kilauea Iki lava lake, HI, to demonstrate proof of concept. We treat this sample as one with little geologic context, then compare our modeling results to the known thermal history experienced by that sample. To complete forward modeling, we use MELTS to estimate the boundary condition, initial and quench temperatures. We also assume a simple relationship between crystal growth and cooling rate. Another important parameter is the isotopic effect for diffusion (i.e., the relative diffusivity of the light vs. heavy isotope of an element). The isotopic effects for Mg and Fe diffusion in olivine have been estimated based on natural samples; experiments to better constrain these parameters are underway. We find that 40% of the random t-T paths can be used to fit the Mg-Fe chemical profiles. However, only a few can be used to simultaneously fit the Mg-Fe isotopic profiles. These few t-T paths are close to the independently determined t-T history of the sample. This modeling approach can be further extended other igneous and metamorphic systems where data exist for diffusion rates, crystal growth rates, and isotopic effects for diffusion.

  6. A machine learning method to estimate PM2.5 concentrations across China with remote sensing, meteorological and land use information.

    PubMed

    Chen, Gongbo; Li, Shanshan; Knibbs, Luke D; Hamm, N A S; Cao, Wei; Li, Tiantian; Guo, Jianping; Ren, Hongyan; Abramson, Michael J; Guo, Yuming

    2018-09-15

    Machine learning algorithms have very high predictive ability. However, no study has used machine learning to estimate historical concentrations of PM 2.5 (particulate matter with aerodynamic diameter ≤ 2.5 μm) at daily time scale in China at a national level. To estimate daily concentrations of PM 2.5 across China during 2005-2016. Daily ground-level PM 2.5 data were obtained from 1479 stations across China during 2014-2016. Data on aerosol optical depth (AOD), meteorological conditions and other predictors were downloaded. A random forests model (non-parametric machine learning algorithms) and two traditional regression models were developed to estimate ground-level PM 2.5 concentrations. The best-fit model was then utilized to estimate the daily concentrations of PM 2.5 across China with a resolution of 0.1° (≈10 km) during 2005-2016. The daily random forests model showed much higher predictive accuracy than the other two traditional regression models, explaining the majority of spatial variability in daily PM 2.5 [10-fold cross-validation (CV) R 2  = 83%, root mean squared prediction error (RMSE) = 28.1 μg/m 3 ]. At the monthly and annual time-scale, the explained variability of average PM 2.5 increased up to 86% (RMSE = 10.7 μg/m 3 and 6.9 μg/m 3 , respectively). Taking advantage of a novel application of modeling framework and the most recent ground-level PM 2.5 observations, the machine learning method showed higher predictive ability than previous studies. Random forests approach can be used to estimate historical exposure to PM 2.5 in China with high accuracy. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. A randomized trial to determine the impact on compliance of a psychophysical peripheral cue based on the Elaboration Likelihood Model.

    PubMed

    Horton, Rachael Jane; Minniti, Antoinette; Mireylees, Stewart; McEntegart, Damian

    2008-11-01

    Non-compliance in clinical studies is a significant issue, but causes remain unclear. Utilizing the Elaboration Likelihood Model of persuasion, this study assessed the psychophysical peripheral cue 'Interactive Voice Response System (IVRS) call frequency' on compliance. 71 participants were randomized to once daily (OD), twice daily (BID) or three times daily (TID) call schedules over two weeks. Participants completed 30-item cognitive function tests at each call. Compliance was defined as proportion of expected calls within a narrow window (+/- 30 min around scheduled time), and within a relaxed window (-30 min to +4 h). Data were analyzed by ANOVA and pairwise comparisons adjusted by the Bonferroni correction. There was a relationship between call frequency and compliance. Bonferroni adjusted pairwise comparisons showed significantly higher compliance (p=0.03) for the BID (51.0%) than TID (30.3%) for the narrow window; for the extended window, compliance was higher (p=0.04) with OD (59.5%), than TID (38.4%). The IVRS psychophysical peripheral cue call frequency supported the ELM as a route to persuasion. The results also support OD strategy for optimal compliance. Models suggest specific indicators to enhance compliance with medication dosing and electronic patient diaries to improve health outcomes and data integrity respectively.

  8. SEQUOIA: significance enhanced network querying through context-sensitive random walk and minimization of network conductance.

    PubMed

    Jeong, Hyundoo; Yoon, Byung-Jun

    2017-03-14

    Network querying algorithms provide computational means to identify conserved network modules in large-scale biological networks that are similar to known functional modules, such as pathways or molecular complexes. Two main challenges for network querying algorithms are the high computational complexity of detecting potential isomorphism between the query and the target graphs and ensuring the biological significance of the query results. In this paper, we propose SEQUOIA, a novel network querying algorithm that effectively addresses these issues by utilizing a context-sensitive random walk (CSRW) model for network comparison and minimizing the network conductance of potential matches in the target network. The CSRW model, inspired by the pair hidden Markov model (pair-HMM) that has been widely used for sequence comparison and alignment, can accurately assess the node-to-node correspondence between different graphs by accounting for node insertions and deletions. The proposed algorithm identifies high-scoring network regions based on the CSRW scores, which are subsequently extended by maximally reducing the network conductance of the identified subnetworks. Performance assessment based on real PPI networks and known molecular complexes show that SEQUOIA outperforms existing methods and clearly enhances the biological significance of the query results. The source code and datasets can be downloaded from http://www.ece.tamu.edu/~bjyoon/SEQUOIA .

  9. Random Forests for Global and Regional Crop Yield Predictions.

    PubMed

    Jeong, Jig Han; Resop, Jonathan P; Mueller, Nathaniel D; Fleisher, David H; Yun, Kyungdahm; Butler, Ethan E; Timlin, Dennis J; Shim, Kyo-Moon; Gerber, James S; Reddy, Vangimalla R; Kim, Soo-Hyung

    2016-01-01

    Accurate predictions of crop yield are critical for developing effective agricultural and food policies at the regional and global scales. We evaluated a machine-learning method, Random Forests (RF), for its ability to predict crop yield responses to climate and biophysical variables at global and regional scales in wheat, maize, and potato in comparison with multiple linear regressions (MLR) serving as a benchmark. We used crop yield data from various sources and regions for model training and testing: 1) gridded global wheat grain yield, 2) maize grain yield from US counties over thirty years, and 3) potato tuber and maize silage yield from the northeastern seaboard region. RF was found highly capable of predicting crop yields and outperformed MLR benchmarks in all performance statistics that were compared. For example, the root mean square errors (RMSE) ranged between 6 and 14% of the average observed yield with RF models in all test cases whereas these values ranged from 14% to 49% for MLR models. Our results show that RF is an effective and versatile machine-learning method for crop yield predictions at regional and global scales for its high accuracy and precision, ease of use, and utility in data analysis. RF may result in a loss of accuracy when predicting the extreme ends or responses beyond the boundaries of the training data.

  10. Quantitative prediction of oral cancer risk in patients with oral leukoplakia.

    PubMed

    Liu, Yao; Li, Yicheng; Fu, Yue; Liu, Tong; Liu, Xiaoyong; Zhang, Xinyan; Fu, Jie; Guan, Xiaobing; Chen, Tong; Chen, Xiaoxin; Sun, Zheng

    2017-07-11

    Exfoliative cytology has been widely used for early diagnosis of oral squamous cell carcinoma. We have developed an oral cancer risk index using DNA index value to quantitatively assess cancer risk in patients with oral leukoplakia, but with limited success. In order to improve the performance of the risk index, we collected exfoliative cytology, histopathology, and clinical follow-up data from two independent cohorts of normal, leukoplakia and cancer subjects (training set and validation set). Peaks were defined on the basis of first derivatives with positives, and modern machine learning techniques were utilized to build statistical prediction models on the reconstructed data. Random forest was found to be the best model with high sensitivity (100%) and specificity (99.2%). Using the Peaks-Random Forest model, we constructed an index (OCRI2) as a quantitative measurement of cancer risk. Among 11 leukoplakia patients with an OCRI2 over 0.5, 4 (36.4%) developed cancer during follow-up (23 ± 20 months), whereas 3 (5.3%) of 57 leukoplakia patients with an OCRI2 less than 0.5 developed cancer (32 ± 31 months). OCRI2 is better than other methods in predicting oral squamous cell carcinoma during follow-up. In conclusion, we have developed an exfoliative cytology-based method for quantitative prediction of cancer risk in patients with oral leukoplakia.

  11. Modeling cooperating micro-organisms in antibiotic environment.

    PubMed

    Book, Gilad; Ingham, Colin; Ariel, Gil

    2017-01-01

    Recent experiments with the bacteria Paenibacillus vortex reveal a remarkable strategy enabling it to cope with antibiotics by cooperating with a different bacterium-Escherichia coli. While P. vortex is a highly effective swarmer, it is sensitive to the antibiotic ampicillin. On the other hand, E. coli can degrade ampicillin but is non-motile when grown on high agar percentages. The two bacterial species form a shared colony in which E. coli is transported by P. vortex and E. coli detoxifies the ampicillin. The paper presents a simplified model, consisting of coupled reaction-diffusion equations, describing the development of ring patterns in the shared colony. Our results demonstrate some of the possible cooperative movement strategies bacteria utilize in order to survive harsh conditions. In addition, we explore the behavior of mixed colonies under new conditions such as antibiotic gradients, synchronization between colonies and possible dynamics of a 3-species system including P. vortex, E. coli and a carbon producing algae that provides nutrients under illuminated, nutrient poor conditions. The derived model was able to simulate an asymmetric relationship between two or three micro-organisms where cooperation is required for survival. Computationally, in order to avoid numerical artifacts due to symmetries within the discretizing grid, the model was solved using a second order Vectorizable Random Lattices method, which is developed as a finite volume scheme on a random grid.

  12. Safety performance of traffic phases and phase transitions in three phase traffic theory.

    PubMed

    Xu, Chengcheng; Liu, Pan; Wang, Wei; Li, Zhibin

    2015-12-01

    Crash risk prediction models were developed to link safety to various phases and phase transitions defined by the three phase traffic theory. Results of the Bayesian conditional logit analysis showed that different traffic states differed distinctly with respect to safety performance. The random-parameter logit approach was utilized to account for the heterogeneity caused by unobserved factors. The Bayesian inference approach based on the Markov Chain Monte Carlo (MCMC) method was used for the estimation of the random-parameter logit model. The proposed approach increased the prediction performance of the crash risk models as compared with the conventional logit model. The three phase traffic theory can help us better understand the mechanism of crash occurrences in various traffic states. The contributing factors to crash likelihood can be well explained by the mechanism of phase transitions. We further discovered that the free flow state can be divided into two sub-phases on the basis of safety performance, including a true free flow state in which the interactions between vehicles are minor, and a platooned traffic state in which bunched vehicles travel in successions. The results of this study suggest that a safety perspective can be added to the three phase traffic theory. The results also suggest that the heterogeneity between different traffic states should be considered when estimating the risks of crash occurrences on freeways. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Modeling cooperating micro-organisms in antibiotic environment

    PubMed Central

    Book, Gilad; Ingham, Colin; Ariel, Gil

    2017-01-01

    Recent experiments with the bacteria Paenibacillus vortex reveal a remarkable strategy enabling it to cope with antibiotics by cooperating with a different bacterium—Escherichia coli. While P. vortex is a highly effective swarmer, it is sensitive to the antibiotic ampicillin. On the other hand, E. coli can degrade ampicillin but is non-motile when grown on high agar percentages. The two bacterial species form a shared colony in which E. coli is transported by P. vortex and E. coli detoxifies the ampicillin. The paper presents a simplified model, consisting of coupled reaction-diffusion equations, describing the development of ring patterns in the shared colony. Our results demonstrate some of the possible cooperative movement strategies bacteria utilize in order to survive harsh conditions. In addition, we explore the behavior of mixed colonies under new conditions such as antibiotic gradients, synchronization between colonies and possible dynamics of a 3-species system including P. vortex, E. coli and a carbon producing algae that provides nutrients under illuminated, nutrient poor conditions. The derived model was able to simulate an asymmetric relationship between two or three micro-organisms where cooperation is required for survival. Computationally, in order to avoid numerical artifacts due to symmetries within the discretizing grid, the model was solved using a second order Vectorizable Random Lattices method, which is developed as a finite volume scheme on a random grid. PMID:29284016

  14. Genetic Parameters for Milk Yield and Lactation Persistency Using Random Regression Models in Girolando Cattle

    PubMed Central

    Canaza-Cayo, Ali William; Lopes, Paulo Sávio; da Silva, Marcos Vinicius Gualberto Barbosa; de Almeida Torres, Robledo; Martins, Marta Fonseca; Arbex, Wagner Antonio; Cobuci, Jaime Araujo

    2015-01-01

    A total of 32,817 test-day milk yield (TDMY) records of the first lactation of 4,056 Girolando cows daughters of 276 sires, collected from 118 herds between 2000 and 2011 were utilized to estimate the genetic parameters for TDMY via random regression models (RRM) using Legendre’s polynomial functions whose orders varied from 3 to 5. In addition, nine measures of persistency in milk yield (PSi) and the genetic trend of 305-day milk yield (305MY) were evaluated. The fit quality criteria used indicated RRM employing the Legendre’s polynomial of orders 3 and 5 for fitting the genetic additive and permanent environment effects, respectively, as the best model. The heritability and genetic correlation for TDMY throughout the lactation, obtained with the best model, varied from 0.18 to 0.23 and from −0.03 to 1.00, respectively. The heritability and genetic correlation for persistency and 305MY varied from 0.10 to 0.33 and from −0.98 to 1.00, respectively. The use of PS7 would be the most suitable option for the evaluation of Girolando cattle. The estimated breeding values for 305MY of sires and cows showed significant and positive genetic trends. Thus, the use of selection indices would be indicated in the genetic evaluation of Girolando cattle for both traits. PMID:26323397

  15. A Random Forest Based Risk Model for Reliable and Accurate Prediction of Receipt of Transfusion in Patients Undergoing Percutaneous Coronary Intervention

    PubMed Central

    Gurm, Hitinder S.; Kooiman, Judith; LaLonde, Thomas; Grines, Cindy; Share, David; Seth, Milan

    2014-01-01

    Background Transfusion is a common complication of Percutaneous Coronary Intervention (PCI) and is associated with adverse short and long term outcomes. There is no risk model for identifying patients most likely to receive transfusion after PCI. The objective of our study was to develop and validate a tool for predicting receipt of blood transfusion in patients undergoing contemporary PCI. Methods Random forest models were developed utilizing 45 pre-procedural clinical and laboratory variables to estimate the receipt of transfusion in patients undergoing PCI. The most influential variables were selected for inclusion in an abbreviated model. Model performance estimating transfusion was evaluated in an independent validation dataset using area under the ROC curve (AUC), with net reclassification improvement (NRI) used to compare full and reduced model prediction after grouping in low, intermediate, and high risk categories. The impact of procedural anticoagulation on observed versus predicted transfusion rates were assessed for the different risk categories. Results Our study cohort was comprised of 103,294 PCI procedures performed at 46 hospitals between July 2009 through December 2012 in Michigan of which 72,328 (70%) were randomly selected for training the models, and 30,966 (30%) for validation. The models demonstrated excellent calibration and discrimination (AUC: full model  = 0.888 (95% CI 0.877–0.899), reduced model AUC = 0.880 (95% CI, 0.868–0.892), p for difference 0.003, NRI = 2.77%, p = 0.007). Procedural anticoagulation and radial access significantly influenced transfusion rates in the intermediate and high risk patients but no clinically relevant impact was noted in low risk patients, who made up 70% of the total cohort. Conclusions The risk of transfusion among patients undergoing PCI can be reliably calculated using a novel easy to use computational tool (https://bmc2.org/calculators/transfusion). This risk prediction algorithm may prove useful for both bed side clinical decision making and risk adjustment for assessment of quality. PMID:24816645

  16. Rural-urban difference in the use of annual physical examination among seniors in Shandong, China: a cross-sectional study.

    PubMed

    Ge, Dandan; Chu, Jie; Zhou, Chengchao; Qian, Yangyang; Zhang, Li; Sun, Long

    2017-05-23

    Regular physical examination contributes to early detection and timely treatment, which is helpful in promoting healthy behaviors and preventing diseases. The objective of this study is to compare the annual physical examination (APE) use between rural and urban elderly in China. A total of 3,922 participants (60+) were randomly selected from three urban districts and three rural counties in Shandong Province, China, and were interviewed using a standardized questionnaire. We performed unadjusted and adjusted logistic regression models to examine the difference in the utilization of APE between rural and urban elderly. Two adjusted logistic regression models were employed to identify the factors associated with APE use in rural and urban seniors respectively. The utilization rates of APE in rural and urban elderly are 37.4% and 76.2% respectively. Factors including education level, exercise, watching TV, and number of non-communicable chronic conditions, are associated with APE use both in rural and urban elderly. Hospitalization, self-reported economic status, and health insurance are found to be significant (p < 0.05) predictors for APE use in rural elderly. Elderly covered by Urban Resident Basic Medical Insurance (URBMI) (p < 0.05, OR = 1.874) are more likely to use APE in urban areas. There is a big difference in APE utilization between rural and urban elderly. Interventions targeting identified at-risk subgroups, especially for those rural elderly, are essential to reduce such a gap. To improve health literacy might be helpful to increase the utilization rate of APE among the elderly.

  17. Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures

    NASA Technical Reports Server (NTRS)

    Chang, C. S.

    1975-01-01

    The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.

  18. Free kick instead of cross-validation in maximum-likelihood refinement of macromolecular crystal structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pražnikar, Jure; University of Primorska,; Turk, Dušan, E-mail: dusan.turk@ijs.si

    2014-12-01

    The maximum-likelihood free-kick target, which calculates model error estimates from the work set and a randomly displaced model, proved superior in the accuracy and consistency of refinement of crystal structures compared with the maximum-likelihood cross-validation target, which calculates error estimates from the test set and the unperturbed model. The refinement of a molecular model is a computational procedure by which the atomic model is fitted to the diffraction data. The commonly used target in the refinement of macromolecular structures is the maximum-likelihood (ML) function, which relies on the assessment of model errors. The current ML functions rely on cross-validation. Theymore » utilize phase-error estimates that are calculated from a small fraction of diffraction data, called the test set, that are not used to fit the model. An approach has been developed that uses the work set to calculate the phase-error estimates in the ML refinement from simulating the model errors via the random displacement of atomic coordinates. It is called ML free-kick refinement as it uses the ML formulation of the target function and is based on the idea of freeing the model from the model bias imposed by the chemical energy restraints used in refinement. This approach for the calculation of error estimates is superior to the cross-validation approach: it reduces the phase error and increases the accuracy of molecular models, is more robust, provides clearer maps and may use a smaller portion of data for the test set for the calculation of R{sub free} or may leave it out completely.« less

  19. Time evolution of strategic and non-strategic 2-party competitions

    NASA Astrophysics Data System (ADS)

    Shanahan, Linda Lee

    The study of the nature of conflict and competition and its many manifestations---military, social, environmental, biological---has enjoyed a long history and garnered the attention of researchers in many disciplines. It will no doubt continue to do so. That the topic is of interest to some in the physics community has to do with the critical role physicists have shouldered in furthering knowledge in every sphere with reference to behavior observed in nature. The techniques, in the case of this research, have been rooted in statistical physics and the science of probability. Our tools include the use of cellular automata and random number generators in an agent-based modeling approach. In this work, we first examine a type of "conflict" model where two parties vye for the same resources with no apparent strategy or intelligence, their interactions devolving to random encounters. Analytical results for the time evolution of the model are presented with multiple examples. What at first encounter seems a trivial formulation is found to be a model with rich possibilities for adaptation to far more interesting and potentially relevant scenarios. An example of one such possibility---random events punctuated by correlated non-random ones---is included. We then turn our attention to a different conflict scenario, one in which one party acts with no strategy and in a random manner while the other receives intelligence, makes decisions, and acts with a specific purpose. We develop a set of parameters and examine several examples for insight into the model behavior in different regions of the parameter space, finding both intuitive and non-intuitive results. Of particular interest is the role of the so-called "intelligence" in determining the outcome of a conflict. We consider two applications for which specific conditions are imposed on the parameters. First, can an invader beginning in a single cell or site and utilizing a search and deploy strategy gain territory in an environment defined by constant exposure to random attacks? What magnitude of defense is sufficient to eliminate or contain such growth, and what role does the quantity and quality of available information play? Second, we build on the idea of a single intruder to include a look at a scenario where a single intruder or a small group of intruders invades or attacks a space which may have significant restrictions (such as walls or other inaccessible spaces). The importance of information and strategy emerges in keeping with intuitive expectations. Additional derivations are provided in the appendix, along with the MATLAB codes for the models. References are relegated to the end of the thesis.

  20. Cost-utility analysis of eprosartan compared to enalapril in primary prevention and nitrendipine in secondary prevention in Europe--the HEALTH model.

    PubMed

    Schwander, Björn; Gradl, Birgit; Zöllner, York; Lindgren, Peter; Diener, Hans-Christoph; Lüders, Stephan; Schrader, Joachim; Villar, Fernando Antoñanzas; Greiner, Wolfgang; Jönsson, Bengt

    2009-09-01

    To investigate the cost-utility of eprosartan versus enalapril (primary prevention) and versus nitrendipine (secondary prevention) on the basis of head-to-head evidence from randomized controlled trials. The HEALTH model (Health Economic Assessment of Life with Teveten for Hypertension) is an object-oriented probabilistic Monte Carlo simulation model. It combines a Framingham-based risk calculation with a systolic blood pressure approach to estimate the relative risk reduction of cardiovascular and cerebrovascular events based on recent meta-analyses. In secondary prevention, an additional risk reduction is modeled for eprosartan according to the results of the MOSES study ("Morbidity and Mortality after Stroke--Eprosartan Compared to Nitrendipine for Secondary Prevention"). Costs and utilities were derived from published estimates considering European country-specific health-care payer perspectives. Comparing eprosartan to enalapril in a primary prevention setting the mean costs per quality adjusted life year (QALY) gained were highest in Germany (Euro 24,036) followed by Belgium (Euro 17,863), the UK (Euro 16,364), Norway (Euro 13,834), Sweden (Euro 11,691) and Spain (Euro 7918). In a secondary prevention setting (eprosartan vs. nitrendipine) the highest costs per QALY gained have been observed in Germany (Euro 9136) followed by the UK (Euro 6008), Norway (Euro 1695), Sweden (Euro 907), Spain (Euro -2054) and Belgium (Euro -5767). Considering a Euro 30,000 willingness-to-pay threshold per QALY gained, eprosartan is cost-effective as compared to enalapril in primary prevention (patients >or=50 years old and a systolic blood pressure >or=160 mm Hg) and cost-effective as compared to nitrendipine in secondary prevention (all investigated patients).

  1. Patient and Societal Value Functions for the Testing Morbidities Index

    PubMed Central

    Swan, John Shannon; Kong, Chung Yin; Lee, Janie M.; Akinyemi, Omosalewa; Halpern, Elkan F.; Lee, Pablo; Vavinskiy, Sergey; Williams, Olubunmi; Zoltick, Emilie S.; Donelan, Karen

    2013-01-01

    Background We developed preference-based and summated scale scoring for the Testing Morbidities Index (TMI) classification, which addresses short-term effects on quality of life from diagnostic testing before, during and after a testing procedure. Methods The two TMI value functions utilize multiattribute value techniques; one is patient-based and the other has a societal perspective. 206 breast biopsy patients and 466 (societal) subjects informed the models. Due to a lack of standard short-term methods for this application, we utilized the visual analog scale (VAS). Waiting trade-off (WTO) tolls provided an additional option for linear transformation of the TMI. We randomized participants to one of three surveys: the first derived weights for generic testing morbidity attributes and levels of severity with the VAS; a second developed VAS values and WTO tolls for linear transformation of the TMI to a death-healthy scale; the third addressed initial validation in a specific test (breast biopsy). 188 patients and 425 community subjects participated in initial validation, comparing direct VAS and WTO values to the TMI. Alternative TMI scoring as a non-preference summated scale was included, given evidence of construct and content validity. Results The patient model can use an additive function, while the societal model is multiplicative. Direct VAS and the VAS-scaled TMI were correlated across modeling groups (r=0.45 to 0.62) and agreement was comparable to the value function validation of the Health Utilities Index 2. Mean Absolute Difference (MAD) calculations showed a range of 0.07–0.10 in patients and 0.11–0.17 in subjects. MAD for direct WTO tolls compared to the WTO-scaled TMI varied closely around one quality-adjusted life day. Conclusions The TMI shows initial promise in measuring short-term testing-related health states. PMID:23689044

  2. Statistical-learning strategies generate only modestly performing predictive models for urinary symptoms following external beam radiotherapy of the prostate: A comparison of conventional and machine-learning methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yahya, Noorazrul, E-mail: noorazrul.yahya@research.uwa.edu.au; Ebert, Martin A.; Bulsara, Max

    Purpose: Given the paucity of available data concerning radiotherapy-induced urinary toxicity, it is important to ensure derivation of the most robust models with superior predictive performance. This work explores multiple statistical-learning strategies for prediction of urinary symptoms following external beam radiotherapy of the prostate. Methods: The performance of logistic regression, elastic-net, support-vector machine, random forest, neural network, and multivariate adaptive regression splines (MARS) to predict urinary symptoms was analyzed using data from 754 participants accrued by TROG03.04-RADAR. Predictive features included dose-surface data, comorbidities, and medication-intake. Four symptoms were analyzed: dysuria, haematuria, incontinence, and frequency, each with three definitions (grade ≥more » 1, grade ≥ 2 and longitudinal) with event rate between 2.3% and 76.1%. Repeated cross-validations producing matched models were implemented. A synthetic minority oversampling technique was utilized in endpoints with rare events. Parameter optimization was performed on the training data. Area under the receiver operating characteristic curve (AUROC) was used to compare performance using sample size to detect differences of ≥0.05 at the 95% confidence level. Results: Logistic regression, elastic-net, random forest, MARS, and support-vector machine were the highest-performing statistical-learning strategies in 3, 3, 3, 2, and 1 endpoints, respectively. Logistic regression, MARS, elastic-net, random forest, neural network, and support-vector machine were the best, or were not significantly worse than the best, in 7, 7, 5, 5, 3, and 1 endpoints. The best-performing statistical model was for dysuria grade ≥ 1 with AUROC ± standard deviation of 0.649 ± 0.074 using MARS. For longitudinal frequency and dysuria grade ≥ 1, all strategies produced AUROC>0.6 while all haematuria endpoints and longitudinal incontinence models produced AUROC<0.6. Conclusions: Logistic regression and MARS were most likely to be the best-performing strategy for the prediction of urinary symptoms with elastic-net and random forest producing competitive results. The predictive power of the models was modest and endpoint-dependent. New features, including spatial dose maps, may be necessary to achieve better models.« less

  3. Providing intensive addiction/housing case management to homeless veterans enrolled in addictions treatment: A randomized controlled trial.

    PubMed

    Malte, Carol A; Cox, Koriann; Saxon, Andrew J

    2017-05-01

    This study sought to determine whether homeless veterans entering Veterans Affairs (VA) substance use treatment randomized to intensive addiction/housing case management (AHCM) had improved housing, substance use, mental health, and functional outcomes and lower acute health care utilization, compared to a housing support group (HSG) control. Homeless veterans (n = 181) entering outpatient VA substance use treatment were randomized to AHCM and HSG and received treatment for 12 months. AHCM provided individualized housing, substance use and mental health case management, life skills training, and community outreach. The control condition was a weekly drop-in housing support group. Adjusted longitudinal analyses compared groups on baseline to month 12 change in percentage of days housed and functional status, substance use, and mental health outcomes (36-Item Short-Form Health Survey; Addiction Severity Index [ASI]). Negative binomial regression models compared groups on health care utilization. Both conditions significantly increased percentage of days housed, with no differences detected between conditions. In total, 74 (81.3%) AHCM and 64 (71.1%) HSG participants entered long-term housing (odds ratio = 1.9, 95% confidence interval [0.9, 4.0], p = .088). HSG participants experienced a greater decrease in emergency department visits than AHCM (p = .037), whereas AHCM participants remained in substance use treatment 52.7 days longer (p = .005) and had greater study treatment participation (p < .001) than HSG. ASI alcohol composite scores improved more for HSG than AHCM (p = .006), and both conditions improved on ASI drug and psychiatric scores and alcohol/drug abstinence. AHCM did not demonstrate overarching benefits beyond standard VA housing and substance use care. For those veterans not entering or losing long-term housing, different approaches to outreach and ongoing intervention are required. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Fixation using alternative implants for the treatment of hip fractures (FAITH): design and rationale for a multi-centre randomized trial comparing sliding hip screws and cancellous screws on revision surgery rates and quality of life in the treatment of femoral neck fractures.

    PubMed

    2014-06-26

    Hip fractures are a common type of fragility fracture that afflict 293,000 Americans (over 5,000 per week) and 35,000 Canadians (over 670 per week) annually. Despite the large population impact the optimal fixation technique for low energy femoral neck fractures remains controversial. The primary objective of the FAITH study is to assess the impact of cancellous screw fixation versus sliding hip screws on rates of revision surgery at 24 months in individuals with femoral neck fractures. The secondary objective is to determine the impact on health-related quality of life, functional outcomes, health state utilities, fracture healing, mortality and fracture-related adverse events. FAITH is a multi-centre, multi-national randomized controlled trial utilizing minimization to determine patient allocation. Surgeons in North America, Europe, Australia, and Asia will recruit a total of at least 1,000 patients with low-energy femoral neck fractures. Using central randomization, patients will be allocated to receive surgical treatment with cancellous screws or a sliding hip screw. Patient outcomes will be assessed at one week (baseline), 10 weeks, 6, 12, 18, and 24 months post initial fixation. We will independently adjudicate revision surgery and complications within 24 months of the initial fixation. Outcome analysis will be performed using a Cox proportional hazards model and likelihood ratio test. This study represents major international efforts to definitively resolve the treatment of low-energy femoral neck fractures. This trial will not only change current Orthopaedic practice, but will also set a benchmark for the conduct of future Orthopaedic trials. The FAITH trial is registered at ClinicalTrials.gov (Identifier NCT00761813).

  5. Cognitive behavioral therapy for insomnia in stable heart failure: Protocol for a randomized controlled trial.

    PubMed

    Redeker, Nancy S; Knies, Andrea K; Hollenbeak, Christopher; Klar Yaggi, H; Cline, John; Andrews, Laura; Jacoby, Daniel; Sullivan, Anna; O'Connell, Meghan; Iennaco, Joanne; Finoia, Lisa; Jeon, Sangchoon

    2017-04-01

    Chronic insomnia is associated with disabling symptoms and decrements in functional performance. It may contribute to the development of heart failure (HF) and incident mortality. In our previous work, cognitive-behavioral therapy for insomnia (CBT-I), compared to HF self-management education, provided as an attention control condition, was feasible, acceptable, and had large effects on insomnia and fatigue among HF patients. The purpose of this randomized controlled trial (RCT) is to evaluate the sustained effects of group CBT-I compared with HF self-management education (attention control) on insomnia severity, sleep characteristics, daytime symptoms, symptom clusters, functional performance, and health care utilization among patients with stable HF. We will estimate the cost-effectiveness of CBT-I and explore the effects of CBT-I on event-free survival (EFS). Two hundred participants will be randomized in clusters to a single center parallel group (CBT-I vs. attention control) RCT. Wrist actigraphy and self-report will elicit insomnia, sleep characteristics, symptoms, and functional performance. We will use the psychomotor vigilance test to evaluate sleep loss effects and the Six Minute Walk Test to evaluate effects on daytime function. Medical record review and interviews will elicit health care utilization and EFS. Statistical methods will include general linear mixed models and latent transition analysis. Stochastic cost-effectiveness analysis with a competing risk approach will be employed to conduct the cost-effectiveness analysis. The results will be generalizable to HF patients with chronic comorbid insomnia and pave the way for future research focused on the dissemination and translation of CBT-I into HF settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Effects of preventive online mindfulness interventions on stress and mindfulness: A meta-analysis of randomized controlled trials.

    PubMed

    Jayewardene, Wasantha P; Lohrmann, David K; Erbe, Ryan G; Torabi, Mohammad R

    2017-03-01

    Empirical evidence suggested that mind-body interventions can be effectively delivered online. This study aimed to examine whether preventive online mindfulness interventions (POMI) for non-clinical populations improve short- and long-term outcomes for perceived-stress (primary) and mindfulness (secondary). Systematic search of four electronic databases, manuscript reference lists, and journal content lists was conducted in 2016, using 21 search-terms. Eight randomized controlled trials (RCTs) evaluating effects of POMI in non-clinical populations with adequately reported perceived-stress and mindfulness measures pre- and post-intervention were included. Random-effects models utilized for all effect-size estimations with meta-regression performed for mean age and %females. Participants were volunteers (adults; predominantly female) from academic, workplace, or community settings. Most interventions utilized simplified Mindfulness-Based Stress Reduction protocols over 2-12 week periods. Post-intervention, significant medium effect found for perceived-stress (g = 0.432), with moderate heterogeneity and significant, but small, effect size for mindfulness (g = 0.275) with low heterogeneity; highest effects were for middle-aged individuals. At follow-up, significant large effect found for perceived-stress (g = 0.699) with low heterogeneity and significant medium effect (g = 0.466) for mindfulness with high heterogeneity. No publication bias was found for perceived-stress; publication bias found for mindfulness outcomes led to underestimation of effects, not overestimation. Number of eligible RCTs was low with inadequate data reporting in some studies. POMI had substantial stress reduction effects and some mindfulness improvement effects. POMI can be a more convenient and cost-effective strategy, compared to traditional face-to-face interventions, especially in the context of busy, hard-to-reach, but digitally-accessible populations.

  7. Tests of Hypotheses Arising In the Correlated Random Coefficient Model*

    PubMed Central

    Heckman, James J.; Schmierer, Daniel

    2010-01-01

    This paper examines the correlated random coefficient model. It extends the analysis of Swamy (1971), who pioneered the uncorrelated random coefficient model in economics. We develop the properties of the correlated random coefficient model and derive a new representation of the variance of the instrumental variable estimator for that model. We develop tests of the validity of the correlated random coefficient model against the null hypothesis of the uncorrelated random coefficient model. PMID:21170148

  8. Building the Evidence Base for Decision-making in Cancer Genomic Medicine Using Comparative Effectiveness Research

    PubMed Central

    Goddard, Katrina A.B.; Knaus, William A.; Whitlock, Evelyn; Lyman, Gary H.; Feigelson, Heather Spencer; Schully, Sheri D.; Ramsey, Scott; Tunis, Sean; Freedman, Andrew N.; Khoury, Muin J.; Veenstra, David L.

    2013-01-01

    Background The clinical utility is uncertain for many cancer genomic applications. Comparative effectiveness research (CER) can provide evidence to clarify this uncertainty. Objectives To identify approaches to help stakeholders make evidence-based decisions, and to describe potential challenges and opportunities using CER to produce evidence-based guidance. Methods We identified general CER approaches for genomic applications through literature review, the authors’ experiences, and lessons learned from a recent, seven-site CER initiative in cancer genomic medicine. Case studies illustrate the use of CER approaches. Results Evidence generation and synthesis approaches include comparative observational and randomized trials, patient reported outcomes, decision modeling, and economic analysis. We identified significant challenges to conducting CER in cancer genomics: the rapid pace of innovation, the lack of regulation, the limited evidence for clinical utility, and the beliefs that genomic tests could have personal utility without having clinical utility. Opportunities to capitalize on CER methods in cancer genomics include improvements in the conduct of evidence synthesis, stakeholder engagement, increasing the number of comparative studies, and developing approaches to inform clinical guidelines and research prioritization. Conclusions CER offers a variety of methodological approaches to address stakeholders’ needs. Innovative approaches are needed to ensure an effective translation of genomic discoveries. PMID:22516979

  9. An international survey of the health economics of IVF and ICSI.

    PubMed

    Collins, JohnA

    2002-01-01

    The health economics of IVF and ICSI involve assessments of utilization, cost, cost-effectiveness and ability to pay. In 48 countries, utilization averaged 289 IVF/ICSI cycles per million of population per annum, ranging from two in Kazachstan, to 1657 in Israel. Higher national utilization of IVF/ICSI was associated with higher quality of health services, as indicated by lower infant mortality rates. IVF and ICSI are scientifically demanding and personnel-intensive, and are therefore expensive procedures. The average cost per IVF/ICSI cycle in 2002 would be US$9547 in the USA, and US$3518 in 25 other countries. Price elasticity estimates suggest that a 10% decrease in IVF/ICSI cost would generate a 30% increase in utilization. The average cost-effectiveness ratios in 2002 would be US$58,394 per live birth in the USA, and US$22,048 in other countries. In three randomized controlled trials, incremental costs per additional live birth with IVF compared with conventional therapy were US$ -26,586, $79,472 and $47,749. The national costs of IVF/ICSI treatment would be US$1.00 per capita in one current model, but the costs to individual couples range from 10% of annual household expenditures in European countries to 25% in Canada and the USA.

  10. Probabilistic image modeling with an extended chain graph for human activity recognition and image segmentation.

    PubMed

    Zhang, Lei; Zeng, Zhi; Ji, Qiang

    2011-09-01

    Chain graph (CG) is a hybrid probabilistic graphical model (PGM) capable of modeling heterogeneous relationships among random variables. So far, however, its application in image and video analysis is very limited due to lack of principled learning and inference methods for a CG of general topology. To overcome this limitation, we introduce methods to extend the conventional chain-like CG model to CG model with more general topology and the associated methods for learning and inference in such a general CG model. Specifically, we propose techniques to systematically construct a generally structured CG, to parameterize this model, to derive its joint probability distribution, to perform joint parameter learning, and to perform probabilistic inference in this model. To demonstrate the utility of such an extended CG, we apply it to two challenging image and video analysis problems: human activity recognition and image segmentation. The experimental results show improved performance of the extended CG model over the conventional directed or undirected PGMs. This study demonstrates the promise of the extended CG for effective modeling and inference of complex real-world problems.

  11. Diagnostic cost groups (DCGs) and concurrent utilization among patients with substance abuse disorders.

    PubMed

    Rosen, Amy K; Loveland, Susan A; Anderson, Jennifer J; Hankin, Cheryl S; Breckenridge, James N; Berlowitz, Dan R

    2002-08-01

    To assess the performance of Diagnostic Cost Groups (DCGs) in explaining variation in concurrent utilization for a defined subgroup, patients with substance abuse (SA) disorders, within the Department of Veterans Affairs (VA). A 60 percent random sample of veterans who used health care services during Fiscal Year (FY) 1997 was obtained from VA administrative databases. Patients with SA disorders (13.3 percent) were identified from primary and secondary ICD-9-CM diagnosis codes. Concurrent risk adjustment models were fitted and tested using the DCG/HCC model. Three outcome measures were defined: (1) "service days" (the sum of a patient's inpatient and outpatient visit days), (2) mental health/substance abuse (MH/SA) service days, and (3) ambulatory provider encounters. To improve model performance, we ran three DCG/HCC models with additional indicators for patients with SA disorders. To create a single file of veterans who used health care services in FY 1997, we merged records from all VA inpatient and outpatient files. Adding indicators for patients with mild/moderate SA disorders did not appreciably improve the R-squares for any of the outcome measures. When indicators were added for patients with severe SA who were in the most costly category, the explanatory ability of the models was modestly improved for all three outcomes. Modifying the DCG/HCC model with additional markers for SA modestly improved homogeneity and model prediction. Because considerable variation still remained after modeling, we conclude that health care systems should evaluate "off-the-shelf" risk adjustment systems before applying them to their own populations.

  12. Diagnostics Cost Groups and Concurrent Utilization among Patients

    PubMed Central

    Rosen, Amy K; Loveland, Susan A; Anderson, Jennifer J; Hankin, Cheryl S; Breckenridge, James N; Berlowitz, Dan R

    2002-01-01

    Objective To assess the performance of Diagnostic Cost Groups (DCGs) in explaining variation in concurrent utilization for a defined subgroup, patients with substance abuse (SA) disorders, within the Department of Veterans Affairs (VA). Data Sources A 60 percent random sample of veterans who used health care services during Fiscal Year (FY) 1997 was obtained from VA administrative databases. Patients with SA disorders (13.3 percent) were identified from primary and secondary ICD-9-CM diagnosis codes. Study Design Concurrent risk adjustment models were fitted and tested using the DCG/HCC model. Three outcome measures were defined: (1) “service days” (the sum of a patient's inpatient and outpatient visit days), (2) mental health/substance abuse (MH/SA) service days, and (3) ambulatory provider encounters. To improve model performance, we ran three DCG/HCC models with additional indicators for patients with SA disorders. Data Collection To create a single file of veterans who used health care services in FY 1997, we merged records from all VA inpatient and outpatient files. Principal Findings Adding indicators for patients with mild/moderate SA disorders did not appreciably improve the R-squares for any of the outcome measures. When indicators were added for patients with severe SA who were in the most costly category, the explanatory ability of the models was modestly improved for all three outcomes. Conclusions Modifying the DCG/HCC model with additional markers for SA modestly improved homogeneity and model prediction. Because considerable variation still remained after modeling, we conclude that health care systems should evaluate “off-the-shelf” risk adjustment systems before applying them to their own populations. PMID:12236385

  13. Estimating the Rate of Occurrence of Renal Stones in Astronauts

    NASA Technical Reports Server (NTRS)

    Myers, J.; Goodenow, D.; Gokoglu, S.; Kassemi, M.

    2016-01-01

    Changes in urine chemistry, during and post flight, potentially increases the risk of renal stones in astronauts. Although much is known about the effects of space flight on urine chemistry, no inflight incidence of renal stones in US astronauts exists and the question "How much does this risk change with space flight?" remains difficult to accurately quantify. In this discussion, we tackle this question utilizing a combination of deterministic and probabilistic modeling that implements the physics behind free stone growth and agglomeration, speciation of urine chemistry and published observations of population renal stone incidences to estimate changes in the rate of renal stone presentation. The modeling process utilizes a Population Balance Equation based model developed in the companion IWS abstract by Kassemi et al. (2016) to evaluate the maximum growth and agglomeration potential from a specified set of urine chemistry values. Changes in renal stone occurrence rates are obtained from this model in a probabilistic simulation that interrogates the range of possible urine chemistries using Monte Carlo techniques. Subsequently, each randomly sampled urine chemistry undergoes speciation analysis using the well-established Joint Expert Speciation System (JESS) code to calculate critical values, such as ionic strength and relative supersaturation. The Kassemi model utilizes this information to predict the mean and maximum stone size. We close the assessment loop by using a transfer function that estimates the rate of stone formation from combining the relative supersaturation and both the mean and maximum free stone growth sizes. The transfer function is established by a simulation analysis which combines population stone formation rates and Poisson regression. Training this transfer function requires using the output of the aforementioned assessment steps with inputs from known non-stone-former and known stone-former urine chemistries. Established in a Monte Carlo system, the entire renal stone analysis model produces a probability distribution of the stone formation rate and an expected uncertainty in the estimate. The utility of this analysis will be demonstrated by showing the change in renal stone occurrence predicted by this method using urine chemistry distributions published in Whitson et al. 2009. A comparison to the model predictions to previous assessments of renal stone risk will be used to illustrate initial validation of the model.

  14. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    PubMed

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  15. Orion MPCV Service Module Avionics Ring Pallet Testing, Correlation, and Analysis

    NASA Technical Reports Server (NTRS)

    Staab, Lucas; Akers, James; Suarez, Vicente; Jones, Trevor

    2012-01-01

    The NASA Orion Multi-Purpose Crew Vehicle (MPCV) is being designed to replace the Space Shuttle as the main manned spacecraft for the agency. Based on the predicted environments in the Service Module avionics ring, an isolation system was deemed necessary to protect the avionics packages carried by the spacecraft. Impact, sinusoidal, and random vibration testing were conducted on a prototype Orion Service Module avionics pallet in March 2010 at the NASA Glenn Research Center Structural Dynamics Laboratory (SDL). The pallet design utilized wire rope isolators to reduce the vibration levels seen by the avionics packages. The current pallet design utilizes the same wire rope isolators (M6-120-10) that were tested in March 2010. In an effort to save cost and schedule, the Finite Element Models of the prototype pallet tested in March 2010 were correlated. Frequency Response Function (FRF) comparisons, mode shape and frequency were all part of the correlation process. The non-linear behavior and the modeling the wire rope isolators proved to be the most difficult part of the correlation process. The correlated models of the wire rope isolators were taken from the prototype design and integrated into the current design for future frequency response analysis and component environment specification.

  16. Generalized and synthetic regression estimators for randomized branch sampling

    Treesearch

    David L. R. Affleck; Timothy G. Gregoire

    2015-01-01

    In felled-tree studies, ratio and regression estimators are commonly used to convert more readily measured branch characteristics to dry crown mass estimates. In some cases, data from multiple trees are pooled to form these estimates. This research evaluates the utility of both tactics in the estimation of crown biomass following randomized branch sampling (...

  17. Packed bed reactor for photochemical .sup.196 Hg isotope separation

    DOEpatents

    Grossman, Mark W.; Speer, Richard

    1992-01-01

    Straight tubes and randomly oriented pieces of tubing having been employed in a photochemical mercury enrichment reactor and have been found to improve the enrichment factor (E) and utilization (U) compared to a non-packed reactor. One preferred embodiment of this system uses a moving bed (via gravity) for random packing.

  18. Randomized Trial of Contingent Prizes versus Vouchers in Cocaine-Using Methadone Patients

    ERIC Educational Resources Information Center

    Petry, Nancy M.; Alessi, Sheila M.; Hanson, Tressa; Sierra, Sean

    2007-01-01

    Contingency management (CM) interventions frequently utilize vouchers as reinforcers, but a prize-based system is also efficacious. This study compared these approaches. Seventy-four cocaine-dependent methadone outpatients were randomly assigned to standard treatment (ST), ST plus a maximum of $585 in contingent vouchers, or ST plus an expected…

  19. A Randomized Trial of Probation Case Management for Drug-Involved Women Offenders

    ERIC Educational Resources Information Center

    Guydish, Joseph; Chan, Monica; Bostrom, Alan; Jessup, Martha A.; Davis, Thomas B.; Marsh, Cheryl

    2011-01-01

    This article reports findings from a clinical trial of a probation case management (PCM) intervention for drug-involved women offenders. Participants were randomly assigned to PCM (n = 92) or standard probation (n = 91) and followed for 12 months using measures of substance abuse, psychiatric symptoms, social support, and service utilization.…

  20. Randomized Control Trial of a CBT Trauma Recovery Program in Palestinian Schools

    ERIC Educational Resources Information Center

    Barron, Ian G.; Abdallah, Ghassan; Smith, Patrick

    2013-01-01

    The current study aimed to assess the Teaching Recovery Techniques (TRT) trauma recovery program within the context of ongoing violence. Utilizing a randomized controlled trial, 11-14-year-old students in Nablus, Palestine, were allocated by class to intervention or wait-list control conditions. Standardized measures assessed trauma exposure,…

  1. Expected antenna utilization and overload

    NASA Technical Reports Server (NTRS)

    Posner, Edward C.

    1991-01-01

    The trade-offs between the number of antennas at Deep Space Network (DSN) Deep-Space Communications Complex and the fraction of continuous coverage provided to a set of hypothetical spacecraft, assuming random placement of the space craft passes during the day. The trade-offs are fairly robust with respect to the randomness assumption. A sample result is that a three-antenna complex provides an average of 82.6 percent utilization of facilities and coverage of nine spacecraft that each have 8-hour passes, whereas perfect phasing of the passes would yield 100 percent utilization and coverage. One key point is that sometimes fewer than three spacecraft are visible, so an antenna is idle, while at other times, there aren't enough antennas, and some spacecraft do without service. This point of view may be useful in helping to size the network or to develop a normalization for a figure of merit of DSN coverage.

  2. [Adjustment of the Andersen's model to the Mexican context: access to prenatal care].

    PubMed

    Tamez-González, Silvia; Valle-Arcos, Rosa Irene; Eibenschutz-Hartman, Catalina; Méndez-Ramírez, Ignacio

    2006-01-01

    The aim of this work was to propose an adjustment to the Model of Andersen who answers better to the social inequality of the population in the Mexico City and allows to evaluate the effect of socioeconomic factors in the access to the prenatal care of a sample stratified according to degree of marginalization. The data come from a study of 663 women, randomly selected from a framework sample of 21,421 homes in Mexico City. This work collects information about factors that affect utilization of health services, as well as predisposing factors (age and socioeconomic level), as enabling factors (education, social support, entitlement, pay out of pocket and opinion of health services), and need factors. The sample was ranked according to exclusion variables into three stratums. The data were analyzed through the technique of path analysis. The results indicate that socioeconomic level takes part like predisposed variable for utilization of prenatal care services into three stratums. Otherwise, education and social support were the most important enabling variables for utilization of prenatal care services in the same three groups. In regard to low stratum, the most important enabling variables were education and entitlement. For high stratum the principal enabling variables were pay out of pocket and social support. The medium stratum shows atypical behavior which it was difficult to explain and understand. There was not mediating role with need variable in three models. This indicated absence of equality in all stratums. However, the most correlations in high stratum perhaps indicate less inequitable conditions regarding other stratums.

  3. Determination of Energy and Nutrient Utilization of Enzyme-treated Rump Round Meat and Lotus Root Designed for Senior People with Young and Age d Hens as an Animal Model

    PubMed Central

    Kim, Jong Woong; Kil, Dong Yong

    2016-01-01

    This study aimed to examine the nutrient utilization of rump round meat and lotus root using young (32 wk) and aged hens (108 wk) as an animal model. Rump round meat and lotus root were prepared with or without enzymatic treatment. For each age group of laying hens, a total of 24 Hy-Line Brown laying hens were randomly allotted to one of two dietary treatments with six replicates. For rump round meat, the true total tract retention rate (TTTR) of dry matter (DM) and nitrogen (N) were unaffected by either enzymatic treatment or hen age. However, aged hens had greater (p<0.01) TTTR of energy and crude fat than young hens. Enzymatic treatment did not influence the TTTR of energy or crude fat. In addition, we did not observe any significant interaction between the TTTR of DM, energy, N, or crude fat in rump round meat and hen age or enzymatic treatment. The TTTR of DM remained unchanged between controls and enzyme-treated lotus root for young hens. However, enzyme-treated lotus root exhibited greater (p<0.05) TTTR of DM than control lotus root for aged hens, resulting in a significant interaction (p<0.05). The TTTR of energy and N in lotus roots were greater (p<0.01) for aged hens than for young hens. In conclusion, enzymatic treatment exerted beneficial effects on energy and nutrient utilization in aged hens, suggesting the aged hen model is practical for simulation of metabolism of elderly individuals. PMID:27499671

  4. CoMSIA and Docking Study of Rhenium Based Estrogen Receptor Ligand Analogs

    PubMed Central

    Wolohan, Peter; Reichert, David E.

    2007-01-01

    OPLS all atom force field parameters were developed in order to model a diverse set of novel rhenium based estrogen receptor ligands whose relative binding affinities (RBA) to the estrogen receptor alpha isoform (ERα) with respect to 17β-Estradiol were available. The binding properties of these novel rhenium based organometallic complexes were studied with a combination of Comparative Molecular Similarity Indices Analysis (CoMSIA) and docking. A total of 29 estrogen receptor ligands consisting of 11 rhenium complexes and 18 organic ligands were docked inside the ligand-binding domain (LBD) of ERα utilizing the program Gold. The top ranked pose was used to construct CoMSIA models from a training set of 22 of the estrogen receptor ligands which were selected at random. In addition scoring functions from the docking runs and the polar volume (PV) were also studied to investigate their ability to predict RBA ERα. A partial least-squares analysis consisting of the CoMSIA steric, electrostatic and hydrophobic indices together with the polar volume proved sufficiently predictive having a correlation coefficient, r2, of 0.94 and a cross-validated correlation coefficient, q2, utilizing the leave one out method of 0.68. Analysis of the scoring functions from Gold showed particularly poor correlation to RBA ERα which did not improve when the rhenium complexes were extracted to leave the organic ligands. The combined CoMSIA and polar volume model ranked correctly the ligands in order of increasing RBA ERα, illustrating the utility of this method as a prescreening tool in the development of novel rhenium based estrogen receptor ligands. PMID:17280694

  5. Results of the Medicare Health Support disease-management pilot program.

    PubMed

    McCall, Nancy; Cromwell, Jerry

    2011-11-03

    In the Medicare Modernization Act of 2003, Congress required the Centers for Medicare and Medicaid Services to test the commercial disease-management model in the Medicare fee-for-service program. The Medicare Health Support Pilot Program was a large, randomized study of eight commercial programs for disease management that used nurse-based call centers. We randomly assigned patients with heart failure, diabetes, or both to the intervention or to usual care (control) and compared them with the use of a difference-in-differences method to evaluate the effects of the commercial programs on the quality of clinical care, acute care utilization, and Medicare expenditures for Medicare fee-for-service beneficiaries. The study included 242,417 patients (163,107 in the intervention group and 79,310 in the control group). The eight commercial disease-management programs did not reduce hospital admissions or emergency room visits, as compared with usual care. We observed only 14 significant improvements in process-of-care measures out of 40 comparisons. These modest improvements came at substantial cost to the Medicare program in fees paid to the disease-management companies ($400 million), with no demonstrable savings in Medicare expenditures. In this large study, commercial disease-management programs using nurse-based call centers achieved only modest improvements in quality-of-care measures, with no demonstrable reduction in the utilization of acute care or the costs of care.

  6. Cost-Effectiveness Analysis of High-Efficiency Hemodiafiltration Versus Low-Flux Hemodialysis Based on the Canadian Arm of the CONTRAST Study.

    PubMed

    Lévesque, Renee; Marcelli, Daniele; Cardinal, Héloïse; Caron, Marie-Line; Grooteman, Muriel P C; Bots, Michiel L; Blankestijn, Peter J; Nubé, Menso J; Grassmann, Aileen; Canaud, Bernard; Gandjour, Afschin

    2015-12-01

    The aim of this study was to assess the cost effectiveness of high-efficiency on-line hemodiafiltration (OL-HDF) compared with low-flux hemodialysis (LF-HD) for patients with end-stage renal disease (ESRD) based on the Canadian (Centre Hospitalier de l'Université de Montréal) arm of a parallel-group randomized controlled trial (RCT), the CONvective TRAnsport STudy. An economic evaluation was conducted for the period of the RCT (74 months). In addition, a Markov state transition model was constructed to simulate costs and health benefits over lifetime. The primary outcome was costs per quality-adjusted life-year (QALY) gained. The analysis had the perspective of the Quebec public healthcare system. A total of 130 patients were randomly allocated to OL-HDF (n = 67) and LF-HD (n = 63). The cost-utility ratio of OL-HDF versus LF-HD was Can$53,270 per QALY gained over lifetime. This ratio was fairly robust in the sensitivity analysis. The cost-utility ratio was lower than that of LF-HD compared with no treatment (immediate death), which was Can$93,008 per QALY gained. High-efficiency OL-HDF can be considered a cost-effective treatment for ESRD in a Canadian setting. Further research is needed to assess cost effectiveness in other settings and healthcare systems.

  7. The cost-effectiveness of temozolomide in the adjuvant treatment of newly diagnosed glioblastoma in the United States

    PubMed Central

    Messali, Andrew; Hay, Joel W.; Villacorta, Reginald

    2013-01-01

    Background The objective of this work was to determine the cost-effectiveness of temozolomide compared with that of radiotherapy alone in the adjuvant treatment of newly diagnosed glioblastoma. Temozolomide is the only chemotherapeutic agent to have demonstrated a significant survival benefit in a randomized clinical trial. Our analysis builds on earlier work by incorporating caregiver time costs and generic temozolomide availability. It is also the first analysis applicable to the US context. Methods A systematic literature review was conducted to collect relevant data. Transition probabilities were calculated from randomized controlled trial data comparing temozolomide plus radiotherapy with radiotherapy alone. Direct costs were calculated from charges reported by the Mayo Clinic. Utilities were obtained from a previous cost-utility analysis. Using these data, a Markov model with a 1-month cycle length and 5-year time horizon was constructed. Results The addition of brand Temodar and generic temozolomide to the standard radiotherapy regimen was associated with base-case incremental cost-effectiveness ratios of $102 364 and $8875, respectively, per quality-adjusted life-year. The model was most sensitive to the progression-free survival associated with the use of only radiotherapy. Conclusions Both the brand and generic base-case estimates are cost-effective under a willingness-to-pay threshold of $150 000 per quality-adjusted life-year. All 1-way sensitivity analyses produced incremental cost-effectiveness ratios below this threshold. We conclude that both the brand Temodar and generic temozolomide are cost-effective treatments for newly diagnosed glioblastoma within the US context. However, assuming that the generic product produces equivalent quality of life and survival benefits, it would be significantly more cost-effective than the brand option. PMID:23935155

  8. Dispersion Analysis Using Particle Tracking Simulations Through Heterogeneity Based on Outcrop Lidar Imagery

    NASA Astrophysics Data System (ADS)

    Klise, K. A.; Weissmann, G. S.; McKenna, S. A.; Tidwell, V. C.; Frechette, J. D.; Wawrzyniec, T. F.

    2007-12-01

    Solute plumes are believed to disperse in a non-Fickian manner due to small-scale heterogeneity and variable velocities that create preferential pathways. In order to accurately predict dispersion in naturally complex geologic media, the connection between heterogeneity and dispersion must be better understood. Since aquifer properties can not be measured at every location, it is common to simulate small-scale heterogeneity with random field generators based on a two-point covariance (e.g., through use of sequential simulation algorithms). While these random fields can produce preferential flow pathways, it is unknown how well the results simulate solute dispersion through natural heterogeneous media. To evaluate the influence that complex heterogeneity has on dispersion, we utilize high-resolution terrestrial lidar to identify and model lithofacies from outcrop for application in particle tracking solute transport simulations using RWHet. The lidar scan data are used to produce a lab (meter) scale two-dimensional model that captures 2-8 mm scale natural heterogeneity. Numerical simulations utilize various methods to populate the outcrop structure captured by the lidar-based image with reasonable hydraulic conductivity values. The particle tracking simulations result in residence time distributions used to evaluate the nature of dispersion through complex media. Particle tracking simulations through conductivity fields produced from the lidar images are then compared to particle tracking simulations through hydraulic conductivity fields produced from sequential simulation algorithms. Based on this comparison, the study aims to quantify the difference in dispersion when using realistic and simplified representations of aquifer heterogeneity. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  9. Prevention of nosocomial infections in critically ill patients with lactoferrin (PREVAIL study): study protocol for a randomized controlled trial.

    PubMed

    Muscedere, John; Maslove, David; Boyd, John Gordon; O'Callaghan, Nicole; Lamontagne, Francois; Reynolds, Steven; Albert, Martin; Hall, Rick; McGolrick, Danielle; Jiang, Xuran; Day, Andrew G

    2016-09-29

    Nosocomial infections remain an important source of morbidity, mortality, and increased health care costs in hospitalized patients. This is particularly problematic in intensive care units (ICUs) because of increased patient vulnerability due to the underlying severity of illness and increased susceptibility from utilization of invasive therapeutic and monitoring devices. Lactoferrin (LF) and the products of its breakdown have multiple biological effects, which make its utilization of interest for the prevention of nosocomial infections in the critically ill. This is a phase II randomized, multicenter, double-blinded trial to determine the effect of LF on antibiotic-free days in mechanically ventilated, critically ill, adult patients in the ICU. Eligible, consenting patients will be randomized to receive either LF or placebo. The treating clinician will remain blinded to allocation during the study; blinding will be maintained by using opaque syringes and containers. The primary outcome will be antibiotic-free days, defined as the number of days alive and free of antibiotics 28 days after randomization. Secondary outcomes will include: antibiotic utilization, adjudicated diagnosis of nosocomial infection (longer than 72 h of admission to ICU), hospital and ICU length of stay, change in organ function after randomization, hospital and 90-day mortality, incidence of tracheal colonization, changes in gastrointestinal permeability, and immune function. Outcomes to inform the conduct of a larger definitive trial will also be evaluated, including feasibility as determined by recruitment rates and protocol adherence. The results from this study are expected to provide insight into a potential novel therapeutic use for LF in critically ill adult patients. Further, analysis of study outcomes will inform a future, large-scale phase III randomized controlled trial powered on clinically important outcomes related to the use of LF. The trial was registered at www.ClinicalTrials.gov on 18 November 2013. NCT01996579 .

  10. Emergency department-initiated palliative care for advanced cancer patients: protocol for a pilot randomized controlled trial.

    PubMed

    Kandarian, Brandon; Morrison, R Sean; Richardson, Lynne D; Ortiz, Joanna; Grudzen, Corita R

    2014-06-25

    For patients with advanced cancer, visits to the emergency department (ED) are common. Such patients present to the ED with a specific profile of palliative care needs, including burdensome symptoms such as pain, dyspnea, or vomiting that cannot be controlled in other settings and a lack of well-defined goals of care. The goals of this study are: i) to test the feasibility of recruiting, enrolling, and randomizing patients with serious illness in the ED; and ii) to evaluate the impact of ED-initiated palliative care on health care utilization, quality of life, and survival. This is a protocol for a single center parallel, two-arm randomized controlled trial in ED patients with metastatic solid tumors comparing ED-initiated palliative care referral to a control group receiving usual care. We plan to enroll 125 to 150 ED-advanced cancer patients at Mount Sinai Hospital in New York, USA, who meet the following criteria: i) pass a brief cognitive screen; ii) speak fluent English or Spanish; and iii) have never been seen by palliative care. We will use balanced block randomization in groups of 50 to assign patients to the intervention or control group after completion of a baseline questionnaire. All research staff performing assessment or analysis will be blinded to patient assignment. We will measure the impact of the palliative care intervention on the following outcomes: i) timing and rate of palliative care consultation; ii) quality of life and depression at 12 weeks, measured using the FACT-G and PHQ-9; iii) health care utilization; and iv) length of survival. The primary analysis will be based on intention-to-treat. This pilot randomized controlled trial will test the feasibility of recruiting, enrolling, and randomizing patients with advanced cancer in the ED, and provide a preliminary estimate of the impact of palliative care referral on health care utilization, quality of life, and survival. Clinical Trials.gov identifier: NCT01358110 (Entered 5/19/2011).

  11. Computer modeling describes gravity-related adaptation in cell cultures.

    PubMed

    Alexandrov, Ludmil B; Alexandrova, Stoyana; Usheva, Anny

    2009-12-16

    Questions about the changes of biological systems in response to hostile environmental factors are important but not easy to answer. Often, the traditional description with differential equations is difficult due to the overwhelming complexity of the living systems. Another way to describe complex systems is by simulating them with phenomenological models such as the well-known evolutionary agent-based model (EABM). Here we developed an EABM to simulate cell colonies as a multi-agent system that adapts to hyper-gravity in starvation conditions. In the model, the cell's heritable characteristics are generated and transferred randomly to offspring cells. After a qualitative validation of the model at normal gravity, we simulate cellular growth in hyper-gravity conditions. The obtained data are consistent with previously confirmed theoretical and experimental findings for bacterial behavior in environmental changes, including the experimental data from the microgravity Atlantis and the Hypergravity 3000 experiments. Our results demonstrate that it is possible to utilize an EABM with realistic qualitative description to examine the effects of hypergravity and starvation on complex cellular entities.

  12. Carbon nanotube thin film strain sensors: comparison between experimental tests and numerical simulations

    NASA Astrophysics Data System (ADS)

    Lee, Bo Mi; Loh, Kenneth J.

    2017-04-01

    Carbon nanotubes can be randomly deposited in polymer thin film matrices to form nanocomposite strain sensors. However, a computational framework that enables the direct design of these nanocomposite thin films is still lacking. The objective of this study is to derive an experimentally validated and two-dimensional numerical model of carbon nanotube-based thin film strain sensors. This study consisted of two parts. First, multi-walled carbon nanotube (MWCNT)-Pluronic strain sensors were fabricated using vacuum filtration, and their physical, electrical, and electromechanical properties were evaluated. Second, scanning electron microscope images of the films were used for identifying topological features of the percolated MWCNT network, where the information obtained was then utilized for developing the numerical model. Validation of the numerical model was achieved by ensuring that the area ratios (of MWCNTs relative to the polymer matrix) were equivalent for both the experimental and modeled cases. Strain sensing behavior of the percolation-based model was simulated and then compared to experimental test results.

  13. Cost Utility Analysis of the Cervical Artificial Disc vs Fusion for the Treatment of 2-Level Symptomatic Degenerative Disc Disease: 5-Year Follow-up.

    PubMed

    Ament, Jared D; Yang, Zhuo; Nunley, Pierce; Stone, Marcus B; Lee, Darrin; Kim, Kee D

    2016-07-01

    The cervical total disc replacement (cTDR) was developed to treat cervical degenerative disc disease while preserving motion. Cost-effectiveness of this intervention was established by looking at 2-year follow-up, and this update reevaluates our analysis over 5 years. Data were derived from a randomized trial of 330 patients. Data from the 12-Item Short Form Health Survey were transformed into utilities by using the SF-6D algorithm. Costs were calculated by extracting diagnosis-related group codes and then applying 2014 Medicare reimbursement rates. A Markov model evaluated quality-adjusted life years (QALYs) for both treatment groups. Univariate and multivariate sensitivity analyses were conducted to test the stability of the model. The model adopted both societal and health system perspectives and applied a 3% annual discount rate. The cTDR costs $1687 more than anterior cervical discectomy and fusion (ACDF) over 5 years. In contrast, cTDR had $34 377 less productivity loss compared with ACDF. There was a significant difference in the return-to-work rate (81.6% compared with 65.4% for cTDR and ACDF, respectively; P = .029). From a societal perspective, the incremental cost-effective ratio (ICER) for cTDR was -$165 103 per QALY. From a health system perspective, the ICER for cTDR was $8518 per QALY. In the sensitivity analysis, the ICER for cTDR remained below the US willingness-to-pay threshold of $50 000 per QALY in all scenarios (-$225 816 per QALY to $22 071 per QALY). This study is the first to report the comparative cost-effectiveness of cTDR vs ACDF for 2-level degenerative disc disease at 5 years. The authors conclude that, because of the negative ICER, cTDR is the dominant modality. ACDF, anterior cervical discectomy and fusionAWP, average wholesale priceCE, cost-effectivenessCEA, cost-effectiveness analysisCPT, Current Procedural TerminologycTDR, cervical total disc replacementCUA, cost-utility analysisDDD, degenerative disc diseaseDRG, diagnosis-related groupFDA, US Food and Drug AdministrationICER, incremental cost-effectiveness ratioIDE, Investigational Device ExemptionNDI, neck disability indexQALY, quality-adjusted life yearsRCT, randomized controlled trialRTW, return-to-workSF-12, 12-Item Short Form Health SurveyVAS, visual analog scaleWTP, willingness-to-pay.

  14. Mapping health assessment questionnaire disability index (HAQ-DI) score, pain visual analog scale (VAS), and disease activity score in 28 joints (DAS28) onto the EuroQol-5D (EQ-5D) utility score with the KORean Observational study Network for Arthritis (KORONA) registry data.

    PubMed

    Kim, Hye-Lin; Kim, Dam; Jang, Eun Jin; Lee, Min-Young; Song, Hyun Jin; Park, Sun-Young; Cho, Soo-Kyung; Sung, Yoon-Kyoung; Choi, Chan-Bum; Won, Soyoung; Bang, So-Young; Cha, Hoon-Suk; Choe, Jung-Yoon; Chung, Won Tae; Hong, Seung-Jae; Jun, Jae-Bum; Kim, Jinseok; Kim, Seong-Kyu; Kim, Tae-Hwan; Kim, Tae-Jong; Koh, Eunmi; Lee, Hwajeong; Lee, Hye-Soon; Lee, Jisoo; Lee, Shin-Seok; Lee, Sung Won; Park, Sung-Hoon; Shim, Seung-Cheol; Yoo, Dae-Hyun; Yoon, Bo Young; Bae, Sang-Cheol; Lee, Eui-Kyung

    2016-04-01

    The aim of this study was to estimate the mapping model for EuroQol-5D (EQ-5D) utility values using the health assessment questionnaire disability index (HAQ-DI), pain visual analog scale (VAS), and disease activity score in 28 joints (DAS28) in a large, nationwide cohort of rheumatoid arthritis (RA) patients in Korea. The KORean Observational study Network for Arthritis (KORONA) registry data on 3557 patients with RA were used. Data were randomly divided into a modeling set (80 % of the data) and a validation set (20 % of the data). The ordinary least squares (OLS), Tobit, and two-part model methods were employed to construct a model to map to the EQ-5D index. Using a combination of HAQ-DI, pain VAS, and DAS28, four model versions were examined. To evaluate the predictive accuracy of the models, the root-mean-square error (RMSE) and mean absolute error (MAE) were calculated using the validation dataset. A model that included HAQ-DI, pain VAS, and DAS28 produced the highest adjusted R (2) as well as the lowest Akaike information criterion, RMSE, and MAE, regardless of the statistical methods used in modeling set. The mapping equation of the OLS method is given as EQ-5D = 0.95-0.21 × HAQ-DI-0.24 × pain VAS/100-0.01 × DAS28 (adjusted R (2) = 57.6 %, RMSE = 0.1654 and MAE = 0.1222). Also in the validation set, the RMSE and MAE were shown to be the smallest. The model with HAQ-DI, pain VAS, and DAS28 showed the best performance, and this mapping model enabled the estimation of an EQ-5D value for RA patients in whom utility values have not been measured.

  15. Texture analysis of common renal masses in multiple MR sequences for prediction of pathology

    NASA Astrophysics Data System (ADS)

    Hoang, Uyen N.; Malayeri, Ashkan A.; Lay, Nathan S.; Summers, Ronald M.; Yao, Jianhua

    2017-03-01

    This pilot study performs texture analysis on multiple magnetic resonance (MR) images of common renal masses for differentiation of renal cell carcinoma (RCC). Bounding boxes are drawn around each mass on one axial slice in T1 delayed sequence to use for feature extraction and classification. All sequences (T1 delayed, venous, arterial, pre-contrast phases, T2, and T2 fat saturated sequences) are co-registered and texture features are extracted from each sequence simultaneously. Random forest is used to construct models to classify lesions on 96 normal regions, 87 clear cell RCCs, 8 papillary RCCs, and 21 renal oncocytomas; ground truths are verified through pathology reports. The highest performance is seen in random forest model when data from all sequences are used in conjunction, achieving an overall classification accuracy of 83.7%. When using data from one single sequence, the overall accuracies achieved for T1 delayed, venous, arterial, and pre-contrast phase, T2, and T2 fat saturated were 79.1%, 70.5%, 56.2%, 61.0%, 60.0%, and 44.8%, respectively. This demonstrates promising results of utilizing intensity information from multiple MR sequences for accurate classification of renal masses.

  16. Cost-Effectiveness of Integrating Tobacco Cessation Into Post-Traumatic Stress Disorder Treatment.

    PubMed

    Barnett, Paul G; Jeffers, Abra; Smith, Mark W; Chow, Bruce K; McFall, Miles; Saxon, Andrew J

    2016-03-01

    We examined the cost-effectiveness of smoking cessation integrated with treatment for post-traumatic stress disorder (PTSD). Smoking veterans receiving care for PTSD (N = 943) were randomized to care integrated with smoking cessation versus referral to a smoking cessation clinic. Smoking cessation services, health care cost and utilization, quality of life, and biochemically-verified abstinence from cigarettes were assessed over 18-months of follow-up. Clinical outcomes were combined with literature on changes in smoking status and the effect of smoking on health care cost, mortality, and quality of life in a Markov model of cost-effectiveness over a lifetime horizon. We discounted cost and outcomes at 3% per year and report costs in 2010 US dollars. The mean of smoking cessation services cost was $1286 in those randomized to integrated care and $551 in those receiving standard care (P < .001). There were no significant differences in the cost of mental health services or other care. After 12 months, prolonged biochemically verified abstinence was observed in 8.9% of those randomized to integrated care and 4.5% of those randomized to standard care (P = .004). The model projected that Integrated Care added $836 in lifetime cost and generated 0.0259 quality adjusted life years (QALYs), an incremental cost-effectiveness ratio of $32 257 per QALY. It was 86.0% likely to be cost-effective compared to a threshold of $100 000/QALY. Smoking cessation integrated with treatment for PTSD was cost-effective, within a broad confidence region, but less cost-effective than most other smoking cessation programs reported in the literature. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  17. Cost-Effectiveness of Integrating Tobacco Cessation Into Post-Traumatic Stress Disorder Treatment

    PubMed Central

    Jeffers, Abra; Smith, Mark W.; Chow, Bruce K.; McFall, Miles; Saxon, Andrew J.

    2016-01-01

    Abstract Introduction: We examined the cost-effectiveness of smoking cessation integrated with treatment for post-traumatic stress disorder (PTSD). Methods: Smoking veterans receiving care for PTSD ( N = 943) were randomized to care integrated with smoking cessation versus referral to a smoking cessation clinic. Smoking cessation services, health care cost and utilization, quality of life, and biochemically-verified abstinence from cigarettes were assessed over 18-months of follow-up. Clinical outcomes were combined with literature on changes in smoking status and the effect of smoking on health care cost, mortality, and quality of life in a Markov model of cost-effectiveness over a lifetime horizon. We discounted cost and outcomes at 3% per year and report costs in 2010 US dollars. Results: The mean of smoking cessation services cost was $1286 in those randomized to integrated care and $551 in those receiving standard care ( P < .001). There were no significant differences in the cost of mental health services or other care. After 12 months, prolonged biochemically verified abstinence was observed in 8.9% of those randomized to integrated care and 4.5% of those randomized to standard care ( P = .004). The model projected that Integrated Care added $836 in lifetime cost and generated 0.0259 quality adjusted life years (QALYs), an incremental cost-effectiveness ratio of $32 257 per QALY. It was 86.0% likely to be cost-effective compared to a threshold of $100 000/QALY. Conclusions: Smoking cessation integrated with treatment for PTSD was cost-effective, within a broad confidence region, but less cost-effective than most other smoking cessation programs reported in the literature. PMID:25943761

  18. Divergence instability of pipes conveying fluid with uncertain flow velocity

    NASA Astrophysics Data System (ADS)

    Rahmati, Mehdi; Mirdamadi, Hamid Reza; Goli, Sareh

    2018-02-01

    This article deals with investigation of probabilistic stability of pipes conveying fluid with stochastic flow velocity in time domain. As a matter of fact, this study has focused on the randomness effects of flow velocity on stability of pipes conveying fluid while most of research efforts have only focused on the influences of deterministic parameters on the system stability. The Euler-Bernoulli beam and plug flow theory are employed to model pipe structure and internal flow, respectively. In addition, flow velocity is considered as a stationary random process with Gaussian distribution. Afterwards, the stochastic averaging method and Routh's stability criterion are used so as to investigate the stability conditions of system. Consequently, the effects of boundary conditions, viscoelastic damping, mass ratio, and elastic foundation on the stability regions are discussed. Results delineate that the critical mean flow velocity decreases by increasing power spectral density (PSD) of the random velocity. Moreover, by increasing PSD from zero, the type effects of boundary condition and presence of elastic foundation are diminished, while the influences of viscoelastic damping and mass ratio could increase. Finally, to have a more applicable study, regression analysis is utilized to develop design equations and facilitate further analyses for design purposes.

  19. Quantum dynamics of nuclear spins and spin relaxation in organic semiconductors

    NASA Astrophysics Data System (ADS)

    Mkhitaryan, V. V.; Dobrovitski, V. V.

    2017-06-01

    We investigate the role of the nuclear-spin quantum dynamics in hyperfine-induced spin relaxation of hopping carriers in organic semiconductors. The fast-hopping regime, when the carrier spin does not rotate much between subsequent hops, is typical for organic semiconductors possessing long spin coherence times. We consider this regime and focus on a carrier random-walk diffusion in one dimension, where the effect of the nuclear-spin dynamics is expected to be the strongest. Exact numerical simulations of spin systems with up to 25 nuclear spins are performed using the Suzuki-Trotter decomposition of the evolution operator. Larger nuclear-spin systems are modeled utilizing the spin-coherent state P -representation approach developed earlier. We find that the nuclear-spin dynamics strongly influences the carrier spin relaxation at long times. If the random walk is restricted to a small area, it leads to the quenching of carrier spin polarization at a nonzero value at long times. If the random walk is unrestricted, the carrier spin polarization acquires a long-time tail, decaying as 1 /√{t } . Based on the numerical results, we devise a simple formula describing the effect quantitatively.

  20. Random matrix theory of singular values of rectangular complex matrices I: Exact formula of one-body distribution function in fixed-trace ensemble

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adachi, Satoshi; Toda, Mikito; Kubotani, Hiroto

    The fixed-trace ensemble of random complex matrices is the fundamental model that excellently describes the entanglement in the quantum states realized in a coupled system by its strongly chaotic dynamical evolution [see H. Kubotani, S. Adachi, M. Toda, Phys. Rev. Lett. 100 (2008) 240501]. The fixed-trace ensemble fully takes into account the conservation of probability for quantum states. The present paper derives for the first time the exact analytical formula of the one-body distribution function of singular values of random complex matrices in the fixed-trace ensemble. The distribution function of singular values (i.e. Schmidt eigenvalues) of a quantum state ismore » so important since it describes characteristics of the entanglement in the state. The derivation of the exact analytical formula utilizes two recent achievements in mathematics, which appeared in 1990s. The first is the Kaneko theory that extends the famous Selberg integral by inserting a hypergeometric type weight factor into the integrand to obtain an analytical formula for the extended integral. The second is the Petkovsek-Wilf-Zeilberger theory that calculates definite hypergeometric sums in a closed form.« less

  1. Utilizing PowerPoint Presentation to Promote Fall Prevention among Older Adults

    ERIC Educational Resources Information Center

    McCrary-Quarles, Audrey R.

    2008-01-01

    This study evaluated a PowerPoint home safety (PPHS) presentation in enhancing awareness, knowledge and behavior change among senior center attendees in southern Illinois. Twelve centers were utilized as data collection sites in a pretest-posttest control group design. Through stratified randomization, centers were placed into categories (high,…

  2. 45 CFR 1356.71 - Federal review of the eligibility of children in foster care and the eligibility of foster care...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... by ACF statistical staff from the Adoption and Foster Care Analysis and Reporting System (AFCARS... primary review utilizing probability sampling methodologies. Usually, the chosen methodology will be simple random sampling, but other probability samples may be utilized, when necessary and appropriate. (3...

  3. A Comparative Study of Random Patterns for Digital Image Correlation

    NASA Astrophysics Data System (ADS)

    Stoilov, G.; Kavardzhikov, V.; Pashkouleva, D.

    2012-06-01

    Digital Image Correlation (DIC) is a computer based image analysis technique utilizing random patterns, which finds applications in experimental mechanics of solids and structures. In this paper a comparative study of three simulated random patterns is done. One of them is generated according to a new algorithm, introduced by the authors. A criterion for quantitative evaluation of random patterns after the calculation of their autocorrelation functions is introduced. The patterns' deformations are simulated numerically and realized experimentally. The displacements are measured by using the DIC method. Tensile tests are performed after printing the generated random patterns on surfaces of standard iron sheet specimens. It is found that the new designed random pattern keeps relatively good quality until reaching 20% deformation.

  4. Evolutionary Perspective on Collective Decision Making

    NASA Astrophysics Data System (ADS)

    Farrell, Dene; Sayama, Hiroki; Dionne, Shelley D.; Yammarino, Francis J.; Wilson, David Sloan

    Team decision making dynamics are investigated from a novel perspective by shifting agency from decision makers to representations of potential solutions. We provide a new way to navigate social dynamics of collective decision making by interpreting decision makers as constituents of an evolutionary environment of an ecology of evolving solutions. We demonstrate distinct patterns of evolution with respect to three forms of variation: (1) Results with random variations in utility functions of individuals indicate that groups demonstrating minimal internal variation produce higher true utility values of group solutions and display better convergence; (2) analysis of variations in behavioral patterns within a group shows that a proper balance between selective and creative evolutionary forces is crucial to producing adaptive solutions; and (3) biased variations of the utility functions diminish the range of variation for potential solution utility, leaving only the differential of convergence performance static. We generally find that group cohesion (low random variation within a group) and composition (appropriate variation of behavioral patterns within a group) are necessary for a successful navigation of the solution space, but performance in both cases is susceptible to group level biases.

  5. Cost-utility of a cardiovascular prevention program in highly educated adults: intermediate results of a randomized controlled trial.

    PubMed

    Jacobs, Nele; Evers, Silvia; Ament, Andre; Claes, Neree

    2010-01-01

    Little is known about the costs and the effects of cardiovascular prevention programs targeted at medical and behavioral risk factors. The aim was to evaluate the cost-utility of a cardiovascular prevention program in a general sample of highly educated adults after 1 year of intervention. The participants were randomly assigned to intervention (n = 208) and usual care conditions (n = 106). The intervention consisted of medical interventions and optional behavior-change interventions (e.g., a tailored Web site). Cost data were registered from a healthcare perspective, and questionnaires were used to determine effectiveness (e.g., quality-adjusted life-years [QALYs]). A cost-utility analysis and sensitivity analyses using bootstrapping were performed on the intermediate results. When adjusting for baseline utility differences, the incremental cost was 433 euros and the incremental effectiveness was 0.016 QALYs. The incremental cost-effectiveness ratio was 26,910 euros per QALY. The intervention was cost-effective compared with usual care in this sample of highly educated adults after 1 year of intervention. Increased participation would make this intervention highly cost-effective.

  6. The impact of a health education program targeting patients with high visit rates in a managed care organization.

    PubMed

    Dally, Diana L; Dahar, Wendy; Scott, Ann; Roblin, Douglas; Khoury, Allan T

    2002-01-01

    To determine if a mailed health promotion program reduced outpatient visits while improving health status. Randomized controlled trial. A midsized, group practice model, managed care organization in Ohio. Members invited (N = 3214) were high utilizers, 18 to 64 years old, with hypertension, diabetes, or arthritis (or all). A total of 886 members agreed to participate, and 593 members returned the initial questionnaires. The 593 members were randomized to the following groups: 99 into arthritis treatment and 100 into arthritis control, 94 into blood pressure treatment and 92 into blood pressure control, and 104 into diabetes treatment and 104 into diabetes control. Outpatient utilization, health status, and self-efficacy were followed over 30 months. Health risk appraisal questionnaires were mailed to treatment and control groups before randomization and at 1 year. The treatment group received three additional condition-specific (arthritis, diabetes, or hypertension) questionnaires and a health information handbook. The treatment group also received written health education materials and an individualized feedback letter after each returned questionnaire. The control group received condition-specific written health education materials and reimbursement for exercise equipment or fitness club membership after returning the 1-year end of the study questionnaire. Changes in visit rates were disease specific. Parameter estimates were calculated from a Poisson regression model. For intervention vs. controls, the arthritis group decreased visits 4.84 per 30 months (p < 0.00), the diabetes group had no significant change, and the hypertension group increased visits 2.89 per 30 months (p < 0.05), the overall health status improved significantly (-6.5 vs. 2.3, p < 0.01) for the arthritis group but showed no significant change for the other two groups, and coronary artery disease and cancer risk scores did not change significantly for any group individually. Overall self-efficacy for intervention group completers improved by -8.6 points (p < 0.03) for the arthritis group, and the other groups showed no significant change. This study demonstrated that in a population of 18 to 64 years with chronic conditions, mailed health promotion programs might only benefit people with certain conditions.

  7. Using whole disease modeling to inform resource allocation decisions: economic evaluation of a clinical guideline for colorectal cancer using a single model.

    PubMed

    Tappenden, Paul; Chilcott, Jim; Brennan, Alan; Squires, Hazel; Glynne-Jones, Rob; Tappenden, Janine

    2013-06-01

    To assess the feasibility and value of simulating whole disease and treatment pathways within a single model to provide a common economic basis for informing resource allocation decisions. A patient-level simulation model was developed with the intention of being capable of evaluating multiple topics within National Institute for Health and Clinical Excellence's colorectal cancer clinical guideline. The model simulates disease and treatment pathways from preclinical disease through to detection, diagnosis, adjuvant/neoadjuvant treatments, follow-up, curative/palliative treatments for metastases, supportive care, and eventual death. The model parameters were informed by meta-analyses, randomized trials, observational studies, health utility studies, audit data, costing sources, and expert opinion. Unobservable natural history parameters were calibrated against external data using Bayesian Markov chain Monte Carlo methods. Economic analysis was undertaken using conventional cost-utility decision rules within each guideline topic and constrained maximization rules across multiple topics. Under usual processes for guideline development, piecewise economic modeling would have been used to evaluate between one and three topics. The Whole Disease Model was capable of evaluating 11 of 15 guideline topics, ranging from alternative diagnostic technologies through to treatments for metastatic disease. The constrained maximization analysis identified a configuration of colorectal services that is expected to maximize quality-adjusted life-year gains without exceeding current expenditure levels. This study indicates that Whole Disease Model development is feasible and can allow for the economic analysis of most interventions across a disease service within a consistent conceptual and mathematical infrastructure. This disease-level modeling approach may be of particular value in providing an economic basis to support other clinical guidelines. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  8. Do repeated assessments of performance status improve predictions for risk of death among patients with cancer? A population-based cohort study.

    PubMed

    Su, Jiandong; Barbera, Lisa; Sutradhar, Rinku

    2015-06-01

    Prior work has utilized longitudinal information on performance status to demonstrate its association with risk of death among cancer patients; however, no study has assessed whether such longitudinal information improves the predictions for risk of death. To examine whether the use of repeated performance status assessments improve predictions for risk of death compared to using only performance status assessment at the time of cancer diagnosis. This was a population-based longitudinal study of adult outpatients who had a cancer diagnosis and had at least one assessment of performance status. To account for each patient's changing performance status over time, we implemented a Cox model with a time-varying covariate for performance status. This model was compared to a Cox model using only a time-fixed (baseline) covariate for performance status. The regression coefficients of each model were derived based on a randomly selected 60% of patients, and then, the predictive ability of each model was assessed via concordance probabilities when applied to the remaining 40% of patients. Our study consisted of 15,487 cancer patients with over 53,000 performance status assessments. The utilization of repeated performance status assessments improved predictions for risk of death compared to using only the performance status assessment taken at diagnosis. When studying the hazard of death among patients with cancer, if available, researchers should incorporate changing information on performance status scores, instead of simply baseline information on performance status. © The Author(s) 2015.

  9. Using Predictive Analytics to Predict Power Outages from Severe Weather

    NASA Astrophysics Data System (ADS)

    Wanik, D. W.; Anagnostou, E. N.; Hartman, B.; Frediani, M. E.; Astitha, M.

    2015-12-01

    The distribution of reliable power is essential to businesses, public services, and our daily lives. With the growing abundance of data being collected and created by industry (i.e. outage data), government agencies (i.e. land cover), and academia (i.e. weather forecasts), we can begin to tackle problems that previously seemed too complex to solve. In this session, we will present newly developed tools to aid decision-support challenges at electric distribution utilities that must mitigate, prepare for, respond to and recover from severe weather. We will show a performance evaluation of outage predictive models built for Eversource Energy (formerly Connecticut Light & Power) for storms of all types (i.e. blizzards, thunderstorms and hurricanes) and magnitudes (from 20 to >15,000 outages). High resolution weather simulations (simulated with the Weather and Research Forecast Model) were joined with utility outage data to calibrate four types of models: a decision tree (DT), random forest (RF), boosted gradient tree (BT) and an ensemble (ENS) decision tree regression that combined predictions from DT, RF and BT. The study shows that the ENS model forced with weather, infrastructure and land cover data was superior to the other models we evaluated, especially in terms of predicting the spatial distribution of outages. This research has the potential to be used for other critical infrastructure systems (such as telecommunications, drinking water and gas distribution networks), and can be readily expanded to the entire New England region to facilitate better planning and coordination among decision-makers when severe weather strikes.

  10. Magnetically coupled flextensional transducer for wideband vibration energy harvesting: Design, modeling and experiments

    NASA Astrophysics Data System (ADS)

    Zou, Hong-Xiang; Zhang, Wen-Ming; Li, Wen-Bo; Wei, Ke-Xiang; Hu, Kai-Ming; Peng, Zhi-Ke; Meng, Guang

    2018-03-01

    The combination of nonlinear bistable and flextensional mechanisms has the advantages of wide operating frequency and high equivalent piezoelectric constant. In this paper, three magnetically coupled flextensional vibration energy harvesters (MF-VEHs) are designed from three magnetically coupled vibration systems which utilize a magnetic repulsion, two symmetrical magnetic attractions and multi-magnetic repulsions, respectively. The coupled dynamic models are developed to describe the electromechanical transitions. Simulations under harmonic excitation and random excitation are carried out to investigate the performance of the MF-VEHs with different parameters. Experimental validations of the MF-VEHs are performed under different excitation levels. The experimental results verify that the developed mathematical models can be used to accurately characterize the MF-VEHs for various magnetic coupling modes. A comparison of three MF-VEHs is provided and the results illustrate that a reasonable arrangement of multiple magnets can reduce the threshold excitation intensity and increase the harvested energy.

  11. Do People Use the Shortest Path? An Empirical Test of Wardrop’s First Principle

    PubMed Central

    Zhu, Shanjiang; Levinson, David

    2015-01-01

    Most recent route choice models, following either the random utility maximization or rule-based paradigm, require explicit enumeration of feasible routes. The quality of model estimation and prediction is sensitive to the appropriateness of the consideration set. However, few empirical studies of revealed route characteristics have been reported in the literature. This study evaluates the widely applied shortest path assumption by evaluating routes followed by residents of the Minneapolis—St. Paul metropolitan area. Accurate Global Positioning System (GPS) and Geographic Information System (GIS) data were employed to reveal routes people used over an eight to thirteen week period. Most people did not choose the shortest path. Using three weeks of that data, we find that current route choice set generation algorithms do not reveal the majority of paths that individuals took. Findings from this study may guide future efforts in building better route choice models. PMID:26267756

  12. Hypersonic Wind Tunnel Calibration Using the Modern Design of Experiments

    NASA Technical Reports Server (NTRS)

    Rhode, Matthew N.; DeLoach, Richard

    2005-01-01

    A calibration of a hypersonic wind tunnel has been conducted using formal experiment design techniques and response surface modeling. Data from a compact, highly efficient experiment was used to create a regression model of the pitot pressure as a function of the facility operating conditions as well as the longitudinal location within the test section. The new calibration utilized far fewer design points than prior experiments, but covered a wider range of the facility s operating envelope while revealing interactions between factors not captured in previous calibrations. A series of points chosen randomly within the design space was used to verify the accuracy of the response model. The development of the experiment design is discussed along with tactics used in the execution of the experiment to defend against systematic variation in the results. Trends in the data are illustrated, and comparisons are made to earlier findings.

  13. Emergence of Alpha and Gamma Like Rhythms in a Large Scale Simulation of Interacting Neurons

    NASA Astrophysics Data System (ADS)

    Gaebler, Philipp; Miller, Bruce

    2007-10-01

    In the normal brain, at first glance the electrical activity appears very random. However, certain frequencies emerge during specific stages of sleep or between quiet wake states. This raises the question of whether current mathematical and computational models of interacting neurons can display similar behavior. A recent model developed by Eugene Izhikevich appears to succeed. However, early dynamical simulations used to detect these patterns were possibly compromised by an over-simplified initial condition and evolution algorithm. Utilizing the same model, but a more robust algorithm, here we present our initial results, showing that these patterns persist under a wide range of initial conditions. We employ spectral analysis of the firing patterns of a system of interacting excitatory and inhibitory neurons to demonstrate a bimodal spectrum centered on two frequencies in the range characteristic of alpha and gamma rhythms in the human brain.

  14. A machine learning model with human cognitive biases capable of learning from small and biased datasets.

    PubMed

    Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro

    2018-05-09

    Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.

  15. Ability of crime, demographic and business data to forecast areas of increased violence.

    PubMed

    Bowen, Daniel A; Mercer Kollar, Laura M; Wu, Daniel T; Fraser, David A; Flood, Charles E; Moore, Jasmine C; Mays, Elizabeth W; Sumner, Steven A

    2018-05-24

    Identifying geographic areas and time periods of increased violence is of considerable importance in prevention planning. This study compared the performance of multiple data sources to prospectively forecast areas of increased interpersonal violence. We used 2011-2014 data from a large metropolitan county on interpersonal violence (homicide, assault, rape and robbery) and forecasted violence at the level of census block-groups and over a one-month moving time window. Inputs to a Random Forest model included historical crime records from the police department, demographic data from the US Census Bureau, and administrative data on licensed businesses. Among 279 block groups, a model utilizing all data sources was found to prospectively improve the identification of the top 5% most violent block-group months (positive predictive value = 52.1%; negative predictive value = 97.5%; sensitivity = 43.4%; specificity = 98.2%). Predictive modelling with simple inputs can help communities more efficiently focus violence prevention resources geographically.

  16. Numerical comparison of grid pattern diffraction effects through measurement and modeling with OptiScan software

    NASA Astrophysics Data System (ADS)

    Murray, Ian B.; Densmore, Victor; Bora, Vaibhav; Pieratt, Matthew W.; Hibbard, Douglas L.; Milster, Tom D.

    2011-06-01

    Coatings of various metalized patterns are used for heating and electromagnetic interference (EMI) shielding applications. Previous work has focused on macro differences between different types of grids, and has shown good correlation between measurements and analyses of grid diffraction. To advance this work, we have utilized the University of Arizona's OptiScan software, which has been optimized for this application by using the Babinet Principle. When operating on an appropriate computer system, this algorithm produces results hundreds of times faster than standard Fourier-based methods, and allows realistic cases to be modeled for the first time. By using previously published derivations by Exotic Electro-Optics, we compare diffraction performance of repeating and randomized grid patterns with equivalent sheet resistance using numerical performance metrics. Grid patterns of each type are printed on optical substrates and measured energy is compared against modeled energy.

  17. Friendship Dissolution Within Social Networks Modeled Through Multilevel Event History Analysis

    PubMed Central

    Dean, Danielle O.; Bauer, Daniel J.; Prinstein, Mitchell J.

    2018-01-01

    A social network perspective can bring important insight into the processes that shape human behavior. Longitudinal social network data, measuring relations between individuals over time, has become increasingly common—as have the methods available to analyze such data. A friendship duration model utilizing discrete-time multilevel survival analysis with a multiple membership random effect structure is developed and applied here to study the processes leading to undirected friendship dissolution within a larger social network. While the modeling framework is introduced in terms of understanding friendship dissolution, it can be used to understand microlevel dynamics of a social network more generally. These models can be fit with standard generalized linear mixed-model software, after transforming the data to a pair-period data set. An empirical example highlights how the model can be applied to understand the processes leading to friendship dissolution between high school students, and a simulation study is used to test the use of the modeling framework under representative conditions that would be found in social network data. Advantages of the modeling framework are highlighted, and potential limitations and future directions are discussed. PMID:28463022

  18. Implementation of a Smeared Crack Band Model in a Micromechanics Framework

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.

    2012-01-01

    The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.

  19. Modeling Randomness in Judging Rating Scales with a Random-Effects Rating Scale Model

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Wilson, Mark; Shih, Ching-Lin

    2006-01-01

    This study presents the random-effects rating scale model (RE-RSM) which takes into account randomness in the thresholds over persons by treating them as random-effects and adding a random variable for each threshold in the rating scale model (RSM) (Andrich, 1978). The RE-RSM turns out to be a special case of the multidimensional random…

  20. Fuzzy association rule mining and classification for the prediction of malaria in South Korea.

    PubMed

    Buczak, Anna L; Baugher, Benjamin; Guven, Erhan; Ramac-Thomas, Liane C; Elbert, Yevgeniy; Babin, Steven M; Lewis, Sheri H

    2015-06-18

    Malaria is the world's most prevalent vector-borne disease. Accurate prediction of malaria outbreaks may lead to public health interventions that mitigate disease morbidity and mortality. We describe an application of a method for creating prediction models utilizing Fuzzy Association Rule Mining to extract relationships between epidemiological, meteorological, climatic, and socio-economic data from Korea. These relationships are in the form of rules, from which the best set of rules is automatically chosen and forms a classifier. Two classifiers have been built and their results fused to become a malaria prediction model. Future malaria cases are predicted as Low, Medium or High, where these classes are defined as a total of 0-2, 3-16, and above 17 cases, respectively, for a region in South Korea during a two-week period. Based on user recommendations, HIGH is considered an outbreak. Model accuracy is described by Positive Predictive Value (PPV), Sensitivity, and F-score for each class, computed on test data not previously used to develop the model. For predictions made 7-8 weeks in advance, model PPV and Sensitivity are 0.842 and 0.681, respectively, for the HIGH classes. The F0.5 and F3 scores (which combine PPV and Sensitivity) are 0.804 and 0.694, respectively, for the HIGH classes. The overall FARM results (as measured by F-scores) are significantly better than those obtained by Decision Tree, Random Forest, Support Vector Machine, and Holt-Winters methods for the HIGH class. For the Medium class, Random Forest and FARM obtain comparable results, with FARM being better at F0.5, and Random Forest obtaining a higher F3. A previously described method for creating disease prediction models has been modified and extended to build models for predicting malaria. In addition, some new input variables were used, including indicators of intervention measures. The South Korea malaria prediction models predict Low, Medium or High cases 7-8 weeks in the future. This paper demonstrates that our data driven approach can be used for the prediction of different diseases.

  1. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014.

    PubMed

    Guo, Lijun; Bao, Yong; Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction.

  2. Quality of community basic medical service utilization in urban and suburban areas in Shanghai from 2009 to 2014

    PubMed Central

    Ma, Jun; Li, Shujun; Cai, Yuyang; Sun, Wei; Liu, Qiaohong

    2018-01-01

    Urban areas usually display better health care services than rural areas, but data about suburban areas in China are lacking. Hence, this cross-sectional study compared the utilization of community basic medical services in Shanghai urban and suburban areas between 2009 and 2014. These data were used to improve the efficiency of community health service utilization and to provide a reference for solving the main health problems of the residents in urban and suburban areas of Shanghai. Using a two-stage random sampling method, questionnaires were completed by 73 community health service centers that were randomly selected from six districts that were also randomly selected from 17 counties in Shanghai. Descriptive statistics, principal component analysis, and forecast analysis were used to complete a gap analysis of basic health services utilization quality between urban and suburban areas. During the 6-year study period, there was an increasing trend toward greater efficiency of basic medical service provision, benefits of basic medical service provision, effectiveness of common chronic disease management, overall satisfaction of community residents, and two-way referral effects. In addition to the implementation effect of hypertension management and two-way referral, the remaining indicators showed a superior effect in urban areas compared with the suburbs (P<0.001). In addition, among the seven principal components, four principal component scores were better in urban areas than in suburban areas (P = <0.001, 0.004, 0.036, and 0.022). The urban comprehensive score also exceeded that of the suburbs (P<0.001). In summary, over the 6-year period, there was a rapidly increasing trend in basic medical service utilization. Comprehensive satisfaction clearly improved as well. Nevertheless, there was an imbalance in health service utilization between urban and suburban areas. There is a need for the health administrative department to address this imbalance between urban and suburban institutions and to provide the required support to underdeveloped areas to improve resident satisfaction. PMID:29791470

  3. Dental utilization by active duty Army personnel.

    PubMed

    Chisick, M C

    1993-10-01

    In spring 1989, a random, Army-wide sample of 15,364 enlisted and 4,529 officer personnel was surveyed on dental utilization. Results show no difference in annual dental utilization between officer and enlisted personnel when age is controlled. Because annual dental utilization increases with age and enlisted ranks contain a disproportionately large number of younger personnel, a difference in annual dental utilization between enlisted and officer personnel emerges when age is not controlled. Check-ups are the most common reason for dental visits. Nearly all soldiers seek care exclusively in military dental clinics. Non-use is highest among 18- to 19-year-olds (12.2%).

  4. More Than A Meal? A Randomized Control Trial Comparing the Effects of Home-Delivered Meals Programs on Participants' Feelings of Loneliness.

    PubMed

    Thomas, Kali S; Akobundu, Ucheoma; Dosa, David

    2016-11-01

    Nutrition service providers are seeking alternative delivery models to control costs and meet the growing need for home-delivered meals. The objective of this study was to evaluate the extent to which the home-delivered meals program, and the type of delivery model, reduces homebound older adults' feelings of loneliness. This project utilizes data from a three-arm, fixed randomized control study conducted with 626 seniors on waiting lists at eight Meals on Wheels programs across the United States. Seniors were randomly assigned to either (i) receive daily meal delivery; (ii) receive once-weekly meal delivery; or (iii) remain on the waiting list. Participants were surveyed at baseline and again at 15 weeks. Analysis of covariance was used to test for differences in loneliness between groups, over time and logistic regression was used to assess differences in self-rated improvement in loneliness. Participants receiving meals had lower adjusted loneliness scores at follow-up compared with the control group. Individuals who received daily-delivered meals were more likely to self-report that home-delivered meals improved their loneliness than the group receiving once-weekly delivered meals. This article includes important implications for organizations that provide home-delivered meals in terms of cost, delivery modality, and potential recipient benefits. Published by Oxford University Press on behalf of the Gerontological Society of America 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.

  5. Tobacco use and mass media utilization in sub-Saharan Africa.

    PubMed

    Achia, Thomas N O

    2015-01-01

    Media utilization has been identified as an important determinant of tobacco use. We examined the association between self-reported tobacco use and frequency of mass media utilization by women and men in nine low-to middle-income sub-Saharan African countries. Data for the study came from Demographic and Health Surveys conducted in Burkina Faso, Ethiopia, Liberia, Lesotho, Malawi, Swaziland, Uganda, Zambia and Zimbabwe over the period 2006-2011. Each survey population was a cross-sectional sample of women aged 15-49 years and men aged 15-59 years, with information on tobacco use and media access being obtained by face-to-face interviews. An index of media utilization was constructed based on responses to questions on the frequency of reading newspapers, frequency of watching television and frequency of listening to the radio. Demographic and socioeconomic variables were considered as potentially confounding covariates. Logistic regression models with country and cluster specific random effects were estimated for the pooled data. The risk of cigarette smoking increased with greater utilization to mass media. The use of smokeless tobacco and tobacco use in general declined with greater utilization to mass media. The risk of tobacco use was 5% lower in women with high media utilization compared to those with low media utilization [Adjusted Odds Ratio (AOR) = 0.95, 95% confidence interval (CI):0.82-1.00]. Men with a high media utilization were 21% less likely to use tobacco compared to those with low media utilization [AOR = 0.79, 95%CI = 0.73-0.85]. In the male sample, tobacco use also declined with the increased frequency of reading newspapers (or magazines), listening to radio and watching television. Mass media campaigns, conducted in the context of comprehensive tobacco control programmes, can reduce the prevalence of tobacco smoking in sub-Saharan Africa. The reach, intensity, duration and type of messages are important aspects of the campaigns but need to also address all forms of tobacco use.

  6. Robin Hood Effects on Motivation in Math: Family Interest Moderates the Effects of Relevance Interventions

    ERIC Educational Resources Information Center

    Häfner, Isabelle; Flunger, Barbara; Dicke, Anna-Lena; Gaspard, Hanna; Brisson, Brigitte M.; Nagengast, Benjamin; Trautwein, Ulrich

    2017-01-01

    Using a cluster randomized field trial, the present study tested whether 2 relevance interventions affected students' value beliefs, self-concept, and effort in math differently depending on family background (socioeconomic status, family interest (FI), and parental utility value). Eighty-two classrooms were randomly assigned to either 1 of 2…

  7. Randomized, Controlled Trial of a Comprehensive Program for Young Students with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Young, Helen E.; Falco, Ruth A.; Hanita, Makoto

    2016-01-01

    This randomized, controlled trial, comparing the Comprehensive Autism Program (CAP) and business as usual programs, studied outcomes for 3-5 year old students with autism spectrum disorder (ASD). Participants included 84 teachers and 302 students with ASD and their parents. CAP utilized specialized curricula and training components to implement…

  8. Lesson Study to Scale up Research-Based Knowledge: A Randomized, Controlled Trial of Fractions Learning

    ERIC Educational Resources Information Center

    Lewis, Catherine; Perry, Rebecca

    2017-01-01

    An understanding of fractions eludes many U.S. students, and research-based knowledge about fraction, such as the utility of linear representations, has not broadly influenced instruction. This randomized trial of lesson study supported by mathematical resources assigned 39 educator teams across the United States to locally managed lesson study…

  9. A Randomized Controlled Study Evaluating a Brief, Bystander Bullying Intervention with Junior High School Students

    ERIC Educational Resources Information Center

    Midgett, Aida; Doumas, Diana; Trull, Rhiannon; Johnston, April D.

    2017-01-01

    A randomized controlled study evaluated a brief, bystander bullying intervention for junior high school students. Students in both groups reported an increase in knowledge and confidence to act as defenders and to utilize strategies to intervene on behalf of victims of bullying. Findings suggest possible carry-over effects from the intervention…

  10. Information hiding based on double random-phase encoding and public-key cryptography.

    PubMed

    Sheng, Yuan; Xin, Zhou; Alam, Mohammed S; Xi, Lu; Xiao-Feng, Li

    2009-03-02

    A novel information hiding method based on double random-phase encoding (DRPE) and Rivest-Shamir-Adleman (RSA) public-key cryptosystem is proposed. In the proposed technique, the inherent diffusion property of DRPE is cleverly utilized to make up the diffusion insufficiency of RSA public-key cryptography, while the RSA cryptosystem is utilized for simultaneous transmission of the cipher text and the two phase-masks, which is not possible under the DRPE technique. This technique combines the complementary advantages of the DPRE and RSA encryption techniques and brings security and convenience for efficient information transmission. Extensive numerical simulation results are presented to verify the performance of the proposed technique.

  11. Speeding up image quality improvement in random phase-free holograms using ringing artifact characteristics.

    PubMed

    Nagahama, Yuki; Shimobaba, Tomoyoshi; Kakue, Takashi; Masuda, Nobuyuki; Ito, Tomoyoshi

    2017-05-01

    A holographic projector utilizes holography techniques. However, there are several barriers to realizing holographic projections. One is deterioration of hologram image quality caused by speckle noise and ringing artifacts. The combination of the random phase-free method and the Gerchberg-Saxton (GS) algorithm has improved the image quality of holograms. However, the GS algorithm requires significant computation time. We propose faster methods for image quality improvement of random phase-free holograms using the characteristics of ringing artifacts.

  12. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  13. Dense modifiable interconnections utilizing photorefractive volume holograms

    NASA Astrophysics Data System (ADS)

    Psaltis, Demetri; Qiao, Yong

    1990-11-01

    This report describes an experimental two-layer optical neural network built at Caltech. The system uses photorefractive volume holograms to implement dense, modifiable synaptic interconnections and liquid crystal light valves (LCVS) to perform nonlinear thresholding operations. Kanerva's Sparse, Distributed Memory was implemented using this network and its ability to recognize handwritten character-alphabet (A-Z) has been demonstrated experimentally. According to Kanerva's model, the first layer has fixed, random weights of interconnections and the second layer is trained by sum-of-outer-products rule. After training, the recognition rates of the network on the training set (104 patterns) and test set (520 patterns) are 100 and 50 percent, respectively.

  14. Choosing the best index for the average score intraclass correlation coefficient.

    PubMed

    Shieh, Gwowen

    2016-09-01

    The intraclass correlation coefficient (ICC)(2) index from a one-way random effects model is widely used to describe the reliability of mean ratings in behavioral, educational, and psychological research. Despite its apparent utility, the essential property of ICC(2) as a point estimator of the average score intraclass correlation coefficient is seldom mentioned. This article considers several potential measures and compares their performance with ICC(2). Analytical derivations and numerical examinations are presented to assess the bias and mean square error of the alternative estimators. The results suggest that more advantageous indices can be recommended over ICC(2) for their theoretical implication and computational ease.

  15. Measuring Care-Related Quality of Life of Caregivers for Use in Economic Evaluations: CarerQol Tariffs for Australia, Germany, Sweden, UK, and US.

    PubMed

    Hoefman, Renske J; van Exel, Job; Brouwer, Werner B F

    2017-04-01

    Informal care is often not included in economic evaluations in healthcare, while the impact of caregiving can be relevant for cost-effectiveness recommendations from a societal perspective. The impact of informal care can be measured and valued with the CarerQol instrument, which measures the impact of informal care on seven important burden dimensions (CarerQol-7D) and values this in terms of general quality of life (CarerQol-VAS). The CarerQol can be included at the effect side of multi-criteria analyses of patient interventions or in cost-effectiveness or utility analysis of interventions targeted at caregivers. At present, utility scores based on relative utility weights for the CarerQol-7D are only available for the Netherlands. This study calculates CarerQol-7D tariffs for Australia, Germany, Sweden, UK, and US. Data were collected among the general population in Australia, Germany, Sweden, UK, and US by an Internet survey. Utility weights were collected with a discrete choice experiment with two unlabeled alternatives described in terms of the seven CarerQol-7D dimensions. An efficient experimental design with priors obtained from the Netherlands was used to create the choice sets. Data was analyzed with a panel mixed multinomial logit model with random parameters. In all five countries, the CarerQol-7D dimensions were significantly associated with the utility of informal care situations. Physical health problems were most strongly associated with the utility for informal care situations. The tariff was constructed by adding up the relative utility weights per category of all CarerQol-7D dimensions for each country separately. The CarerQol tariffs for Australia, Germany, Sweden, UK, and US facilitate the inclusion of informal care in economic evaluations.

  16. Use of the preconditioned conjugate gradient algorithm as a generic solver for mixed-model equations in animal breeding applications.

    PubMed

    Tsuruta, S; Misztal, I; Strandén, I

    2001-05-01

    Utility of the preconditioned conjugate gradient algorithm with a diagonal preconditioner for solving mixed-model equations in animal breeding applications was evaluated with 16 test problems. The problems included single- and multiple-trait analyses, with data on beef, dairy, and swine ranging from small examples to national data sets. Multiple-trait models considered low and high genetic correlations. Convergence was based on relative differences between left- and right-hand sides. The ordering of equations was fixed effects followed by random effects, with no special ordering within random effects. The preconditioned conjugate gradient program implemented with double precision converged for all models. However, when implemented in single precision, the preconditioned conjugate gradient algorithm did not converge for seven large models. The preconditioned conjugate gradient and successive overrelaxation algorithms were subsequently compared for 13 of the test problems. The preconditioned conjugate gradient algorithm was easy to implement with the iteration on data for general models. However, successive overrelaxation requires specific programming for each set of models. On average, the preconditioned conjugate gradient algorithm converged in three times fewer rounds of iteration than successive overrelaxation. With straightforward implementations, programs using the preconditioned conjugate gradient algorithm may be two or more times faster than those using successive overrelaxation. However, programs using the preconditioned conjugate gradient algorithm would use more memory than would comparable implementations using successive overrelaxation. Extensive optimization of either algorithm can influence rankings. The preconditioned conjugate gradient implemented with iteration on data, a diagonal preconditioner, and in double precision may be the algorithm of choice for solving mixed-model equations when sufficient memory is available and ease of implementation is essential.

  17. Genetic Analysis of Milk Yield in First-Lactation Holstein Friesian in Ethiopia: A Lactation Average vs Random Regression Test-Day Model Analysis

    PubMed Central

    Meseret, S.; Tamir, B.; Gebreyohannes, G.; Lidauer, M.; Negussie, E.

    2015-01-01

    The development of effective genetic evaluations and selection of sires requires accurate estimates of genetic parameters for all economically important traits in the breeding goal. The main objective of this study was to assess the relative performance of the traditional lactation average model (LAM) against the random regression test-day model (RRM) in the estimation of genetic parameters and prediction of breeding values for Holstein Friesian herds in Ethiopia. The data used consisted of 6,500 test-day (TD) records from 800 first-lactation Holstein Friesian cows that calved between 1997 and 2013. Co-variance components were estimated using the average information restricted maximum likelihood method under single trait animal model. The estimate of heritability for first-lactation milk yield was 0.30 from LAM whilst estimates from the RRM model ranged from 0.17 to 0.29 for the different stages of lactation. Genetic correlations between different TDs in first-lactation Holstein Friesian ranged from 0.37 to 0.99. The observed genetic correlation was less than unity between milk yields at different TDs, which indicated that the assumption of LAM may not be optimal for accurate evaluation of the genetic merit of animals. A close look at estimated breeding values from both models showed that RRM had higher standard deviation compared to LAM indicating that the TD model makes efficient utilization of TD information. Correlations of breeding values between models ranged from 0.90 to 0.96 for different group of sires and cows and marked re-rankings were observed in top sires and cows in moving from the traditional LAM to RRM evaluations. PMID:26194217

  18. Progressive Elaboration and Cross-Validation of a Latent Class Typology of Adolescent Alcohol Involvement in a National Sample

    PubMed Central

    Donovan, John E.; Chung, Tammy

    2015-01-01

    Objective: Most studies of adolescent drinking focus on single alcohol use behaviors (e.g., high-volume drinking, drunkenness) and ignore the patterning of adolescents’ involvement across multiple alcohol behaviors. The present latent class analyses (LCAs) examined a procedure for empirically determining multiple cut points on the alcohol use behaviors in order to establish a typology of adolescent alcohol involvement. Method: LCA was carried out on six alcohol use behavior indicators collected from 6,504 7th through 12th graders who participated in Wave I of the National Longitudinal Study of Adolescent Health (AddHealth). To move beyond dichotomous indicators, a “progressive elaboration” strategy was used, starting with six dichotomous indicators and then evaluating a series of models testing additional cut points on the ordinal indicators at progressively higher points for one indicator at a time. Analyses were performed on one random half-sample, and confirmatory LCAs were performed on the second random half-sample and in the Wave II data. Results: The final model consisted of four latent classes (never or non–current drinkers, low-intake drinkers, non–problem drinkers, and problem drinkers). Confirmatory LCAs in the second random half-sample from Wave I and in Wave II support this four-class solution. The means on the four latent classes were also generally ordered on an array of measures reflecting psychosocial risk for problem behavior. Conclusions: These analyses suggest that there may be four different classes or types of alcohol involvement among adolescents, and, more importantly, they illustrate the utility of the progressive elaboration strategy for moving beyond dichotomous indicators in latent class models. PMID:25978828

  19. Electronic communications and home blood pressure monitoring (e-BP) study: design, delivery, and evaluation framework.

    PubMed

    Green, Beverly B; Ralston, James D; Fishman, Paul A; Catz, Sheryl L; Cook, Andrea; Carlson, Jim; Tyll, Lynda; Carrell, David; Thompson, Robert S

    2008-05-01

    Randomized controlled trials have provided unequivocal evidence that treatment of hypertension decreases mortality and major disability from cardiovascular disease; however, blood pressure remains inadequately treated in most affected individuals. This large gap continues despite the facts that more than 90% of adults with hypertension have health insurance, and hypertension is the leading cause of visits to the doctor. New approaches are needed to improve hypertension care. The Electronic Communications and Home Blood Pressure Monitoring (e-BP) study is a three-arm randomized controlled trial designed to determine whether care based on the Chronic Care Model and delivered over the Internet improves hypertension care. The primary study outcomes are systolic, diastolic, and blood pressure control; secondary outcomes are medication adherence, patient self-efficacy, satisfaction and quality of life, and healthcare utilization and costs. Hypertensive patients receiving care at Group Health medical centers are eligible if they have uncontrolled blood pressure on two screening visits and access to the Web and an e-mail address. Study participants are randomly assigned to three intervention groups: (a) usual care; (b) home blood pressure monitoring receipt and proficiency training on its use and the Group Health secure patient website (with secure e-mail access to their healthcare provider, access to a shared medical record, prescription refill and other services); or (c) this plus pharmacist care management (collaborative care management between the patient, the pharmacist, and the patient's physician via a secure patient website and the electronic medical record). We will determine whether a new model of patient-centered care that leverages Web communications, self-monitoring, and collaborative care management improves hypertension control. If this model proves successful and cost-effective, similar interventions could be used to improve the care of large numbers of patients with uncontrolled hypertension.

  20. Interrogating the topological robustness of gene regulatory circuits by randomization

    PubMed Central

    Levine, Herbert; Onuchic, Jose N.

    2017-01-01

    One of the most important roles of cells is performing their cellular tasks properly for survival. Cells usually achieve robust functionality, for example, cell-fate decision-making and signal transduction, through multiple layers of regulation involving many genes. Despite the combinatorial complexity of gene regulation, its quantitative behavior has been typically studied on the basis of experimentally verified core gene regulatory circuitry, composed of a small set of important elements. It is still unclear how such a core circuit operates in the presence of many other regulatory molecules and in a crowded and noisy cellular environment. Here we report a new computational method, named random circuit perturbation (RACIPE), for interrogating the robust dynamical behavior of a gene regulatory circuit even without accurate measurements of circuit kinetic parameters. RACIPE generates an ensemble of random kinetic models corresponding to a fixed circuit topology, and utilizes statistical tools to identify generic properties of the circuit. By applying RACIPE to simple toggle-switch-like motifs, we observed that the stable states of all models converge to experimentally observed gene state clusters even when the parameters are strongly perturbed. RACIPE was further applied to a proposed 22-gene network of the Epithelial-to-Mesenchymal Transition (EMT), from which we identified four experimentally observed gene states, including the states that are associated with two different types of hybrid Epithelial/Mesenchymal phenotypes. Our results suggest that dynamics of a gene circuit is mainly determined by its topology, not by detailed circuit parameters. Our work provides a theoretical foundation for circuit-based systems biology modeling. We anticipate RACIPE to be a powerful tool to predict and decode circuit design principles in an unbiased manner, and to quantitatively evaluate the robustness and heterogeneity of gene expression. PMID:28362798

  1. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  2. Travel time to maternity care and its effect on utilization in rural Ghana: a multilevel analysis.

    PubMed

    Masters, Samuel H; Burstein, Roy; Amofah, George; Abaogye, Patrick; Kumar, Santosh; Hanlon, Michael

    2013-09-01

    Rates of neonatal and maternal mortality are high in Ghana. In-facility delivery and other maternal services could reduce this burden, yet utilization rates of key maternal services are relatively low, especially in rural areas. We tested a theoretical implication that travel time negatively affects the use of in-facility delivery and other maternal services. Empirically, we used geospatial techniques to estimate travel times between populations and health facilities. To account for uncertainty in Ghana Demographic and Health Survey cluster locations, we adopted a novel approach of treating the location selection as an imputation problem. We estimated a multilevel random-intercept logistic regression model. For rural households, we found that travel time had a significant effect on the likelihood of in-facility delivery and antenatal care visits, holding constant education, wealth, maternal age, facility capacity, female autonomy, and the season of birth. In contrast, a facility's capacity to provide sophisticated maternity care had no detectable effect on utilization. As the Ghanaian health network expands, our results suggest that increasing the availability of basic obstetric services and improving transport infrastructure may be important interventions. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Progressive Failure of a Unidirectional Fiber-Reinforced Composite Using the Method of Cells: Discretization Objective Computational Results

    NASA Technical Reports Server (NTRS)

    Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.

    2012-01-01

    The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.

  4. A Representation for Gaining Insight into Clinical Decision Models

    PubMed Central

    Jimison, Holly B.

    1988-01-01

    For many medical domains uncertainty and patient preferences are important components of decision making. Decision theory is useful as a representation for such medical models in computer decision aids, but the methodology has typically had poor performance in the areas of explanation and user interface. The additional representation of probabilities and utilities as random variables serves to provide a framework for graphical and text insight into complicated decision models. The approach allows for efficient customization of a generic model that describes the general patient population of interest to a patient- specific model. Monte Carlo simulation is used to calculate the expected value of information and sensitivity for each model variable, thus providing a metric for deciding what to emphasize in the graphics and text summary. The computer-generated explanation includes variables that are sensitive with respect to the decision or that deviate significantly from what is typically observed. These techniques serve to keep the assessment and explanation of the patient's decision model concise, allowing the user to focus on the most important aspects for that patient.

  5. Application of hierarchical dissociated neural network in closed-loop hybrid system integrating biological and mechanical intelligence.

    PubMed

    Li, Yongcheng; Sun, Rong; Zhang, Bin; Wang, Yuechao; Li, Hongyi

    2015-01-01

    Neural networks are considered the origin of intelligence in organisms. In this paper, a new design of an intelligent system merging biological intelligence with artificial intelligence was created. It was based on a neural controller bidirectionally connected to an actual mobile robot to implement a novel vehicle. Two types of experimental preparations were utilized as the neural controller including 'random' and '4Q' (cultured neurons artificially divided into four interconnected parts) neural network. Compared to the random cultures, the '4Q' cultures presented absolutely different activities, and the robot controlled by the '4Q' network presented better capabilities in search tasks. Our results showed that neural cultures could be successfully employed to control an artificial agent; the robot performed better and better with the stimulus because of the short-term plasticity. A new framework is provided to investigate the bidirectional biological-artificial interface and develop new strategies for a future intelligent system using these simplified model systems.

  6. A Randomized Study of Incentivizing HIV Testing for Parolees in Community Aftercare.

    PubMed

    Saxena, Preeta; Hall, Elizabeth A; Prendergast, Michael

    2016-04-01

    HIV risk-behaviors are high in criminal justice populations and more efforts are necessary to address them among criminal justice-involved substance abusers. This study examines the role of incentives in promoting HIV testing among parolees. Participants were randomly assigned to either an incentive (n = 104) or education group (control; n = 98), where the incentive group received a voucher for testing for HIV. Bivariate comparisons showed that a larger proportion of those in the incentive group received HIV testing (59% versus 47%), but this was not statistically significant (p = .09). However, in a multivariate logistic regression model controlling for covariates likely to influence HIV-testing behavior, those in the incentive group had increased odds of HIV testing in comparison to those in the education group (OR = 1.99, p < .05, CI [1.05, 3.78]). As a first of its kind, this study provides a foundation for further research on the utility of incentives in promoting HIV testing and other healthy behaviors in criminal justice populations.

  7. Future directions in physical activity intervention research: expanding our focus to sedentary behaviors, technology, and dissemination.

    PubMed

    Lewis, Beth A; Napolitano, Melissa A; Buman, Matthew P; Williams, David M; Nigg, Claudio R

    2017-02-01

    Despite the increased health risks of a sedentary lifestyle, only 49 % of American adults participate in physical activity (PA) at the recommended levels. In an effort to move the PA field forward, we briefly review three emerging areas of PA intervention research. First, new intervention research has focused on not only increasing PA but also on decreasing sedentary behavior. Researchers should utilize randomized controlled trials, common terminology, investigate which behaviors should replace sedentary behaviors, evaluate long-term outcomes, and focus across the lifespan. Second, technology has contributed to an increase in sedentary behavior but has also led to innovative PA interventions. PA technology research should focus on large randomized trials with evidence-based components, explore social networking and innovative apps, improve PA monitoring, consider the lifespan, and be grounded in theory. Finally, in an effort to maximize public health impact, dissemination efforts should address the RE-AIM model, health disparities, and intervention costs.

  8. Non-intrusive head movement analysis of videotaped seizures of epileptic origin.

    PubMed

    Mandal, Bappaditya; Eng, How-Lung; Lu, Haiping; Chan, Derrick W S; Ng, Yen-Ling

    2012-01-01

    In this work we propose a non-intrusive video analytic system for patient's body parts movement analysis in Epilepsy Monitoring Unit. The system utilizes skin color modeling, head/face pose template matching and face detection to analyze and quantify the head movements. Epileptic patients' heads are analyzed holistically to infer seizure and normal random movements. The patient does not require to wear any special clothing, markers or sensors, hence it is totally non-intrusive. The user initializes the person-specific skin color and selects few face/head poses in the initial few frames. The system then tracks the head/face and extracts spatio-temporal features. Support vector machines are then used on these features to classify seizure-like movements from normal random movements. Experiments are performed on numerous long hour video sequences captured in an Epilepsy Monitoring Unit at a local hospital. The results demonstrate the feasibility of the proposed system in pediatric epilepsy monitoring and seizure detection.

  9. Future directions in physical activity intervention research: expanding our focus to sedentary behaviors, technology, and dissemination

    PubMed Central

    Napolitano, Melissa A.; Buman, Matthew P.; Williams, David M.; Nigg, Claudio R.

    2016-01-01

    Despite the increased health risks of a sedentary lifestyle, only 49 % of American adults participate in physical activity (PA) at the recommended levels. In an effort to move the PA field forward, we briefly review three emerging areas of PA intervention research. First, new intervention research has focused on not only increasing PA but also on decreasing sedentary behavior. Researchers should utilize randomized controlled trials, common terminology, investigate which behaviors should replace sedentary behaviors, evaluate long-term outcomes, and focus across the lifespan. Second, technology has contributed to an increase in sedentary behavior but has also led to innovative PA interventions. PA technology research should focus on large randomized trials with evidence-based components, explore social networking and innovative apps, improve PA monitoring, consider the lifespan, and be grounded in theory. Finally, in an effort to maximize public health impact, dissemination efforts should address the RE-AIM model, health disparities, and intervention costs. PMID:27722907

  10. Measurement of fault latency in a digital avionic miniprocessor

    NASA Technical Reports Server (NTRS)

    Mcgough, J. G.; Swern, F. L.

    1981-01-01

    The results of fault injection experiments utilizing a gate-level emulation of the central processor unit of the Bendix BDX-930 digital computer are presented. The failure detection coverage of comparison-monitoring and a typical avionics CPU self-test program was determined. The specific tasks and experiments included: (1) inject randomly selected gate-level and pin-level faults and emulate six software programs using comparison-monitoring to detect the faults; (2) based upon the derived empirical data develop and validate a model of fault latency that will forecast a software program's detecting ability; (3) given a typical avionics self-test program, inject randomly selected faults at both the gate-level and pin-level and determine the proportion of faults detected; (4) determine why faults were undetected; (5) recommend how the emulation can be extended to multiprocessor systems such as SIFT; and (6) determine the proportion of faults detected by a uniprocessor BIT (built-in-test) irrespective of self-test.

  11. Major Differences: Variations in Undergraduate and Graduate Student Mental Health and Treatment Utilization across Academic Disciplines

    ERIC Educational Resources Information Center

    Lipson, Sarah Ketchen; Zhou, Sasha; Wagner, Blake, III; Beck, Katie; Eisenberg, Daniel

    2016-01-01

    This article explores variations in mental health and service utilization across academic disciplines using a random sample of undergraduate and graduate students (N = 64,519) at 81 colleges and universities. We report prevalence of depression, anxiety, suicidality, and self-injury, and rates of help-seeking across disciplines, including results…

  12. Effect of Feedback and Training on Utility Usage among Adolescent Delinquents.

    ERIC Educational Resources Information Center

    Sexton, Richard E.; And Others

    The usefulness of providing specific information and a progress/feedback mechanism to control utility usage in community-based, halfway houses for dependent-neglected and for delinquent adolescents was explored. The investigation was carried out in a random sample of 12 houses of an Arizona facility, divided into equivalent groups of three houses.…

  13. Assessment of Social Media Utilization and Study Habit of Students of Tertiary Institutions in Katsina State

    ERIC Educational Resources Information Center

    Olutola, Adekunle Thomas; Olatoye, Olufunke Omotoke; Olatoye, Rafiu Ademola

    2016-01-01

    This study investigated assessment of social media utilization and study habits of students of tertiary institutions in Katsina State. The descriptive survey design was adopted for this study. Three hundred and eighty-one (381) students' of tertiary institutions in Katsina State were randomly selected for the study. Researchers'-designed…

  14. The Effects of Accelerated Math Utilization on Grade Equivalency Score at a Selected Elementary School

    ERIC Educational Resources Information Center

    Kariuki, Patrick; Gentry, Christi

    2010-01-01

    The purpose of this study was to examine the effects of Accelerated Math utilization on students' grade equivalency scores. Twelve students for both experimental and control groups were randomly selected from 37 students enrolled in math in grades four through six. The experimental group consisted of the students who actively participated in…

  15. A metabolic core model elucidates how enhanced utilization of glucose and glutamine, with enhanced glutamine-dependent lactate production, promotes cancer cell growth: The WarburQ effect

    PubMed Central

    Damiani, Chiara; Colombo, Riccardo; Gaglio, Daniela; Mastroianni, Fabrizia; Westerhoff, Hans Victor; Vanoni, Marco; Alberghina, Lilia

    2017-01-01

    Cancer cells share several metabolic traits, including aerobic production of lactate from glucose (Warburg effect), extensive glutamine utilization and impaired mitochondrial electron flow. It is still unclear how these metabolic rearrangements, which may involve different molecular events in different cells, contribute to a selective advantage for cancer cell proliferation. To ascertain which metabolic pathways are used to convert glucose and glutamine to balanced energy and biomass production, we performed systematic constraint-based simulations of a model of human central metabolism. Sampling of the feasible flux space allowed us to obtain a large number of randomly mutated cells simulated at different glutamine and glucose uptake rates. We observed that, in the limited subset of proliferating cells, most displayed fermentation of glucose to lactate in the presence of oxygen. At high utilization rates of glutamine, oxidative utilization of glucose was decreased, while the production of lactate from glutamine was enhanced. This emergent phenotype was observed only when the available carbon exceeded the amount that could be fully oxidized by the available oxygen. Under the latter conditions, standard Flux Balance Analysis indicated that: this metabolic pattern is optimal to maximize biomass and ATP production; it requires the activity of a branched TCA cycle, in which glutamine-dependent reductive carboxylation cooperates to the production of lipids and proteins; it is sustained by a variety of redox-controlled metabolic reactions. In a K-ras transformed cell line we experimentally assessed glutamine-induced metabolic changes. We validated computational results through an extension of Flux Balance Analysis that allows prediction of metabolite variations. Taken together these findings offer new understanding of the logic of the metabolic reprogramming that underlies cancer cell growth. PMID:28957320

  16. Relationship of self-reported asthma severity and urgent health care utilization to psychological sequelae of the September 11, 2001 terrorist attacks on the World Trade Center among New York City area residents.

    PubMed

    Fagan, Joanne; Galea, Sandro; Ahern, Jennifer; Bonner, Sebastian; Vlahov, David

    2003-01-01

    Posttraumatic psychological stress may be associated with increases in somatic illness, including asthma, but the impact of the psychological sequelae of the September 11, 2001 terrorist attacks on physical illness has not been well documented. The authors assessed the relationship between the psychological sequelae of the attacks and asthma symptom severity and the utilization of urgent health care services for asthma since September 11. The authors performed a random digit dial telephone survey of adults in the New York City (NYC) metropolitan area 6 to 9 months after September 11, 2001. Two thousand seven hundred fifty-five demographically representative adults including 364 asthmatics were recruited. The authors assessed self-reported asthma symptom severity, emergency room (ER) visits, and unscheduled physician office visits for asthma since September 11. After adjustment for asthma measures before September 11, demographics, and event exposure in multivariate models posttraumatic stress disorder (PTSD) were a significant predictor of self-reported moderate-to-severe asthma symptoms (OR = 3.4; CI = 1.2-9.4), seeking care for asthma at an ER since September 11 (OR = 6.6; CI = 1.6-28.0), and unscheduled physician visits for asthma since September 11 (OR = 3.6; CI = 1.1-11.5). The number of PTSD symptoms was also significantly related to moderate-to-severe asthma symptoms and unscheduled physician visits since September 11. Neither a panic attack on September 11 nor depression since September 11 was an independent predictor of asthma severity or utilization in multivariate models after September 11. PTSD related to the September 11 terrorist attacks contributed to symptom severity and the utilization of urgent health care services among asthmatics in the NYC metropolitan area.

  17. Intracesarean insertion of the Copper T380A versus 6 weeks postcesarean: a randomized clinical trial.

    PubMed

    Lester, Felicia; Kakaire, Othman; Byamugisha, Josaphat; Averbach, Sarah; Fortin, Jennifer; Maurer, Rie; Goldberg, Alisa

    2015-03-01

    To compare rates of Copper T380A intrauterine device (IUD) utilization and satisfaction with immediate versus delayed IUD insertion after cesarean delivery in Kampala, Uganda. This study was a randomized clinical trial of women undergoing cesarean section who desired an IUD in Kampala, Uganda. Participants were randomly assigned to IUD insertion at the time of cesarean delivery or 6weeks afterward. The primary outcome was IUD utilization at 6months after delivery. Among 68 women who underwent randomization, an IUD was inserted in 100% (34/34) of the women in the immediate insertion group and in 53% (18/34) in the delayed group. IUD use at 6 months was higher in the immediate insertion group (93% vs. 50% after delayed insertion; p<.0001). Infection and expulsion were rare and did not differ between groups. When we pooled both groups and looked at IUD users compared to nonusers, 91% (39/43) of IUD users were satisfied or very satisfied with their contraceptive method compared to 44% (11/25) of nonusers (p<.0001). Women who chose not to be in the study or had the IUD removed often did so because of perceived husband or community disapproval. The 6-month utilization of an IUD after immediate insertion was significantly higher than after delayed insertion without increased complications. Contraceptive satisfaction was significantly higher among IUD users than nonusers. Community and husband attitudes influence IUD utilization and continuation in Kampala, Uganda. This work is important because it shows the safety and efficacy of providing IUDs during cesarean section in a setting where access to any healthcare, including contraception, can be extremely limited outside of childbearing and the consequences of an unintended, closely spaced pregnancy after a cesarean section can be life threatening. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. A Randomized Phase 2 Trial of 177Lu Radiolabeled Anti-PSMA Monoclonal Antibody J591 in Patients with High-Risk Castrate, Biochemically Relapsed Prostate Cancer

    DTIC Science & Technology

    2014-09-01

    antibody (J591) against the external portion of PSMA that binds to viable PSMA-expressing cells and is internalized. Studies utilizing J591...metastatic CRPC to bone (Rad223, Xofigo ®) leading to excitement within the field. A more tumor-targeted approach utilizing J591 is of increased...has generated renewed scientific and clinical interest. In addition, recent studies utilizing J591-based immuno- PET imaging providing additional

  19. Economic Evaluation of Lipid-Lowering Therapy in the Secondary Prevention Setting in the Philippines.

    PubMed

    Tumanan-Mendoza, Bernadette A; Mendoza, Victor L

    2013-05-01

    To determine the cost-effectiveness of lipid-lowering therapy in the secondary prevention of cardiovascular events in the Philippines. A cost-utility analysis was performed by using Markov modeling in the secondary prevention setting. The models incorporated efficacy of lipid-lowering therapy demonstrated in randomized controlled trials and mortality rates obtained from local life tables. Average and incremental cost-effectiveness ratios were obtained for simvastatin, atorvastatin, pravastatin, and gemfibrozil. The costs of the following were included: medications, laboratory examinations, consultation and related expenses, and production losses. The costs were expressed in current or nominal prices as of the first quarter of 2010 (Philippine peso). Utility was expressed in quality-adjusted life-years gained. Sensitivity analyses were performed by using variations in the cost centers, discount rates, starting age, and differences in utility weights for stroke. In the analysis using the lower-priced generic counterparts, therapy using 40 mg simvastatin daily was the most cost-effective option compared with the other therapies, while pravastatin 40 mg daily was the most cost-effective alternative if the higher-priced innovator drugs were used. In all sensitivity analyses, gemfibrozil was strongly dominated by the statins. In secondary prevention, simvastatin or pravastatin were the most cost-effective options compared with atorvastatin and gemfibrozil in the Philippines. Gemfibrozil was strongly dominated by the statins. Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. A review of the clinical utility of duloxetine in the treatment of diabetic peripheral neuropathic pain

    PubMed Central

    King, Jordan B; Schauerhamer, Marisa B; Bellows, Brandon K

    2015-01-01

    Diabetes mellitus is a world-wide epidemic with many long-term complications, with neuropathy being the most common. In particular, diabetic peripheral neuropathic pain (DPNP), can be one of the most distressing complications associated with diabetes, leading to decreases in physical and mental quality of life. Despite the availability of many efficient medications, DPNP remains a challenge to treat, and the optimal sequencing of pharmacotherapy remains unknown. Currently, there are only three medications approved by the US Food and Drug Administration specifically for the management of DPNP. Duloxetine (DUL), a selective serotonin-norepinephrine reuptake inhibitor, is one of these. With the goal of optimizing pharmacotherapy use in DPNP population, a review of current literature was conducted, and the clinical utility of DUL described. Along with early clinical trials, recently published observational studies and pharmacoeconomic models may be useful in guiding decision making by clinicians and managed care organizations. In real-world practice settings, DUL is associated with decreased or similar opioid utilization, increased medication adherence, and similar health care costs compared with current standard of care. DUL has consistently been found to be a cost-effective option over short time-horizons. Currently, the long-term cost-effectiveness of DUL is unknown. Evidence derived from randomized clinical trials, real-world observations, and economic models support the use of DUL as a first-line treatment option from the perspective of the patient, clinician, and managed care payer. PMID:26309404

  1. A review of the clinical utility of duloxetine in the treatment of diabetic peripheral neuropathic pain.

    PubMed

    King, Jordan B; Schauerhamer, Marisa B; Bellows, Brandon K

    2015-01-01

    Diabetes mellitus is a world-wide epidemic with many long-term complications, with neuropathy being the most common. In particular, diabetic peripheral neuropathic pain (DPNP), can be one of the most distressing complications associated with diabetes, leading to decreases in physical and mental quality of life. Despite the availability of many efficient medications, DPNP remains a challenge to treat, and the optimal sequencing of pharmacotherapy remains unknown. Currently, there are only three medications approved by the US Food and Drug Administration specifically for the management of DPNP. Duloxetine (DUL), a selective serotonin-norepinephrine reuptake inhibitor, is one of these. With the goal of optimizing pharmacotherapy use in DPNP population, a review of current literature was conducted, and the clinical utility of DUL described. Along with early clinical trials, recently published observational studies and pharmacoeconomic models may be useful in guiding decision making by clinicians and managed care organizations. In real-world practice settings, DUL is associated with decreased or similar opioid utilization, increased medication adherence, and similar health care costs compared with current standard of care. DUL has consistently been found to be a cost-effective option over short time-horizons. Currently, the long-term cost-effectiveness of DUL is unknown. Evidence derived from randomized clinical trials, real-world observations, and economic models support the use of DUL as a first-line treatment option from the perspective of the patient, clinician, and managed care payer.

  2. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  3. Using fixed-parameter and random-parameter ordered regression models to identify significant factors that affect the severity of drivers' injuries in vehicle-train collisions.

    PubMed

    Dabbour, Essam; Easa, Said; Haider, Murtaza

    2017-10-01

    This study attempts to identify significant factors that affect the severity of drivers' injuries when colliding with trains at railroad-grade crossings by analyzing the individual-specific heterogeneity related to those factors over a period of 15 years. Both fixed-parameter and random-parameter ordered regression models were used to analyze records of all vehicle-train collisions that occurred in the United States from January 1, 2001 to December 31, 2015. For fixed-parameter ordered models, both probit and negative log-log link functions were used. The latter function accounts for the fact that lower injury severity levels are more probable than higher ones. Separate models were developed for heavy and light-duty vehicles. Higher train and vehicle speeds, female, and young drivers (below the age of 21 years) were found to be consistently associated with higher severity of drivers' injuries for both heavy and light-duty vehicles. Furthermore, favorable weather, light-duty trucks (including pickup trucks, panel trucks, mini-vans, vans, and sports-utility vehicles), and senior drivers (above the age of 65 years) were found be consistently associated with higher severity of drivers' injuries for light-duty vehicles only. All other factors (e.g. air temperature, the type of warning devices, darkness conditions, and highway pavement type) were found to be temporally unstable, which may explain the conflicting findings of previous studies related to those factors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. How to derive biological information from the value of the normalization constant in allometric equations.

    PubMed

    Kaitaniemi, Pekka

    2008-04-09

    Allometric equations are widely used in many branches of biological science. The potential information content of the normalization constant b in allometric equations of the form Y = bX(a) has, however, remained largely neglected. To demonstrate the potential for utilizing this information, I generated a large number of artificial datasets that resembled those that are frequently encountered in biological studies, i.e., relatively small samples including measurement error or uncontrolled variation. The value of X was allowed to vary randomly within the limits describing different data ranges, and a was set to a fixed theoretical value. The constant b was set to a range of values describing the effect of a continuous environmental variable. In addition, a normally distributed random error was added to the values of both X and Y. Two different approaches were then used to model the data. The traditional approach estimated both a and b using a regression model, whereas an alternative approach set the exponent a at its theoretical value and only estimated the value of b. Both approaches produced virtually the same model fit with less than 0.3% difference in the coefficient of determination. Only the alternative approach was able to precisely reproduce the effect of the environmental variable, which was largely lost among noise variation when using the traditional approach. The results show how the value of b can be used as a source of valuable biological information if an appropriate regression model is selected.

  5. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  6. A conceptual model for site-level ecology of the giant gartersnake (Thamnophis gigas) in the Sacramento Valley, California

    USGS Publications Warehouse

    Halstead, Brian J.; Wylie, Glenn D.; Casazza, Michael L.; Hansen, Eric C.; Scherer, Rick D.; Patterson, Laura C.

    2015-08-14

    Bayesian networks further provide a clear visual display of the model that facilitates understanding among various stakeholders (Marcot and others, 2001; Uusitalo , 2007). Empirical data and expert judgment can be combined, as continuous or categorical variables, to update knowledge about the system (Marcot and others, 2001; Uusitalo , 2007). Importantly, Bayesian network models allow inference from causes to consequences, but also from consequences to causes, so that data can inform the states of nodes (values of different random variables) in either direction (Marcot and others, 2001; Uusitalo , 2007). Because they can incorporate both decision nodes that represent management actions and utility nodes that quantify the costs and benefits of outcomes, Bayesian networks are ideally suited to risk analysis and adaptive management (Nyberg and others, 2006; Howes and others, 2010). Thus, Bayesian network models are useful in situations where empirical data are not available, such as questions concerning the responses of giant gartersnakes to management.

  7. Security of statistical data bases: invasion of privacy through attribute correlational modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palley, M.A.

    This study develops, defines, and applies a statistical technique for the compromise of confidential information in a statistical data base. Attribute Correlational Modeling (ACM) recognizes that the information contained in a statistical data base represents real world statistical phenomena. As such, ACM assumes correlational behavior among the database attributes. ACM proceeds to compromise confidential information through creation of a regression model, where the confidential attribute is treated as the dependent variable. The typical statistical data base may preclude the direct application of regression. In this scenario, the research introduces the notion of a synthetic data base, created through legitimate queriesmore » of the actual data base, and through proportional random variation of responses to these queries. The synthetic data base is constructed to resemble the actual data base as closely as possible in a statistical sense. ACM then applies regression analysis to the synthetic data base, and utilizes the derived model to estimate confidential information in the actual database.« less

  8. A quasi-Monte-Carlo comparison of parametric and semiparametric regression methods for heavy-tailed and non-normal data: an application to healthcare costs.

    PubMed

    Jones, Andrew M; Lomas, James; Moore, Peter T; Rice, Nigel

    2016-10-01

    We conduct a quasi-Monte-Carlo comparison of the recent developments in parametric and semiparametric regression methods for healthcare costs, both against each other and against standard practice. The population of English National Health Service hospital in-patient episodes for the financial year 2007-2008 (summed for each patient) is randomly divided into two equally sized subpopulations to form an estimation set and a validation set. Evaluating out-of-sample using the validation set, a conditional density approximation estimator shows considerable promise in forecasting conditional means, performing best for accuracy of forecasting and among the best four for bias and goodness of fit. The best performing model for bias is linear regression with square-root-transformed dependent variables, whereas a generalized linear model with square-root link function and Poisson distribution performs best in terms of goodness of fit. Commonly used models utilizing a log-link are shown to perform badly relative to other models considered in our comparison.

  9. Community organizing goes to college: A practice-based model of community organizing to implement environmental strategies to reduce high-risk drinking on college campuses

    PubMed Central

    Wagoner, Kimberly G.; Rhodes, Scott D.; Lentz, Ashley W.; Wolfson, Mark

    2013-01-01

    Community organizing is a successful method to leverage resources and build community capacity to identify and intervene upon health issues. However, published accounts documenting the systematic facilitation of the process are limited. This qualitative analysis explored community organizing using data collected as part of the Study to Prevent Alcohol Related Consequences (SPARC), a randomized community trial of 10 North Carolina colleges focused on reducing consequences of high-risk drinking among college students. We sought to develop and confirm use of a community-organizing model, based in practice, illustrating an authentic process of organizing campus and community stakeholders for public health change. Using the grounded theory approach, we analyzed and interpreted data from three waves of individual interviews with full-time community organizers on five SPARC intervention campuses. A five-phase community-organizing model was developed and its use was confirmed. This model may serve as a practical guide for public health interventions utilizing community-organizing approaches. PMID:20530638

  10. Understanding Spatially Complex Segmental and Branch Anatomy Using 3D Printing: Liver, Lung, Prostate, Coronary Arteries, and Circle of Willis.

    PubMed

    Javan, Ramin; Herrin, Douglas; Tangestanipoor, Ardalan

    2016-09-01

    Three-dimensional (3D) manufacturing is shaping personalized medicine, in which radiologists can play a significant role, be it as consultants to surgeons for surgical planning or by creating powerful visual aids for communicating with patients, physicians, and trainees. This report illustrates the steps in development of custom 3D models that enhance the understanding of complex anatomy. We graphically designed 3D meshes or modified imported data from cross-sectional imaging to develop physical models targeted specifically for teaching complex segmental and branch anatomy. The 3D printing itself is easily accessible through online commercial services, and the models are made of polyamide or gypsum. Anatomic models of the liver, lungs, prostate, coronary arteries, and the Circle of Willis were created. These models have advantages that include customizable detail, relative low cost, full control of design focusing on subsegments, color-coding potential, and the utilization of cross-sectional imaging combined with graphic design. Radiologists have an opportunity to serve as leaders in medical education and clinical care with 3D printed models that provide beneficial interaction with patients, clinicians, and trainees across all specialties by proactively taking on the educator's role. Complex models can be developed to show normal anatomy or common pathology for medical educational purposes. There is a need for randomized trials, which radiologists can design, to demonstrate the utility and effectiveness of 3D printed models for teaching simple and complex anatomy, simulating interventions, measuring patient satisfaction, and improving clinical care. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  11. Brain MR image segmentation based on an improved active contour model

    PubMed Central

    Meng, Xiangrui; Gu, Wenya; Zhang, Jianwei

    2017-01-01

    It is often a difficult task to accurately segment brain magnetic resonance (MR) images with intensity in-homogeneity and noise. This paper introduces a novel level set method for simultaneous brain MR image segmentation and intensity inhomogeneity correction. To reduce the effect of noise, novel anisotropic spatial information, which can preserve more details of edges and corners, is proposed by incorporating the inner relationships among the neighbor pixels. Then the proposed energy function uses the multivariate Student's t-distribution to fit the distribution of the intensities of each tissue. Furthermore, the proposed model utilizes Hidden Markov random fields to model the spatial correlation between neigh-boring pixels/voxels. The means of the multivariate Student's t-distribution can be adaptively estimated by multiplying a bias field to reduce the effect of intensity inhomogeneity. In the end, we reconstructed the energy function to be convex and calculated it by using the Split Bregman method, which allows our framework for random initialization, thereby allowing fully automated applications. Our method can obtain the final result in less than 1 second for 2D image with size 256 × 256 and less than 300 seconds for 3D image with size 256 × 256 × 171. The proposed method was compared to other state-of-the-art segmentation methods using both synthetic and clinical brain MR images and increased the accuracies of the results more than 3%. PMID:28854235

  12. Box–Cox Transformation and Random Regression Models for Fecal egg Count Data

    PubMed Central

    da Silva, Marcos Vinícius Gualberto Barbosa; Van Tassell, Curtis P.; Sonstegard, Tad S.; Cobuci, Jaime Araujo; Gasbarre, Louis C.

    2012-01-01

    Accurate genetic evaluation of livestock is based on appropriate modeling of phenotypic measurements. In ruminants, fecal egg count (FEC) is commonly used to measure resistance to nematodes. FEC values are not normally distributed and logarithmic transformations have been used in an effort to achieve normality before analysis. However, the transformed data are often still not normally distributed, especially when data are extremely skewed. A series of repeated FEC measurements may provide information about the population dynamics of a group or individual. A total of 6375 FEC measures were obtained for 410 animals between 1992 and 2003 from the Beltsville Agricultural Research Center Angus herd. Original data were transformed using an extension of the Box–Cox transformation to approach normality and to estimate (co)variance components. We also proposed using random regression models (RRM) for genetic and non-genetic studies of FEC. Phenotypes were analyzed using RRM and restricted maximum likelihood. Within the different orders of Legendre polynomials used, those with more parameters (order 4) adjusted FEC data best. Results indicated that the transformation of FEC data utilizing the Box–Cox transformation family was effective in reducing the skewness and kurtosis, and dramatically increased estimates of heritability, and measurements of FEC obtained in the period between 12 and 26 weeks in a 26-week experimental challenge period are genetically correlated. PMID:22303406

  13. Box-Cox Transformation and Random Regression Models for Fecal egg Count Data.

    PubMed

    da Silva, Marcos Vinícius Gualberto Barbosa; Van Tassell, Curtis P; Sonstegard, Tad S; Cobuci, Jaime Araujo; Gasbarre, Louis C

    2011-01-01

    Accurate genetic evaluation of livestock is based on appropriate modeling of phenotypic measurements. In ruminants, fecal egg count (FEC) is commonly used to measure resistance to nematodes. FEC values are not normally distributed and logarithmic transformations have been used in an effort to achieve normality before analysis. However, the transformed data are often still not normally distributed, especially when data are extremely skewed. A series of repeated FEC measurements may provide information about the population dynamics of a group or individual. A total of 6375 FEC measures were obtained for 410 animals between 1992 and 2003 from the Beltsville Agricultural Research Center Angus herd. Original data were transformed using an extension of the Box-Cox transformation to approach normality and to estimate (co)variance components. We also proposed using random regression models (RRM) for genetic and non-genetic studies of FEC. Phenotypes were analyzed using RRM and restricted maximum likelihood. Within the different orders of Legendre polynomials used, those with more parameters (order 4) adjusted FEC data best. Results indicated that the transformation of FEC data utilizing the Box-Cox transformation family was effective in reducing the skewness and kurtosis, and dramatically increased estimates of heritability, and measurements of FEC obtained in the period between 12 and 26 weeks in a 26-week experimental challenge period are genetically correlated.

  14. Random effects coefficient of determination for mixed and meta-analysis models

    PubMed Central

    Demidenko, Eugene; Sargent, James; Onega, Tracy

    2011-01-01

    The key feature of a mixed model is the presence of random effects. We have developed a coefficient, called the random effects coefficient of determination, Rr2, that estimates the proportion of the conditional variance of the dependent variable explained by random effects. This coefficient takes values from 0 to 1 and indicates how strong the random effects are. The difference from the earlier suggested fixed effects coefficient of determination is emphasized. If Rr2 is close to 0, there is weak support for random effects in the model because the reduction of the variance of the dependent variable due to random effects is small; consequently, random effects may be ignored and the model simplifies to standard linear regression. The value of Rr2 apart from 0 indicates the evidence of the variance reduction in support of the mixed model. If random effects coefficient of determination is close to 1 the variance of random effects is very large and random effects turn into free fixed effects—the model can be estimated using the dummy variable approach. We derive explicit formulas for Rr2 in three special cases: the random intercept model, the growth curve model, and meta-analysis model. Theoretical results are illustrated with three mixed model examples: (1) travel time to the nearest cancer center for women with breast cancer in the U.S., (2) cumulative time watching alcohol related scenes in movies among young U.S. teens, as a risk factor for early drinking onset, and (3) the classic example of the meta-analysis model for combination of 13 studies on tuberculosis vaccine. PMID:23750070

  15. Competency-Based Training and Worker Turnover in Community Supports for People with IDD: Results from a Group Randomized Controlled Study

    ERIC Educational Resources Information Center

    Bogenschutz, Matthew; Nord, Derek; Hewitt, Amy

    2015-01-01

    Turnover among direct support professionals (DSPs) in community support settings for individuals with intellectual and developmental disabilities (IDD) has been regarded as a challenge since tracking of this workforce began in the 1980s. This study utilized a group randomized controlled design to test the effects of a competency-based training…

  16. Experiences Recruiting Indian Worksites for an Integrated Health Protection and Health Promotion Randomized Control Trial in Maharashtra, India

    ERIC Educational Resources Information Center

    Shulman Cordeira, L.; Pednekar, M. S.; Nagler, E. M.; Gautam, J.; Wallace, L.; Stoddard, A. M.; Gupta, P. C.; Sorensen, G. C.

    2015-01-01

    This article provides an overview of the recruitment strategies utilized in the Mumbai Worksites Tobacco Control Study, a cluster randomized trial testing the effectiveness of an integrated tobacco control and occupational safety and health program in Indian manufacturing worksites. From June 2012 to June 2013, 20 companies were recruited.…

  17. Impact of a Social-Emotional and Character Development Program on School-Level Indicators of Academic Achievement, Absenteeism, and Disciplinary Outcomes: A Matched-Pair, Cluster-Randomized, Controlled Trial

    ERIC Educational Resources Information Center

    Snyder, Frank; Flay, Brian; Vuchinich, Samuel; Acock, Alan; Washburn, Isaac; Beets, Michael; Li, Kin-Kit

    2010-01-01

    This article reports the effects of a comprehensive elementary school-based social-emotional and character education program on school-level achievement, absenteeism, and disciplinary outcomes utilizing a matched-pair, cluster-randomized, controlled design. The "Positive Action" Hawai'i trial included 20 racially/ethnically diverse…

  18. Statics and Dynamics of Selfish Interactions in Distributed Service Systems

    PubMed Central

    Altarelli, Fabrizio; Braunstein, Alfredo; Dall’Asta, Luca

    2015-01-01

    We study a class of games which models the competition among agents to access some service provided by distributed service units and which exhibits congestion and frustration phenomena when service units have limited capacity. We propose a technique, based on the cavity method of statistical physics, to characterize the full spectrum of Nash equilibria of the game. The analysis reveals a large variety of equilibria, with very different statistical properties. Natural selfish dynamics, such as best-response, usually tend to large-utility equilibria, even though those of smaller utility are exponentially more numerous. Interestingly, the latter actually can be reached by selecting the initial conditions of the best-response dynamics close to the saturation limit of the service unit capacities. We also study a more realistic stochastic variant of the game by means of a simple and effective approximation of the average over the random parameters, showing that the properties of the average-case Nash equilibria are qualitatively similar to the deterministic ones. PMID:26177449

  19. Effectiveness of Reablement: A Systematic Review.

    PubMed

    Tessier, Annie; Beaulieu, Marie-Dominique; Mcginn, Carrie Anna; Latulippe, Renée

    2016-05-01

    The ageing of the population and the increasing need for long-term care services are global issues. Some countries have adapted homecare programs by introducing an intervention called reablement, which is aimed at optimizing independence. The effectiveness of reablement, as well as its different service models, was examined. A systematic literature review was conducted using MEDLINE, CINAHL, PsycINFO and EBM Reviews to search from 2001 to 2014. Core characteristics and facilitators of reablement implementation were identified from international experiences. Ten studies comprising a total of 14,742 participants (including four randomized trials, most of excellent or good quality) showed a positive impact of reablement, especially on health-related quality of life and service utilization. The implementation of reablement was studied in three regions, and all observed a reduction in healthcare service utilization. Considering its effectiveness and positive impact observed in several countries, the implementation of reablement is a promising avenue to be pursued by policy makers. Copyright © 2016 Longwoods Publishing.

  20. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

Top