Sample records for interval linear programming

  1. FSILP: fuzzy-stochastic-interval linear programming for supporting municipal solid waste management.

    PubMed

    Li, Pu; Chen, Bing

    2011-04-01

    Although many studies on municipal solid waste management (MSW management) were conducted under uncertain conditions of fuzzy, stochastic, and interval coexistence, the solution to the conventional linear programming problems of integrating fuzzy method with the other two was inefficient. In this study, a fuzzy-stochastic-interval linear programming (FSILP) method is developed by integrating Nguyen's method with conventional linear programming for supporting municipal solid waste management. The Nguyen's method was used to convert the fuzzy and fuzzy-stochastic linear programming problems into the conventional linear programs, by measuring the attainment values of fuzzy numbers and/or fuzzy random variables, as well as superiority and inferiority between triangular fuzzy numbers/triangular fuzzy-stochastic variables. The developed method can effectively tackle uncertainties described in terms of probability density functions, fuzzy membership functions, and discrete intervals. Moreover, the method can also improve upon the conventional interval fuzzy programming and two-stage stochastic programming approaches, with advantageous capabilities that are easily achieved with fewer constraints and significantly reduces consumption time. The developed model was applied to a case study of municipal solid waste management system in a city. The results indicated that reasonable solutions had been generated. The solution can help quantify the relationship between the change of system cost and the uncertainties, which could support further analysis of tradeoffs between the waste management cost and the system failure risk. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Life cycle cost optimization of biofuel supply chains under uncertainties based on interval linear programming.

    PubMed

    Ren, Jingzheng; Dong, Liang; Sun, Lu; Goodsite, Michael Evan; Tan, Shiyu; Dong, Lichun

    2015-01-01

    The aim of this work was to develop a model for optimizing the life cycle cost of biofuel supply chain under uncertainties. Multiple agriculture zones, multiple transportation modes for the transport of grain and biofuel, multiple biofuel plants, and multiple market centers were considered in this model, and the price of the resources, the yield of grain and the market demands were regarded as interval numbers instead of constants. An interval linear programming was developed, and a method for solving interval linear programming was presented. An illustrative case was studied by the proposed model, and the results showed that the proposed model is feasible for designing biofuel supply chain under uncertainties. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A generalized interval fuzzy mixed integer programming model for a multimodal transportation problem under uncertainty

    NASA Astrophysics Data System (ADS)

    Tian, Wenli; Cao, Chengxuan

    2017-03-01

    A generalized interval fuzzy mixed integer programming model is proposed for the multimodal freight transportation problem under uncertainty, in which the optimal mode of transport and the optimal amount of each type of freight transported through each path need to be decided. For practical purposes, three mathematical methods, i.e. the interval ranking method, fuzzy linear programming method and linear weighted summation method, are applied to obtain equivalents of constraints and parameters, and then a fuzzy expected value model is presented. A heuristic algorithm based on a greedy criterion and the linear relaxation algorithm are designed to solve the model.

  4. A novel approach based on preference-based index for interval bilevel linear programming problem.

    PubMed

    Ren, Aihong; Wang, Yuping; Xue, Xingsi

    2017-01-01

    This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.

  5. A new neural network model for solving random interval linear programming problems.

    PubMed

    Arjmandzadeh, Ziba; Safi, Mohammadreza; Nazemi, Alireza

    2017-05-01

    This paper presents a neural network model for solving random interval linear programming problems. The original problem involving random interval variable coefficients is first transformed into an equivalent convex second order cone programming problem. A neural network model is then constructed for solving the obtained convex second order cone problem. Employing Lyapunov function approach, it is also shown that the proposed neural network model is stable in the sense of Lyapunov and it is globally convergent to an exact satisfactory solution of the original problem. Several illustrative examples are solved in support of this technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Waste management under multiple complexities: Inexact piecewise-linearization-based fuzzy flexible programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun Wei; Huang, Guo H., E-mail: huang@iseis.org; Institute for Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan, S4S 0A2

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Inexact piecewise-linearization-based fuzzy flexible programming is proposed. Black-Right-Pointing-Pointer It's the first application to waste management under multiple complexities. Black-Right-Pointing-Pointer It tackles nonlinear economies-of-scale effects in interval-parameter constraints. Black-Right-Pointing-Pointer It estimates costs more accurately than the linear-regression-based model. Black-Right-Pointing-Pointer Uncertainties are decreased and more satisfactory interval solutions are obtained. - Abstract: To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerancemore » intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities.« less

  7. The microcomputer scientific software series 2: general linear model--regression.

    Treesearch

    Harold M. Rauscher

    1983-01-01

    The general linear model regression (GLMR) program provides the microcomputer user with a sophisticated regression analysis capability. The output provides a regression ANOVA table, estimators of the regression model coefficients, their confidence intervals, confidence intervals around the predicted Y-values, residuals for plotting, a check for multicollinearity, a...

  8. A Limitation of the Applicability of Interval Shift Analysis to Program Evaluation

    ERIC Educational Resources Information Center

    Hardy, Roy

    1975-01-01

    Interval Shift Analysis (ISA) is an adaptation of the linear programming model used to determine maximum benefits or minimal losses in quantifiable economics problems. ISA is applied to pre and posttest score distributions for 43 classes of second graders. (RC)

  9. Waste management under multiple complexities: inexact piecewise-linearization-based fuzzy flexible programming.

    PubMed

    Sun, Wei; Huang, Guo H; Lv, Ying; Li, Gongchen

    2012-06-01

    To tackle nonlinear economies-of-scale (EOS) effects in interval-parameter constraints for a representative waste management problem, an inexact piecewise-linearization-based fuzzy flexible programming (IPFP) model is developed. In IPFP, interval parameters for waste amounts and transportation/operation costs can be quantified; aspiration levels for net system costs, as well as tolerance intervals for both capacities of waste treatment facilities and waste generation rates can be reflected; and the nonlinear EOS effects transformed from objective function to constraints can be approximated. An interactive algorithm is proposed for solving the IPFP model, which in nature is an interval-parameter mixed-integer quadratically constrained programming model. To demonstrate the IPFP's advantages, two alternative models are developed to compare their performances. One is a conventional linear-regression-based inexact fuzzy programming model (IPFP2) and the other is an IPFP model with all right-hand-sides of fussy constraints being the corresponding interval numbers (IPFP3). The comparison results between IPFP and IPFP2 indicate that the optimized waste amounts would have the similar patterns in both models. However, when dealing with EOS effects in constraints, the IPFP2 may underestimate the net system costs while the IPFP can estimate the costs more accurately. The comparison results between IPFP and IPFP3 indicate that their solutions would be significantly different. The decreased system uncertainties in IPFP's solutions demonstrate its effectiveness for providing more satisfactory interval solutions than IPFP3. Following its first application to waste management, the IPFP can be potentially applied to other environmental problems under multiple complexities. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Capacity planning for waste management systems: an interval fuzzy robust dynamic programming approach.

    PubMed

    Nie, Xianghui; Huang, Guo H; Li, Yongping

    2009-11-01

    This study integrates the concepts of interval numbers and fuzzy sets into optimization analysis by dynamic programming as a means of accounting for system uncertainty. The developed interval fuzzy robust dynamic programming (IFRDP) model improves upon previous interval dynamic programming methods. It allows highly uncertain information to be effectively communicated into the optimization process through introducing the concept of fuzzy boundary interval and providing an interval-parameter fuzzy robust programming method for an embedded linear programming problem. Consequently, robustness of the optimization process and solution can be enhanced. The modeling approach is applied to a hypothetical problem for the planning of waste-flow allocation and treatment/disposal facility expansion within a municipal solid waste (MSW) management system. Interval solutions for capacity expansion of waste management facilities and relevant waste-flow allocation are generated and interpreted to provide useful decision alternatives. The results indicate that robust and useful solutions can be obtained, and the proposed IFRDP approach is applicable to practical problems that are associated with highly complex and uncertain information.

  11. MAGDM linear-programming models with distinct uncertain preference structures.

    PubMed

    Xu, Zeshui S; Chen, Jian

    2008-10-01

    Group decision making with preference information on alternatives is an interesting and important research topic which has been receiving more and more attention in recent years. The purpose of this paper is to investigate multiple-attribute group decision-making (MAGDM) problems with distinct uncertain preference structures. We develop some linear-programming models for dealing with the MAGDM problems, where the information about attribute weights is incomplete, and the decision makers have their preferences on alternatives. The provided preference information can be represented in the following three distinct uncertain preference structures: 1) interval utility values; 2) interval fuzzy preference relations; and 3) interval multiplicative preference relations. We first establish some linear-programming models based on decision matrix and each of the distinct uncertain preference structures and, then, develop some linear-programming models to integrate all three structures of subjective uncertain preference information provided by the decision makers and the objective information depicted in the decision matrix. Furthermore, we propose a simple and straightforward approach in ranking and selecting the given alternatives. It is worth pointing out that the developed models can also be used to deal with the situations where the three distinct uncertain preference structures are reduced to the traditional ones, i.e., utility values, fuzzy preference relations, and multiplicative preference relations. Finally, we use a practical example to illustrate in detail the calculation process of the developed approach.

  12. Discrete Methods and their Applications

    DTIC Science & Technology

    1993-02-03

    problem of finding all near-optimal solutions to a linear program. In paper [18], we give a brief and elementary proof of a result of Hoffman [1952) about...relies only on linear programming duality; second, we obtain geometric and algebraic representations of the bounds that are determined explicitly in...same. We have studied the problem of finding the minimum n such that a given unit interval graph is an n--graph. A linear time algorithm to compute

  13. A program for identification of linear systems

    NASA Technical Reports Server (NTRS)

    Buell, J.; Kalaba, R.; Ruspini, E.; Yakush, A.

    1971-01-01

    A program has been written for the identification of parameters in certain linear systems. These systems appear in biomedical problems, particularly in compartmental models of pharmacokinetics. The method presented here assumes that some of the state variables are regularly modified by jump conditions. This simulates administration of drugs following some prescribed drug regime. Parameters are identified by a least-square fit of the linear differential system to a set of experimental observations. The method is especially suited when the interval of observation of the system is very long.

  14. A generalized fuzzy linear programming approach for environmental management problem under uncertainty.

    PubMed

    Fan, Yurui; Huang, Guohe; Veawab, Amornvadee

    2012-01-01

    In this study, a generalized fuzzy linear programming (GFLP) method was developed to deal with uncertainties expressed as fuzzy sets that exist in the constraints and objective function. A stepwise interactive algorithm (SIA) was advanced to solve GFLP model and generate solutions expressed as fuzzy sets. To demonstrate its application, the developed GFLP method was applied to a regional sulfur dioxide (SO2) control planning model to identify effective SO2 mitigation polices with a minimized system performance cost under uncertainty. The results were obtained to represent the amount of SO2 allocated to different control measures from different sources. Compared with the conventional interval-parameter linear programming (ILP) approach, the solutions obtained through GFLP were expressed as fuzzy sets, which can provide intervals for the decision variables and objective function, as well as related possibilities. Therefore, the decision makers can make a tradeoff between model stability and the plausibility based on solutions obtained through GFLP and then identify desired policies for SO2-emission control under uncertainty.

  15. SSP: an interval integer linear programming for de novo transcriptome assembly and isoform discovery of RNA-seq reads.

    PubMed

    Safikhani, Zhaleh; Sadeghi, Mehdi; Pezeshk, Hamid; Eslahchi, Changiz

    2013-01-01

    Recent advances in the sequencing technologies have provided a handful of RNA-seq datasets for transcriptome analysis. However, reconstruction of full-length isoforms and estimation of the expression level of transcripts with a low cost are challenging tasks. We propose a novel de novo method named SSP that incorporates interval integer linear programming to resolve alternatively spliced isoforms and reconstruct the whole transcriptome from short reads. Experimental results show that SSP is fast and precise in determining different alternatively spliced isoforms along with the estimation of reconstructed transcript abundances. The SSP software package is available at http://www.bioinf.cs.ipm.ir/software/ssp. © 2013.

  16. An improved risk-explicit interval linear programming model for pollution load allocation for watershed management.

    PubMed

    Xia, Bisheng; Qian, Xin; Yao, Hong

    2017-11-01

    Although the risk-explicit interval linear programming (REILP) model has solved the problem of having interval solutions, it has an equity problem, which can lead to unbalanced allocation between different decision variables. Therefore, an improved REILP model is proposed. This model adds an equity objective function and three constraint conditions to overcome this equity problem. In this case, pollution reduction is in proportion to pollutant load, which supports balanced development between different regional economies. The model is used to solve the problem of pollution load allocation in a small transboundary watershed. Compared with the REILP original model result, our model achieves equity between the upstream and downstream pollutant loads; it also overcomes the problem of greatest pollution reduction, where sources are nearest to the control section. The model provides a better solution to the problem of pollution load allocation than previous versions.

  17. A risk explicit interval linear programming model for uncertainty-based environmental economic optimization in the Lake Fuxian watershed, China.

    PubMed

    Zhang, Xiaoling; Huang, Kai; Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of "low risk and high return efficiency" in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management.

  18. A Risk Explicit Interval Linear Programming Model for Uncertainty-Based Environmental Economic Optimization in the Lake Fuxian Watershed, China

    PubMed Central

    Zou, Rui; Liu, Yong; Yu, Yajuan

    2013-01-01

    The conflict of water environment protection and economic development has brought severe water pollution and restricted the sustainable development in the watershed. A risk explicit interval linear programming (REILP) method was used to solve integrated watershed environmental-economic optimization problem. Interval linear programming (ILP) and REILP models for uncertainty-based environmental economic optimization at the watershed scale were developed for the management of Lake Fuxian watershed, China. Scenario analysis was introduced into model solution process to ensure the practicality and operability of optimization schemes. Decision makers' preferences for risk levels can be expressed through inputting different discrete aspiration level values into the REILP model in three periods under two scenarios. Through balancing the optimal system returns and corresponding system risks, decision makers can develop an efficient industrial restructuring scheme based directly on the window of “low risk and high return efficiency” in the trade-off curve. The representative schemes at the turning points of two scenarios were interpreted and compared to identify a preferable planning alternative, which has the relatively low risks and nearly maximum benefits. This study provides new insights and proposes a tool, which was REILP, for decision makers to develop an effectively environmental economic optimization scheme in integrated watershed management. PMID:24191144

  19. TI-59 Programs for Multiple Regression.

    DTIC Science & Technology

    1980-05-01

    general linear hypothesis model of full rank [ Graybill , 19611 can be written as Y = x 8 + C , s-N(O,o 2I) nxl nxk kxl nxl where Y is the vector of n...a "reduced model " solution, and confidence intervals for linear functions of the coefficients can be obtained using (x’x) and a2, based on the t...O107)l UA.LLL. Library ModuIe NASTER -Puter 0NTINA Cards 1 PROGRAM DESCRIPTION (s s 2 ror the general linear hypothesis model Y - XO + C’ calculates

  20. Waste management with recourse: an inexact dynamic programming model containing fuzzy boundary intervals in objectives and constraints.

    PubMed

    Tan, Q; Huang, G H; Cai, Y P

    2010-09-01

    The existing inexact optimization methods based on interval-parameter linear programming can hardly address problems where coefficients in objective functions are subject to dual uncertainties. In this study, a superiority-inferiority-based inexact fuzzy two-stage mixed-integer linear programming (SI-IFTMILP) model was developed for supporting municipal solid waste management under uncertainty. The developed SI-IFTMILP approach is capable of tackling dual uncertainties presented as fuzzy boundary intervals (FuBIs) in not only constraints, but also objective functions. Uncertainties expressed as a combination of intervals and random variables could also be explicitly reflected. An algorithm with high computational efficiency was provided to solve SI-IFTMILP. SI-IFTMILP was then applied to a long-term waste management case to demonstrate its applicability. Useful interval solutions were obtained. SI-IFTMILP could help generate dynamic facility-expansion and waste-allocation plans, as well as provide corrective actions when anticipated waste management plans are violated. It could also greatly reduce system-violation risk and enhance system robustness through examining two sets of penalties resulting from variations in fuzziness and randomness. Moreover, four possible alternative models were formulated to solve the same problem; solutions from them were then compared with those from SI-IFTMILP. The results indicate that SI-IFTMILP could provide more reliable solutions than the alternatives. 2010 Elsevier Ltd. All rights reserved.

  1. Comparison of linear, skewed-linear, and proportional hazard models for the analysis of lambing interval in Ripollesa ewes.

    PubMed

    Casellas, J; Bach, R

    2012-06-01

    Lambing interval is a relevant reproductive indicator for sheep populations under continuous mating systems, although there is a shortage of selection programs accounting for this trait in the sheep industry. Both the historical assumption of small genetic background and its unorthodox distribution pattern have limited its implementation as a breeding objective. In this manuscript, statistical performances of 3 alternative parametrizations [i.e., symmetric Gaussian mixed linear (GML) model, skew-Gaussian mixed linear (SGML) model, and piecewise Weibull proportional hazard (PWPH) model] have been compared to elucidate the preferred methodology to handle lambing interval data. More specifically, flock-by-flock analyses were performed on 31,986 lambing interval records (257.3 ± 0.2 d) from 6 purebred Ripollesa flocks. Model performances were compared in terms of deviance information criterion (DIC) and Bayes factor (BF). For all flocks, PWPH models were clearly preferred; they generated a reduction of 1,900 or more DIC units and provided BF estimates larger than 100 (i.e., PWPH models against linear models). These differences were reduced when comparing PWPH models with different number of change points for the baseline hazard function. In 4 flocks, only 2 change points were required to minimize the DIC, whereas 4 and 6 change points were needed for the 2 remaining flocks. These differences demonstrated a remarkable degree of heterogeneity across sheep flocks that must be properly accounted for in genetic evaluation models to avoid statistical biases and suboptimal genetic trends. Within this context, all 6 Ripollesa flocks revealed substantial genetic background for lambing interval with heritabilities ranging between 0.13 and 0.19. This study provides the first evidence of the suitability of PWPH models for lambing interval analysis, clearly discarding previous parametrizations focused on mixed linear models.

  2. Users manual for flight control design programs

    NASA Technical Reports Server (NTRS)

    Nalbandian, J. Y.

    1975-01-01

    Computer programs for the design of analog and digital flight control systems are documented. The program DIGADAPT uses linear-quadratic-gaussian synthesis algorithms in the design of command response controllers and state estimators, and it applies covariance propagation analysis to the selection of sampling intervals for digital systems. Program SCHED executes correlation and regression analyses for the development of gain and trim schedules to be used in open-loop explicit-adaptive control laws. A linear-time-varying simulation of aircraft motions is provided by the program TVHIS, which includes guidance and control logic, as well as models for control actuator dynamics. The programs are coded in FORTRAN and are compiled and executed on both IBM and CDC computers.

  3. Robust Control Design via Linear Programming

    NASA Technical Reports Server (NTRS)

    Keel, L. H.; Bhattacharyya, S. P.

    1998-01-01

    This paper deals with the problem of synthesizing or designing a feedback controller of fixed dynamic order. The closed loop specifications considered here are given in terms of a target performance vector representing a desired set of closed loop transfer functions connecting various signals. In general these point targets are unattainable with a fixed order controller. By enlarging the target from a fixed point set to an interval set the solvability conditions with a fixed order controller are relaxed and a solution is more easily enabled. Results from the parametric robust control literature can be used to design the interval target family so that the performance deterioration is acceptable, even when plant uncertainty is present. It is shown that it is possible to devise a computationally simple linear programming approach that attempts to meet the desired closed loop specifications.

  4. A Multiobjective Interval Programming Model for Wind-Hydrothermal Power System Dispatching Using 2-Step Optimization Algorithm

    PubMed Central

    Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision. PMID:24895663

  5. A multiobjective interval programming model for wind-hydrothermal power system dispatching using 2-step optimization algorithm.

    PubMed

    Ren, Kun; Jihong, Qu

    2014-01-01

    Wind-hydrothermal power system dispatching has received intensive attention in recent years because it can help develop various reasonable plans to schedule the power generation efficiency. But future data such as wind power output and power load would not be accurately predicted and the nonlinear nature involved in the complex multiobjective scheduling model; therefore, to achieve accurate solution to such complex problem is a very difficult task. This paper presents an interval programming model with 2-step optimization algorithm to solve multiobjective dispatching. Initially, we represented the future data into interval numbers and simplified the object function to a linear programming problem to search the feasible and preliminary solutions to construct the Pareto set. Then the simulated annealing method was used to search the optimal solution of initial model. Thorough experimental results suggest that the proposed method performed reasonably well in terms of both operating efficiency and precision.

  6. PMICALC: an R code-based software for estimating post-mortem interval (PMI) compatible with Windows, Mac and Linux operating systems.

    PubMed

    Muñoz-Barús, José I; Rodríguez-Calvo, María Sol; Suárez-Peñaranda, José M; Vieira, Duarte N; Cadarso-Suárez, Carmen; Febrero-Bande, Manuel

    2010-01-30

    In legal medicine the correct determination of the time of death is of utmost importance. Recent advances in estimating post-mortem interval (PMI) have made use of vitreous humour chemistry in conjunction with Linear Regression, but the results are questionable. In this paper we present PMICALC, an R code-based freeware package which estimates PMI in cadavers of recent death by measuring the concentrations of potassium ([K+]), hypoxanthine ([Hx]) and urea ([U]) in the vitreous humor using two different regression models: Additive Models (AM) and Support Vector Machine (SVM), which offer more flexibility than the previously used Linear Regression. The results from both models are better than those published to date and can give numerical expression of PMI with confidence intervals and graphic support within 20 min. The program also takes into account the cause of death. 2009 Elsevier Ireland Ltd. All rights reserved.

  7. Static Analysis Numerical Algorithms

    DTIC Science & Technology

    2016-04-01

    represented by a collection of intervals (one for each variable) or a convex polyhedron (each dimension of the affine space representing a program variable...Another common abstract domain uses a set of linear constraints (i.e. an enclosing polyhedron ) to over-approximate the joint values of several

  8. Chosen interval methods for solving linear interval systems with special type of matrix

    NASA Astrophysics Data System (ADS)

    Szyszka, Barbara

    2013-10-01

    The paper is devoted to chosen direct interval methods for solving linear interval systems with special type of matrix. This kind of matrix: band matrix with a parameter, from finite difference problem is obtained. Such linear systems occur while solving one dimensional wave equation (Partial Differential Equations of hyperbolic type) by using the central difference interval method of the second order. Interval methods are constructed so as the errors of method are enclosed in obtained results, therefore presented linear interval systems contain elements that determining the errors of difference method. The chosen direct algorithms have been applied for solving linear systems because they have no errors of method. All calculations were performed in floating-point interval arithmetic.

  9. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications

    PubMed Central

    Austin, Peter C.

    2017-01-01

    Summary Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log–log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata). PMID:29307954

  10. A Tutorial on Multilevel Survival Analysis: Methods, Models and Applications.

    PubMed

    Austin, Peter C

    2017-08-01

    Data that have a multilevel structure occur frequently across a range of disciplines, including epidemiology, health services research, public health, education and sociology. We describe three families of regression models for the analysis of multilevel survival data. First, Cox proportional hazards models with mixed effects incorporate cluster-specific random effects that modify the baseline hazard function. Second, piecewise exponential survival models partition the duration of follow-up into mutually exclusive intervals and fit a model that assumes that the hazard function is constant within each interval. This is equivalent to a Poisson regression model that incorporates the duration of exposure within each interval. By incorporating cluster-specific random effects, generalised linear mixed models can be used to analyse these data. Third, after partitioning the duration of follow-up into mutually exclusive intervals, one can use discrete time survival models that use a complementary log-log generalised linear model to model the occurrence of the outcome of interest within each interval. Random effects can be incorporated to account for within-cluster homogeneity in outcomes. We illustrate the application of these methods using data consisting of patients hospitalised with a heart attack. We illustrate the application of these methods using three statistical programming languages (R, SAS and Stata).

  11. The Linear Programming to evaluate the performance of Oral Health in Primary Care.

    PubMed

    Colussi, Claudia Flemming; Calvo, Maria Cristina Marino; Freitas, Sergio Fernando Torres de

    2013-01-01

    To show the use of Linear Programming to evaluate the performance of Oral Health in Primary Care. This study used data from 19 municipalities of Santa Catarina city that participated of the state evaluation in 2009 and have more than 50,000 habitants. A total of 40 indicators were evaluated, calculated using the Microsoft Excel 2007, and converted to the interval [0, 1] in ascending order (one indicating the best situation and zero indicating the worst situation). Applying the Linear Programming technique municipalities were assessed and compared among them according to performance curve named "quality estimated frontier". Municipalities included in the frontier were classified as excellent. Indicators were gathered, and became synthetic indicators. The majority of municipalities not included in the quality frontier (values different of 1.0) had lower values than 0.5, indicating poor performance. The model applied to the municipalities of Santa Catarina city assessed municipal management and local priorities rather than the goals imposed by pre-defined parameters. In the final analysis three municipalities were included in the "perceived quality frontier". The Linear Programming technique allowed to identify gaps that must be addressed by city managers to enhance actions taken. It also enabled to observe each municipal performance and compare results among similar municipalities.

  12. Monte Carlo simulation of parameter confidence intervals for non-linear regression analysis of biological data using Microsoft Excel.

    PubMed

    Lambert, Ronald J W; Mytilinaios, Ioannis; Maitland, Luke; Brown, Angus M

    2012-08-01

    This study describes a method to obtain parameter confidence intervals from the fitting of non-linear functions to experimental data, using the SOLVER and Analysis ToolPaK Add-In of the Microsoft Excel spreadsheet. Previously we have shown that Excel can fit complex multiple functions to biological data, obtaining values equivalent to those returned by more specialized statistical or mathematical software. However, a disadvantage of using the Excel method was the inability to return confidence intervals for the computed parameters or the correlations between them. Using a simple Monte-Carlo procedure within the Excel spreadsheet (without recourse to programming), SOLVER can provide parameter estimates (up to 200 at a time) for multiple 'virtual' data sets, from which the required confidence intervals and correlation coefficients can be obtained. The general utility of the method is exemplified by applying it to the analysis of the growth of Listeria monocytogenes, the growth inhibition of Pseudomonas aeruginosa by chlorhexidine and the further analysis of the electrophysiological data from the compound action potential of the rodent optic nerve. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Tail mean and related robust solution concepts

    NASA Astrophysics Data System (ADS)

    Ogryczak, Włodzimierz

    2014-01-01

    Robust optimisation might be viewed as a multicriteria optimisation problem where objectives correspond to the scenarios although their probabilities are unknown or imprecise. The simplest robust solution concept represents a conservative approach focused on the worst-case scenario results optimisation. A softer concept allows one to optimise the tail mean thus combining performances under multiple worst scenarios. We show that while considering robust models allowing the probabilities to vary only within given intervals, the tail mean represents the robust solution for only upper bounded probabilities. For any arbitrary intervals of probabilities the corresponding robust solution may be expressed by the optimisation of appropriately combined mean and tail mean criteria thus remaining easily implementable with auxiliary linear inequalities. Moreover, we use the tail mean concept to develope linear programming implementable robust solution concepts related to risk averse optimisation criteria.

  14. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  15. A Unified Approach to Optimization

    DTIC Science & Technology

    2014-10-02

    employee scheduling, ad placement, latin squares, disjunctions of linear systems, temporal modeling with interval variables, and traveling salesman problems ...integrating technologies. A key to integrated modeling is to formulate a problem with high-levelmetaconstraints, which are inspired by the “global... problem substructure to the solver. This contrasts with the atomistic modeling style of mixed integer programming (MIP) and satisfiability (SAT) solvers

  16. A primer for biomedical scientists on how to execute model II linear regression analysis.

    PubMed

    Ludbrook, John

    2012-04-01

    1. There are two very different ways of executing linear regression analysis. One is Model I, when the x-values are fixed by the experimenter. The other is Model II, in which the x-values are free to vary and are subject to error. 2. I have received numerous complaints from biomedical scientists that they have great difficulty in executing Model II linear regression analysis. This may explain the results of a Google Scholar search, which showed that the authors of articles in journals of physiology, pharmacology and biochemistry rarely use Model II regression analysis. 3. I repeat my previous arguments in favour of using least products linear regression analysis for Model II regressions. I review three methods for executing ordinary least products (OLP) and weighted least products (WLP) regression analysis: (i) scientific calculator and/or computer spreadsheet; (ii) specific purpose computer programs; and (iii) general purpose computer programs. 4. Using a scientific calculator and/or computer spreadsheet, it is easy to obtain correct values for OLP slope and intercept, but the corresponding 95% confidence intervals (CI) are inaccurate. 5. Using specific purpose computer programs, the freeware computer program smatr gives the correct OLP regression coefficients and obtains 95% CI by bootstrapping. In addition, smatr can be used to compare the slopes of OLP lines. 6. When using general purpose computer programs, I recommend the commercial programs systat and Statistica for those who regularly undertake linear regression analysis and I give step-by-step instructions in the Supplementary Information as to how to use loss functions. © 2011 The Author. Clinical and Experimental Pharmacology and Physiology. © 2011 Blackwell Publishing Asia Pty Ltd.

  17. A Sequential Linear Quadratic Approach for Constrained Nonlinear Optimal Control with Adaptive Time Discretization and Application to Higher Elevation Mars Landing Problem

    NASA Astrophysics Data System (ADS)

    Sandhu, Amit

    A sequential quadratic programming method is proposed for solving nonlinear optimal control problems subject to general path constraints including mixed state-control and state only constraints. The proposed algorithm further develops on the approach proposed in [1] with objective to eliminate the use of a high number of time intervals for arriving at an optimal solution. This is done by introducing an adaptive time discretization to allow formation of a desirable control profile without utilizing a lot of intervals. The use of fewer time intervals reduces the computation time considerably. This algorithm is further used in this thesis to solve a trajectory planning problem for higher elevation Mars landing.

  18. Evaluation of confidence intervals for a steady-state leaky aquifer model

    USGS Publications Warehouse

    Christensen, S.; Cooley, R.L.

    1999-01-01

    The fact that dependent variables of groundwater models are generally nonlinear functions of model parameters is shown to be a potentially significant factor in calculating accurate confidence intervals for both model parameters and functions of the parameters, such as the values of dependent variables calculated by the model. The Lagrangian method of Vecchia and Cooley [Vecchia, A.V. and Cooley, R.L., Water Resources Research, 1987, 23(7), 1237-1250] was used to calculate nonlinear Scheffe-type confidence intervals for the parameters and the simulated heads of a steady-state groundwater flow model covering 450 km2 of a leaky aquifer. The nonlinear confidence intervals are compared to corresponding linear intervals. As suggested by the significant nonlinearity of the regression model, linear confidence intervals are often not accurate. The commonly made assumption that widths of linear confidence intervals always underestimate the actual (nonlinear) widths was not correct. Results show that nonlinear effects can cause the nonlinear intervals to be asymmetric and either larger or smaller than the linear approximations. Prior information on transmissivities helps reduce the size of the confidence intervals, with the most notable effects occurring for the parameters on which there is prior information and for head values in parameter zones for which there is prior information on the parameters.The fact that dependent variables of groundwater models are generally nonlinear functions of model parameters is shown to be a potentially significant factor in calculating accurate confidence intervals for both model parameters and functions of the parameters, such as the values of dependent variables calculated by the model. The Lagrangian method of Vecchia and Cooley was used to calculate nonlinear Scheffe-type confidence intervals for the parameters and the simulated heads of a steady-state groundwater flow model covering 450 km2 of a leaky aquifer. The nonlinear confidence intervals are compared to corresponding linear intervals. As suggested by the significant nonlinearity of the regression model, linear confidence intervals are often not accurate. The commonly made assumption that widths of linear confidence intervals always underestimate the actual (nonlinear) widths was not correct. Results show that nonlinear effects can cause the nonlinear intervals to be asymmetric and either larger or smaller than the linear approximations. Prior information on transmissivities helps reduce the size of the confidence intervals, with the most notable effects occurring for the parameters on which there is prior information and for head values in parameter zones for which there is prior information on the parameters.

  19. Inexact fuzzy-stochastic mixed-integer programming approach for long-term planning of waste management--Part A: methodology.

    PubMed

    Guo, P; Huang, G H

    2009-01-01

    In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.

  20. Interval linear programming model for long-term planning of vehicle recycling in the Republic of Serbia under uncertainty.

    PubMed

    Simic, Vladimir; Dimitrijevic, Branka

    2015-02-01

    An interval linear programming approach is used to formulate and comprehensively test a model for optimal long-term planning of vehicle recycling in the Republic of Serbia. The proposed model is applied to a numerical case study: a 4-year planning horizon (2013-2016) is considered, three legislative cases and three scrap metal price trends are analysed, availability of final destinations for sorted waste flows is explored. Potential and applicability of the developed model are fully illustrated. Detailed insights on profitability and eco-efficiency of the projected contemporary equipped vehicle recycling factory are presented. The influences of the ordinance on the management of end-of-life vehicles in the Republic of Serbia on the vehicle hulks procuring, sorting generated material fractions, sorted waste allocation and sorted metals allocation decisions are thoroughly examined. The validity of the waste management strategy for the period 2010-2019 is tested. The formulated model can create optimal plans for procuring vehicle hulks, sorting generated material fractions, allocating sorted waste flows and allocating sorted metals. Obtained results are valuable for supporting the construction and/or modernisation process of a vehicle recycling system in the Republic of Serbia. © The Author(s) 2015.

  1. Frequency and Determinants of a Short-Interval Follow-up Recommendation After an Abnormal Screening Mammogram.

    PubMed

    Pelletier, Eric; Daigle, Jean-Marc; Defay, Fannie; Major, Diane; Guertin, Marie-Hélène; Brisson, Jacques

    2016-11-01

    After imaging assessment of an abnormal screening mammogram, a follow-up examination 6 months later is recommended to some women. Our aim was to identify which characteristics of lesions, women, and physicians are associated to such short-interval follow-up recommendation in the Quebec Breast Cancer Screening Program. Between 1998 and 2008, 1,839,396 screening mammograms were performed and a total of 114,781 abnormal screens were assessed by imaging only. Multivariate analysis was done with multilevel Poisson regression models with robust variance and generalized linear mixed models. A short-interval follow-up was recommended in 26.7% of assessments with imaging only, representing 2.3% of all screens. Case-mix adjusted proportion of short-interval follow-up recommendations varied substantially across physicians (range: 4%-64%). Radiologists with high recall rates (≥15%) had a high proportion of short-interval follow-up recommendation (risk ratio: 1.82; 95% confidence interval: 1.35-2.45) compared to radiologists with low recall rates (<5%). The adjusted proportion of short-interval follow-up was high (22.8%) even when a previous mammogram was usually available. Short-interval follow-up recommendation at assessment is frequent in this Canadian screening program, even when a previous mammogram is available. Characteristics related to radiologists appear to be key determinants of short-interval follow-up recommendation, rather than characteristics of lesions or patient mix. Given that it can cause anxiety to women and adds pressure on the health system, it appears important to record and report short-interval follow-up and to identify ways to reduce its frequency. Short-interval follow-up recommendations should be considered when assessing the burden of mammography screening. Copyright © 2016 Canadian Association of Radiologists. Published by Elsevier Inc. All rights reserved.

  2. Interval-parameter semi-infinite fuzzy-stochastic mixed-integer programming approach for environmental management under multiple uncertainties.

    PubMed

    Guo, P; Huang, G H

    2010-03-01

    In this study, an interval-parameter semi-infinite fuzzy-chance-constrained mixed-integer linear programming (ISIFCIP) approach is developed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing interval-parameter semi-infinite programming (ISIP) and fuzzy-chance-constrained programming (FCCP) by incorporating uncertainties expressed as dual uncertainties of functional intervals and multiple uncertainties of distributions with fuzzy-interval admissible probability of violating constraint within a general optimization framework. The binary-variable solutions represent the decisions of waste-management-facility expansion, and the continuous ones are related to decisions of waste-flow allocation. The interval solutions can help decision-makers to obtain multiple decision alternatives, as well as provide bases for further analyses of tradeoffs between waste-management cost and system-failure risk. In the application to the City of Regina, Canada, two scenarios are considered. In Scenario 1, the City's waste-management practices would be based on the existing policy over the next 25 years. The total diversion rate for the residential waste would be approximately 14%. Scenario 2 is associated with a policy for waste minimization and diversion, where 35% diversion of residential waste should be achieved within 15 years, and 50% diversion over 25 years. In this scenario, not only landfill would be expanded, but also CF and MRF would be expanded. Through the scenario analyses, useful decision support for the City's solid-waste managers and decision-makers has been generated. Three special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it is useful for tackling multiple uncertainties expressed as intervals, functional intervals, probability distributions, fuzzy sets, and their combinations; secondly, it has capability in addressing the temporal variations of the functional intervals; thirdly, it can facilitate dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period and multi-option context. Copyright 2009 Elsevier Ltd. All rights reserved.

  3. An inexact chance-constrained programming model for water quality management in Binhai New Area of Tianjin, China.

    PubMed

    Xie, Y L; Li, Y P; Huang, G H; Li, Y F; Chen, L R

    2011-04-15

    In this study, an inexact-chance-constrained water quality management (ICC-WQM) model is developed for planning regional environmental management under uncertainty. This method is based on an integration of interval linear programming (ILP) and chance-constrained programming (CCP) techniques. ICC-WQM allows uncertainties presented as both probability distributions and interval values to be incorporated within a general optimization framework. Complexities in environmental management systems can be systematically reflected, thus applicability of the modeling process can be highly enhanced. The developed method is applied to planning chemical-industry development in Binhai New Area of Tianjin, China. Interval solutions associated with different risk levels of constraint violation have been obtained. They can be used for generating decision alternatives and thus help decision makers identify desired policies under various system-reliability constraints of water environmental capacity of pollutant. Tradeoffs between system benefits and constraint-violation risks can also be tackled. They are helpful for supporting (a) decision of wastewater discharge and government investment, (b) formulation of local policies regarding water consumption, economic development and industry structure, and (c) analysis of interactions among economic benefits, system reliability and pollutant discharges. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Interresponse Time Structures in Variable-Ratio and Variable-Interval Schedules

    ERIC Educational Resources Information Center

    Bowers, Matthew T.; Hill, Jade; Palya, William L.

    2008-01-01

    The interresponse-time structures of pigeon key pecking were examined under variable-ratio, variable-interval, and variable-interval plus linear feedback schedules. Whereas the variable-ratio and variable-interval plus linear feedback schedules generally resulted in a distinct group of short interresponse times and a broad distribution of longer…

  5. Linear quadratic tracking problems in Hilbert space - Application to optimal active noise suppression

    NASA Technical Reports Server (NTRS)

    Banks, H. T.; Silcox, R. J.; Keeling, S. L.; Wang, C.

    1989-01-01

    A unified treatment of the linear quadratic tracking (LQT) problem, in which a control system's dynamics are modeled by a linear evolution equation with a nonhomogeneous component that is linearly dependent on the control function u, is presented; the treatment proceeds from the theoretical formulation to a numerical approximation framework. Attention is given to two categories of LQT problems in an infinite time interval: the finite energy and the finite average energy. The behavior of the optimal solution for finite time-interval problems as the length of the interval tends to infinity is discussed. Also presented are the formulations and properties of LQT problems in a finite time interval.

  6. Optimal Wind Power Uncertainty Intervals for Electricity Market Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ying; Zhou, Zhi; Botterud, Audun

    It is important to select an appropriate uncertainty level of the wind power forecast for power system scheduling and electricity market operation. Traditional methods hedge against a predefined level of wind power uncertainty, such as a specific confidence interval or uncertainty set, which leaves the questions of how to best select the appropriate uncertainty levels. To bridge this gap, this paper proposes a model to optimize the forecast uncertainty intervals of wind power for power system scheduling problems, with the aim of achieving the best trade-off between economics and reliability. Then we reformulate and linearize the models into a mixedmore » integer linear programming (MILP) without strong assumptions on the shape of the probability distribution. In order to invest the impacts on cost, reliability, and prices in a electricity market, we apply the proposed model on a twosettlement electricity market based on a six-bus test system and on a power system representing the U.S. state of Illinois. The results show that the proposed method can not only help to balance the economics and reliability of the power system scheduling, but also help to stabilize the energy prices in electricity market operation.« less

  7. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  8. Resolution of an uncertain closed-loop logistics model: an application to fuzzy linear programs with risk analysis.

    PubMed

    Wang, Hsiao-Fan; Hsu, Hsin-Wei

    2010-11-01

    With the urgency of global warming, green supply chain management, logistics in particular, has drawn the attention of researchers. Although there are closed-loop green logistics models in the literature, most of them do not consider the uncertain environment in general terms. In this study, a generalized model is proposed where the uncertainty is expressed by fuzzy numbers. An interval programming model is proposed by the defined means and mean square imprecision index obtained from the integrated information of all the level cuts of fuzzy numbers. The resolution for interval programming is based on the decision maker (DM)'s preference. The resulting solution provides useful information on the expected solutions under a confidence level containing a degree of risk. The results suggest that the more optimistic the DM is, the better is the resulting solution. However, a higher risk of violation of the resource constraints is also present. By defining this probable risk, a solution procedure was developed with numerical illustrations. This provides a DM trade-off mechanism between logistic cost and the risk. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. A Comparative Theoretical and Computational Study on Robust Counterpart Optimization: I. Robust Linear Optimization and Robust Mixed Integer Linear Optimization

    PubMed Central

    Li, Zukui; Ding, Ran; Floudas, Christodoulos A.

    2011-01-01

    Robust counterpart optimization techniques for linear optimization and mixed integer linear optimization problems are studied in this paper. Different uncertainty sets, including those studied in literature (i.e., interval set; combined interval and ellipsoidal set; combined interval and polyhedral set) and new ones (i.e., adjustable box; pure ellipsoidal; pure polyhedral; combined interval, ellipsoidal, and polyhedral set) are studied in this work and their geometric relationship is discussed. For uncertainty in the left hand side, right hand side, and objective function of the optimization problems, robust counterpart optimization formulations induced by those different uncertainty sets are derived. Numerical studies are performed to compare the solutions of the robust counterpart optimization models and applications in refinery production planning and batch process scheduling problem are presented. PMID:21935263

  10. User's manual for SEDCALC, a computer program for computation of suspended-sediment discharge

    USGS Publications Warehouse

    Koltun, G.F.; Gray, John R.; McElhone, T.J.

    1994-01-01

    Sediment-Record Calculations (SEDCALC), a menu-driven set of interactive computer programs, was developed to facilitate computation of suspended-sediment records. The programs comprising SEDCALC were developed independently in several District offices of the U.S. Geological Survey (USGS) to minimize the intensive labor associated with various aspects of sediment-record computations. SEDCALC operates on suspended-sediment-concentration data stored in American Standard Code for Information Interchange (ASCII) files in a predefined card-image format. Program options within SEDCALC can be used to assist in creating and editing the card-image files, as well as to reformat card-image files to and from formats used by the USGS Water-Quality System. SEDCALC provides options for creating card-image files containing time series of equal-interval suspended-sediment concentrations from 1. digitized suspended-sediment-concentration traces, 2. linear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals, and 3. nonlinear interpolation between log-transformed instantaneous suspended-sediment-concentration data stored at unequal time intervals. Suspended-sediment discharge can be computed from the streamflow and suspended-sediment-concentration data or by application of transport relations derived by regressing log-transformed instantaneous streamflows on log-transformed instantaneous suspended-sediment concentrations or discharges. The computed suspended-sediment discharge data are stored in card-image files that can be either directly imported to the USGS Automated Data Processing System or used to generate plots by means of other SEDCALC options.

  11. An interval chance-constrained fuzzy modeling approach for supporting land-use planning and eco-environment planning at a watershed level.

    PubMed

    Ou, Guoliang; Tan, Shukui; Zhou, Min; Lu, Shasha; Tao, Yinghui; Zhang, Zuo; Zhang, Lu; Yan, Danping; Guan, Xingliang; Wu, Gang

    2017-12-15

    An interval chance-constrained fuzzy land-use allocation (ICCF-LUA) model is proposed in this study to support solving land resource management problem associated with various environmental and ecological constraints at a watershed level. The ICCF-LUA model is based on the ICCF (interval chance-constrained fuzzy) model which is coupled with interval mathematical model, chance-constrained programming model and fuzzy linear programming model and can be used to deal with uncertainties expressed as intervals, probabilities and fuzzy sets. Therefore, the ICCF-LUA model can reflect the tradeoff between decision makers and land stakeholders, the tradeoff between the economical benefits and eco-environmental demands. The ICCF-LUA model has been applied to the land-use allocation of Wujiang watershed, Guizhou Province, China. The results indicate that under highly land suitable conditions, optimized area of cultivated land, forest land, grass land, construction land, water land, unused land and landfill in Wujiang watershed will be [5015, 5648] hm 2 , [7841, 7965] hm 2 , [1980, 2056] hm 2 , [914, 1423] hm 2 , [70, 90] hm 2 , [50, 70] hm 2 and [3.2, 4.3] hm 2 , the corresponding system economic benefit will be between 6831 and 7219 billion yuan. Consequently, the ICCF-LUA model can effectively support optimized land-use allocation problem in various complicated conditions which include uncertainties, risks, economic objective and eco-environmental constraints. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Bioinactivation: Software for modelling dynamic microbial inactivation.

    PubMed

    Garre, Alberto; Fernández, Pablo S; Lindqvist, Roland; Egea, Jose A

    2017-03-01

    This contribution presents the bioinactivation software, which implements functions for the modelling of isothermal and non-isothermal microbial inactivation. This software offers features such as user-friendliness, modelling of dynamic conditions, possibility to choose the fitting algorithm and generation of prediction intervals. The software is offered in two different formats: Bioinactivation core and Bioinactivation SE. Bioinactivation core is a package for the R programming language, which includes features for the generation of predictions and for the fitting of models to inactivation experiments using non-linear regression or a Markov Chain Monte Carlo algorithm (MCMC). The calculations are based on inactivation models common in academia and industry (Bigelow, Peleg, Mafart and Geeraerd). Bioinactivation SE supplies a user-friendly interface to selected functions of Bioinactivation core, namely the model fitting of non-isothermal experiments and the generation of prediction intervals. The capabilities of bioinactivation are presented in this paper through a case study, modelling the non-isothermal inactivation of Bacillus sporothermodurans. This study has provided a full characterization of the response of the bacteria to dynamic temperature conditions, including confidence intervals for the model parameters and a prediction interval of the survivor curve. We conclude that the MCMC algorithm produces a better characterization of the biological uncertainty and variability than non-linear regression. The bioinactivation software can be relevant to the food and pharmaceutical industry, as well as to regulatory agencies, as part of a (quantitative) microbial risk assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. MOJAVE. XV. VLBA 15 GHz Total Intensity and Polarization Maps of 437 Parsec-scale AGN Jets from 1996 to 2017

    NASA Astrophysics Data System (ADS)

    Lister, M. L.; Aller, M. F.; Aller, H. D.; Hodge, M. A.; Homan, D. C.; Kovalev, Y. Y.; Pushkarev, A. B.; Savolainen, T.

    2018-01-01

    We present 5321 mas-resolution total intensity and linear polarization maps of 437 active galactic nuclei (AGNs) obtained with the VLBA at 15 GHz as part of the MOJAVE survey, and also from the NRAO data archive. The former is a long-term program to study the structure and evolution of powerful parsec-scale outflows associated with AGNs. The targeted AGNs are drawn from several flux-limited radio and γ-ray samples, and all have correlated VLBA flux densities greater than ∼50 mJy at 15 GHz. Approximately 80% of these AGNs are associated with γ-ray sources detected by the Fermi LAT instrument. The vast majority were observed with the VLBA on 5–15 occasions between 1996 January 19 and 2016 December 26, at intervals ranging from a month to several years, with the most typical sampling interval being six months. A detailed analysis of the linear and circular polarization evolutions of these AGN jets is presented in the other papers in this series.

  14. A green vehicle routing problem with customer satisfaction criteria

    NASA Astrophysics Data System (ADS)

    Afshar-Bakeshloo, M.; Mehrabi, A.; Safari, H.; Maleki, M.; Jolai, F.

    2016-12-01

    This paper develops an MILP model, named Satisfactory-Green Vehicle Routing Problem. It consists of routing a heterogeneous fleet of vehicles in order to serve a set of customers within predefined time windows. In this model in addition to the traditional objective of the VRP, both the pollution and customers' satisfaction have been taken into account. Meanwhile, the introduced model prepares an effective dashboard for decision-makers that determines appropriate routes, the best mixed fleet, speed and idle time of vehicles. Additionally, some new factors evaluate the greening of each decision based on three criteria. This model applies piecewise linear functions (PLFs) to linearize a nonlinear fuzzy interval for incorporating customers' satisfaction into other linear objectives. We have presented a mixed integer linear programming formulation for the S-GVRP. This model enriches managerial insights by providing trade-offs between customers' satisfaction, total costs and emission levels. Finally, we have provided a numerical study for showing the applicability of the model.

  15. Linear Polarization Properties of Parsec-Scale AGN Jets

    NASA Astrophysics Data System (ADS)

    Pushkarev, Alexander; Kovalev, Yuri; Lister, Matthew; Savolainen, Tuomas; Aller, Margo; Aller, Hugh; Hodge, Mary

    2017-12-01

    We used 15 GHz multi-epoch Very Long Baseline Array (VLBA) polarization sensitive observations of 484 sources within a time interval 1996--2016 from the MOJAVE program, and also from the NRAO data archive. We have analyzed the linear polarization characteristics of the compact core features and regions downstream, and their changes along and across the parsec-scale active galactic nuclei (AGN) jets. We detected a significant increase of fractional polarization with distance from the radio core along the jet as well as towards the jet edges. Compared to quasars, BL Lacs have a higher degree of polarization and exhibit more stable electric vector position angles (EVPAs) in their core features and a better alignment of the EVPAs with the local jet direction. The latter is accompanied by a higher degree of linear polarization, suggesting that compact bright jet features might be strong transverse shocks, which enhance magnetic field regularity by compression.

  16. Do Generous Unemployment Benefit Programs Reduce Suicide Rates? A State Fixed-Effect Analysis Covering 1968–2008

    PubMed Central

    Cylus, Jonathan; Glymour, M. Maria; Avendano, Mauricio

    2014-01-01

    The recent economic recession has led to increases in suicide, but whether US state unemployment insurance programs ameliorate this association has not been examined. Exploiting US state variations in the generosity of benefit programs between 1968 and 2008, we tested the hypothesis that more generous unemployment benefit programs reduce the impact of economic downturns on suicide. Using state linear fixed-effect models, we found a negative additive interaction between unemployment rates and benefits among the US working-age (20–64 years) population (β = −0.57, 95% confidence interval: −0.86, −0.27; P < 0.001). The finding of a negative additive interaction was robust across multiple model specifications. Our results suggest that the impact of unemployment rates on suicide is offset by the presence of generous state unemployment benefit programs, though estimated effects are small in magnitude. PMID:24939978

  17. An algorithm for the numerical solution of linear differential games

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polovinkin, E S; Ivanov, G E; Balashov, M V

    2001-10-31

    A numerical algorithm for the construction of stable Krasovskii bridges, Pontryagin alternating sets, and also of piecewise program strategies solving two-person linear differential (pursuit or evasion) games on a fixed time interval is developed on the basis of a general theory. The aim of the first player (the pursuer) is to hit a prescribed target (terminal) set by the phase vector of the control system at the prescribed time. The aim of the second player (the evader) is the opposite. A description of numerical algorithms used in the solution of differential games of the type under consideration is presented andmore » estimates of the errors resulting from the approximation of the game sets by polyhedra are presented.« less

  18. Precision Interval Estimation of the Response Surface by Means of an Integrated Algorithm of Neural Network and Linear Regression

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.

    1999-01-01

    The integration of Radial Basis Function Networks and Back Propagation Neural Networks with the Multiple Linear Regression has been accomplished to map nonlinear response surfaces over a wide range of independent variables in the process of the Modem Design of Experiments. The integrated method is capable to estimate the precision intervals including confidence and predicted intervals. The power of the innovative method has been demonstrated by applying to a set of wind tunnel test data in construction of response surface and estimation of precision interval.

  19. The linear sizes tolerances and fits system modernization

    NASA Astrophysics Data System (ADS)

    Glukhov, V. I.; Grinevich, V. A.; Shalay, V. V.

    2018-04-01

    The study is carried out on the urgent topic for technical products quality providing in the tolerancing process of the component parts. The aim of the paper is to develop alternatives for improving the system linear sizes tolerances and dimensional fits in the international standard ISO 286-1. The tasks of the work are, firstly, to classify as linear sizes the elements additionally linear coordinating sizes that determine the detail elements location and, secondly, to justify the basic deviation of the tolerance interval for the element's linear size. The geometrical modeling method of real details elements, the analytical and experimental methods are used in the research. It is shown that the linear coordinates are the dimensional basis of the elements linear sizes. To standardize the accuracy of linear coordinating sizes in all accuracy classes, it is sufficient to select in the standardized tolerance system only one tolerance interval with symmetrical deviations: Js for internal dimensional elements (holes) and js for external elements (shafts). The main deviation of this coordinating tolerance is the average zero deviation, which coincides with the nominal value of the coordinating size. Other intervals of the tolerance system are remained for normalizing the accuracy of the elements linear sizes with a fundamental change in the basic deviation of all tolerance intervals is the maximum deviation corresponding to the limit of the element material: EI is the lower tolerance for the of the internal elements (holes) sizes and es is the upper tolerance deviation for the outer elements (shafts) sizes. It is the sizes of the material maximum that are involved in the of the dimensional elements mating of the shafts and holes and determine the fits type.

  20. Geomorphic domains and linear features on Landsat images, Circle Quadrangle, Alaska

    USGS Publications Warehouse

    Simpson, S.L.

    1984-01-01

    A remote sensing study using Landsat images was undertaken as part of the Alaska Mineral Resource Assessment Program (AMRAP). Geomorphic domains A and B, identified on enhanced Landsat images, divide Circle quadrangle south of Tintina fault zone into two regional areas having major differences in surface characteristics. Domain A is a roughly rectangular, northeast-trending area of relatively low relief and simple, widely spaced drainages, except where igneous rocks are exposed. In contrast, domain B, which bounds two sides of domain A, is more intricately dissected showing abrupt changes in slope and relatively high relief. The northwestern part of geomorphic domain A includes a previously mapped tectonostratigraphic terrane. The southeastern boundary of domain A occurs entirely within the adjoining tectonostratigraphic terrane. The sharp geomorphic contrast along the southeastern boundary of domain A and the existence of known faults along this boundary suggest that the southeastern part of domain A may be a subdivision of the adjoining terrane. Detailed field studies would be necessary to determine the characteristics of the subdivision. Domain B appears to be divisible into large areas of different geomorphic terrains by east-northeast-trending curvilinear lines drawn on Landsat images. Segments of two of these lines correlate with parts of boundaries of mapped tectonostratigraphic terranes. On Landsat images prominent north-trending lineaments together with the curvilinear lines form a large-scale regional pattern that is transected by mapped north-northeast-trending high-angle faults. The lineaments indicate possible lithlogic variations and/or structural boundaries. A statistical strike-frequency analysis of the linear features data for Circle quadrangle shows that northeast-trending linear features predominate throughout, and that most northwest-trending linear features are found south of Tintina fault zone. A major trend interval of N.64-72E. in the linear feature data, corresponds to the strike of foliations in metamorphic rocks and magnetic anomalies reflecting compositional variations suggesting that most linear features in the southern part of the quadrangle probably are related to lithologic variations brought about by folding and foliation of metamorphic rocks. A second important trend interval, N.14-35E., may be related to thrusting south of the Tintina fault zone, as high concentrations of linear features within this interval are found in areas of mapped thrusts. Low concentrations of linear features are found in areas of most igneous intrusives. High concentrations of linear features do not correspond to areas of mineralization in any consistent or significant way that would allow concentration patterns to be easily used as an aid in locating areas of mineralization. The results of this remote sensing study indicate that there are several possibly important areas where further detailed studies are warranted.

  1. Confidence Intervals for Assessing Heterogeneity in Generalized Linear Mixed Models

    ERIC Educational Resources Information Center

    Wagler, Amy E.

    2014-01-01

    Generalized linear mixed models are frequently applied to data with clustered categorical outcomes. The effect of clustering on the response is often difficult to practically assess partly because it is reported on a scale on which comparisons with regression parameters are difficult to make. This article proposes confidence intervals for…

  2. Hemostatic efficacy of local chitosan linear polymer granule in an experimental sheep model with severe bleeding of arteria and vena femoralis.

    PubMed

    Ersoy, Gürkan; Rodoplu, Ülkümen; Yılmaz, Osman; Gökmen, Necati; Doğan, Alper; Dikme, Özgür; Aydınoğlu, Aslı; Orhon, Okyanus

    2016-05-01

    The aim of the present study was to evaluate the hemostatic effect of chitosan linear polymer in a sheep model with femoral bleeding. Following induction of anesthesia and intubation of sheep, groin injury was induced to initiate hemorrhage. Animals were randomly assigned to study and control groups. In the control group, absorbent pads were packed on the wound, and pressure was supplied by a weight placed over the dressing. In the study group, chitosan linear polymer was poured onto the bleeding site; absorbent pads and pressure were applied in the same manner. At 5-min intervals, bleeding was evaluated. Primary endpoint was time to hemostasis. Bleeding had stopped by the 1st interval in 5 members of the study group, and by the 2nd interval in 1 member. One sheep was excluded. The bleeding stopped after the 1st interval in 1 member of the control group and after the 2nd interval in 4 members. Bleeding stopped in 2 cases following ligation of the bleeding vessel. Hemostasis was achieved earlier in the study group, compared to the control group, and the difference was statistically significant. Hemostasis was achieved earlier following application of chitosan linear polymer.

  3. An enhanced export coefficient based optimization model for supporting agricultural nonpoint source pollution mitigation under uncertainty.

    PubMed

    Rong, Qiangqiang; Cai, Yanpeng; Chen, Bing; Yue, Wencong; Yin, Xin'an; Tan, Qian

    2017-02-15

    In this research, an export coefficient based dual inexact two-stage stochastic credibility constrained programming (ECDITSCCP) model was developed through integrating an improved export coefficient model (ECM), interval linear programming (ILP), fuzzy credibility constrained programming (FCCP) and a fuzzy expected value equation within a general two stage programming (TSP) framework. The proposed ECDITSCCP model can effectively address multiple uncertainties expressed as random variables, fuzzy numbers, pure and dual intervals. Also, the model can provide a direct linkage between pre-regulated management policies and the associated economic implications. Moreover, the solutions under multiple credibility levels can be obtained for providing potential decision alternatives for decision makers. The proposed model was then applied to identify optimal land use structures for agricultural NPS pollution mitigation in a representative upstream subcatchment of the Miyun Reservoir watershed in north China. Optimal solutions of the model were successfully obtained, indicating desired land use patterns and nutrient discharge schemes to get a maximum agricultural system benefits under a limited discharge permit. Also, numerous results under multiple credibility levels could provide policy makers with several options, which could help get an appropriate balance between system benefits and pollution mitigation. The developed ECDITSCCP model can be effectively applied to addressing the uncertain information in agricultural systems and shows great applicability to the land use adjustment for agricultural NPS pollution mitigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Using complexity metrics with R-R intervals and BPM heart rate measures.

    PubMed

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian; Jegindø, Else-Marie

    2013-01-01

    Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker of interpersonal coordination. However, there is no consensus about which measurements and analytical tools are most appropriate in mapping the temporal dynamics of heart rate and quite different metrics are reported in the literature. As complexity metrics of heart rate variability depend critically on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-min (BPM). As a proof-of-concept, we employ a simple rest-exercise-rest task and show that non-linear statistics-fractal (DFA) and recurrence (RQA) analyses-reveal information about heart beat activity above and beyond the simple level of heart rate. Non-linear statistics unveil sustained post-exercise effects on heart rate dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to non-linear analyses, the success of non-linear methods for BPM data critically depends on their construction. Generally, "oversampled" BPM time-series can be recommended as they retain most of the information about non-linear aspects of heart beat dynamics.

  5. Using complexity metrics with R-R intervals and BPM heart rate measures

    PubMed Central

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian; Jegindø, Else-Marie

    2013-01-01

    Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker of interpersonal coordination. However, there is no consensus about which measurements and analytical tools are most appropriate in mapping the temporal dynamics of heart rate and quite different metrics are reported in the literature. As complexity metrics of heart rate variability depend critically on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-min (BPM). As a proof-of-concept, we employ a simple rest-exercise-rest task and show that non-linear statistics—fractal (DFA) and recurrence (RQA) analyses—reveal information about heart beat activity above and beyond the simple level of heart rate. Non-linear statistics unveil sustained post-exercise effects on heart rate dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to non-linear analyses, the success of non-linear methods for BPM data critically depends on their construction. Generally, “oversampled” BPM time-series can be recommended as they retain most of the information about non-linear aspects of heart beat dynamics. PMID:23964244

  6. Microgrid Optimal Scheduling With Chance-Constrained Islanding Capability

    DOE PAGES

    Liu, Guodong; Starke, Michael R.; Xiao, B.; ...

    2017-01-13

    To facilitate the integration of variable renewable generation and improve the resilience of electricity sup-ply in a microgrid, this paper proposes an optimal scheduling strategy for microgrid operation considering constraints of islanding capability. A new concept, probability of successful islanding (PSI), indicating the probability that a microgrid maintains enough spinning reserve (both up and down) to meet local demand and accommodate local renewable generation after instantaneously islanding from the main grid, is developed. The PSI is formulated as mixed-integer linear program using multi-interval approximation taking into account the probability distributions of forecast errors of wind, PV and load. With themore » goal of minimizing the total operating cost while preserving user specified PSI, a chance-constrained optimization problem is formulated for the optimal scheduling of mirogrids and solved by mixed integer linear programming (MILP). Numerical simulations on a microgrid consisting of a wind turbine, a PV panel, a fuel cell, a micro-turbine, a diesel generator and a battery demonstrate the effectiveness of the proposed scheduling strategy. Lastly, we verify the relationship between PSI and various factors.« less

  7. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    PubMed

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  8. EIVAN - AN INTERACTIVE ORBITAL TRAJECTORY PLANNING TOOL

    NASA Technical Reports Server (NTRS)

    Brody, A. R.

    1994-01-01

    The Interactive Orbital Trajectory planning Tool, EIVAN, is a forward looking interactive orbit trajectory plotting tool for use with Proximity Operations (operations occurring within a one kilometer sphere of the space station) and other maneuvers. The result of vehicle burns on-orbit is very difficult to anticipate because of non-linearities in the equations of motion governing orbiting bodies. EIVAN was developed to plot resulting trajectories, to provide a better comprehension of orbital mechanics effects, and to help the user develop heuristics for onorbit mission planning. EIVAN comprises a worksheet and a chart from Microsoft Excel on a Macintosh computer. The orbital path for a user-specified time interval is plotted given operator burn inputs. Fuel use is also calculated. After the thrust parameters (magnitude, direction, and time) are input, EIVAN plots the resulting trajectory. Up to five burns may be inserted at any time in the mission. Twenty data points are plotted for each burn and the time interval can be varied to accommodate any desired time frame or degree of resolution. Since the number of data points for each burn is constant, the mission duration can be increased or decreased by increasing or decreasing the time interval. The EIVAN program runs with Microsoft's Excel for execution on a Macintosh running Macintosh OS. A working knowledge of Excel is helpful, but not imperative, for interacting with EIVAN. The program was developed in 1989.

  9. A Linear Programming Approach to the Development of Contrail Reduction Strategies Satisfying Operationally Feasible Constraints

    NASA Technical Reports Server (NTRS)

    Wei, Peng; Sridhar, Banavar; Chen, Neil Yi-Nan; Sun, Dengfent

    2012-01-01

    A class of strategies has been proposed to reduce contrail formation in the United States airspace. A 3D grid based on weather data and the cruising altitude level of aircraft is adjusted to avoid the persistent contrail potential area with the consideration to fuel-efficiency. In this paper, the authors introduce a contrail avoidance strategy on 3D grid by considering additional operationally feasible constraints from an air traffic controller's aspect. First, shifting too many aircraft to the same cruising level will make the miles-in-trail at this level smaller than the safety separation threshold. Furthermore, the high density of aircraft at one cruising level may exceed the workload for the traffic controller. Therefore, in our new model we restrict the number of total aircraft at each level. Second, the aircraft count variation for successive intervals cannot be too drastic since the workload to manage climbing/descending aircraft is much larger than managing cruising aircraft. The contrail reduction is formulated as an integer-programming problem and the problem is shown to have the property of total unimodularity. Solving the corresponding relaxed linear programming with the simplex method provides an optimal and integral solution to the problem. Simulation results are provided to illustrate the methodology.

  10. Correlated observations of three triggered lightning flashes

    NASA Technical Reports Server (NTRS)

    Idone, V. P.; Orville, R. E.; Hubert, P.; Barret, L.; Eybert-Berard, A.

    1984-01-01

    Three triggered lightning flashes, initiated during the Thunderstorm Research International Program (1981) at Langmuir Laboratory, New Mexico, are examined on the basis of three-dimensional return stroke propagation speeds and peak currents. Nonlinear relationships result between return stroke propagation speed and stroke peak current for 56 strokes, and between return stroke propagation speed and dart leader propagation speed for 32 strokes. Calculated linear correlation coefficients include dart leader propagation speed and ensuing return stroke peak current (32 strokes; r = 0.84); and stroke peak current and interstroke interval (69 strokes; r = 0.57). Earlier natural lightning data do not concur with the weak positive correlation between dart leader propagation speed and interstroke interval. Therefore, application of triggered lightning results to natural lightning phenomena must be made with certain caveats. Mean values are included for the three-dimensional return stroke propagation speed and for the three-dimensional dart leader propagation speed.

  11. MILP model for integrated balancing and sequencing mixed-model two-sided assembly line with variable launching interval and assignment restrictions

    NASA Astrophysics Data System (ADS)

    Azmi, N. I. L. Mohd; Ahmad, R.; Zainuddin, Z. M.

    2017-09-01

    This research explores the Mixed-Model Two-Sided Assembly Line (MMTSAL). There are two interrelated problems in MMTSAL which are line balancing and model sequencing. In previous studies, many researchers considered these problems separately and only few studied them simultaneously for one-sided line. However in this study, these two problems are solved simultaneously to obtain more efficient solution. The Mixed Integer Linear Programming (MILP) model with objectives of minimizing total utility work and idle time is generated by considering variable launching interval and assignment restriction constraint. The problem is analysed using small-size test cases to validate the integrated model. Throughout this paper, numerical experiment was conducted by using General Algebraic Modelling System (GAMS) with the solver CPLEX. Experimental results indicate that integrating the problems of model sequencing and line balancing help to minimise the proposed objectives function.

  12. Developing a predictive tropospheric ozone model for Tabriz

    NASA Astrophysics Data System (ADS)

    Khatibi, Rahman; Naghipour, Leila; Ghorbani, Mohammad A.; Smith, Michael S.; Karimi, Vahid; Farhoudi, Reza; Delafrouz, Hadi; Arvanaghi, Hadi

    2013-04-01

    Predictive ozone models are becoming indispensable tools by providing a capability for pollution alerts to serve people who are vulnerable to the risks. We have developed a tropospheric ozone prediction capability for Tabriz, Iran, by using the following five modeling strategies: three regression-type methods: Multiple Linear Regression (MLR), Artificial Neural Networks (ANNs), and Gene Expression Programming (GEP); and two auto-regression-type models: Nonlinear Local Prediction (NLP) to implement chaos theory and Auto-Regressive Integrated Moving Average (ARIMA) models. The regression-type modeling strategies explain the data in terms of: temperature, solar radiation, dew point temperature, and wind speed, by regressing present ozone values to their past values. The ozone time series are available at various time intervals, including hourly intervals, from August 2010 to March 2011. The results for MLR, ANN and GEP models are not overly good but those produced by NLP and ARIMA are promising for the establishing a forecasting capability.

  13. Piecewise linear approximation for hereditary control problems

    NASA Technical Reports Server (NTRS)

    Propst, Georg

    1987-01-01

    Finite dimensional approximations are presented for linear retarded functional differential equations by use of discontinuous piecewise linear functions. The approximation scheme is applied to optimal control problems when a quadratic cost integral has to be minimized subject to the controlled retarded system. It is shown that the approximate optimal feedback operators converge to the true ones both in case the cost integral ranges over a finite time interval as well as in the case it ranges over an infinite time interval. The arguments in the latter case rely on the fact that the piecewise linear approximations to stable systems are stable in a uniform sense. This feature is established using a vector-component stability criterion in the state space R(n) x L(2) and the favorable eigenvalue behavior of the piecewise linear approximations.

  14. Programming with Intervals

    NASA Astrophysics Data System (ADS)

    Matsakis, Nicholas D.; Gross, Thomas R.

    Intervals are a new, higher-level primitive for parallel programming with which programmers directly construct the program schedule. Programs using intervals can be statically analyzed to ensure that they do not deadlock or contain data races. In this paper, we demonstrate the flexibility of intervals by showing how to use them to emulate common parallel control-flow constructs like barriers and signals, as well as higher-level patterns such as bounded-buffer producer-consumer. We have implemented intervals as a publicly available library for Java and Scala.

  15. Fault detection for discrete-time LPV systems using interval observers

    NASA Astrophysics Data System (ADS)

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-10-01

    This paper is concerned with the fault detection (FD) problem for discrete-time linear parameter-varying systems subject to bounded disturbances. A parameter-dependent FD interval observer is designed based on parameter-dependent Lyapunov and slack matrices. The design method is presented by translating the parameter-dependent linear matrix inequalities (LMIs) into finite ones. In contrast to the existing results based on parameter-independent and diagonal Lyapunov matrices, the derived disturbance attenuation, fault sensitivity and nonnegative conditions lead to less conservative LMI characterisations. Furthermore, without the need to design the residual evaluation functions and thresholds, the residual intervals generated by the interval observers are used directly for FD decision. Finally, simulation results are presented for showing the effectiveness and superiority of the proposed method.

  16. Interval-valued intuitionistic fuzzy matrix games based on Archimedean t-conorm and t-norm

    NASA Astrophysics Data System (ADS)

    Xia, Meimei

    2018-04-01

    Fuzzy game theory has been applied in many decision-making problems. The matrix game with interval-valued intuitionistic fuzzy numbers (IVIFNs) is investigated based on Archimedean t-conorm and t-norm. The existing matrix games with IVIFNs are all based on Algebraic t-conorm and t-norm, which are special cases of Archimedean t-conorm and t-norm. In this paper, the intuitionistic fuzzy aggregation operators based on Archimedean t-conorm and t-norm are employed to aggregate the payoffs of players. To derive the solution of the matrix game with IVIFNs, several mathematical programming models are developed based on Archimedean t-conorm and t-norm. The proposed models can be transformed into a pair of primal-dual linear programming models, based on which, the solution of the matrix game with IVIFNs is obtained. It is proved that the theorems being valid in the exiting matrix game with IVIFNs are still true when the general aggregation operator is used in the proposed matrix game with IVIFNs. The proposed method is an extension of the existing ones and can provide more choices for players. An example is given to illustrate the validity and the applicability of the proposed method.

  17. Menu-Driven Solver Of Linear-Programming Problems

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.; Ferencz, D.

    1992-01-01

    Program assists inexperienced user in formulating linear-programming problems. A Linear Program Solver (ALPS) computer program is full-featured LP analysis program. Solves plain linear-programming problems as well as more-complicated mixed-integer and pure-integer programs. Also contains efficient technique for solution of purely binary linear-programming problems. Written entirely in IBM's APL2/PC software, Version 1.01. Packed program contains licensed material, property of IBM (copyright 1988, all rights reserved).

  18. Efficient parallel architecture for highly coupled real-time linear system applications

    NASA Technical Reports Server (NTRS)

    Carroll, Chester C.; Homaifar, Abdollah; Barua, Soumavo

    1988-01-01

    A systematic procedure is developed for exploiting the parallel constructs of computation in a highly coupled, linear system application. An overall top-down design approach is adopted. Differential equations governing the application under consideration are partitioned into subtasks on the basis of a data flow analysis. The interconnected task units constitute a task graph which has to be computed in every update interval. Multiprocessing concepts utilizing parallel integration algorithms are then applied for efficient task graph execution. A simple scheduling routine is developed to handle task allocation while in the multiprocessor mode. Results of simulation and scheduling are compared on the basis of standard performance indices. Processor timing diagrams are developed on the basis of program output accruing to an optimal set of processors. Basic architectural attributes for implementing the system are discussed together with suggestions for processing element design. Emphasis is placed on flexible architectures capable of accommodating widely varying application specifics.

  19. Zero entropy continuous interval maps and MMLS-MMA property

    NASA Astrophysics Data System (ADS)

    Jiang, Yunping

    2018-06-01

    We prove that the flow generated by any continuous interval map with zero topological entropy is minimally mean-attractable and minimally mean-L-stable. One of the consequences is that any oscillating sequence is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy. In particular, the Möbius function is linearly disjoint from all flows generated by all continuous interval maps with zero topological entropy (Sarnak’s conjecture for continuous interval maps). Another consequence is a non-trivial example of a flow having discrete spectrum. We also define a log-uniform oscillating sequence and show a result in ergodic theory for comparison. This material is based upon work supported by the National Science Foundation. It is also partially supported by a collaboration grant from the Simons Foundation (grant number 523341) and PSC-CUNY awards and a grant from NSFC (grant number 11571122).

  20. Semilinear programming: applications and implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohan, S.

    Semilinear programming is a method of solving optimization problems with linear constraints where the non-negativity restrictions on the variables are dropped and the objective function coefficients can take on different values depending on whether the variable is positive or negative. The simplex method for linear programming is modified in this thesis to solve general semilinear and piecewise linear programs efficiently without having to transform them into equivalent standard linear programs. Several models in widely different areas of optimization such as production smoothing, facility locations, goal programming and L/sub 1/ estimation are presented first to demonstrate the compact formulation that arisesmore » when such problems are formulated as semilinear programs. A code SLP is constructed using the semilinear programming techniques. Problems in aggregate planning and L/sub 1/ estimation are solved using SLP and equivalent linear programs using a linear programming simplex code. Comparisons of CPU times and number iterations indicate SLP to be far superior. The semilinear programming techniques are extended to piecewise linear programming in the implementation of the code PLP. Piecewise linear models in aggregate planning are solved using PLP and equivalent standard linear programs using a simple upper bounded linear programming code SUBLP.« less

  1. A combined linear optimisation methodology for water resources allocation in Alfeios River Basin (Greece) under uncertain and vague system conditions

    NASA Astrophysics Data System (ADS)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2013-04-01

    In the present study, a combined linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), is employed for optimizing water allocation under uncertain system conditions in the Alfeios River Basin, in Greece. The Alfeios River is a water resources system of great natural, ecological, social and economic importance for Western Greece, since it has the longest and highest flow rate watercourse in the Peloponnisos region. Moreover, the river basin was exposed in the last decades to a plethora of environmental stresses (e.g. hydrogeological alterations, intensively irrigated agriculture, surface and groundwater overexploitation and infrastructure developments), resulting in the degradation of its quantitative and qualitative characteristics. As in most Mediterranean countries, water resource management in Alfeios River Basin has been focused up to now on an essentially supply-driven approach. It is still characterized by a lack of effective operational strategies. Authority responsibility relationships are fragmented, and law enforcement and policy implementation are weak. The present regulated water allocation puzzle entails a mixture of hydropower generation, irrigation, drinking water supply and recreational activities. Under these conditions its water resources management is characterised by high uncertainty and by vague and imprecise data. The considered methodology has been developed in order to deal with uncertainties expressed as either probability distributions, or/and fuzzy boundary intervals, derived by associated α-cut levels. In this framework a set of deterministic submodels is studied through linear programming. The ad hoc water resources management and alternative management patterns in an Alfeios subbasin are analyzed and evaluated under various scenarios, using the above mentioned methodology, aiming to promote a sustainable and equitable water management. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources management systems using a fuzzy-boundary interval-stochastic programming method, Elsevier Ltd, Advances in Water Resources, 33: 1105-1117. doi:10.1016/j.advwatres.2010.06.015 Bekri, E.S., Disse, M. and P.C.,Yannopoulos, (2012), Methodological framework for correction of quick river discharge measurements using quality characteristics, Session of Environmental Hydraulics - Hydrodynamics, 2nd Common Conference of Hellenic Hydrotechnical Association and Greek Committee for Water Resources Management, Volume: 546-557 (in Greek).

  2. Event-triggered fault detection for a class of discrete-time linear systems using interval observers.

    PubMed

    Zhang, Zhi-Hui; Yang, Guang-Hong

    2017-05-01

    This paper provides a novel event-triggered fault detection (FD) scheme for discrete-time linear systems. First, an event-triggered interval observer is proposed to generate the upper and lower residuals by taking into account the influence of the disturbances and the event error. Second, the robustness of the residual interval against the disturbances and the fault sensitivity are improved by introducing l 1 and H ∞ performances. Third, dilated linear matrix inequalities are used to decouple the Lyapunov matrices from the system matrices. The nonnegative conditions for the estimation error variables are presented with the aid of the slack matrix variables. This technique allows considering a more general Lyapunov function. Furthermore, the FD decision scheme is proposed by monitoring whether the zero value belongs to the residual interval. It is shown that the information communication burden is reduced by designing the event-triggering mechanism, while the FD performance can still be guaranteed. Finally, simulation results demonstrate the effectiveness of the proposed method. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  3. Do generous unemployment benefit programs reduce suicide rates? A state fixed-effect analysis covering 1968-2008.

    PubMed

    Cylus, Jonathan; Glymour, M Maria; Avendano, Mauricio

    2014-07-01

    The recent economic recession has led to increases in suicide, but whether US state unemployment insurance programs ameliorate this association has not been examined. Exploiting US state variations in the generosity of benefit programs between 1968 and 2008, we tested the hypothesis that more generous unemployment benefit programs reduce the impact of economic downturns on suicide. Using state linear fixed-effect models, we found a negative additive interaction between unemployment rates and benefits among the US working-age (20-64 years) population (β = -0.57, 95% confidence interval: -0.86, -0.27; P < 0.001). The finding of a negative additive interaction was robust across multiple model specifications. Our results suggest that the impact of unemployment rates on suicide is offset by the presence of generous state unemployment benefit programs, though estimated effects are small in magnitude. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. The Effects of Linear and Modified Linear Programed Materials on the Achievement of Slow Learners in Tenth Grade BSCS Special Materials Biology.

    ERIC Educational Resources Information Center

    Moody, John Charles

    Assessed were the effects of linear and modified linear programed materials on the achievement of slow learners in tenth grade Biological Sciences Curriculum Study (BSCS) Special Materials biology. Two hundred and six students were randomly placed into four programed materials formats: linear programed materials, modified linear program with…

  5. Hierarchical tone mapping for high dynamic range image visualization

    NASA Astrophysics Data System (ADS)

    Qiu, Guoping; Duan, Jiang

    2005-07-01

    In this paper, we present a computationally efficient, practically easy to use tone mapping techniques for the visualization of high dynamic range (HDR) images in low dynamic range (LDR) reproduction devices. The new method, termed hierarchical nonlinear linear (HNL) tone-mapping operator maps the pixels in two hierarchical steps. The first step allocates appropriate numbers of LDR display levels to different HDR intensity intervals according to the pixel densities of the intervals. The second step linearly maps the HDR intensity intervals to theirs allocated LDR display levels. In the developed HNL scheme, the assignment of LDR display levels to HDR intensity intervals is controlled by a very simple and flexible formula with a single adjustable parameter. We also show that our new operators can be used for the effective enhancement of ordinary images.

  6. Improvements In Ball-Screw Linear Actuators

    NASA Technical Reports Server (NTRS)

    Iskenderian, Theodore; Joffe, Benjamin; Summers, Robert

    1996-01-01

    Report describes modifications of design of type of ball-screw linear actuator driven by dc motor, with linear-displacement feedback via linear variable-differential transformer (LVDT). Actuators used to position spacecraft engines to direct thrust. Modifications directed toward ensuring reliable and predictable operation during planned 12-year cruise and interval of hard use at end of cruise.

  7. Hurst Estimation of Scale Invariant Processes with Stationary Increments and Piecewise Linear Drift

    NASA Astrophysics Data System (ADS)

    Modarresi, N.; Rezakhah, S.

    The characteristic feature of the discrete scale invariant (DSI) processes is the invariance of their finite dimensional distributions by dilation for certain scaling factor. DSI process with piecewise linear drift and stationary increments inside prescribed scale intervals is introduced and studied. To identify the structure of the process, first, we determine the scale intervals, their linear drifts and eliminate them. Then, a new method for the estimation of the Hurst parameter of such DSI processes is presented and applied to some period of the Dow Jones indices. This method is based on fixed number equally spaced samples inside successive scale intervals. We also present some efficient method for estimating Hurst parameter of self-similar processes with stationary increments. We compare the performance of this method with the celebrated FA, DFA and DMA on the simulated data of fractional Brownian motion (fBm).

  8. A symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming.

    PubMed

    Liu, Jing; Duan, Yongrui; Sun, Min

    2017-01-01

    This paper introduces a symmetric version of the generalized alternating direction method of multipliers for two-block separable convex programming with linear equality constraints, which inherits the superiorities of the classical alternating direction method of multipliers (ADMM), and which extends the feasible set of the relaxation factor α of the generalized ADMM to the infinite interval [Formula: see text]. Under the conditions that the objective function is convex and the solution set is nonempty, we establish the convergence results of the proposed method, including the global convergence, the worst-case [Formula: see text] convergence rate in both the ergodic and the non-ergodic senses, where k denotes the iteration counter. Numerical experiments to decode a sparse signal arising in compressed sensing are included to illustrate the efficiency of the new method.

  9. Fuzzy risk explicit interval linear programming model for end-of-life vehicle recycling planning in the EU.

    PubMed

    Simic, Vladimir

    2015-01-01

    End-of-life vehicles (ELVs) are vehicles that have reached the end of their useful lives and are no longer registered or licensed for use. The ELV recycling problem has become very serious in the last decade and more and more efforts are made in order to reduce the impact of ELVs on the environment. This paper proposes the fuzzy risk explicit interval linear programming model for ELV recycling planning in the EU. It has advantages in reflecting uncertainties presented in terms of intervals in the ELV recycling systems and fuzziness in decision makers' preferences. The formulated model has been applied to a numerical study in which different decision maker types and several ELV types under two EU ELV Directive legislative cases were examined. This study is conducted in order to examine the influences of the decision maker type, the α-cut level, the EU ELV Directive and the ELV type on decisions about vehicle hulks procuring, storing unprocessed hulks, sorting generated material fractions, allocating sorted waste flows and allocating sorted metals. Decision maker type can influence quantity of vehicle hulks kept in storages. The EU ELV Directive and decision maker type have no influence on which vehicle hulk type is kept in the storage. Vehicle hulk type, the EU ELV Directive and decision maker type do not influence the creation of metal allocation plans, since each isolated metal has its regular destination. The valid EU ELV Directive eco-efficiency quotas can be reached even when advanced thermal treatment plants are excluded from the ELV recycling process. The introduction of the stringent eco-efficiency quotas will significantly reduce the quantities of land-filled waste fractions regardless of the type of decision makers who will manage vehicle recycling system. In order to reach these stringent quotas, significant quantities of sorted waste need to be processed in advanced thermal treatment plants. Proposed model can serve as the support for the European vehicle recycling managers in creating more successful ELV recycling plans. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. An Instructional Note on Linear Programming--A Pedagogically Sound Approach.

    ERIC Educational Resources Information Center

    Mitchell, Richard

    1998-01-01

    Discusses the place of linear programming in college curricula and the advantages of using linear-programming software. Lists important characteristics of computer software used in linear programming for more effective teaching and learning. (ASK)

  11. Interval Timing Accuracy and Scalar Timing in C57BL/6 Mice

    PubMed Central

    Buhusi, Catalin V.; Aziz, Dyana; Winslow, David; Carter, Rickey E.; Swearingen, Joshua E.; Buhusi, Mona C.

    2010-01-01

    In many species, interval timing behavior is accurate—appropriate estimated durations—and scalar—errors vary linearly with estimated durations. While accuracy has been previously examined, scalar timing has not been yet clearly demonstrated in house mice (Mus musculus), raising concerns about mouse models of human disease. We estimated timing accuracy and precision in C57BL/6 mice, the most used background strain for genetic models of human disease, in a peak-interval procedure with multiple intervals. Both when timing two intervals (Experiment 1) or three intervals (Experiment 2), C57BL/6 mice demonstrated varying degrees of timing accuracy. Importantly, both at individual and group level, their precision varied linearly with the subjective estimated duration. Further evidence for scalar timing was obtained using an intraclass correlation statistic. This is the first report of consistent, reliable scalar timing in a sizable sample of house mice, thus validating the PI procedure as a valuable technique, the intraclass correlation statistic as a powerful test of the scalar property, and the C57BL/6 strain as a suitable background for behavioral investigations of genetically engineered mice modeling disorders of interval timing. PMID:19824777

  12. Carrots and sticks: impact of an incentive/disincentive employee flexible credit benefit plan on health status and medical costs.

    PubMed

    Stein, A D; Karel, T; Zuidema, R

    1999-01-01

    Employee wellness programs aim to assist in controlling employer costs by improving the health status and fitness of employees, potentially increasing productivity, decreasing absenteeism, and reducing medical claims. Most such programs offer no disincentive for nonparticipation. We evaluated an incentive/disincentive program initiated by a large teaching hospital in western Michigan. The HealthPlus Health Quotient program is an incentive/disincentive approach to health promotion. The employer's contribution to the cafeteria plan benefit package is adjusted based on results of an annual appraisal of serum cholesterol, blood pressure, tobacco use, body fat, physical fitness, motor vehicle safety, nutrition, and alcohol consumption. The adjustment (health quotient [HQ]) can range from -$25 to +$25 per pay period. We examined whether appraised health improved between 1993 and 1996 and whether the HQ predicted medical claims. Mean HQ increased slightly (+$0.47 per pay period in 1993 to +$0.89 per pay period in 1996). Individuals with HQs of less than -$10 per pay period incurred approximately twice the medical claims of the other groups (test for linear trend, p = .003). After adjustment, medical claims of employees in the worst category (HQ < -$10 per pay period) were $1078 (95% confidence interval $429-$1728) greater than those for the neutral (HQ between -$2 and +$2 per pay period) category. A decrease in HQ of at least $6 per pay period from 1993 to 1995 was associated with $956 (95% confidence interval $264-$1647) greater costs in 1996 than was a stable HQ. The HealthPlus Health Quotient program is starting to yield benefits. Most employees are impacted minimally, but savings are accruing to the employer from reductions in medical claims paid and in days lost to illness and disability.

  13. Electronic circuit delivers pulse of high interval stability

    NASA Technical Reports Server (NTRS)

    Fisher, B.

    1966-01-01

    Circuit generates a pulse of high interval stability with a complexity level considerably below systems of comparable stability. This circuit is being used as a linear frequency discriminator in the signal conditioner of the Apollo command module.

  14. Reanalysis of cancer mortality in Japanese A-bomb survivors exposed to low doses of radiation: bootstrap and simulation methods

    PubMed Central

    2009-01-01

    Background The International Commission on Radiological Protection (ICRP) recommended annual occupational dose limit is 20 mSv. Cancer mortality in Japanese A-bomb survivors exposed to less than 20 mSv external radiation in 1945 was analysed previously, using a latency model with non-linear dose response. Questions were raised regarding statistical inference with this model. Methods Cancers with over 100 deaths in the 0 - 20 mSv subcohort of the 1950-1990 Life Span Study are analysed with Poisson regression models incorporating latency, allowing linear and non-linear dose response. Bootstrap percentile and Bias-corrected accelerated (BCa) methods and simulation of the Likelihood Ratio Test lead to Confidence Intervals for Excess Relative Risk (ERR) and tests against the linear model. Results The linear model shows significant large, positive values of ERR for liver and urinary cancers at latencies from 37 - 43 years. Dose response below 20 mSv is strongly non-linear at the optimal latencies for the stomach (11.89 years), liver (36.9), lung (13.6), leukaemia (23.66), and pancreas (11.86) and across broad latency ranges. Confidence Intervals for ERR are comparable using Bootstrap and Likelihood Ratio Test methods and BCa 95% Confidence Intervals are strictly positive across latency ranges for all 5 cancers. Similar risk estimates for 10 mSv (lagged dose) are obtained from the 0 - 20 mSv and 5 - 500 mSv data for the stomach, liver, lung and leukaemia. Dose response for the latter 3 cancers is significantly non-linear in the 5 - 500 mSv range. Conclusion Liver and urinary cancer mortality risk is significantly raised using a latency model with linear dose response. A non-linear model is strongly superior for the stomach, liver, lung, pancreas and leukaemia. Bootstrap and Likelihood-based confidence intervals are broadly comparable and ERR is strictly positive by bootstrap methods for all 5 cancers. Except for the pancreas, similar estimates of latency and risk from 10 mSv are obtained from the 0 - 20 mSv and 5 - 500 mSv subcohorts. Large and significant cancer risks for Japanese survivors exposed to less than 20 mSv external radiation from the atomic bombs in 1945 cast doubt on the ICRP recommended annual occupational dose limit. PMID:20003238

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Adair, Kristin Lynn; Durfee, Justin David.

    When developing linear programming models, issues such as budget limitations, customer requirements, or licensing may preclude the use of commercial linear programming solvers. In such cases, one option is to use an open-source linear programming solver. A survey of linear programming tools was conducted to identify potential open-source solvers. From this survey, four open-source solvers were tested using a collection of linear programming test problems and the results were compared to IBM ILOG CPLEX Optimizer (CPLEX) [1], an industry standard. The solvers considered were: COIN-OR Linear Programming (CLP) [2], [3], GNU Linear Programming Kit (GLPK) [4], lp_solve [5] and Modularmore » In-core Nonlinear Optimization System (MINOS) [6]. As no open-source solver outperforms CPLEX, this study demonstrates the power of commercial linear programming software. CLP was found to be the top performing open-source solver considered in terms of capability and speed. GLPK also performed well but cannot match the speed of CLP or CPLEX. lp_solve and MINOS were considerably slower and encountered issues when solving several test problems.« less

  16. Predicting birth weight with conditionally linear transformation models.

    PubMed

    Möst, Lisa; Schmid, Matthias; Faschingbauer, Florian; Hothorn, Torsten

    2016-12-01

    Low and high birth weight (BW) are important risk factors for neonatal morbidity and mortality. Gynecologists must therefore accurately predict BW before delivery. Most prediction formulas for BW are based on prenatal ultrasound measurements carried out within one week prior to birth. Although successfully used in clinical practice, these formulas focus on point predictions of BW but do not systematically quantify uncertainty of the predictions, i.e. they result in estimates of the conditional mean of BW but do not deliver prediction intervals. To overcome this problem, we introduce conditionally linear transformation models (CLTMs) to predict BW. Instead of focusing only on the conditional mean, CLTMs model the whole conditional distribution function of BW given prenatal ultrasound parameters. Consequently, the CLTM approach delivers both point predictions of BW and fetus-specific prediction intervals. Prediction intervals constitute an easy-to-interpret measure of prediction accuracy and allow identification of fetuses subject to high prediction uncertainty. Using a data set of 8712 deliveries at the Perinatal Centre at the University Clinic Erlangen (Germany), we analyzed variants of CLTMs and compared them to standard linear regression estimation techniques used in the past and to quantile regression approaches. The best-performing CLTM variant was competitive with quantile regression and linear regression approaches in terms of conditional coverage and average length of the prediction intervals. We propose that CLTMs be used because they are able to account for possible heteroscedasticity, kurtosis, and skewness of the distribution of BWs. © The Author(s) 2014.

  17. Trends in utilization of FDA expedited drug development and approval programs, 1987-2014: cohort study.

    PubMed

    Kesselheim, Aaron S; Wang, Bo; Franklin, Jessica M; Darrow, Jonathan J

    2015-09-23

    To evaluate the use of special expedited development and review pathways at the US Food and Drug Administration over the past two decades. Cohort study. FDA approved novel therapeutics between 1987 and 2014. Publicly available sources provided each drug's year of approval, their innovativeness (first in class versus not first in class), World Health Organization Anatomic Therapeutic Classification, and which (if any) of the FDA's four primary expedited development and review programs or designations were associated with each drug: orphan drug, fast track, accelerated approval, and priority review. Logistic regression models evaluated trends in the proportion of drugs associated with each of the four expedited development and review programs. To evaluate the number of programs associated with each approved drug over time, Poisson models were employed, with the number of programs as the dependent variable and a linear term for year of approval. The difference in trends was compared between drugs that were first in class and those that were not. The FDA approved 774 drugs during the study period, with one third representing first in class agents. Priority review (43%) was the most prevalent of the four programs, with accelerated approval (9%) the least common. There was a significant increase of 2.6% per year in the number of expedited review and approval programs granted to each newly approved agent (incidence rate ratio 1.026, 95% confidence interval 1.017 to 1.035, P<0.001), and a 2.4% increase in the proportion of drugs associated with at least one such program (odds ratio 1.024, 95% confidence interval 1.006 to 1.043, P=0.009). Driving this trend was an increase in the proportion of approved, non-first in class drugs associated with at least one program for drugs (P=0.03 for interaction). In the past two decades, drugs newly approved by the FDA have been associated with an increasing number of expedited development or review programs. Though expedited programs should be strictly limited to drugs providing noticeable clinical advances, this trend is being driven by drugs that are not first in class and thus potentially less innovative. © Kesselheim et al 2015.

  18. Health effects of unemployment benefit program generosity.

    PubMed

    Cylus, Jonathan; Glymour, M Maria; Avendano, Mauricio

    2015-02-01

    We assessed the impact of unemployment benefit programs on the health of the unemployed. We linked US state law data on maximum allowable unemployment benefit levels between 1985 and 2008 to individual self-rated health for heads of households in the Panel Study of Income Dynamics and implemented state and year fixed-effect models. Unemployment was associated with increased risk of reporting poor health among men in both linear probability (b=0.0794; 95% confidence interval [CI]=0.0623, 0.0965) and logistic models (odds ratio=2.777; 95% CI=2.294, 3.362), but this effect is lower when the generosity of state unemployment benefits is high (b for interaction between unemployment and benefits=-0.124; 95% CI=-0.197, -0.0523). A 63% increase in benefits completely offsets the impact of unemployment on self-reported health. Results suggest that unemployment benefits may significantly alleviate the adverse health effects of unemployment among men.

  19. ALPS: A Linear Program Solver

    NASA Technical Reports Server (NTRS)

    Ferencz, Donald C.; Viterna, Larry A.

    1991-01-01

    ALPS is a computer program which can be used to solve general linear program (optimization) problems. ALPS was designed for those who have minimal linear programming (LP) knowledge and features a menu-driven scheme to guide the user through the process of creating and solving LP formulations. Once created, the problems can be edited and stored in standard DOS ASCII files to provide portability to various word processors or even other linear programming packages. Unlike many math-oriented LP solvers, ALPS contains an LP parser that reads through the LP formulation and reports several types of errors to the user. ALPS provides a large amount of solution data which is often useful in problem solving. In addition to pure linear programs, ALPS can solve for integer, mixed integer, and binary type problems. Pure linear programs are solved with the revised simplex method. Integer or mixed integer programs are solved initially with the revised simplex, and the completed using the branch-and-bound technique. Binary programs are solved with the method of implicit enumeration. This manual describes how to use ALPS to create, edit, and solve linear programming problems. Instructions for installing ALPS on a PC compatible computer are included in the appendices along with a general introduction to linear programming. A programmers guide is also included for assistance in modifying and maintaining the program.

  20. Effects of continuous vs interval exercise training on oxygen uptake efficiency slope in patients with coronary artery disease.

    PubMed

    Prado, D M L; Rocco, E A; Silva, A G; Rocco, D F; Pacheco, M T; Silva, P F; Furlan, V

    2016-02-01

    The oxygen uptake efficiency slope (OUES) is a submaximal index incorporating cardiovascular, peripheral, and pulmonary factors that determine the ventilatory response to exercise. The purpose of this study was to evaluate the effects of continuous exercise training and interval exercise training on the OUES in patients with coronary artery disease. Thirty-five patients (59.3±1.8 years old; 28 men, 7 women) with coronary artery disease were randomly divided into two groups: continuous exercise training (n=18) and interval exercise training (n=17). All patients performed graded exercise tests with respiratory gas analysis before and 3 months after the exercise-training program to determine ventilatory anaerobic threshold (VAT), respiratory compensation point, and peak oxygen consumption (peak VO2). The OUES was assessed based on data from the second minute of exercise until exhaustion by calculating the slope of the linear relation between oxygen uptake and the logarithm of total ventilation. After the interventions, both groups showed increased aerobic fitness (P<0.05). In addition, both the continuous exercise and interval exercise training groups demonstrated an increase in OUES (P<0.05). Significant associations were observed in both groups: 1) continuous exercise training (OUES and peak VO2 r=0.57; OUES and VO2 VAT r=0.57); 2) interval exercise training (OUES and peak VO2 r=0.80; OUES and VO2 VAT r=0.67). Continuous and interval exercise training resulted in a similar increase in OUES among patients with coronary artery disease. These findings suggest that improvements in OUES among CAD patients after aerobic exercise training may be dependent on peripheral and central mechanisms.

  1. ScoreRel CI: An Excel Program for Computing Confidence Intervals for Commonly Used Score Reliability Coefficients

    ERIC Educational Resources Information Center

    Barnette, J. Jackson

    2005-01-01

    An Excel program developed to assist researchers in the determination and presentation of confidence intervals around commonly used score reliability coefficients is described. The software includes programs to determine confidence intervals for Cronbachs alpha, Pearson r-based coefficients such as those used in test-retest and alternate forms…

  2. Differences between measured and linearly interpolated synoptic variables over a 12-h period during AVE 4

    NASA Technical Reports Server (NTRS)

    Dupuis, L. R.; Scoggins, J. R.

    1979-01-01

    Results of analyses revealed that nonlinear changes or differences formed centers or systems, that were mesosynoptic in nature. These systems correlated well in space with upper level short waves, frontal zones, and radar observed convection, and were very systematic in time and space. Many of the centers of differences were well established in the vertical, extending up to the tropopause. Statistical analysis showed that on the average nonlinear changes were larger in convective areas than nonconvective regions. Errors often exceeding 100 percent were made by assuming variables to change linearly through a 12-h period in areas of thunderstorms, indicating that these nonlinear changes are important in the development of severe weather. Linear changes, however, accounted for more and more of an observed change as the time interval (within the 12-h interpolation period) increased, implying that the accuracy of linear interpolation increased over larger time intervals.

  3. User's manual for LINEAR, a FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, Eugene L.; Patterson, Brian P.; Antoniewicz, Robert F.

    1987-01-01

    This report documents a FORTRAN program that provides a powerful and flexible tool for the linearization of aircraft models. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  4. Dosimeter-Type NOx Sensing Properties of KMnO4 and Its Electrical Conductivity during Temperature Programmed Desorption

    PubMed Central

    Groβ, Andrea; Kremling, Michael; Marr, Isabella; Kubinski, David J.; Visser, Jacobus H.; Tuller, Harry L.; Moos, Ralf

    2013-01-01

    An impedimetric NOx dosimeter based on the NOx sorption material KMnO4 is proposed. In addition to its application as a low level NOx dosimeter, KMnO4 shows potential as a precious metal free lean NOx trap material (LNT) for NOx storage catalysts (NSC) enabling electrical in-situ diagnostics. With this dosimeter, low levels of NO and NO2 exposure can be detected electrically as instantaneous values at 380 °C by progressive NOx accumulation in the KMnO4 based sensitive layer. The linear NOx sensing characteristics are recovered periodically by heating to 650 °C or switching to rich atmospheres. Further insight into the NOx sorption-dependent conductivity of the KMnO4-based material is obtained by the novel eTPD method that combines electrical characterization with classical temperature programmed desorption (TPD). The NOx loading amount increases proportionally to the NOx exposure time at sorption temperature. The cumulated NOx exposure, as well as the corresponding NOx loading state, can be detected linearly by electrical means in two modes: (1) time-continuously during the sorption interval including NOx concentration information from the signal derivative or (2) during the short-term thermal NOx release. PMID:23549366

  5. [Influence of age on systolic and diastolic time intervals in normal individuals].

    PubMed

    Soares-Costa, J T; Soares-Costa, T J; Santos, A J; Monteiro, A J

    1991-12-01

    To evaluate the influence of age (I) on the left ventricle (VE) systolic time intervals, the S2O interval, the pulse transmission time (TTP) and the relative amplitude of the a wave (Aa%) of the apexcardiogram (ACG) of normal individuals. 202 subjects considered as normal by clinical and electrocardiographic examinations were studied. Their age (I) was 38 +/- 13 years (average +/- 1 SD), being 125 male and 77 female. The electrocardiogram (ECG), phonocardiogram, ACG and carotid arterial pulse tracing (PC) were simultaneously recorded. The following intervals were determined: Electromechanical (IEM)--from the onset of QRS complex of the ECG to the ascending branch of the great wave of the ACG (A point); mechanical systole (SM)--from the A point of the ACG to the beginning of first high frequency vibration of the aortic component of the second heart sound (S2); ejection period (FE)--from the beginning of the anacrotic branch of the PC to the nadir of its dicrotic notch (ID); isovolumic contraction time (FIS)--subtracting FE duration to the SM duration; S2O interval--since S2 to the O point (nadir) of the ACG; Aa%--relation percentage expressed between a wave amplitude and total amplitude of the ACG; pulse transmission time--since S2 to ID. Statistically significant correlations (p less than 0.05) between I (years expressed) and the previously mentioned variables were investigated. It was possible to verify: a) the IEM and FIS intervals were not significantly correlated with I; b) the FE had a linear, positive and significant correlation with I (r = 0.222); c) the correlations between FE and heart rate (FC) were not significantly different between the considered age groups (14-34, 35-49, 50-69 years); d) the S2O interval had a linear, negative and significant correlation with FC (r = -0.196), and a linear, positive and significant correlation with I (r = 0.392); e) multiple regression equation between S2O, I and FC was: S2O = 70 - 0.36 x FC + 0.55 x I; f) the Aa% had a linear, positive and significant correlation with I (r = 0.252); g) TTP has a linear, negative and significant correlation with I (r = -0.793). a) The FE increases with I related probably to the afterload increasing that follows aging process; b) the S2O interval increases with I reflecting the elongation of the relaxing time that is associated to the senescence; c) the Aa% increases with I, expressing the reduction of the compliance of the VE associated to the aging; d) the TTP decreases with I related to the increasing of velocity of the pulse wave that follows senescence and is attributed to the increasing of the aortic stiffness.

  6. ’Exact’ Two-Sided Confidence Intervals on Nonnegative Linear Combinations of Variances.

    DTIC Science & Technology

    1980-07-01

    Colorado State University ( 042_402) II. CONTrOLLING OFFICE NAME AND ADDRESS It. REPORT OAT Office of Naval Rsearch -// 1 Jul MjW80 Statistics and...MONNEGATIVE LINEAR COMBINATIONS OF VARIANCES by Franklin A. Graybill Colorado State University and Chih-Ming Wang SPSS Inc. 1. Introduction In a paper to soon...1 + a2’ called the Nodf Led Lace Sample (HLS) confidence interval, is in 2. Aoce-3Ion For DDC TAO u*.- *- -. n c edI Ju.-’I if iction_, i !~BV . . I

  7. The application of color display techniques for the analysis of Nimbus infrared radiation data

    NASA Technical Reports Server (NTRS)

    Allison, L. J.; Cherrix, G. T.; Ausfresser, H.

    1972-01-01

    A color enhancement system designed for the Applications Technology Satellite (ATS) spin scan experiment has been adapted for the analysis of Nimbus infrared radiation measurements. For a given scene recorded on magnetic tape by the Nimbus scanning radiometers, a virtually unlimited number of color images can be produced at the ATS Operations Control Center from a color selector paper tape input. Linear image interpolation has produced radiation analyses in which each brightness-color interval has a smooth boundary without any mosaic effects. An annotated latitude-longitude gridding program makes it possible to precisely locate geophysical parameters, which permits accurate interpretation of pertinent meteorological, geological, hydrological, and oceanographic features.

  8. A contracting-interval program for the Danilewski method. Ph.D. Thesis - Va. Univ.

    NASA Technical Reports Server (NTRS)

    Harris, J. D.

    1971-01-01

    The concept of contracting-interval programs is applied to finding the eigenvalues of a matrix. The development is a three-step process in which (1) a program is developed for the reduction of a matrix to Hessenberg form, (2) a program is developed for the reduction of a Hessenberg matrix to colleague form, and (3) the characteristic polynomial with interval coefficients is readily obtained from the interval of colleague matrices. This interval polynomial is then factored into quadratic factors so that the eigenvalues may be obtained. To develop a contracting-interval program for factoring this polynomial with interval coefficients it is necessary to have an iteration method which converges even in the presence of controlled rounding errors. A theorem is stated giving sufficient conditions for the convergence of Newton's method when both the function and its Jacobian cannot be evaluated exactly but errors can be made proportional to the square of the norm of the difference between the previous two iterates. This theorem is applied to prove the convergence of the generalization of the Newton-Bairstow method that is used to obtain quadratic factors of the characteristic polynomial.

  9. A new approach to correct the QT interval for changes in heart rate using a nonparametric regression model in beagle dogs.

    PubMed

    Watanabe, Hiroyuki; Miyazaki, Hiroyasu

    2006-01-01

    Over- and/or under-correction of QT intervals for changes in heart rate may lead to misleading conclusions and/or masking the potential of a drug to prolong the QT interval. This study examines a nonparametric regression model (Loess Smoother) to adjust the QT interval for differences in heart rate, with an improved fitness over a wide range of heart rates. 240 sets of (QT, RR) observations collected from each of 8 conscious and non-treated beagle dogs were used as the materials for investigation. The fitness of the nonparametric regression model to the QT-RR relationship was compared with four models (individual linear regression, common linear regression, and Bazett's and Fridericia's correlation models) with reference to Akaike's Information Criterion (AIC). Residuals were visually assessed. The bias-corrected AIC of the nonparametric regression model was the best of the models examined in this study. Although the parametric models did not fit, the nonparametric regression model improved the fitting at both fast and slow heart rates. The nonparametric regression model is the more flexible method compared with the parametric method. The mathematical fit for linear regression models was unsatisfactory at both fast and slow heart rates, while the nonparametric regression model showed significant improvement at all heart rates in beagle dogs.

  10. 76 FR 48935 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Proposed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ... the $1 Strike Price Interval Program August 4, 2011. Pursuant to Section 19(b)(1) of the Securities... Proposed Rule Change CBOE proposes to amend its rules in order to simplify the $1 Strike Price Interval... Policy .01 to Rule 5.5 in order to simplify the $1 Strike Price Interval Program (``Program''). In 2003...

  11. On the linear programming bound for linear Lee codes.

    PubMed

    Astola, Helena; Tabus, Ioan

    2016-01-01

    Based on an invariance-type property of the Lee-compositions of a linear Lee code, additional equality constraints can be introduced to the linear programming problem of linear Lee codes. In this paper, we formulate this property in terms of an action of the multiplicative group of the field [Formula: see text] on the set of Lee-compositions. We show some useful properties of certain sums of Lee-numbers, which are the eigenvalues of the Lee association scheme, appearing in the linear programming problem of linear Lee codes. Using the additional equality constraints, we formulate the linear programming problem of linear Lee codes in a very compact form, leading to a fast execution, which allows to efficiently compute the bounds for large parameter values of the linear codes.

  12. Are calving interval, abortions, incidence of stillbirths and pre-weaning losses in Nguni cows associated with linear type traits?

    PubMed

    Zindove, T J; Chimonyo, M

    2015-09-01

    The association of six linear type traits with calving interval, abortions, incidence of stillbirths and pre-weaning losses in Nguni cows in semi-arid and sub-humid communal areas was investigated. It was hypothesised that the odds of a cow having caving interval greater than 1 year, aborting, experiencing stillbirths or losing a calf from calving to weaning decreased with increase in body depth, rump height, flank circumference, chest circumference, navel height and body length. Navel height was measured as the distance from the ground to the lowest point of the cow's belly bottom (navel). Data were collected from a total of 200 Nguni cows from two sites experiencing sub-humid and semi-arid environments (100 each) between May and June 2013. Cows in sub-humid regions were 2.57 times more likely to have a calving interval of 1 year than cows in semi-arid areas. As body depth increased, the number of calves lost by a cow before weaning decreased linearly (p < 0.05) in all parities except parity 4. Cows in semi-arid regions were 2.13 times more likely to lose a calf from calving to weaning. For each unit increase in body depth, the odds of a cow aborting decreased by 1.12 and the odds of a cow having stillbirth decreased by 1.15. Rump height, flank circumference, chest circumference, navel height and body length were not associated with calving interval, abortions, incidence of stillbirths and pre-weaning losses. It was, therefore, concluded that body depth influences calving interval, incidence of stillbirths and abortions in Nguni cows. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Effect of Body Weight on Echocardiographic Measurements in 19,866 Pure-Bred Cats with or without Heart Disease.

    PubMed

    Häggström, J; Andersson, Å O; Falk, T; Nilsfors, L; OIsson, U; Kresken, J G; Höglund, K; Rishniw, M; Tidholm, A; Ljungvall, I

    2016-09-01

    Echocardiography is a cost-efficient method to screen cats for presence of heart disease. Current reference intervals for feline cardiac dimensions do not account for body weight (BW). To study the effect of BW on heart rate (HR), aortic (Ao), left atrial (LA) and ventricular (LV) linear dimensions in cats, and to calculate 95% prediction intervals for these variables in normal adult pure-bred cats. 19 866 pure-bred cats. Clinical data from heart screens conducted between 1999 and 2014 were included. Associations between BW, HR, and cardiac dimensions were assessed using univariate linear models and allometric scaling, including all cats, and only those considered normal, respectively. Prediction intervals were created using 95% confidence intervals obtained from regression curves. Associations between BW and echocardiographic dimensions were best described by allometric scaling, and all dimensions increased with increasing BW (all P<0.001). Strongest associations were found between BW and Ao, LV end diastolic, LA dimensions, and thickness of LV free wall. Weak linear associations were found between BW and HR and left atrial to aortic ratio (LA:Ao), for which HR decreased with increasing BW (P<0.001), and LA:Ao increased with increasing BW (P<0.001). Marginal differences were found for prediction formulas and prediction intervals when the dataset included all cats versus only those considered normal. BW had a clinically relevant effect on echocardiographic dimensions in cats, and BW based 95% prediction intervals may help in screening cats for heart disease. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.

  14. A Hybrid Interval-Robust Optimization Model for Water Quality Management.

    PubMed

    Xu, Jieyu; Li, Yongping; Huang, Guohe

    2013-05-01

    In water quality management problems, uncertainties may exist in many system components and pollution-related processes ( i.e. , random nature of hydrodynamic conditions, variability in physicochemical processes, dynamic interactions between pollutant loading and receiving water bodies, and indeterminacy of available water and treated wastewater). These complexities lead to difficulties in formulating and solving the resulting nonlinear optimization problems. In this study, a hybrid interval-robust optimization (HIRO) method was developed through coupling stochastic robust optimization and interval linear programming. HIRO can effectively reflect the complex system features under uncertainty, where implications of water quality/quantity restrictions for achieving regional economic development objectives are studied. By delimiting the uncertain decision space through dimensional enlargement of the original chemical oxygen demand (COD) discharge constraints, HIRO enhances the robustness of the optimization processes and resulting solutions. This method was applied to planning of industry development in association with river-water pollution concern in New Binhai District of Tianjin, China. Results demonstrated that the proposed optimization model can effectively communicate uncertainties into the optimization process and generate a spectrum of potential inexact solutions supporting local decision makers in managing benefit-effective water quality management schemes. HIRO is helpful for analysis of policy scenarios related to different levels of economic penalties, while also providing insight into the tradeoff between system benefits and environmental requirements.

  15. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    ERIC Educational Resources Information Center

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  16. VENVAL : a plywood mill cost accounting program

    Treesearch

    Henry Spelter

    1991-01-01

    This report documents a package of computer programs called VENVAL. These programs prepare plywood mill data for a linear programming (LP) model that, in turn, calculates the optimum mix of products to make, given a set of technologies and market prices. (The software to solve a linear program is not provided and must be obtained separately.) Linear programming finds...

  17. Ranking Forestry Investments With Parametric Linear Programming

    Treesearch

    Paul A. Murphy

    1976-01-01

    Parametric linear programming is introduced as a technique for ranking forestry investments under multiple constraints; it combines the advantages of simple tanking and linear programming as capital budgeting tools.

  18. Expanding children's food experiences: the impact of a school-based kitchen garden program.

    PubMed

    Gibbs, Lisa; Staiger, Petra K; Johnson, Britt; Block, Karen; Macfarlane, Susie; Gold, Lisa; Kulas, Jenny; Townsend, Mardie; Long, Caroline; Ukoumunne, Obioha

    2013-03-01

    Evaluate achievement of the Stephanie Alexander Kitchen Garden Program in increasing child appreciation of diverse, healthy foods. Comparative 2-year study. Six program and 6 comparison primary schools in rural and metropolitan Victoria, Australia, matched for socioeconomic status and size. A total of 764 children in grades 3 to 6 (8-12 years of age) and 562 parents recruited. Retention rates at follow-up included 85% children and 75% parents. Each week of the school year, children spent 45 to 60 minutes in a garden class and 90 minutes in a kitchen class. Program impact on children's willingness to try new foods, capacity to describe foods, and healthy eating. Qualitative data analyzed using inductive thematic analysis. Quantitative data analyzed using random-effects linear regressions adjusted for school clustering. Child and parent qualitative and quantitative measures (if never tried before, odds ratio 2.0; confidence interval, 1.06-3.58) showed increases in children's reported willingness to try new foods. No differences in articulation of food descriptions (program vs comparison groups). Qualitative evidence showed that the program extended its influence to healthy eating, but this was not reflected in the quantitative evidence. Findings indicate program success in achieving its primary objective, meriting further program research. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  19. Comparative effect of interval and continuous training programs on serum uric acid in management of hypertension: a randomized controlled trial.

    PubMed

    Lamina, Sikiru

    2011-03-01

    The purpose of the study was to investigate the effect of interval and continuous training program on blood pressure and serum uric acid (SUA) levels in subjects with hypertension. Three hundred and fifty-seven male patients with mild to moderate systolic blood pressure (SBP) between 140 and 179 and diastolic blood pressure (DBP) between 90 and 109 mm Hg essential hypertension were age-matched and grouped into interval, continuous, and control groups. The interval (work:rest ratio of 1:1) and continuous groups were involved in an 8-week interval and continuous training program of 45-60 minutes, at intensities of 60-79% of heart rate maximum, whereas the control group remained sedentary during this period. SBP, DBP, maximum oxygen uptake (VO2max) and SUA concentration were assessed. One-way analysis of variance and Scheffe and Pearson correlation tests were used in data analysis. Findings of the study revealed significant effect of exercise training program on VO2max, SBP, DBP, and SUA. However, there was no significant difference between the interval and continuous groups. Changes in VO2max negatively correlated with changes in SUA (r = -0.220) at p < 0.05. It was concluded that both moderate-intensity interval and continuous training programs are effective and neither seems superior to the other in the nonpharmacological management of hypertension and may prevent cardiovascular events through the downregulation of SUA in hypertension. Findings of the study support the recommendations of moderate-intensity interval and continuous training programs as adjuncts for nonpharmacological management of essential hypertension.

  20. Investigating Integer Restrictions in Linear Programming

    ERIC Educational Resources Information Center

    Edwards, Thomas G.; Chelst, Kenneth R.; Principato, Angela M.; Wilhelm, Thad L.

    2015-01-01

    Linear programming (LP) is an application of graphing linear systems that appears in many Algebra 2 textbooks. Although not explicitly mentioned in the Common Core State Standards for Mathematics, linear programming blends seamlessly into modeling with mathematics, the fourth Standard for Mathematical Practice (CCSSI 2010, p. 7). In solving a…

  1. User's manual for interactive LINEAR: A FORTRAN program to derive linear aircraft models

    NASA Technical Reports Server (NTRS)

    Antoniewicz, Robert F.; Duke, Eugene L.; Patterson, Brian P.

    1988-01-01

    An interactive FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models is documented in this report. The program LINEAR numerically determines a linear system model using nonlinear equations of motion and a user-supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model.

  2. Automated Interval velocity picking for Atlantic Multi-Channel Seismic Data

    NASA Astrophysics Data System (ADS)

    Singh, Vishwajit

    2016-04-01

    This paper described the challenge in developing and testing a fully automated routine for measuring interval velocities from multi-channel seismic data. Various approaches are employed for generating an interactive algorithm picking interval velocity for continuous 1000-5000 normal moveout (NMO) corrected gather and replacing the interpreter's effort for manual picking the coherent reflections. The detailed steps and pitfalls for picking the interval velocities from seismic reflection time measurements are describe in these approaches. Key ingredients these approaches utilized for velocity analysis stage are semblance grid and starting model of interval velocity. Basin-Hopping optimization is employed for convergence of the misfit function toward local minima. SLiding-Overlapping Window (SLOW) algorithm are designed to mitigate the non-linearity and ill- possessedness of root-mean-square velocity. Synthetic data case studies addresses the performance of the velocity picker generating models perfectly fitting the semblance peaks. A similar linear relationship between average depth and reflection time for synthetic model and estimated models proposed picked interval velocities as the starting model for the full waveform inversion to project more accurate velocity structure of the subsurface. The challenges can be categorized as (1) building accurate starting model for projecting more accurate velocity structure of the subsurface, (2) improving the computational cost of algorithm by pre-calculating semblance grid to make auto picking more feasible.

  3. Exact Scheffé-type confidence intervals for output from groundwater flow models: 1. Use of hydrogeologic information

    USGS Publications Warehouse

    Cooley, Richard L.

    1993-01-01

    A new method is developed to efficiently compute exact Scheffé-type confidence intervals for output (or other function of parameters) g(β) derived from a groundwater flow model. The method is general in that parameter uncertainty can be specified by any statistical distribution having a log probability density function (log pdf) that can be expanded in a Taylor series. However, for this study parameter uncertainty is specified by a statistical multivariate beta distribution that incorporates hydrogeologic information in the form of the investigator's best estimates of parameters and a grouping of random variables representing possible parameter values so that each group is defined by maximum and minimum bounds and an ordering according to increasing value. The new method forms the confidence intervals from maximum and minimum limits of g(β) on a contour of a linear combination of (1) the quadratic form for the parameters used by Cooley and Vecchia (1987) and (2) the log pdf for the multivariate beta distribution. Three example problems are used to compare characteristics of the confidence intervals for hydraulic head obtained using different weights for the linear combination. Different weights generally produced similar confidence intervals, whereas the method of Cooley and Vecchia (1987) often produced much larger confidence intervals.

  4. Effect of regrowth interval and a microbial inoculant on the fermentation profile and dry matter recovery of guinea grass silages.

    PubMed

    Santos, E M; Pereira, O G; Garcia, R; Ferreira, C L L F; Oliveira, J S; Silva, T C

    2014-07-01

    The objectives of this study were to characterize and quantify the microbial populations in guinea grass (Panicum maximum Jacq. cultivar Mombasa) harvested at different regrowth intervals (35, 45, 55, and 65 d). The chemical composition and fermentation profile of silages (after 60 d) with or without the addition of a microbial inoculant were also analyzed. Before ensiling, samples of the plants were used for the isolation and identification of lactic acid bacteria (LAB) in the epiphytic microbiota. A 4 × 2 factorial arrangement of treatments (4 regrowth intervals × with/without inoculant) was used in a completely randomized design with 3 replications. Based on the morphological and biochemical characteristics and the carbohydrate fermentation profile, Lactobacillus plantarum was found to be the predominant specie of LAB in guinea grass forage. Linear increases were detected in the dry matter (DM) content and concentrations of neutral detergent fiber, acid detergent fiber, acid detergent insoluble nitrogen, and DM recovery as well as linear reductions in the concentrations of crude protein and NH3-N with regrowth interval. Additionally, linear reductions for gas and effluent losses in silages were detected with increasing regrowth interval. These results demonstrate that guinea grass plants harvested after 55 d of regrowth contain a LAB population sufficiently large to ensure good fermentation and increase the DM recovery. The use of microbial inoculant further enhanced the fermentation of guinea grass at all stages of regrowth by improving the DM recovery. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Branch and bound algorithm for accurate estimation of analytical isotropic bidirectional reflectance distribution function models.

    PubMed

    Yu, Chanki; Lee, Sang Wook

    2016-05-20

    We present a reliable and accurate global optimization framework for estimating parameters of isotropic analytical bidirectional reflectance distribution function (BRDF) models. This approach is based on a branch and bound strategy with linear programming and interval analysis. Conventional local optimization is often very inefficient for BRDF estimation since its fitting quality is highly dependent on initial guesses due to the nonlinearity of analytical BRDF models. The algorithm presented in this paper employs L1-norm error minimization to estimate BRDF parameters in a globally optimal way and interval arithmetic to derive our feasibility problem and lower bounding function. Our method is developed for the Cook-Torrance model but with several normal distribution functions such as the Beckmann, Berry, and GGX functions. Experiments have been carried out to validate the presented method using 100 isotropic materials from the MERL BRDF database, and our experimental results demonstrate that the L1-norm minimization provides a more accurate and reliable solution than the L2-norm minimization.

  6. Development and validation of a general purpose linearization program for rigid aircraft models

    NASA Technical Reports Server (NTRS)

    Duke, E. L.; Antoniewicz, R. F.

    1985-01-01

    A FORTRAN program that provides the user with a powerful and flexible tool for the linearization of aircraft models is discussed. The program LINEAR numerically determines a linear systems model using nonlinear equations of motion and a user-supplied, nonlinear aerodynamic model. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to allow easy selection and definition of the state, control, and observation variables to be used in a particular model. Also, included in the report is a comparison of linear and nonlinear models for a high performance aircraft.

  7. Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing

    PubMed Central

    Yang, Changju; Kim, Hyongsuk

    2016-01-01

    A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model. PMID:27548186

  8. Linearized Programming of Memristors for Artificial Neuro-Sensor Signal Processing.

    PubMed

    Yang, Changju; Kim, Hyongsuk

    2016-08-19

    A linearized programming method of memristor-based neural weights is proposed. Memristor is known as an ideal element to implement a neural synapse due to its embedded functions of analog memory and analog multiplication. Its resistance variation with a voltage input is generally a nonlinear function of time. Linearization of memristance variation about time is very important for the easiness of memristor programming. In this paper, a method utilizing an anti-serial architecture for linear programming is proposed. The anti-serial architecture is composed of two memristors with opposite polarities. It linearizes the variation of memristance due to complimentary actions of two memristors. For programming a memristor, additional memristor with opposite polarity is employed. The linearization effect of weight programming of an anti-serial architecture is investigated and memristor bridge synapse which is built with two sets of anti-serial memristor architecture is taken as an application example of the proposed method. Simulations are performed with memristors of both linear drift model and nonlinear model.

  9. On Latent Change Model Choice in Longitudinal Studies

    ERIC Educational Resources Information Center

    Raykov, Tenko; Zajacova, Anna

    2012-01-01

    An interval estimation procedure for proportion of explained observed variance in latent curve analysis is discussed, which can be used as an aid in the process of choosing between linear and nonlinear models. The method allows obtaining confidence intervals for the R[squared] indexes associated with repeatedly followed measures in longitudinal…

  10. Surgery for left ventricular aneurysm: early and late survival after simple linear repair and endoventricular patch plasty.

    PubMed

    Lundblad, Runar; Abdelnoor, Michel; Svennevig, Jan Ludvig

    2004-09-01

    Simple linear resection and endoventricular patch plasty are alternative techniques to repair postinfarction left ventricular aneurysm. The aim of the study was to compare these 2 methods with regard to early mortality and long-term survival. We retrospectively reviewed 159 patients undergoing operations between 1989 and 2003. The epidemiologic design was of an exposed (simple linear repair, n = 74) versus nonexposed (endoventricular patch plasty, n = 85) cohort with 2 endpoints: early mortality and long-term survival. The crude effect of aneurysm repair technique versus endpoint was estimated by odds ratio, rate ratio, or relative risk and their 95% confidence intervals. Stratification analysis by using the Mantel-Haenszel method was done to quantify confounders and pinpoint effect modifiers. Adjustment for multiconfounders was performed by using logistic regression and Cox regression analysis. Survival curves were analyzed with the Breslow test and the log-rank test. Early mortality was 8.2% for all patients, 13.5% after linear repair and 3.5% after endoventricular patch plasty. When adjusted for multiconfounders, the risk of early mortality was significantly higher after simple linear repair than after endoventricular patch plasty (odds ratio, 4.4; 95% confidence interval, 1.1-17.8). Mean follow-up was 5.8 +/- 3.8 years (range, 0-14.0 years). Overall 5-year cumulative survival was 78%, 70.1% after linear repair and 91.4% after endoventricular patch plasty. The risk of total mortality was significantly higher after linear repair than after endoventricular patch plasty when controlled for multiconfounders (relative risk, 4.5; 95% confidence interval, 2.0-9.7). Linear repair dominated early in the series and patch plasty dominated later, giving a possible learning-curve bias in favor of patch plasty that could not be adjusted for in the regression analysis. Postinfarction left ventricular aneurysm can be repaired with satisfactory early and late results. Surgical risk was lower and long-term survival was higher after endoventricular patch plasty than simple linear repair. Differences in outcome should be interpreted with care because of the retrospective study design and the chronology of the 2 repair methods.

  11. Optimization Research of Generation Investment Based on Linear Programming Model

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Ge, Xueqian

    Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.

  12. Modeling Relationships Between Flight Crew Demographics and Perceptions of Interval Management

    NASA Technical Reports Server (NTRS)

    Remy, Benjamin; Wilson, Sara R.

    2016-01-01

    The Interval Management Alternative Clearances (IMAC) human-in-the-loop simulation experiment was conducted to assess interval management system performance and participants' acceptability and workload while performing three interval management clearance types. Twenty-four subject pilots and eight subject controllers flew ten high-density arrival scenarios into Denver International Airport during two weeks of data collection. This analysis examined the possible relationships between subject pilot demographics on reported perceptions of interval management in IMAC. Multiple linear regression models were created with a new software tool to predict subject pilot questionnaire item responses from demographic information. General patterns were noted across models that may indicate flight crew demographics influence perceptions of interval management.

  13. State transformations and Hamiltonian structures for optimal control in discrete systems

    NASA Astrophysics Data System (ADS)

    Sieniutycz, S.

    2006-04-01

    Preserving usual definition of Hamiltonian H as the scalar product of rates and generalized momenta we investigate two basic classes of discrete optimal control processes governed by the difference rather than differential equations for the state transformation. The first class, linear in the time interval θ, secures the constancy of optimal H and satisfies a discrete Hamilton-Jacobi equation. The second class, nonlinear in θ, does not assure the constancy of optimal H and satisfies only a relationship that may be regarded as an equation of Hamilton-Jacobi type. The basic question asked is if and when Hamilton's canonical structures emerge in optimal discrete systems. For a constrained discrete control, general optimization algorithms are derived that constitute powerful theoretical and computational tools when evaluating extremum properties of constrained physical systems. The mathematical basis is Bellman's method of dynamic programming (DP) and its extension in the form of the so-called Carathéodory-Boltyanski (CB) stage optimality criterion which allows a variation of the terminal state that is otherwise fixed in Bellman's method. For systems with unconstrained intervals of the holdup time θ two powerful optimization algorithms are obtained: an unconventional discrete algorithm with a constant H and its counterpart for models nonlinear in θ. We also present the time-interval-constrained extension of the second algorithm. The results are general; namely, one arrives at: discrete canonical equations of Hamilton, maximum principles, and (at the continuous limit of processes with free intervals of time) the classical Hamilton-Jacobi theory, along with basic results of variational calculus. A vast spectrum of applications and an example are briefly discussed with particular attention paid to models nonlinear in the time interval θ.

  14. Dried blood spot testing for seven steroids using liquid chromatography-tandem mass spectrometry with reference interval determination in the Korean population.

    PubMed

    Kim, Borahm; Lee, Mi Na; Park, Hyung Doo; Kim, Jong Won; Chang, Yun Sil; Park, Won Soon; Lee, Soo Youn

    2015-11-01

    Conventional screening for congenital adrenal hyperplasia (CAH) using immunoassays generates a large number of false-positive results. A more specific liquid chromatography-tandem mass spectrometry (LC-MS/MS) method has been introduced to minimize unnecessary follow-ups. However, because of limited data on its use in the Korean population, LC-MS/MS has not yet been incorporated into newborn screening programs in this region. The present study aims to develop and validate an LC-MS/MS method for the simultaneous determination of seven steroids in dried blood spots (DBS) for CAH screening, and to define age-specific reference intervals in the Korean population. We developed and validated an LC-MS/MS method to determine the reference intervals of cortisol, 17-hydroxyprogesterone, 11-deoxycortisol, 21-deoxycortisol, androstenedione, corticosterone, and 11-deoxycorticosterone simultaneously in 453 DBS samples. The samples were from Korean subjects stratified by age group (78 full-term neonates, 76 premature neonates, 89 children, and 100 adults). The accuracy, precision, matrix effects, and extraction recovery were satisfactory for all the steroids at three concentrations; values of intra- and inter-day precision coefficients of variance, bias, and recovery were 0.7-7.7%, -1.5-9.8%, and 49.3-97.5%, respectively. The linearity range was 1-100 ng/mL for cortisol and 0.5-50 ng/mL for other steroids (R²>0.99). The reference intervals were in agreement with the previous reports. This LC-MS/MS method and the reference intervals validated in the Korean population can be successfully applied to analyze seven steroids in DBS for the diagnosis of CAH.

  15. Portfolio optimization by using linear programing models based on genetic algorithm

    NASA Astrophysics Data System (ADS)

    Sukono; Hidayat, Y.; Lesmana, E.; Putra, A. S.; Napitupulu, H.; Supian, S.

    2018-01-01

    In this paper, we discussed the investment portfolio optimization using linear programming model based on genetic algorithms. It is assumed that the portfolio risk is measured by absolute standard deviation, and each investor has a risk tolerance on the investment portfolio. To complete the investment portfolio optimization problem, the issue is arranged into a linear programming model. Furthermore, determination of the optimum solution for linear programming is done by using a genetic algorithm. As a numerical illustration, we analyze some of the stocks traded on the capital market in Indonesia. Based on the analysis, it is shown that the portfolio optimization performed by genetic algorithm approach produces more optimal efficient portfolio, compared to the portfolio optimization performed by a linear programming algorithm approach. Therefore, genetic algorithms can be considered as an alternative on determining the investment portfolio optimization, particularly using linear programming models.

  16. Systems of fuzzy equations in structural mechanics

    NASA Astrophysics Data System (ADS)

    Skalna, Iwona; Rama Rao, M. V.; Pownuk, Andrzej

    2008-08-01

    Systems of linear and nonlinear equations with fuzzy parameters are relevant to many practical problems arising in structure mechanics, electrical engineering, finance, economics and physics. In this paper three methods for solving such equations are discussed: method for outer interval solution of systems of linear equations depending linearly on interval parameters, fuzzy finite element method proposed by Rama Rao and sensitivity analysis method. The performance and advantages of presented methods are described with illustrative examples. Extended version of the present paper can be downloaded from the web page of the UTEP [I. Skalna, M.V. Rama Rao, A. Pownuk, Systems of fuzzy equations in structural mechanics, The University of Texas at El Paso, Department of Mathematical Sciences Research Reports Series, , Texas Research Report No. 2007-01, 2007].

  17. An efficient method for generalized linear multiplicative programming problem with multiplicative constraints.

    PubMed

    Zhao, Yingfeng; Liu, Sanyang

    2016-01-01

    We present a practical branch and bound algorithm for globally solving generalized linear multiplicative programming problem with multiplicative constraints. To solve the problem, a relaxation programming problem which is equivalent to a linear programming is proposed by utilizing a new two-phase relaxation technique. In the algorithm, lower and upper bounds are simultaneously obtained by solving some linear relaxation programming problems. Global convergence has been proved and results of some sample examples and a small random experiment show that the proposed algorithm is feasible and efficient.

  18. Very Low-Cost Nutritious Diet Plans Designed by Linear Programming.

    ERIC Educational Resources Information Center

    Foytik, Jerry

    1981-01-01

    Provides procedural details of Linear Programing, developed by the U.S. Department of Agriculture to devise a dietary guide for consumers that minimizes food costs without sacrificing nutritional quality. Compares Linear Programming with the Thrifty Food Plan, which has been a basis for allocating coupons under the Food Stamp Program. (CS)

  19. Fuzzy bi-objective linear programming for portfolio selection problem with magnitude ranking function

    NASA Astrophysics Data System (ADS)

    Kusumawati, Rosita; Subekti, Retno

    2017-04-01

    Fuzzy bi-objective linear programming (FBOLP) model is bi-objective linear programming model in fuzzy number set where the coefficients of the equations are fuzzy number. This model is proposed to solve portfolio selection problem which generate an asset portfolio with the lowest risk and the highest expected return. FBOLP model with normal fuzzy numbers for risk and expected return of stocks is transformed into linear programming (LP) model using magnitude ranking function.

  20. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  1. Power Pattern Sensitivity to Calibration Errors and Mutual Coupling in Linear Arrays through Circular Interval Arithmetics

    PubMed Central

    Anselmi, Nicola; Salucci, Marco; Rocca, Paolo; Massa, Andrea

    2016-01-01

    The sensitivity to both calibration errors and mutual coupling effects of the power pattern radiated by a linear array is addressed. Starting from the knowledge of the nominal excitations of the array elements and the maximum uncertainty on their amplitudes, the bounds of the pattern deviations from the ideal one are analytically derived by exploiting the Circular Interval Analysis (CIA). A set of representative numerical results is reported and discussed to assess the effectiveness and the reliability of the proposed approach also in comparison with state-of-the-art methods and full-wave simulations. PMID:27258274

  2. Estimating the intensity of a cyclic Poisson process in the presence of additive and multiplicative linear trend

    NASA Astrophysics Data System (ADS)

    Wayan Mangku, I.

    2017-10-01

    In this paper we survey some results on estimation of the intensity function of a cyclic Poisson process in the presence of additive and multiplicative linear trend. We do not assume any parametric form for the cyclic component of the intensity function, except that it is periodic. Moreover, we consider the case when there is only a single realization of the Poisson process is observed in a bounded interval. The considered estimators are weakly and strongly consistent when the size of the observation interval indefinitely expands. Asymptotic approximations to the bias and variance of those estimators are presented.

  3. Viscoelastic-coupling model for the earthquake cycle driven from below

    USGS Publications Warehouse

    Savage, J.C.

    2000-01-01

    In a linear system the earthquake cycle can be represented as the sum of a solution which reproduces the earthquake cycle itself (viscoelastic-coupling model) and a solution that provides the driving force. We consider two cases, one in which the earthquake cycle is driven by stresses transmitted along the schizosphere and a second in which the cycle is driven from below by stresses transmitted along the upper mantle (i.e., the schizosphere and upper mantle, respectively, act as stress guides in the lithosphere). In both cases the driving stress is attributed to steady motion of the stress guide, and the upper crust is assumed to be elastic. The surface deformation that accumulates during the interseismic interval depends solely upon the earthquake-cycle solution (viscoelastic-coupling model) not upon the driving source solution. Thus geodetic observations of interseismic deformation are insensitive to the source of the driving forces in a linear system. In particular, the suggestion of Bourne et al. [1998] that the deformation that accumulates across a transform fault system in the interseismic interval is a replica of the deformation that accumulates in the upper mantle during the same interval does not appear to be correct for linear systems.

  4. Health Effects of Unemployment Benefit Program Generosity

    PubMed Central

    Glymour, M. Maria; Avendano, Mauricio

    2015-01-01

    Objectives. We assessed the impact of unemployment benefit programs on the health of the unemployed. Methods. We linked US state law data on maximum allowable unemployment benefit levels between 1985 and 2008 to individual self-rated health for heads of households in the Panel Study of Income Dynamics and implemented state and year fixed-effect models. Results. Unemployment was associated with increased risk of reporting poor health among men in both linear probability (b = 0.0794; 95% confidence interval [CI] = 0.0623, 0.0965) and logistic models (odds ratio = 2.777; 95% CI = 2.294, 3.362), but this effect is lower when the generosity of state unemployment benefits is high (b for interaction between unemployment and benefits = −0.124; 95% CI = −0.197, −0.0523). A 63% increase in benefits completely offsets the impact of unemployment on self-reported health. Conclusions. Results suggest that unemployment benefits may significantly alleviate the adverse health effects of unemployment among men. PMID:25521897

  5. Computer simulation of two-dimensional unsteady flows in estuaries and embayments by the method of characteristics : basic theory and the formulation of the numerical method

    USGS Publications Warehouse

    Lai, Chintu

    1977-01-01

    Two-dimensional unsteady flows of homogeneous density in estuaries and embayments can be described by hyperbolic, quasi-linear partial differential equations involving three dependent and three independent variables. A linear combination of these equations leads to a parametric equation of characteristic form, which consists of two parts: total differentiation along the bicharacteristics and partial differentiation in space. For its numerical solution, the specified-time-interval scheme has been used. The unknown, partial space-derivative terms can be eliminated first by suitable combinations of difference equations, converted from the corresponding differential forms and written along four selected bicharacteristics and a streamline. Other unknowns are thus made solvable from the known variables on the current time plane. The computation is carried to the second-order accuracy by using trapezoidal rule of integration. Means to handle complex boundary conditions are developed for practical application. Computer programs have been written and a mathematical model has been constructed for flow simulation. The favorable computer outputs suggest further exploration and development of model worthwhile. (Woodard-USGS)

  6. Minimax confidence intervals in geomagnetism

    NASA Technical Reports Server (NTRS)

    Stark, Philip B.

    1992-01-01

    The present paper uses theory of Donoho (1989) to find lower bounds on the lengths of optimally short fixed-length confidence intervals (minimax confidence intervals) for Gauss coefficients of the field of degree 1-12 using the heat flow constraint. The bounds on optimal minimax intervals are about 40 percent shorter than Backus' intervals: no procedure for producing fixed-length confidence intervals, linear or nonlinear, can give intervals shorter than about 60 percent the length of Backus' in this problem. While both methods rigorously account for the fact that core field models are infinite-dimensional, the application of the techniques to the geomagnetic problem involves approximations and counterfactual assumptions about the data errors, and so these results are likely to be extremely optimistic estimates of the actual uncertainty in Gauss coefficients.

  7. IBM system/360 assembly language interval arithmetic software

    NASA Technical Reports Server (NTRS)

    Phillips, E. J.

    1972-01-01

    Computer software designed to perform interval arithmetic is described. An interval is defined as the set of all real numbers between two given numbers including or excluding one or both endpoints. Interval arithmetic consists of the various elementary arithmetic operations defined on the set of all intervals, such as interval addition, subtraction, union, etc. One of the main applications of interval arithmetic is in the area of error analysis of computer calculations. For example, it has been used sucessfully to compute bounds on sounding errors in the solution of linear algebraic systems, error bounds in numerical solutions of ordinary differential equations, as well as integral equations and boundary value problems. The described software enables users to implement algorithms of the type described in references efficiently on the IBM 360 system.

  8. Estimating Standardized Linear Contrasts of Means with Desired Precision

    ERIC Educational Resources Information Center

    Bonett, Douglas G.

    2009-01-01

    L. Wilkinson and the Task Force on Statistical Inference (1999) recommended reporting confidence intervals for measures of effect sizes. If the sample size is too small, the confidence interval may be too wide to provide meaningful information. Recently, K. Kelley and J. R. Rausch (2006) used an iterative approach to computer-generate tables of…

  9. Effect of action potential duration on Tpeak-Tend interval, T-wave area and T-wave amplitude as indices of dispersion of repolarization: Theoretical and simulation study in the rabbit heart.

    PubMed

    Arteyeva, Natalia V; Azarov, Jan E

    The aim of the study was to differentiate the effect of dispersion of repolarization (DOR) and action potential duration (APD) on T-wave parameters being considered as indices of DOR, namely, Tpeak-Tend interval, T-wave amplitude and T-wave area. T-wave was simulated in a wide physiological range of DOR and APD using a realistic rabbit model based on experimental data. A simplified mathematical formulation of T-wave formation was conducted. Both the simulations and the mathematical formulation showed that Tpeak-Tend interval and T-wave area are linearly proportional to DOR irrespectively of APD range, while T-wave amplitude is non-linearly proportional to DOR and inversely proportional to the minimal repolarization time, or minimal APD value. Tpeak-Tend interval and T-wave area are the most accurate DOR indices independent of APD. T-wave amplitude can be considered as an index of DOR when the level of APD is taken into account. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Comparison of Linear and Systems Thinking Approaches for Program Evaluation Illustrated Using the Indiana Interdisciplinary GK-12

    ERIC Educational Resources Information Center

    Dyehouse, Melissa; Bennett, Deborah; Harbor, Jon; Childress, Amy; Dark, Melissa

    2009-01-01

    Logic models are based on linear relationships between program resources, activities, and outcomes, and have been used widely to support both program development and evaluation. While useful in describing some programs, the linear nature of the logic model makes it difficult to capture the complex relationships within larger, multifaceted…

  11. Maternal Night-Fasting Interval during Pregnancy Is Directly Associated with Neonatal Head Circumference and Adiposity in Girls but Not Boys.

    PubMed

    Loy, See Ling; Wee, Poh Hui; Colega, Marjorelee T; Cheung, Yin Bun; Aris, Izzuddin M; Chan, Jerry Kok Yen; Godfrey, Keith M; Gluckman, Peter D; Tan, Kok Hian; Shek, Lynette Pei-Chi; Chong, Yap-Seng; Natarajan, Padmapriya; Müller-Riemenschneider, Falk; Lek, Ngee; Rajadurai, Victor Samuel; Tint, Mya-Thway; Lee, Yung Seng; Chong, Mary Foong-Fong; Yap, Fabian

    2017-07-01

    Background: Synchrony between daily feeding-fasting signals and circadian rhythms has been shown to improve metabolic health in animals and adult humans, but the potential programming effect on fetal growth is unknown. Objective: We examined the associations of the maternal night-fasting interval during pregnancy with offspring birth size and adiposity. Methods: This was a cross-sectional study of mother-offspring dyads within the Growing Up in Singapore Towards healthy Outcomes (GUSTO) cohort. For 384 mothers aged 30.8 ± 4.8 y (mean ± SD), the night-fasting interval at 26-28 wk of gestation was determined from a 3-d food diary based on the average of the fasting duration at night (1900-0659). Offspring birth weight, length, and head circumference were measured and converted to weight-for-gestational age (GA), length-for-GA, and head circumference-for-GA z scores, respectively, by using local customized percentile charts. The percentage of neonatal total body fat (TBF) was derived by using a validated prediction equation. Multivariable general linear models, stratified by child sex, were performed. Results: The mean ± SD maternal night-fasting interval was 9.9 ± 1.3 h. In infant girls, each 1-h increase in the maternal night-fasting interval was associated with a 0.22-SD (95% CI: 0.05-, 0.40-SD; P = 0.013) increase in birth head circumference-for-GA and a 0.84% (95% CI: 0.19%, 1.49%; P = 0.012) increase in TBF at birth, after adjustment for confounders. In infant boys, no associations were observed between the maternal night-fasting interval and birth size or TBF. Conclusions: An increased maternal night-fasting interval in the late second trimester of pregnancy is associated with increased birth head circumference and TBF in girls but not boys. Our findings are in accordance with previous observations that suggest that there are sex-specific responses in fetal brain growth and adiposity, and raise the possibility of the maternal night-fasting interval as an underlying influence. This trial was registered at clinicaltrials.gov as NCT01174875. © 2017 American Society for Nutrition.

  12. Masticatory motion after surgical or nonsurgical treatment for unilateral fractures of the mandibular condylar process.

    PubMed

    Throckmorton, Gaylord S; Ellis, Edward; Hayasaki, Haruaki

    2004-02-01

    We sought to compare mandibular motion during mastication in patients treated in either an open or a closed fashion for unilateral fractures of the mandibular condylar process. Eighty-one male patients with unilateral condylar process fractures were treated either with (n = 37) or without (n = 44) surgical reduction and rigid fixation of their condylar process fractures. At 6 weeks, 6 months, 1 year, and 2 years after treatment, the subjects' chewing cycles were recorded using a magnetic sensor array (Sirognathograph; Siemens Corp, Bensheim, Germany) while chewing Gummi-Bears (HARIBO, Bonn, Germany) unilaterally on the same side as the fracture and on the opposite side. The chewing cycles were analyzed using a custom computer program, and the duration, excursive ranges, and 3-dimensional cycle shape were compared between the 2 treatment groups at each time interval using multilevel linear modeling statistics. The 2 treatment groups did not differ significantly for any measure of cycle duration or any excursive range (except lateral excursions at 1 year post-treatment) at any of the time intervals. However, the 3-dimensional cycle shapes of the 2 groups did differ significantly at all time intervals. Surgical correction of unilateral condylar process fractures has relatively little effect on the more standard measures (duration and excursive ranges) of masticatory function. However, surgical correction better normalizes opening incisor pathways during mastication on the side opposite the fracture.

  13. Linear Programming across the Curriculum

    ERIC Educational Resources Information Center

    Yoder, S. Elizabeth; Kurz, M. Elizabeth

    2015-01-01

    Linear programming (LP) is taught in different departments across college campuses with engineering and management curricula. Modeling an LP problem is taught in every linear programming class. As faculty teaching in Engineering and Management departments, the depth to which teachers should expect students to master this particular type of…

  14. Fundamental solution of the problem of linear programming and method of its determination

    NASA Technical Reports Server (NTRS)

    Petrunin, S. V.

    1978-01-01

    The idea of a fundamental solution to a problem in linear programming is introduced. A method of determining the fundamental solution and of applying this method to the solution of a problem in linear programming is proposed. Numerical examples are cited.

  15. A Sawmill Manager Adapts To Change With Linear Programming

    Treesearch

    George F. Dutrow; James E. Granskog

    1973-01-01

    Linear programming provides guidelines for increasing sawmill capacity and flexibility and for determining stumpagepurchasing strategy. The operator of a medium-sized sawmill implemented improvements suggested by linear programming analysis; results indicate a 45 percent increase in revenue and a 36 percent hike in volume processed.

  16. Multi-cluster processor operating only select number of clusters during each phase based on program statistic monitored at predetermined intervals

    DOEpatents

    Balasubramonian, Rajeev [Sandy, UT; Dwarkadas, Sandhya [Rochester, NY; Albonesi, David [Ithaca, NY

    2009-02-10

    In a processor having multiple clusters which operate in parallel, the number of clusters in use can be varied dynamically. At the start of each program phase, the configuration option for an interval is run to determine the optimal configuration, which is used until the next phase change is detected. The optimum instruction interval is determined by starting with a minimum interval and doubling it until a low stability factor is reached.

  17. On differences of linear positive operators

    NASA Astrophysics Data System (ADS)

    Aral, Ali; Inoan, Daniela; Raşa, Ioan

    2018-04-01

    In this paper we consider two different general linear positive operators defined on unbounded interval and obtain estimates for the differences of these operators in quantitative form. Our estimates involve an appropriate K-functional and a weighted modulus of smoothness. Similar estimates are obtained for Chebyshev functional of these operators as well. All considerations are based on rearrangement of the remainder in Taylor's formula. The obtained results are applied for some well known linear positive operators.

  18. Application of Statistical Learning Theory to Plankton Image Analysis

    DTIC Science & Technology

    2006-06-01

    linear distance interval from 1 to 40 pixels and two directions formula (horizontal & vertical, and diagonals), EF2 is EF with 7 ex- ponential distance...and four directions formula (horizontal, vertical and two diagonals). It is clear that exponential distance inter- val works better than the linear ...PSI - PS by Vincent, linear and pseudo opening and closing spectra, each has 40 elements, total feature length of 160. PS2 - PS modified from Mei- jster

  19. [Classification and characteristics of interval cancers in the Principality of Asturias's Breast Cancer Screening Program].

    PubMed

    Prieto García, M A; Delgado Sevillano, R; Baldó Sierra, C; González Díaz, E; López Secades, A; Llavona Amor, J A; Vidal Marín, B

    2013-09-01

    To review and classify the interval cancers found in the Principality of Asturias's Breast Cancer Screening Program (PDPCM). A secondary objective was to determine the histological characteristics, size, and stage of the interval cancers at the time of diagnosis. We included the interval cancers in the PDPCM in the period 2003-2007. Interval cancers were classified according to the breast cancer screening program protocol, with double reading without consensus, without blinding, with arbitration. Mammograms were interpreted by 10 radiologists in the PDPCM. A total of 33.7% of the interval cancers could not be classified; of the interval cancers that could be classified, 40.67% were labeled true interval cancers, 31.4% were labeled false negatives on screening, 23.7% had minimal signs, and 4.23% were considered occult. A total of 70% of the interval cancers were diagnosed in the year of the period between screening examinations and 71.7% were diagnosed after subsequent screening. A total of 76.9% were invasive ductal carcinomas, 61.1% were stage II when detected, and 78.7% were larger than 10mm when detected. The rate of interval cancers and the rate of false negatives in the PDPCM are higher than those recommended in the European guidelines. Interval cancers are diagnosed later than the tumors detected at screening. Studying interval cancers provides significant training for the radiologists in the PDPCM. Copyright © 2011 SERAM. Published by Elsevier Espana. All rights reserved.

  20. 77 FR 68177 - Self-Regulatory Organizations; NYSE MKT LLC; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-15

    ... Commission has recently approved certain products to trade at $0.50 and $1 strike price intervals on CBOE... Series (``STOS'') Program that normally trade in $1 Strike Price Intervals shall be $0.50 or greater; and for classes in the STOS Program that do not normally trade in $1 Strike Price Intervals, the strike...

  1. Linear and volumetric dimensional changes of injection-molded PMMA denture base resins.

    PubMed

    El Bahra, Shadi; Ludwig, Klaus; Samran, Abdulaziz; Freitag-Wolf, Sandra; Kern, Matthias

    2013-11-01

    The aim of this study was to evaluate the linear and volumetric dimensional changes of six denture base resins processed by their corresponding injection-molding systems at 3 time intervals of water storage. Two heat-curing (SR Ivocap Hi Impact and Lucitone 199) and four auto-curing (IvoBase Hybrid, IvoBase Hi Impact, PalaXpress, and Futura Gen) acrylic resins were used with their specific injection-molding technique to fabricate 6 specimens of each material. Linear and volumetric dimensional changes were determined by means of a digital caliper and an electronic hydrostatic balance, respectively, after water storage of 1, 30, or 90 days. Means and standard deviations of linear and volumetric dimensional changes were calculated in percentage (%). Statistical analysis was done using Student's and Welch's t tests with Bonferroni-Holm correction for multiple comparisons (α=0.05). Statistically significant differences in linear dimensional changes between resins were demonstrated at all three time intervals of water immersion (p≤0.05), with exception of the following comparisons which showed no significant difference: IvoBase Hi Impact/SR Ivocap Hi Impact and PalaXpress/Lucitone 199 after 1 day, Futura Gen/PalaXpress and PalaXpress/Lucitone 199 after 30 days, and IvoBase Hybrid/IvoBase Hi Impact after 90 days. Also, statistically significant differences in volumetric dimensional changes between resins were found at all three time intervals of water immersion (p≤0.05), with exception of the comparison between PalaXpress and Futura Gen. Denture base resins (IvoBase Hybrid and IvoBase Hi Impact) processed by the new injection-molding system (IvoBase), revealed superior dimensional precision. Copyright © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  2. Timetabling an Academic Department with Linear Programming.

    ERIC Educational Resources Information Center

    Bezeau, Lawrence M.

    This paper describes an approach to faculty timetabling and course scheduling that uses computerized linear programming. After reviewing the literature on linear programming, the paper discusses the process whereby a timetable was created for a department at the University of New Brunswick. Faculty were surveyed with respect to course offerings…

  3. A Comparison of Traditional Worksheet and Linear Programming Methods for Teaching Manure Application Planning.

    ERIC Educational Resources Information Center

    Schmitt, M. A.; And Others

    1994-01-01

    Compares traditional manure application planning techniques calculated to meet agronomic nutrient needs on a field-by-field basis with plans developed using computer-assisted linear programming optimization methods. Linear programming provided the most economical and environmentally sound manure application strategy. (Contains 15 references.) (MDH)

  4. A new variable interval schedule with constant hazard rate and finite time range.

    PubMed

    Bugallo, Mehdi; Machado, Armando; Vasconcelos, Marco

    2018-05-27

    We propose a new variable interval (VI) schedule that achieves constant probability of reinforcement in time while using a bounded range of intervals. By sampling each trial duration from a uniform distribution ranging from 0 to 2 T seconds, and then applying a reinforcement rule that depends linearly on trial duration, the schedule alternates reinforced and unreinforced trials, each less than 2 T seconds, while preserving a constant hazard function. © 2018 Society for the Experimental Analysis of Behavior.

  5. Minding the Gap: Children's Difficulty Conceptualizing Spatial Intervals as Linear Measurement Units

    ERIC Educational Resources Information Center

    Solomon, Tracy L.; Vasilyeva, Marina; Huttenlocher, Janellen; Levine, Susan C.

    2015-01-01

    Understanding measurement units is critical to mathematics and science learning, but it is a topic that American students find difficult. In 3 studies, we investigated the challenges underlying this difficulty in kindergarten and second grade by comparing performance on different versions of a linear measurement task. Children measured crayons…

  6. Applications of Goal Programming to Education.

    ERIC Educational Resources Information Center

    Van Dusseldorp, Ralph A.; And Others

    This paper discusses goal programming, a computer-based operations research technique that is basically a modification and extension of linear programming. The authors first discuss the similarities and differences between goal programming and linear programming, then describe the limitations of goal programming and its possible applications for…

  7. Deduced Inference in the Analysis of Experimental Data

    ERIC Educational Resources Information Center

    Bird, Kevin D.

    2011-01-01

    Any set of confidence interval inferences on J - 1 linearly independent contrasts on J means, such as the two comparisons [mu][subscript 1] - [mu][subscript 2] and [mu][subscript 2] - [mu][subscript 3] on 3 means, provides a basis for the deduction of interval inferences on all other contrasts, such as the redundant comparison [mu][subscript 1] -…

  8. Ada Linear-Algebra Program

    NASA Technical Reports Server (NTRS)

    Klumpp, A. R.; Lawson, C. L.

    1988-01-01

    Routines provided for common scalar, vector, matrix, and quaternion operations. Computer program extends Ada programming language to include linear-algebra capabilities similar to HAS/S programming language. Designed for such avionics applications as software for Space Station.

  9. A linear programming manual

    NASA Technical Reports Server (NTRS)

    Tuey, R. C.

    1972-01-01

    Computer solutions of linear programming problems are outlined. Information covers vector spaces, convex sets, and matrix algebra elements for solving simultaneous linear equations. Dual problems, reduced cost analysis, ranges, and error analysis are illustrated.

  10. Care processes associated with quicker door-in-door-out times for patients with ST-elevation-myocardial infarction requiring transfer: results from a statewide regionalization program.

    PubMed

    Glickman, Seth W; Lytle, Barbara L; Ou, Fang-Shu; Mears, Greg; O'Brien, Sean; Cairns, Charles B; Garvey, J Lee; Bohle, David J; Peterson, Eric D; Jollis, James G; Granger, Christopher B

    2011-07-01

    The ability to rapidly identify patients with ST-segment elevation-myocardial infarction (STEMI) at hospitals without percutaneous coronary intervention (PCI) and transfer them to hospitals with PCI capability is critical to STEMI regionalization efforts. Our objective was to assess the association of prehospital, emergency department (ED), and hospital processes of care implemented as part of a statewide STEMI regionalization program with door-in-door-out times at non-PCI hospitals. Door-in-door-out times for 436 STEMI patients at 55 non-PCI hospitals were determined before (July 2005 to September 2005) and after (January 2007 to March 2007) a year-long implementation of standardized protocols as part of a statewide regionalization program (Reperfusion of Acute Myocardial Infarction in North Carolina Emergency Departments, RACE). The association of 8 system care processes (encompassing emergency medical services [EMS], ED, and hospital settings) with door-in-door-out times was determined using multivariable linear regression. Median door-in-door-out times improved significantly with the intervention (before: 97.0 minutes, interquartile range, 56.0 to 160.0 minutes; after: 58.0 minutes, interquartile range, 35.0 to 90.0 minutes; P<0.0001). Hospital, ED, and EMS care processes were each independently associated with shorter door-in-door-out times (-17.7 [95% confidence interval, -27.5 to -7.9]; -10.1 [95% confidence interval, -19.0 to -1.1], and -7.3 [95% confidence interval, -13.0 to -1.5] minutes for each additional hospital, ED, and EMS process, respectively). Combined, adoption of EMS processes was associated with the shortest median treatment times (44 versus 138 minutes for hospitals that adopted all EMS processes versus none). Prehospital, ED, and hospital processes of care were independently associated with shorter door-in-door-out times for STEMI patients requiring transfer. Adoption of several EMS processes was associated with the largest reduction in treatment times. These findings highlight the need for an integrated, system-based approach to improving STEMI care.

  11. BRIDGING GAPS BETWEEN ZOO AND WILDLIFE MEDICINE: ESTABLISHING REFERENCE INTERVALS FOR FREE-RANGING AFRICAN LIONS (PANTHERA LEO).

    PubMed

    Broughton, Heather M; Govender, Danny; Shikwambana, Purvance; Chappell, Patrick; Jolles, Anna

    2017-06-01

    The International Species Information System has set forth an extensive database of reference intervals for zoologic species, allowing veterinarians and game park officials to distinguish normal health parameters from underlying disease processes in captive wildlife. However, several recent studies comparing reference values from captive and free-ranging animals have found significant variation between populations, necessitating the development of separate reference intervals in free-ranging wildlife to aid in the interpretation of health data. Thus, this study characterizes reference intervals for six biochemical analytes, eleven hematologic or immune parameters, and three hormones using samples from 219 free-ranging African lions ( Panthera leo ) captured in Kruger National Park, South Africa. Using the original sample population, exclusion criteria based on physical examination were applied to yield a final reference population of 52 clinically normal lions. Reference intervals were then generated via 90% confidence intervals on log-transformed data using parametric bootstrapping techniques. In addition to the generation of reference intervals, linear mixed-effect models and generalized linear mixed-effect models were used to model associations of each focal parameter with the following independent variables: age, sex, and body condition score. Age and sex were statistically significant drivers for changes in hepatic enzymes, renal values, hematologic parameters, and leptin, a hormone related to body fat stores. Body condition was positively correlated with changes in monocyte counts. Given the large variation in reference values taken from captive versus free-ranging lions, it is our hope that this study will serve as a baseline for future clinical evaluations and biomedical research targeting free-ranging African lions.

  12. Object matching using a locally affine invariant and linear programming techniques.

    PubMed

    Li, Hongsheng; Huang, Xiaolei; He, Lei

    2013-02-01

    In this paper, we introduce a new matching method based on a novel locally affine-invariant geometric constraint and linear programming techniques. To model and solve the matching problem in a linear programming formulation, all geometric constraints should be able to be exactly or approximately reformulated into a linear form. This is a major difficulty for this kind of matching algorithm. We propose a novel locally affine-invariant constraint which can be exactly linearized and requires a lot fewer auxiliary variables than other linear programming-based methods do. The key idea behind it is that each point in the template point set can be exactly represented by an affine combination of its neighboring points, whose weights can be solved easily by least squares. Errors of reconstructing each matched point using such weights are used to penalize the disagreement of geometric relationships between the template points and the matched points. The resulting overall objective function can be solved efficiently by linear programming techniques. Our experimental results on both rigid and nonrigid object matching show the effectiveness of the proposed algorithm.

  13. Bursting as a source of non-linear determinism in the firing patterns of nigral dopamine neurons

    PubMed Central

    Jeong, Jaeseung; Shi, Wei-Xing; Hoffman, Ralph; Oh, Jihoon; Gore, John C.; Bunney, Benjamin S.; Peterson, Bradley S.

    2012-01-01

    Nigral dopamine (DA) neurons in vivo exhibit complex firing patterns consisting of tonic single-spikes and phasic bursts that encode information for certain types of reward-related learning and behavior. Non-linear dynamical analysis has previously demonstrated the presence of a non-linear deterministic structure in complex firing patterns of DA neurons, yet the origin of this non-linear determinism remains unknown. In this study, we hypothesized that bursting activity is the primary source of non-linear determinism in the firing patterns of DA neurons. To test this hypothesis, we investigated the dimension complexity of inter-spike interval data recorded in vivo from bursting and non-bursting DA neurons in the chloral hydrate-anesthetized rat substantia nigra. We found that bursting DA neurons exhibited non-linear determinism in their firing patterns, whereas non-bursting DA neurons showed truly stochastic firing patterns. Determinism was also detected in the isolated burst and inter-burst interval data extracted from firing patterns of bursting neurons. Moreover, less bursting DA neurons in halothane-anesthetized rats exhibited higher dimensional spiking dynamics than do more bursting DA neurons in chloral hydrate-anesthetized rats. These results strongly indicate that bursting activity is the main source of low-dimensional, non-linear determinism in the firing patterns of DA neurons. This finding furthermore suggests that bursts are the likely carriers of meaningful information in the firing activities of DA neurons. PMID:22831464

  14. Assessment of Ethylene Vinyl-Acetato Copolymer (EVA) Samples Bombarded by Gamma Radiation via Linearity Analyses

    NASA Astrophysics Data System (ADS)

    de Oliveira, L. N.; do Nascimento, E. O.; Schimidt, F.; Antonio, P. L.; Caldas, L. V. E.

    2018-03-01

    Materials with the potential to become dosimeters are of interest in radiation physics. In this research, the materials were analyzed and compared in relation to their linearity ranges. Samples of ethylene vinyl-acetate copolymer (EVA) were irradiated with doses from 10 Gy to 10 kGy using a 60Co Gamma-Cell system 220 and evaluated with the FTIR technique. The linearity analyses were applied through two methodologies, searching for linear regions in their response. The results show that both applied analyses indicate linear regions in defined dose interval. The radiation detectors EVA can be useful for radiation dosimetry in intermediate and high doses.

  15. How people make friends in social networking sites—A microscopic perspective

    NASA Astrophysics Data System (ADS)

    Hu, Haibo; Wang, Xiaofan

    2012-02-01

    We study the detailed growth of a social networking site with full temporal information by examining the creation process of each friendship relation that can collectively lead to the macroscopic properties of the network. We first study the reciprocal behavior of users, and find that link requests are quickly responded to and that the distribution of reciprocation intervals decays in an exponential form. The degrees of inviters/accepters are slightly negatively correlative with reciprocation time. In addition, the temporal feature of the online community shows that the distributions of intervals of user behaviors, such as sending or accepting link requests, follow a power law with a universal exponent, and peaks emerge for intervals of an integral day. We finally study the preferential selection and linking phenomena of the social networking site and find that, for the former, a linear preference holds for preferential sending and reception, and for the latter, a linear preference also holds for preferential acceptance, creation, and attachment. Based on the linearly preferential linking, we put forward an analyzable network model which can reproduce the degree distribution of the network. The research framework presented in the paper could provide a potential insight into how the micro-motives of users lead to the global structure of online social networks.

  16. A computer program for the geometrically nonlinear static and dynamic analysis of arbitrarily loaded shells of revolution, theory and users manual

    NASA Technical Reports Server (NTRS)

    Ball, R. E.

    1972-01-01

    A digital computer program known as SATANS (static and transient analysis, nonlinear, shells) for the geometrically nonlinear static and dynamic response of arbitrarily loaded shells of revolution is presented. Instructions for the preparation of the input data cards and other information necessary for the operation of the program are described in detail and two sample problems are included. The governing partial differential equations are based upon Sanders' nonlinear thin shell theory for the conditions of small strains and moderately small rotations. The governing equations are reduced to uncoupled sets of four linear, second order, partial differential equations in the meridional and time coordinates by expanding the dependent variables in a Fourier sine or cosine series in the circumferential coordinate and treating the nonlinear modal coupling terms as pseudo loads. The derivatives with respect to the meridional coordinate are approximated by central finite differences, and the displacement accelerations are approximated by the implicit Houbolt backward difference scheme with a constant time interval. The boundaries of the shell may be closed, free, fixed, or elastically restrained. The program is coded in the FORTRAN 4 language and is dimensioned to allow a maximum of 10 arbitrary Fourier harmonics and a maximum product of the total number of meridional stations and the total number of Fourier harmonics of 200. The program requires 155,000 bytes of core storage.

  17. PubMed

    Trinker, Horst

    2011-10-28

    We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859-2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound.

  18. Ambient temperature and FIT performance in the Emilia-Romagna colorectal cancer screening programme.

    PubMed

    De Girolamo, Gianfranco; Goldoni, Carlo A; Corradini, Rossella; Giuliani, Orietta; Falcini, Fabio; Sassoli De'Bianchi, Priscilla; Naldoni, Carlo; Zauli Sajani, Stefano

    2016-12-01

    To assess the impact of ambient temperature on faecal immunochemical test (FIT) performance in the colorectal cancer screening programme of Emilia-Romagna (Italy). A population-based retrospective cohort study on data from 2005 to 2011. Positive rate, detection rate, and positive predictive value rate for cancers and adenomas, and incidence rate of interval cancers after negative tests were analysed using Poisson regression models. In addition to ambient temperature, gender, age, screening history, and Local Health Unit were also considered. In 1,521,819 tests analysed, the probability of a positive result decreased linearly with increasing temperature. Point estimates and 95% Confidence Intervals were estimated for six temperature classes (<5, 5 |-10, 10 |-15, 15 |-20, 20|-25 and ≥25℃), and referred to the 5|-10℃ class. The positive rate ratio was significantly related to temperature increase: 0.99 (0.97-1.02), 1, 0.98 (0.96-1.00), 0.96 (0.94-0.99), 0.93 (0.91-0.96), 0.92 (0.89-0.95). A linear trend was also evident for advanced adenoma detection rate ratio: 1.00 (0.96-1.04), 1, 0.98 (0.93-1.02), 0.96 (0.92-1.00), 0.92 (0.88-0.96), 0.94 (0.88-1.01). The effect was less linear, but still important, for cancer detection rates: 0.95 (0.85-1.06), 1, 1.00 (0.90-1.10), 0.94 (0.85-1.05), 0.81 (0.72-0.92), 0.93 (0.80-1.09). No association or linear trend was found for positive predictive values or risk of interval cancer, despite an excess of +16% in the highest temperature class for interval cancer. Ambient temperatures can affect screening performance. Continued monitoring is needed to verify the effect of introducing FIT tubes with a new buffer, which should guarantee a higher stability of haemoglobin. © The Author(s) 2016.

  19. Advanced Interval Type-2 Fuzzy Sliding Mode Control for Robot Manipulator.

    PubMed

    Hwang, Ji-Hwan; Kang, Young-Chang; Park, Jong-Wook; Kim, Dong W

    2017-01-01

    In this paper, advanced interval type-2 fuzzy sliding mode control (AIT2FSMC) for robot manipulator is proposed. The proposed AIT2FSMC is a combination of interval type-2 fuzzy system and sliding mode control. For resembling a feedback linearization (FL) control law, interval type-2 fuzzy system is designed. For compensating the approximation error between the FL control law and interval type-2 fuzzy system, sliding mode controller is designed, respectively. The tuning algorithms are derived in the sense of Lyapunov stability theorem. Two-link rigid robot manipulator with nonlinearity is used to test and the simulation results are presented to show the effectiveness of the proposed method that can control unknown system well.

  20. LINEAR - DERIVATION AND DEFINITION OF A LINEAR AIRCRAFT MODEL

    NASA Technical Reports Server (NTRS)

    Duke, E. L.

    1994-01-01

    The Derivation and Definition of a Linear Model program, LINEAR, provides the user with a powerful and flexible tool for the linearization of aircraft aerodynamic models. LINEAR was developed to provide a standard, documented, and verified tool to derive linear models for aircraft stability analysis and control law design. Linear system models define the aircraft system in the neighborhood of an analysis point and are determined by the linearization of the nonlinear equations defining vehicle dynamics and sensors. LINEAR numerically determines a linear system model using nonlinear equations of motion and a user supplied linear or nonlinear aerodynamic model. The nonlinear equations of motion used are six-degree-of-freedom equations with stationary atmosphere and flat, nonrotating earth assumptions. LINEAR is capable of extracting both linearized engine effects, such as net thrust, torque, and gyroscopic effects and including these effects in the linear system model. The point at which this linear model is defined is determined either by completely specifying the state and control variables, or by specifying an analysis point on a trajectory and directing the program to determine the control variables and the remaining state variables. The system model determined by LINEAR consists of matrices for both the state and observation equations. The program has been designed to provide easy selection of state, control, and observation variables to be used in a particular model. Thus, the order of the system model is completely under user control. Further, the program provides the flexibility of allowing alternate formulations of both the state and observation equations. Data describing the aircraft and the test case is input to the program through a terminal or formatted data files. All data can be modified interactively from case to case. The aerodynamic model can be defined in two ways: a set of nondimensional stability and control derivatives for the flight point of interest, or a full non-linear aerodynamic model as used in simulations. LINEAR is written in FORTRAN and has been implemented on a DEC VAX computer operating under VMS with a virtual memory requirement of approximately 296K of 8 bit bytes. Both an interactive and batch version are included. LINEAR was developed in 1988.

  1. Variable frame rate transmission - A review of methodology and application to narrow-band LPC speech coding

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. R.; Makhoul, J.; Schwartz, R. M.; Huggins, A. W. F.

    1982-04-01

    The variable frame rate (VFR) transmission methodology developed, implemented, and tested in the years 1973-1978 for efficiently transmitting linear predictive coding (LPC) vocoder parameters extracted from the input speech at a fixed frame rate is reviewed. With the VFR method, parameters are transmitted only when their values have changed sufficiently over the interval since their preceding transmission. Two distinct approaches to automatic implementation of the VFR method are discussed. The first bases the transmission decisions on comparisons between the parameter values of the present frame and the last transmitted frame. The second, which is based on a functional perceptual model of speech, compares the parameter values of all the frames that lie in the interval between the present frame and the last transmitted frame against a linear model of parameter variation over that interval. Also considered is the application of VFR transmission to the design of narrow-band LPC speech coders with average bit rates of 2000-2400 bts/s.

  2. Isoelectronic studies of the 5s/sup 2/ /sup 1/S/sub 0/-5s5p/sup 1,3/P/sub J/ intervals in the Cd sequence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, L.J.

    1986-02-01

    The 5s/sup 2/ /sup 1/S/sub 0/-5s5p/sup 1,3/P/sub J/ energy intervals in the Cd isoelectronic sequence have been investigated through a semiempirical systematization of recent measurements and through the performance of ab initio multiconfiguration Dirac-Fock calculations. Screening-parameter reductions of the spin-orbit and exchange energies both for the observed data and for the theoretically computed values establish the existence of empirical linearities similar to those exploited earlier for the Be, Mg, and Zn sequences. This permits extrapolative isoelectronic predictions of the relative energies of the 5s5p levels, which can be connected to 5s/sup 2/ using intersinglet intervals obtained from empirically corrected abmore » initio calculations. These linearities have also been examined homologously for the Zn, Cd, and Hg sequences, and common relationships have been found that accurately describe all three of these sequences.« less

  3. Active fault tolerant control based on interval type-2 fuzzy sliding mode controller and non linear adaptive observer for 3-DOF laboratory helicopter.

    PubMed

    Zeghlache, Samir; Benslimane, Tarak; Bouguerra, Abderrahmen

    2017-11-01

    In this paper, a robust controller for a three degree of freedom (3 DOF) helicopter control is proposed in presence of actuator and sensor faults. For this purpose, Interval type-2 fuzzy logic control approach (IT2FLC) and sliding mode control (SMC) technique are used to design a controller, named active fault tolerant interval type-2 Fuzzy Sliding mode controller (AFTIT2FSMC) based on non-linear adaptive observer to estimate and detect the system faults for each subsystem of the 3-DOF helicopter. The proposed control scheme allows avoiding difficult modeling, attenuating the chattering effect of the SMC, reducing the rules number of the fuzzy controller. Exponential stability of the closed loop is guaranteed by using the Lyapunov method. The simulation results show that the AFTIT2FSMC can greatly alleviate the chattering effect, providing good tracking performance, even in presence of actuator and sensor faults. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Frequency analysis via the method of moment functionals

    NASA Technical Reports Server (NTRS)

    Pearson, A. E.; Pan, J. Q.

    1990-01-01

    Several variants are presented of a linear-in-parameters least squares formulation for determining the transfer function of a stable linear system at specified frequencies given a finite set of Fourier series coefficients calculated from transient nonstationary input-output data. The basis of the technique is Shinbrot's classical method of moment functionals using complex Fourier based modulating functions to convert a differential equation model on a finite time interval into an algebraic equation which depends linearly on frequency-related parameters.

  5. Participation in fitness-related activities of an incentive-based health promotion program and hospital costs: a retrospective longitudinal study.

    PubMed

    Patel, Deepak; Lambert, Estelle V; da Silva, Roseanne; Greyling, Mike; Kolbe-Alexander, Tracy; Noach, Adam; Conradie, Jaco; Nossel, Craig; Borresen, Jill; Gaziano, Thomas

    2011-01-01

    A retrospective, longitudinal study examined changes in participation in fitness-related activities and hospital claims over 5 years amongst members of an incentivized health promotion program offered by a private health insurer. A 3-year retrospective observational analysis measuring gym visits and participation in documented fitness-related activities, probability of hospital admission, and associated costs of admission. A South African private health plan, Discovery Health and the Vitality health promotion program. 304,054 adult members of the Discovery medical plan, 192,467 of whom registered for the health promotion program and 111,587 members who were not on the program. Members were incentivised for fitness-related activities on the basis of the frequency of gym visits. Changes in electronically documented gym visits and registered participation in fitness-related activities over 3 years and measures of association between changes in participation (years 1-3) and subsequent probability and costs of hospital admission (years 4-5). Hospital admissions and associated costs are based on claims extracted from the health insurer database. The probability of a claim modeled by using linear logistic regression and costs of claims examined by using general linear models. Propensity scores were estimated and included age, gender, registration for chronic disease benefits, plan type, and the presence of a claim during the transition period, and these were used as covariates in the final model. There was a significant decrease in the prevalence of inactive members (76% to 68%) over 5 years. Members who remained highly active (years 1-3) had a lower probability (p < .05) of hospital admission in years 4 to 5 (20.7%) compared with those who remained inactive (22.2%). The odds of admission were 13% lower for two additional gym visits per week (odds ratio, .87; 95% confidence interval [CI], .801-.949). We observed an increase in fitness-related activities over time amongst members of this incentive-based health promotion program, which was associated with a lower probability of hospital admission and lower hospital costs in the subsequent 2 years. Copyright © 2011 by American Journal of Health Promotion, Inc.

  6. 75 FR 39593 - Self-Regulatory Organizations; NASDAQ OMX PHLX, Inc.; Order Granting Approval of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ... stocks on which option series may be listed at $1 strike price intervals. To be eligible for inclusion in..., 2003) (SR-Phlx-2002-55) (approval of pilot program). The Strike Program was then extended several times... option series with $1 strike price intervals for any class selected for the program, except as...

  7. Variance approach for multi-objective linear programming with fuzzy random of objective function coefficients

    NASA Astrophysics Data System (ADS)

    Indarsih, Indrati, Ch. Rini

    2016-02-01

    In this paper, we define variance of the fuzzy random variables through alpha level. We have a theorem that can be used to know that the variance of fuzzy random variables is a fuzzy number. We have a multi-objective linear programming (MOLP) with fuzzy random of objective function coefficients. We will solve the problem by variance approach. The approach transform the MOLP with fuzzy random of objective function coefficients into MOLP with fuzzy of objective function coefficients. By weighted methods, we have linear programming with fuzzy coefficients and we solve by simplex method for fuzzy linear programming.

  8. Genetic programming over context-free languages with linear constraints for the knapsack problem: first results.

    PubMed

    Bruhn, Peter; Geyer-Schulz, Andreas

    2002-01-01

    In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.

  9. Chronic and acute inspiratory muscle loading augment the effect of a 6-week interval program on tolerance of high-intensity intermittent bouts of running.

    PubMed

    Tong, Tom K; Fu, Frank H; Eston, Roger; Chung, Pak-Kwong; Quach, Binh; Lu, Kui

    2010-11-01

    This study examined the hypothesis that chronic (training) and acute (warm-up) loaded ventilatory activities applied to the inspiratory muscles (IM) in an integrated manner would augment the training volume of an interval running program. This in turn would result in additional improvement in the maximum performance of the Yo-Yo intermittent recovery test in comparison with interval training alone. Eighteen male nonprofessional athletes were allocated to either an inspiratory muscle loading (IML) group or control group. Both groups participated in a 6-week interval running program consisting of 3-4 workouts (1-3 sets of various repetitions of selected distance [100-2,400 m] per workout) per week. For the IML group, 4-week IM training (30 inspiratory efforts at 50% maximal static inspiratory pressure [P0] per set, 2 sets·d-1, 6 d·wk-1) was applied before the interval program. Specific IM warm-up (2 sets of 30 inspiratory efforts at 40% P0) was performed before each workout of the program. For the control group, neither IML was applied. In comparison with the control group, the interval training volume as indicated by the repeatability of running bouts at high intensity was approximately 27% greater in the IML group. Greater increase in the maximum performance of the Yo-Yo test (control: 16.9 ± 5.5%; IML: 30.7 ± 4.7% baseline value) was also observed after training. The enhanced exercise performance was partly attributable to the greater reductions in the sensation of breathlessness and whole-body metabolic stress during the Yo-Yo test. These findings show that the combination of chronic and acute IML into a high-intensity interval running program is a beneficial training strategy for enhancing the tolerance to high-intensity intermittent bouts of running.

  10. Energy expenditure estimation during daily military routine with body-fixed sensors.

    PubMed

    Wyss, Thomas; Mäder, Urs

    2011-05-01

    The purpose of this study was to develop and validate an algorithm for estimating energy expenditure during the daily military routine on the basis of data collected using body-fixed sensors. First, 8 volunteers completed isolated physical activities according to an established protocol, and the resulting data were used to develop activity-class-specific multiple linear regressions for physical activity energy expenditure on the basis of hip acceleration, heart rate, and body mass as independent variables. Second, the validity of these linear regressions was tested during the daily military routine using indirect calorimetry (n = 12). Volunteers' mean estimated energy expenditure did not significantly differ from the energy expenditure measured with indirect calorimetry (p = 0.898, 95% confidence interval = -1.97 to 1.75 kJ/min). We conclude that the developed activity-class-specific multiple linear regressions applied to the acceleration and heart rate data allow estimation of energy expenditure in 1-minute intervals during daily military routine, with accuracy equal to indirect calorimetry.

  11. Automated Dynamic Demand Response Implementation on a Micro-grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuppannagari, Sanmukh R.; Kannan, Rajgopal; Chelmis, Charalampos

    In this paper, we describe a system for real-time automated Dynamic and Sustainable Demand Response with sparse data consumption prediction implemented on the University of Southern California campus microgrid. Supply side approaches to resolving energy supply-load imbalance do not work at high levels of renewable energy penetration. Dynamic Demand Response (D 2R) is a widely used demand-side technique to dynamically adjust electricity consumption during peak load periods. Our D 2R system consists of accurate machine learning based energy consumption forecasting models that work with sparse data coupled with fast and sustainable load curtailment optimization algorithms that provide the ability tomore » dynamically adapt to changing supply-load imbalances in near real-time. Our Sustainable DR (SDR) algorithms attempt to distribute customer curtailment evenly across sub-intervals during a DR event and avoid expensive demand peaks during a few sub-intervals. It also ensures that each customer is penalized fairly in order to achieve the targeted curtailment. We develop near linear-time constant-factor approximation algorithms along with Polynomial Time Approximation Schemes (PTAS) for SDR curtailment that minimizes the curtailment error defined as the difference between the target and achieved curtailment values. Our SDR curtailment problem is formulated as an Integer Linear Program that optimally matches customers to curtailment strategies during a DR event while also explicitly accounting for customer strategy switching overhead as a constraint. We demonstrate the results of our D 2R system using real data from experiments performed on the USC smartgrid and show that 1) our prediction algorithms can very accurately predict energy consumption even with noisy or missing data and 2) our curtailment algorithms deliver DR with extremely low curtailment errors in the 0.01-0.05 kWh range.« less

  12. Linear-Algebra Programs

    NASA Technical Reports Server (NTRS)

    Lawson, C. L.; Krogh, F. T.; Gold, S. S.; Kincaid, D. R.; Sullivan, J.; Williams, E.; Hanson, R. J.; Haskell, K.; Dongarra, J.; Moler, C. B.

    1982-01-01

    The Basic Linear Algebra Subprograms (BLAS) library is a collection of 38 FORTRAN-callable routines for performing basic operations of numerical linear algebra. BLAS library is portable and efficient source of basic operations for designers of programs involving linear algebriac computations. BLAS library is supplied in portable FORTRAN and Assembler code versions for IBM 370, UNIVAC 1100 and CDC 6000 series computers.

  13. Interval stability for complex systems

    NASA Astrophysics Data System (ADS)

    Klinshov, Vladimir V.; Kirillov, Sergey; Kurths, Jürgen; Nekorkin, Vladimir I.

    2018-04-01

    Stability of dynamical systems against strong perturbations is an important problem of nonlinear dynamics relevant to many applications in various areas. Here, we develop a novel concept of interval stability, referring to the behavior of the perturbed system during a finite time interval. Based on this concept, we suggest new measures of stability, namely interval basin stability (IBS) and interval stability threshold (IST). IBS characterizes the likelihood that the perturbed system returns to the stable regime (attractor) in a given time. IST provides the minimal magnitude of the perturbation capable to disrupt the stable regime for a given interval of time. The suggested measures provide important information about the system susceptibility to external perturbations which may be useful for practical applications. Moreover, from a theoretical viewpoint the interval stability measures are shown to bridge the gap between linear and asymptotic stability. We also suggest numerical algorithms for quantification of the interval stability characteristics and demonstrate their potential for several dynamical systems of various nature, such as power grids and neural networks.

  14. Feasibility, Safety, and Preliminary Effectiveness of a Home-Based Self-Managed High-Intensity Interval Training Program Offered to Long-Term Manual Wheelchair Users.

    PubMed

    Gauthier, Cindy; Brosseau, Rachel; Hicks, Audrey L; Gagnon, Dany H

    2018-01-01

    To investigate and compare the feasibility, safety, and preliminary effectiveness of home-based self-managed manual wheelchair high-intensity interval training (HIIT) and moderate-intensity continuous training (MICT) programs. Eleven manual wheelchair users were randomly assigned to the HIIT ( n = 6) or the MICT group ( n = 5). Both six-week programs consisted of three 40-minute propulsion training sessions per week. The HIIT group alternated between 30 s high-intensity intervals and 60 s low-intensity intervals, whereas the MICT group maintained a constant moderate intensity. Cardiorespiratory fitness, upper limb strength, and shoulder pain were measured before and after the programs. Participants completed a questionnaire on the programs that explored general areas of feasibility. The answers to the questionnaire demonstrated that both training programs were feasible in the community. No severe adverse events occurred, although some participants experienced increased shoulder pain during HIIT. Neither program yielded a significant change in cardiorespiratory fitness or upper limb strength. However, both groups reported moderate to significant subjective improvement. Home-based wheelchair HIIT appears feasible and safe although potential development of shoulder pain remains a concern and should be addressed with a future preventive shoulder exercise program. Some recommendations have been proposed for a larger study aiming to strengthen evidence regarding the feasibility, safety, and effectiveness of HIIT.

  15. Bursting as a source of non-linear determinism in the firing patterns of nigral dopamine neurons.

    PubMed

    Jeong, Jaeseung; Shi, Wei-Xing; Hoffman, Ralph; Oh, Jihoon; Gore, John C; Bunney, Benjamin S; Peterson, Bradley S

    2012-11-01

    Nigral dopamine (DA) neurons in vivo exhibit complex firing patterns consisting of tonic single-spikes and phasic bursts that encode information for certain types of reward-related learning and behavior. Non-linear dynamical analysis has previously demonstrated the presence of a non-linear deterministic structure in complex firing patterns of DA neurons, yet the origin of this non-linear determinism remains unknown. In this study, we hypothesized that bursting activity is the primary source of non-linear determinism in the firing patterns of DA neurons. To test this hypothesis, we investigated the dimension complexity of inter-spike interval data recorded in vivo from bursting and non-bursting DA neurons in the chloral hydrate-anesthetized rat substantia nigra. We found that bursting DA neurons exhibited non-linear determinism in their firing patterns, whereas non-bursting DA neurons showed truly stochastic firing patterns. Determinism was also detected in the isolated burst and inter-burst interval data extracted from firing patterns of bursting neurons. Moreover, less bursting DA neurons in halothane-anesthetized rats exhibited higher dimensional spiking dynamics than do more bursting DA neurons in chloral hydrate-anesthetized rats. These results strongly indicate that bursting activity is the main source of low-dimensional, non-linear determinism in the firing patterns of DA neurons. This finding furthermore suggests that bursts are the likely carriers of meaningful information in the firing activities of DA neurons. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  16. Reduced-Size Integer Linear Programming Models for String Selection Problems: Application to the Farthest String Problem.

    PubMed

    Zörnig, Peter

    2015-08-01

    We present integer programming models for some variants of the farthest string problem. The number of variables and constraints is substantially less than that of the integer linear programming models known in the literature. Moreover, the solution of the linear programming-relaxation contains only a small proportion of noninteger values, which considerably simplifies the rounding process. Numerical tests have shown excellent results, especially when a small set of long sequences is given.

  17. Caribou distribution during the post-calving period in relation to infrastructure in the Prudhoe Bay oil field, Alaska

    USGS Publications Warehouse

    Cronin, Matthew A.; Amstrup, Steven C.; Durner, George M.; Noel, Lynn E.; McDonald, Trent L.; Ballard, Warren B.

    1998-01-01

    There is concern that caribou (Rangifer tarandus) may avoid roads and facilities (i.e., infrastructure) in the Prudhoe Bay oil field (PBOF) in northern Alaska, and that this avoidance can have negative effects on the animals. We quantified the relationship between caribou distribution and PBOF infrastructure during the post-calving period (mid-June to mid-August) with aerial surveys from 1990 to 1995. We conducted four to eight surveys per year with complete coverage of the PBOF. We identified active oil field infrastructure and used a geographic information system (GIS) to construct ten 1 km wide concentric intervals surrounding the infrastructure. We tested whether caribou distribution is related to distance from infrastructure with a chi-squared habitat utilization-availability analysis and log-linear regression. We considered bulls, calves, and total caribou of all sex/age classes separately. The habitat utilization-availability analysis indicated there was no consistent trend of attraction to or avoidance of infrastructure. Caribou frequently were more abundant than expected in the intervals close to infrastructure, and this trend was more pronounced for bulls and for total caribou of all sex/age classes than for calves. Log-linear regression (with Poisson error structure) of numbers of caribou and distance from infrastructure were also done, with and without combining data into the 1 km distance intervals. The analysis without intervals revealed no relationship between caribou distribution and distance from oil field infrastructure, or between caribou distribution and Julian date, year, or distance from the Beaufort Sea coast. The log-linear regression with caribou combined into distance intervals showed the density of bulls and total caribou of all sex/age classes declined with distance from infrastructure. Our results indicate that during the post-calving period: 1) caribou distribution is largely unrelated to distance from infrastructure; 2) caribou regularly use habitats in the PBOF; 3) caribou often occur close to infrastructure; and 4) caribou do not appear to avoid oil field infrastructure.

  18. A Block-LU Update for Large-Scale Linear Programming

    DTIC Science & Technology

    1990-01-01

    linear programming problems. Results are given from runs on the Cray Y -MP. 1. Introduction We wish to use the simplex method [Dan63] to solve the...standard linear program, minimize cTx subject to Ax = b 1< x <U, where A is an m by n matrix and c, x, 1, u, and b are of appropriate dimension. The simplex...the identity matrix. The basis is used to solve for the search direction y and the dual variables 7r in the following linear systems: Bky = aq (1.2) and

  19. Cutting Force Predication Based on Integration of Symmetric Fuzzy Number and Finite Element Method

    PubMed Central

    Wang, Zhanli; Hu, Yanjuan; Wang, Yao; Dong, Chao; Pang, Zaixiang

    2014-01-01

    In the process of turning, pointing at the uncertain phenomenon of cutting which is caused by the disturbance of random factors, for determining the uncertain scope of cutting force, the integrated symmetric fuzzy number and the finite element method (FEM) are used in the prediction of cutting force. The method used symmetric fuzzy number to establish fuzzy function between cutting force and three factors and obtained the uncertain interval of cutting force by linear programming. At the same time, the change curve of cutting force with time was directly simulated by using thermal-mechanical coupling FEM; also the nonuniform stress field and temperature distribution of workpiece, tool, and chip under the action of thermal-mechanical coupling were simulated. The experimental result shows that the method is effective for the uncertain prediction of cutting force. PMID:24790556

  20. Linear System of Equations, Matrix Inversion, and Linear Programming Using MS Excel

    ERIC Educational Resources Information Center

    El-Gebeily, M.; Yushau, B.

    2008-01-01

    In this note, we demonstrate with illustrations two different ways that MS Excel can be used to solve Linear Systems of Equation, Linear Programming Problems, and Matrix Inversion Problems. The advantage of using MS Excel is its availability and transparency (the user is responsible for most of the details of how a problem is solved). Further, we…

  1. High profile students’ growth of mathematical understanding in solving linier programing problems

    NASA Astrophysics Data System (ADS)

    Utomo; Kusmayadi, TA; Pramudya, I.

    2018-04-01

    Linear program has an important role in human’s life. This linear program is learned in senior high school and college levels. This material is applied in economy, transportation, military and others. Therefore, mastering linear program is useful for provision of life. This research describes a growth of mathematical understanding in solving linear programming problems based on the growth of understanding by the Piere-Kieren model. Thus, this research used qualitative approach. The subjects were students of grade XI in Salatiga city. The subjects of this study were two students who had high profiles. The researcher generally chose the subjects based on the growth of understanding from a test result in the classroom; the mark from the prerequisite material was ≥ 75. Both of the subjects were interviewed by the researcher to know the students’ growth of mathematical understanding in solving linear programming problems. The finding of this research showed that the subjects often folding back to the primitive knowing level to go forward to the next level. It happened because the subjects’ primitive understanding was not comprehensive.

  2. Can Linear Superiorization Be Useful for Linear Optimization Problems?

    PubMed Central

    Censor, Yair

    2017-01-01

    Linear superiorization considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are (i) Does linear superiorization provide a feasible point whose linear target function value is lower than that obtained by running the same feasibility-seeking algorithm without superiorization under identical conditions? and (ii) How does linear superiorization fare in comparison with the Simplex method for solving linear programming problems? Based on our computational experiments presented here, the answers to these two questions are: “yes” and “very well”, respectively. PMID:29335660

  3. General purpose computer programs for numerically analyzing linear ac electrical and electronic circuits for steady-state conditions

    NASA Technical Reports Server (NTRS)

    Egebrecht, R. A.; Thorbjornsen, A. R.

    1967-01-01

    Digital computer programs determine steady-state performance characteristics of active and passive linear circuits. The ac analysis program solves the basic circuit parameters. The compiler program solves these circuit parameters and in addition provides a more versatile program by allowing the user to perform mathematical and logical operations.

  4. A sequential solution for anisotropic total variation image denoising with interval constraints

    NASA Astrophysics Data System (ADS)

    Xu, Jingyan; Noo, Frédéric

    2017-09-01

    We show that two problems involving the anisotropic total variation (TV) and interval constraints on the unknown variables admit, under some conditions, a simple sequential solution. Problem 1 is a constrained TV penalized image denoising problem; problem 2 is a constrained fused lasso signal approximator. The sequential solution entails finding first the solution to the unconstrained problem, and then applying a thresholding to satisfy the constraints. If the interval constraints are uniform, this sequential solution solves problem 1. If the interval constraints furthermore contain zero, the sequential solution solves problem 2. Here uniform interval constraints refer to all unknowns being constrained to the same interval. A typical example of application is image denoising in x-ray CT, where the image intensities are non-negative as they physically represent linear attenuation coefficient in the patient body. Our results are simple yet seem unknown; we establish them using the Karush-Kuhn-Tucker conditions for constrained convex optimization.

  5. Task Space Angular Velocity Blending for Real-Time Trajectory Generation

    NASA Technical Reports Server (NTRS)

    Volpe, Richard A. (Inventor)

    1997-01-01

    The invention is embodied in a method of controlling a robot manipulator moving toward a target frame F(sub 0) with a target velocity v(sub 0) including a linear target velocity v and an angular target velocity omega(sub 0) to smoothly and continuously divert the robot manipulator to a subsequent frame F(sub 1) by determining a global transition velocity v(sub 1), the global transition velocity including a linear transition velocity v(sub 1) and an angular transition velocity omega(sub 1), defining a blend time interval 2(tau)(sub 0) within which the global velocity of the robot manipulator is to be changed from a global target velocity v(sub 0) to the global transition velocity v(sub 1) and dividing the blend time interval 2(tau)(sub 0) into discrete time segments (delta)t. During each one of the discrete time segments delta t of the blend interval 2(tau)(sub 0), a blended global velocity v of the manipulator is computed as a blend of the global target velocity v(sub 0) and the global transition velocity v(sub 1), the blended global velocity v including a blended angular velocity omega and a blended linear velocity v, and then, the manipulator is rotated by an incremental rotation corresponding to an integration of the blended angular velocity omega over one discrete time segment (delta)t.

  6. Iteration with Spreadsheets.

    ERIC Educational Resources Information Center

    Smith, Michael

    1990-01-01

    Presents several examples of the iteration method using computer spreadsheets. Examples included are simple iterative sequences and the solution of equations using the Newton-Raphson formula, linear interpolation, and interval bisection. (YP)

  7. Linear Programming and Its Application to Pattern Recognition Problems

    NASA Technical Reports Server (NTRS)

    Omalley, M. J.

    1973-01-01

    Linear programming and linear programming like techniques as applied to pattern recognition problems are discussed. Three relatively recent research articles on such applications are summarized. The main results of each paper are described, indicating the theoretical tools needed to obtain them. A synopsis of the author's comments is presented with regard to the applicability or non-applicability of his methods to particular problems, including computational results wherever given.

  8. Comparison of head impact location during games and practices in Division III men's lacrosse players.

    PubMed

    O'Day, Kathleen M; Koehling, Elizabeth M; Vollavanh, Lydia R; Bradney, Debbie; May, James M; Breedlove, Katherine M; Breedlove, Evan L; Blair, Price; Nauman, Eric A; Bowman, Thomas G

    2017-03-01

    Head impacts have been studied extensively in football, but little similar research has been conducted in men's lacrosse. It is important to understand the location and magnitude of head impacts during men's lacrosse to recognize the risk of head injury. Descriptive epidemiology study set on collegiate lacrosse fields. Eleven men's lacrosse players (age=20.9±1.13years, mass=83.91±9.04kg, height=179.88±5.99cm) volunteered to participate. We applied X2 sensors behind the right ear of participants for games and practices. Sensors recorded data on linear and rotational accelerations and the location of head impacts. We calculated incidence rates per 1000 exposures with 95% confidence intervals for impact locations and compared the effect of impact location on linear and rotational accelerations with Kruskal-Wallis tests. We verified 167 head impacts (games=112; practices=55). During games, the incidence rate was 651.16 (95% confidence interval=530.57-771.76). The high and low incidence rates for head impact locations during games were: side=410.7 (95% confidence interval=292.02-529.41) and top=26.79 (95% confidence interval=3.53-57.10). For games and practices combined, the impact locations did not significantly affect linear (χ 2 3 =6.69, P=0.08) or rotational acceleration (χ 2 3 =6.34, P=0.10). We suggest further research into the location of head impacts during games and practices. We also suggest player and coach education on head impacts as well as behavior modification in men's lacrosse athletes to reduce the incidence of impacts to the side of the head in an effort to reduce potential injury. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Real-Time Safety and Quality Reporting System: Assessment of Clinical Data and Staff Participation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rahn, Douglas A.; Kim, Gwe-Ya; Mundt, Arno J.

    Purpose: To report on the use of an incident learning system in a radiation oncology clinic, along with a review of staff participation. Methods and Materials: On September 24, 2010, our department initiated an online real-time voluntary reporting system for safety issues, called the Radiation Oncology Quality Reporting System (ROQRS). We reviewed these reports from the program's inception through January 18, 2013 (2 years, 3 months, 25 days) to assess error reports (defined as both near-misses and incidents of inaccurate treatment). Results: During the study interval, there were 60,168 fractions of external beam radiation therapy and 955 brachytherapy procedures. There were 298 entriesmore » in the ROQRS system, among which 108 errors were reported. There were 31 patients with near-misses reported and 27 patients with incidents of inaccurate treatment reported. These incidents of inaccurate treatment occurred in 68 total treatment fractions (0.11% of treatments delivered during the study interval). None of these incidents of inaccurate treatment resulted in deviation from the prescription by 5% or more. A solution to the errors was documented in ROQRS in 65% of the cases. Errors occurred as repeated errors in 22% of the cases. A disproportionate number of the incidents of inaccurate treatment were due to improper patient setup at the linear accelerator (P<.001). Physician participation in ROQRS was nonexistent initially, but improved after an education program. Conclusions: Incident learning systems are a useful and practical means of improving safety and quality in patient care.« less

  10. Evaluation of electrical impedance ratio measurements in accuracy of electronic apex locators.

    PubMed

    Kim, Pil-Jong; Kim, Hong-Gee; Cho, Byeong-Hoon

    2015-05-01

    The aim of this paper was evaluating the ratios of electrical impedance measurements reported in previous studies through a correlation analysis in order to explicit it as the contributing factor to the accuracy of electronic apex locator (EAL). The literature regarding electrical property measurements of EALs was screened using Medline and Embase. All data acquired were plotted to identify correlations between impedance and log-scaled frequency. The accuracy of the impedance ratio method used to detect the apical constriction (APC) in most EALs was evaluated using linear ramp function fitting. Changes of impedance ratios for various frequencies were evaluated for a variety of file positions. Among the ten papers selected in the search process, the first-order equations between log-scaled frequency and impedance were in the negative direction. When the model for the ratios was assumed to be a linear ramp function, the ratio values decreased if the file went deeper and the average ratio values of the left and right horizontal zones were significantly different in 8 out of 9 studies. The APC was located within the interval of linear relation between the left and right horizontal zones of the linear ramp model. Using the ratio method, the APC was located within a linear interval. Therefore, using the impedance ratio between electrical impedance measurements at different frequencies was a robust method for detection of the APC.

  11. Statistics of return intervals between long heartbeat intervals and their usability for online prediction of disorders

    NASA Astrophysics Data System (ADS)

    Bogachev, Mikhail I.; Kireenkov, Igor S.; Nifontov, Eugene M.; Bunde, Armin

    2009-06-01

    We study the statistics of return intervals between large heartbeat intervals (above a certain threshold Q) in 24 h records obtained from healthy subjects. We find that both the linear and the nonlinear long-term memory inherent in the heartbeat intervals lead to power-laws in the probability density function PQ(r) of the return intervals. As a consequence, the probability WQ(t; Δt) that at least one large heartbeat interval will occur within the next Δt heartbeat intervals, with an increasing elapsed number of intervals t after the last large heartbeat interval, follows a power-law. Based on these results, we suggest a method of obtaining a priori information about the occurrence of the next large heartbeat interval, and thus to predict it. We show explicitly that the proposed method, which exploits long-term memory, is superior to the conventional precursory pattern recognition technique, which focuses solely on short-term memory. We believe that our results can be straightforwardly extended to obtain more reliable predictions in other physiological signals like blood pressure, as well as in other complex records exhibiting multifractal behaviour, e.g. turbulent flow, precipitation, river flows and network traffic.

  12. Two Computer Programs for the Statistical Evaluation of a Weighted Linear Composite.

    ERIC Educational Resources Information Center

    Sands, William A.

    1978-01-01

    Two computer programs (one batch, one interactive) are designed to provide statistics for a weighted linear combination of several component variables. Both programs provide mean, variance, standard deviation, and a validity coefficient. (Author/JKS)

  13. d'plus: A program to calculate accuracy and bias measures from detection and discrimination data.

    PubMed

    Macmillan, N A; Creelman, C D

    1997-01-01

    The program d'plus calculates accuracy (sensitivity) and response-bias parameters using Signal Detection Theory. Choice Theory, and 'nonparametric' models. is is appropriate for data from one-interval, two- and three-interval forced-choice, same different, ABX, and oddity experimental paradigms.

  14. A ’Multiple Pivoting’ Algorithm for Goal-Interval Programming Formulations.

    DTIC Science & Technology

    1980-03-01

    jotso _P- ,- Research Report CCS 355 A "MULTIPLE PIVOTING" ALGORITHM FOR GOAL-INTERVAL PROGRAMMING FORMULATIONS by R. Armstrong* A. Charnes*W. Cook...J. Godfrey*** March 1980 *The University of Texas at Austin **York University, Downsview, Ontario, Canada ***Washington, DC This research was partly...areas. However, the main direction of goal programing research has been in formulating models instead of seeking procedures that would provide

  15. New Steering Strategies for the USNO Master Clocks

    DTIC Science & Technology

    1999-12-01

    1992. P. Koppang and R. Leland , “Linear quadratic stochastic control of atomic hydrogen masers,” IEEE Trans. Ultrason., Ferroelect., Freq. Contr...vol. 46, pp. 517-522, May 1999. P. Koppang and R. Leland , “Steering of frequency standards by the use of linear quadratic gaussian control theory...3lst Annual Precise Time and Time Interval (PTTI) Meeting NEWSTEERINGSTRATEGIESFOR THEUSNOMASTERCLOCKS Paul A. Koppang Datum, Inc. Beverly, MA

  16. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  17. Note: Increasing dynamic range of digital-to-analog converter using a superconducting quantum interference device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakanishi, Masakazu, E-mail: m.nakanishi@aist.go.jp

    Responses of a superconducting quantum interference device (SQUID) are periodically dependent on magnetic flux coupling to its superconducting ring and the period is a flux quantum (Φ{sub o} = h/2e, where h and e, respectively, express Planck's constant and elementary charge). Using this periodicity, we had proposed a digital to analog converter using a SQUID (SQUID DAC) of first generation with linear current output, interval of which corresponded to Φ{sub o}. Modification for increasing dynamic range by interpolating within each interval is reported. Linearity of the interpolation was also based on the quantum periodicity. A SQUID DAC with dynamic rangemore » of about 1.4 × 10{sup 7} was created as a demonstration.« less

  18. Effect of Adding Interferential Current in an Exercise and Manual Therapy Program for Patients With Unilateral Shoulder Impingement Syndrome: A Randomized Clinical Trial.

    PubMed

    Gomes, Cid André Fidelis de Paula; Dibai-Filho, Almir Vieira; Moreira, William Arruda; Rivas, Shirley Quispe; Silva, Emanuela Dos Santos; Garrido, Ana Claudia Bogik

    The purpose of this study was to measure the additional effect of adding interferential current (IFC) to an exercise and manual therapy program for patients with unilateral shoulder impingement syndrome. Forty-five participants were randomly assigned to group 1 (exercise and manual therapy), group 2 (exercise and manual therapy + IFC), or group 3 (exercise and manual therapy + placebo ultrasound). Individuals participated in 16 treatment sessions, twice a week for 8 weeks. The primary outcome of the study was total score of the Shoulder Pain and Disability Index (SPADI). The secondary outcomes were the pain and disability subscales of SPADI, Numeric Rating Scale, and Pain-Related Self-Statement Scale. Adjusted between-group mean differences (MDs) and 95% confidence intervals (CIs) were calculated using linear mixed models. After 16 treatment sessions, statistically significant but not clinically important differences were identified in favor of the exercise and manual therapy program alone in the SPADI-total (group 1 vs group 2, MD 11.12 points, 95% CI 5.90-16.35; group 1 vs group 3, MD 13.43 points, 95% CI 8.21-18.65). Similar results were identified for secondary outcomes. The addition of IFC does not generate greater clinical effects in an exercise and manual therapy program for individuals with unilateral shoulder impingement syndrome. Copyright © 2018. Published by Elsevier Inc.

  19. Findings From the EASY Minds Cluster Randomized Controlled Trial: Evaluation of a Physical Activity Integration Program for Mathematics in Primary Schools.

    PubMed

    Riley, Nicholas; Lubans, David R; Holmes, Kathryn; Morgan, Philip J

    2016-02-01

    To evaluate the impact of a primary school-based physical activity (PA) integration program delivered by teachers on objectively measured PA and key educational outcomes. Ten classes from 8 Australian public schools were randomly allocated to treatment conditions. Teachers from the intervention group were taught to embed movement-based learning in their students' (n = 142) daily mathematics program in 3 lessons per week for 6 weeks. The control group (n = 98) continued its regular mathematics program. The primary outcome was accelerometer-determined PA across the school day. Linear mixed models were used to analyze treatment effects. Significant intervention effects were found for PA across the school day (adjusted mean difference 103 counts per minute [CPM], 95% confidence interval [CI], 36.5-169.7, P = .008). Intervention effects were also found for PA (168 CPM, 95% CI, 90.1-247.4, P = .008) and moderate-to-vigorous PA (2.6%, 95% CI, 0.9-4.4, P = .009) in mathematics lessons, sedentary time across the school day (-3.5%, 95% CI, -7.0 to -0.13, P = .044) and during mathematics (-8.2%, CI, -13.0 to -2.0, P = .010) and on-task behavior (13.8%, 95% CI, 4.0-23.6, P = .011)-but not for mathematics performance or attitude. Integrating movement across the primary mathematics syllabus is feasible and efficacious.

  20. Simplified large African carnivore density estimators from track indices.

    PubMed

    Winterbach, Christiaan W; Ferreira, Sam M; Funston, Paul J; Somers, Michael J

    2016-01-01

    The range, population size and trend of large carnivores are important parameters to assess their status globally and to plan conservation strategies. One can use linear models to assess population size and trends of large carnivores from track-based surveys on suitable substrates. The conventional approach of a linear model with intercept may not intercept at zero, but may fit the data better than linear model through the origin. We assess whether a linear regression through the origin is more appropriate than a linear regression with intercept to model large African carnivore densities and track indices. We did simple linear regression with intercept analysis and simple linear regression through the origin and used the confidence interval for ß in the linear model y  =  αx  + ß, Standard Error of Estimate, Mean Squares Residual and Akaike Information Criteria to evaluate the models. The Lion on Clay and Low Density on Sand models with intercept were not significant ( P  > 0.05). The other four models with intercept and the six models thorough origin were all significant ( P  < 0.05). The models using linear regression with intercept all included zero in the confidence interval for ß and the null hypothesis that ß = 0 could not be rejected. All models showed that the linear model through the origin provided a better fit than the linear model with intercept, as indicated by the Standard Error of Estimate and Mean Square Residuals. Akaike Information Criteria showed that linear models through the origin were better and that none of the linear models with intercept had substantial support. Our results showed that linear regression through the origin is justified over the more typical linear regression with intercept for all models we tested. A general model can be used to estimate large carnivore densities from track densities across species and study areas. The formula observed track density = 3.26 × carnivore density can be used to estimate densities of large African carnivores using track counts on sandy substrates in areas where carnivore densities are 0.27 carnivores/100 km 2 or higher. To improve the current models, we need independent data to validate the models and data to test for non-linear relationship between track indices and true density at low densities.

  1. Accommodation of practical constraints by a linear programming jet select. [for Space Shuttle

    NASA Technical Reports Server (NTRS)

    Bergmann, E.; Weiler, P.

    1983-01-01

    An experimental spacecraft control system will be incorporated into the Space Shuttle flight software and exercised during a forthcoming mission to evaluate its performance and handling qualities. The control system incorporates a 'phase space' control law to generate rate change requests and a linear programming jet select to compute jet firings. Posed as a linear programming problem, jet selection must represent the rate change request as a linear combination of jet acceleration vectors where the coefficients are the jet firing times, while minimizing the fuel expended in satisfying that request. This problem is solved in real time using a revised Simplex algorithm. In order to implement the jet selection algorithm in the Shuttle flight control computer, it was modified to accommodate certain practical features of the Shuttle such as limited computer throughput, lengthy firing times, and a large number of control jets. To the authors' knowledge, this is the first such application of linear programming. It was made possible by careful consideration of the jet selection problem in terms of the properties of linear programming and the Simplex algorithm. These modifications to the jet select algorithm may by useful for the design of reaction controlled spacecraft.

  2. Effect of correlation on covariate selection in linear and nonlinear mixed effect models.

    PubMed

    Bonate, Peter L

    2017-01-01

    The effect of correlation among covariates on covariate selection was examined with linear and nonlinear mixed effect models. Demographic covariates were extracted from the National Health and Nutrition Examination Survey III database. Concentration-time profiles were Monte Carlo simulated where only one covariate affected apparent oral clearance (CL/F). A series of univariate covariate population pharmacokinetic models was fit to the data and compared with the reduced model without covariate. The "best" covariate was identified using either the likelihood ratio test statistic or AIC. Weight and body surface area (calculated using Gehan and George equation, 1970) were highly correlated (r = 0.98). Body surface area was often selected as a better covariate than weight, sometimes as high as 1 in 5 times, when weight was the covariate used in the data generating mechanism. In a second simulation, parent drug concentration and three metabolites were simulated from a thorough QT study and used as covariates in a series of univariate linear mixed effects models of ddQTc interval prolongation. The covariate with the largest significant LRT statistic was deemed the "best" predictor. When the metabolite was formation-rate limited and only parent concentrations affected ddQTc intervals the metabolite was chosen as a better predictor as often as 1 in 5 times depending on the slope of the relationship between parent concentrations and ddQTc intervals. A correlated covariate can be chosen as being a better predictor than another covariate in a linear or nonlinear population analysis by sheer correlation These results explain why for the same drug different covariates may be identified in different analyses. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Timber management planning with timber ram and goal programming

    Treesearch

    Richard C. Field

    1978-01-01

    By using goal programming to enhance the linear programming of Timber RAM, multiple decision criteria were incorporated in the timber management planning of a National Forest in the southeastern United States. Combining linear and goal programming capitalizes on the advantages of the two techniques and produces operationally feasible solutions. This enhancement may...

  4. A comparison of Heuristic method and Llewellyn’s rules for identification of redundant constraints

    NASA Astrophysics Data System (ADS)

    Estiningsih, Y.; Farikhin; Tjahjana, R. H.

    2018-03-01

    Important techniques in linear programming is modelling and solving practical optimization. Redundant constraints are consider for their effects on general linear programming problems. Identification and reduce redundant constraints are for avoidance of all the calculations associated when solving an associated linear programming problems. Many researchers have been proposed for identification redundant constraints. This paper a compararison of Heuristic method and Llewellyn’s rules for identification of redundant constraints.

  5. River water quality management considering agricultural return flows: application of a nonlinear two-stage stochastic fuzzy programming.

    PubMed

    Tavakoli, Ali; Nikoo, Mohammad Reza; Kerachian, Reza; Soltani, Maryam

    2015-04-01

    In this paper, a new fuzzy methodology is developed to optimize water and waste load allocation (WWLA) in rivers under uncertainty. An interactive two-stage stochastic fuzzy programming (ITSFP) method is utilized to handle parameter uncertainties, which are expressed as fuzzy boundary intervals. An iterative linear programming (ILP) is also used for solving the nonlinear optimization model. To accurately consider the impacts of the water and waste load allocation strategies on the river water quality, a calibrated QUAL2Kw model is linked with the WWLA optimization model. The soil, water, atmosphere, and plant (SWAP) simulation model is utilized to determine the quantity and quality of each agricultural return flow. To control pollution loads of agricultural networks, it is assumed that a part of each agricultural return flow can be diverted to an evaporation pond and also another part of it can be stored in a detention pond. In detention ponds, contaminated water is exposed to solar radiation for disinfecting pathogens. Results of applying the proposed methodology to the Dez River system in the southwestern region of Iran illustrate its effectiveness and applicability for water and waste load allocation in rivers. In the planning phase, this methodology can be used for estimating the capacities of return flow diversion system and evaporation and detention ponds.

  6. Human sinus arrhythmia as an index of vagal cardiac outflow

    NASA Technical Reports Server (NTRS)

    Eckberg, D. L.

    1983-01-01

    The human central vagal mechanisms were investigated by measuring the intervals between heartbeats during controlled breathing (at breathing intervals of 2.5-10 s and nominal tidal volumes of 1000 and 1500 ml) in six young men and women. It was found that as the breathing interval increased, the longest heart periods became longer, the shortest heart periods became shorter, and the peak-valley P-P intervals increased asymptotically. Peak-valley intervals also increased in proportion to tidal volume, although this influence was small. The phase angles between heart period changes and respiration were found to vary as linear functions of breathing interval. Heart period shortening began in inspiration at short breathing intervals and in expiration at long breathing intervals, while heart period lengthening began in early expiration at all breathing intervals studied. It is concluded that a close relationship exists between variations of respiratory depth and interval and the quantity, periodicity, and timing of vagal cardiac outflow in conscious humans. The results indicate that at usual breathing rates, phasic respiration-related changes of vagal motoneuron activity begin in expiration, progress slowly, and are incompletely expressed at fast breathing ratges.

  7. Atomic temporal interval relations in branching time: calculation and application

    NASA Astrophysics Data System (ADS)

    Anger, Frank D.; Ladkin, Peter B.; Rodriguez, Rita V.

    1991-03-01

    A practical method of reasoning about intervals in a branching-time model which is dense, unbounded, future-branching, without rejoining branches is presented. The discussion is based on heuristic constraint- propagation techniques using the relation algebra of binary temporal relations among the intervals over the branching-time model. This technique has been applied with success to models of intervals over linear time by Allen and others, and is of cubic-time complexity. To extend it to branding-time models, it is necessary to calculate compositions of the relations; thus, the table of compositions for the 'atomic' relations is computed, enabling the rapid determination of the composition of arbitrary relations, expressed as disjunctions or unions of the atomic relations.

  8. 47 CFR 73.642 - Subscription TV service.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... available to the FCC upon request. (d) The use of the visual vertical blanking interval or an aural... programming may be used only upon specific FCC authorization. Letter requests to use either the video blanking intervals or aural subcarriers during periods of non-subscription programming are to be sent to the FCC in...

  9. 47 CFR 73.642 - Subscription TV service.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... available to the FCC upon request. (d) The use of the visual vertical blanking interval or an aural... programming may be used only upon specific FCC authorization. Letter requests to use either the video blanking intervals or aural subcarriers during periods of non-subscription programming are to be sent to the FCC in...

  10. A Linear Programming Model to Optimize Various Objective Functions of a Foundation Type State Support Program.

    ERIC Educational Resources Information Center

    Matzke, Orville R.

    The purpose of this study was to formulate a linear programming model to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…

  11. Interval cancers in a population-based screening program for colorectal cancer in catalonia, Spain.

    PubMed

    Garcia, M; Domènech, X; Vidal, C; Torné, E; Milà, N; Binefa, G; Benito, L; Moreno, V

    2015-01-01

    Objective. To analyze interval cancers among participants in a screening program for colorectal cancer (CRC) during four screening rounds. Methods. The study population consisted of participants of a fecal occult blood test-based screening program from February 2000 to September 2010, with a 30-month follow-up (n = 30,480). We used hospital administration data to identify CRC. An interval cancer was defined as an invasive cancer diagnosed within 30 months of a negative screening result and before the next recommended examination. Gender, age, stage, and site distribution of interval cancers were compared with those in the screen-detected group. Results. Within the study period, 97 tumors were screen-detected and 74 tumors were diagnosed after a negative screening. In addition, 17 CRC (18.3%) were found after an inconclusive result and 2 cases were diagnosed within the surveillance interval (2.1%). There was an increase of interval cancers over the four rounds (from 32.4% to 46.0%). When compared with screen-detected cancers, interval cancers were found predominantly in the rectum (OR: 3.66; 95% CI: 1.51-8.88) and at more advanced stages (P = 0.025). Conclusion. There are large numbers of cancer that are not detected through fecal occult blood test-based screening. The low sensitivity should be emphasized to ensure that individuals with symptoms are not falsely reassured.

  12. Postmolar gestational trophoblastic neoplasia: beyond the traditional risk factors.

    PubMed

    Bakhtiyari, Mahmood; Mirzamoradi, Masoumeh; Kimyaiee, Parichehr; Aghaie, Abbas; Mansournia, Mohammd Ali; Ashrafi-Vand, Sepideh; Sarfjoo, Fatemeh Sadat

    2015-09-01

    To investigate the slope of linear regression of postevacuation serum hCG as an independent risk factor for postmolar gestational trophoblastic neoplasia (GTN). Multicenter retrospective cohort study. Academic referral health care centers. All subjects with confirmed hydatidiform mole and at least four measurements of β-hCG titer. None. Type and magnitude of the relationship between the slope of linear regression of β-hCG as a new risk factor and GTN using Bayesian logistic regression with penalized log-likelihood estimation. Among the high-risk and low-risk molar pregnancy cases, 11 (18.6%) and 19 cases (13.3%) had GTN, respectively. No significant relationship was found between the components of a high-risk pregnancy and GTN. The β-hCG return slope was higher in the spontaneous cure group. However, the initial level of this hormone in the first measurement was higher in the GTN group compared with in the spontaneous recovery group. The average time for diagnosing GTN in the high-risk molar pregnancy group was 2 weeks less than that of the low-risk molar pregnancy group. In addition to slope of linear regression of β-hCG (odds ratio [OR], 12.74, confidence interval [CI], 5.42-29.2), abortion history (OR, 2.53; 95% CI, 1.27-5.04) and large uterine height for gestational age (OR, 1.26; CI, 1.04-1.54) had the maximum effects on GTN outcome, respectively. The slope of linear regression of β-hCG was introduced as an independent risk factor, which could be used for clinical decision making based on records of β-hCG titer and subsequent prevention program. Copyright © 2015 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  13. Preterm infant linear growth and adiposity gain: trade-offs for later weight status and intelligence quotient.

    PubMed

    Belfort, Mandy B; Gillman, Matthew W; Buka, Stephen L; Casey, Patrick H; McCormick, Marie C

    2013-12-01

    To examine trade-offs between cognitive outcome and overweight/obesity in preterm-born infants at school age and young adulthood in relation to weight gain and linear growth during infancy. We studied 945 participants in the Infant Health and Development Program, an 8-center study of preterm (≤37 weeks gestational age), low birth weight (≤2500 g) infants from birth to age 18 years. Adjusting for maternal and child factors in logistic regression, we estimated the odds of overweight/obesity (body mass index [BMI] ≥85th percentile at age 8 or ≥25 kg/m(2) at age 18) and in separate models, low IQ (<85) per z-score changes in infant length and BMI from term to 4 months, from 4 to 12 months, and from 12 to 18 months. More rapid linear growth from term to 4 months was associated with lower odds of IQ <85 at age 8 years (OR, 0.82; 95% CI, 0.70-0.96), but higher odds of overweight/obesity (OR, 1.27; 95% CI, 1.05-1.53). More rapid BMI gain in all 3 infant time intervals was also associated with higher odds of overweight/obesity, and BMI gain from 4-12 months was associated with lower odds of IQ <85 at age 8. Results at age 18 were similar. In these preterm, low birth weight infants born in the 1980s, faster linear growth soon after term was associated with better cognition, but also with a greater risk of overweight/obesity at age 8 years and 18 years. BMI gain over the entire 18 months after term was associated with later risk of overweight/obesity, with less evidence of a benefit for IQ. Copyright © 2013 Mosby, Inc. All rights reserved.

  14. A duality approach for solving bounded linear programming problems with fuzzy variables based on ranking functions and its application in bounded transportation problems

    NASA Astrophysics Data System (ADS)

    Ebrahimnejad, Ali

    2015-08-01

    There are several methods, in the literature, for solving fuzzy variable linear programming problems (fuzzy linear programming in which the right-hand-side vectors and decision variables are represented by trapezoidal fuzzy numbers). In this paper, the shortcomings of some existing methods are pointed out and to overcome these shortcomings a new method based on the bounded dual simplex method is proposed to determine the fuzzy optimal solution of that kind of fuzzy variable linear programming problems in which some or all variables are restricted to lie within lower and upper bounds. To illustrate the proposed method, an application example is solved and the obtained results are given. The advantages of the proposed method over existing methods are discussed. Also, one application of this algorithm in solving bounded transportation problems with fuzzy supplies and demands is dealt with. The proposed method is easy to understand and to apply for determining the fuzzy optimal solution of bounded fuzzy variable linear programming problems occurring in real-life situations.

  15. On the Statistical Errors of RADAR Location Sensor Networks with Built-In Wi-Fi Gaussian Linear Fingerprints

    PubMed Central

    Zhou, Mu; Xu, Yu Bin; Ma, Lin; Tian, Shuo

    2012-01-01

    The expected errors of RADAR sensor networks with linear probabilistic location fingerprints inside buildings with varying Wi-Fi Gaussian strength are discussed. As far as we know, the statistical errors of equal and unequal-weighted RADAR networks have been suggested as a better way to evaluate the behavior of different system parameters and the deployment of reference points (RPs). However, up to now, there is still not enough related work on the relations between the statistical errors, system parameters, number and interval of the RPs, let alone calculating the correlated analytical expressions of concern. Therefore, in response to this compelling problem, under a simple linear distribution model, much attention will be paid to the mathematical relations of the linear expected errors, number of neighbors, number and interval of RPs, parameters in logarithmic attenuation model and variations of radio signal strength (RSS) at the test point (TP) with the purpose of constructing more practical and reliable RADAR location sensor networks (RLSNs) and also guaranteeing the accuracy requirements for the location based services in future ubiquitous context-awareness environments. Moreover, the numerical results and some real experimental evaluations of the error theories addressed in this paper will also be presented for our future extended analysis. PMID:22737027

  16. On the statistical errors of RADAR location sensor networks with built-in Wi-Fi Gaussian linear fingerprints.

    PubMed

    Zhou, Mu; Xu, Yu Bin; Ma, Lin; Tian, Shuo

    2012-01-01

    The expected errors of RADAR sensor networks with linear probabilistic location fingerprints inside buildings with varying Wi-Fi Gaussian strength are discussed. As far as we know, the statistical errors of equal and unequal-weighted RADAR networks have been suggested as a better way to evaluate the behavior of different system parameters and the deployment of reference points (RPs). However, up to now, there is still not enough related work on the relations between the statistical errors, system parameters, number and interval of the RPs, let alone calculating the correlated analytical expressions of concern. Therefore, in response to this compelling problem, under a simple linear distribution model, much attention will be paid to the mathematical relations of the linear expected errors, number of neighbors, number and interval of RPs, parameters in logarithmic attenuation model and variations of radio signal strength (RSS) at the test point (TP) with the purpose of constructing more practical and reliable RADAR location sensor networks (RLSNs) and also guaranteeing the accuracy requirements for the location based services in future ubiquitous context-awareness environments. Moreover, the numerical results and some real experimental evaluations of the error theories addressed in this paper will also be presented for our future extended analysis.

  17. Kendall-Theil Robust Line (KTRLine--version 1.0)-A Visual Basic Program for Calculating and Graphing Robust Nonparametric Estimates of Linear-Regression Coefficients Between Two Continuous Variables

    USGS Publications Warehouse

    Granato, Gregory E.

    2006-01-01

    The Kendall-Theil Robust Line software (KTRLine-version 1.0) is a Visual Basic program that may be used with the Microsoft Windows operating system to calculate parameters for robust, nonparametric estimates of linear-regression coefficients between two continuous variables. The KTRLine software was developed by the U.S. Geological Survey, in cooperation with the Federal Highway Administration, for use in stochastic data modeling with local, regional, and national hydrologic data sets to develop planning-level estimates of potential effects of highway runoff on the quality of receiving waters. The Kendall-Theil robust line was selected because this robust nonparametric method is resistant to the effects of outliers and nonnormality in residuals that commonly characterize hydrologic data sets. The slope of the line is calculated as the median of all possible pairwise slopes between points. The intercept is calculated so that the line will run through the median of input data. A single-line model or a multisegment model may be specified. The program was developed to provide regression equations with an error component for stochastic data generation because nonparametric multisegment regression tools are not available with the software that is commonly used to develop regression models. The Kendall-Theil robust line is a median line and, therefore, may underestimate total mass, volume, or loads unless the error component or a bias correction factor is incorporated into the estimate. Regression statistics such as the median error, the median absolute deviation, the prediction error sum of squares, the root mean square error, the confidence interval for the slope, and the bias correction factor for median estimates are calculated by use of nonparametric methods. These statistics, however, may be used to formulate estimates of mass, volume, or total loads. The program is used to read a two- or three-column tab-delimited input file with variable names in the first row and data in subsequent rows. The user may choose the columns that contain the independent (X) and dependent (Y) variable. A third column, if present, may contain metadata such as the sample-collection location and date. The program screens the input files and plots the data. The KTRLine software is a graphical tool that facilitates development of regression models by use of graphs of the regression line with data, the regression residuals (with X or Y), and percentile plots of the cumulative frequency of the X variable, Y variable, and the regression residuals. The user may individually transform the independent and dependent variables to reduce heteroscedasticity and to linearize data. The program plots the data and the regression line. The program also prints model specifications and regression statistics to the screen. The user may save and print the regression results. The program can accept data sets that contain up to about 15,000 XY data points, but because the program must sort the array of all pairwise slopes, the program may be perceptibly slow with data sets that contain more than about 1,000 points.

  18. Orthopaedic Application Of Spatio Temporal Analysis Of Body Form And Function

    NASA Astrophysics Data System (ADS)

    Tauber, C.; Au, J.; Bernstein, S.; Grant, A.; Pugh, J.

    1983-07-01

    Spatial and temporal analysis of walking provides the orthopaedist with objective evidence of functional ability and improvement in a patient. Patients with orthopaedic problems experiencing extreme pain and, consequently, irregularities in joint motions on weightbearing are videorecorded before, during and after a course of rehabilitative treatment and/or surgical correction of their disability. A specially-programmed computer analyzes these tapes for the parameters of walking by locating reflective spots which indicate the centers of the lower limb joints. The following parameters of gait are then generated: dynamic hip, knee and foot angles at various intervals during walking; vertical, horizontal and lateral displacements of each joint at various time intervals; linear and angular velocities of each joint; and the relationships between the joints during various phases of the gait cycle. The systematic sampling and analysis of the videorecordings by computer enable such information to be converted into and presented as computer graphics, as well as organized into tables of gait variables. This format of presentation of the skeletal adjustments involved in normal human motion provides the clinician with a visual format of gait information which objectively illuminates the multifaceted and complex factors involved. This system provides the clinician a method by which to evaluate the success of the regimen in terms of patient comfort and function.

  19. Can linear superiorization be useful for linear optimization problems?

    NASA Astrophysics Data System (ADS)

    Censor, Yair

    2017-04-01

    Linear superiorization (LinSup) considers linear programming problems but instead of attempting to solve them with linear optimization methods it employs perturbation resilient feasibility-seeking algorithms and steers them toward reduced (not necessarily minimal) target function values. The two questions that we set out to explore experimentally are: (i) does LinSup provide a feasible point whose linear target function value is lower than that obtained by running the same feasibility-seeking algorithm without superiorization under identical conditions? (ii) How does LinSup fare in comparison with the Simplex method for solving linear programming problems? Based on our computational experiments presented here, the answers to these two questions are: ‘yes’ and ‘very well’, respectively.

  20. Portfolio optimization using fuzzy linear programming

    NASA Astrophysics Data System (ADS)

    Pandit, Purnima K.

    2013-09-01

    Portfolio Optimization (PO) is a problem in Finance, in which investor tries to maximize return and minimize risk by carefully choosing different assets. Expected return and risk are the most important parameters with regard to optimal portfolios. In the simple form PO can be modeled as quadratic programming problem which can be put into equivalent linear form. PO problems with the fuzzy parameters can be solved as multi-objective fuzzy linear programming problem. In this paper we give the solution to such problems with an illustrative example.

  1. Users manual for linear Time-Varying Helicopter Simulation (Program TVHIS)

    NASA Technical Reports Server (NTRS)

    Burns, M. R.

    1979-01-01

    A linear time-varying helicopter simulation program (TVHIS) is described. The program is designed as a realistic yet efficient helicopter simulation. It is based on a linear time-varying helicopter model which includes rotor, actuator, and sensor models, as well as a simulation of flight computer logic. The TVHIS can generate a mean trajectory simulation along a nominal trajectory, or propagate covariance of helicopter states, including rigid-body, turbulence, control command, controller states, and rigid-body state estimates.

  2. 77 FR 54635 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Order Granting Approval of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-05

    ... Strike Price Intervals in the Short Term Options Program August 29, 2012. I. Introduction On July 2, 2012...-4 thereunder,\\2\\ a proposed rule change to indicate that the interval between strike prices on short term options series (``STOs'') listed in accordance with its Short Term Option Series Program (``STO...

  3. 77 FR 33543 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-06

    ... Term Option Series Program (``STOS Program'') to permit, during the expiration week of an option class... rule to open for trading Short Term Option Series at $0.50 strike price intervals for option classes... Short Term Option Series at $0.50 strike price intervals for option classes that trade in one dollar...

  4. 76 FR 60107 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Order Granting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... underlying price ($24.50) is $20, the Exchange may list a $22 strike. The proposal also contains certain non... Rule to Simplify the $1 Strike Price Interval Program September 22, 2011. I. Introduction On July 26... amend Interpretation and Policy .01 to Rule 5.5 to simplify the $1 Strike Price Interval Program (the...

  5. Socio-dramatic affective-relational intervention for adolescents with asperger syndrome & high functioning autism: pilot study.

    PubMed

    Lerner, Matthew D; Mikami, Amori Yee; Levine, Karen

    2011-01-01

    This study examined the effectiveness of a novel intervention called 'socio-dramatic affective-relational intervention' (SDARI), intended to improve social skills among adolescents with Asperger syndrome and high functioning autism diagnoses. SDARI adapts dramatic training activities to focus on in vivo practice of areas of social skill deficit among this population. SDARI was administered as a six-week summer program in a community human service agency. Nine SDARI participants and eight age- and diagnosis-group matched adolescents not receiving SDARI were compared on child- and parent-report of social functioning at three week intervals beginning six weeks prior to intervention and ending six weeks post-intervention. Hierarchical Linear Modeling (HLM) was used to estimate growth trends between groups to assess treatment outcomes and post-treatment maintenance. Results indicated significant improvement and post-treatment maintenance among SDARI participants on several measures of child social functioning. Implications for practice and research are discussed.

  6. Program Monitoring with LTL in EAGLE

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Goldberg, Allen; Havelund, Klaus; Sen, Koushik

    2004-01-01

    We briefly present a rule-based framework called EAGLE, shown to be capable of defining and implementing finite trace monitoring logics, including future and past time temporal logic, extended regular expressions, real-time and metric temporal logics (MTL), interval logics, forms of quantified temporal logics, and so on. In this paper we focus on a linear temporal logic (LTL) specialization of EAGLE. For an initial formula of size m, we establish upper bounds of O(m(sup 2)2(sup m)log m) and O(m(sup 4)2(sup 2m)log(sup 2) m) for the space and time complexity, respectively, of single step evaluation over an input trace. This bound is close to the lower bound O(2(sup square root m) for future-time LTL presented. EAGLE has been successfully used, in both LTL and metric LTL forms, to test a real-time controller of an experimental NASA planetary rover.

  7. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  8. Influence of Inter-Training Intervals on Intermanual Transfer Effects in Upper-Limb Prosthesis Training: A Randomized Pre-Posttest Study.

    PubMed

    Romkema, Sietske; Bongers, Raoul M; van der Sluis, Corry K

    2015-01-01

    Improvement in prosthetic training using intermanual transfer (the transfer of motor skills from the trained, “unaffected” hand to the untrained, “affected” hand) has been shown in previous studies. The aim of this study is to determine the influence of the inter-training interval on the magnitude of the intermanual transfer effects. This was done using a mechanistic, randomized, single-blinded pretest-posttest design. Sixty-four able-bodied, right-handed participants were randomly assigned to the Short and Long Interval Training Groups and the Short and Long Interval Control Groups. The Short and Long Interval Training Groups used a prosthesis simulator in their training program. The Short and Long Interval Control Groups executed a sham training program, that is, a dummy training program in which the same muscles were trained as with the prosthesis simulator. The Short Interval Training Group and the Short Interval Control Groups trained on consecutive days, while the Long Interval Training Group and Long Interval Control Group trained twice a week. To determine the improvement in skills, a test was administered before, immediately after, and at two points in time after the training. Training was performed with the “unaffected” arm; tests were performed with the “affected” arm. The outcome measurements were: the movement time (the time from the beginning of the movement until completion of the task); the duration of maximum hand opening, (the opening of the prosthetic hand while grasping an object); and the grip-force control (the error from the required grip-force during a tracking task). Intermanual transfer was found in movement times, but not in hand opening or grip-force control. The length of the inter-training interval did not affect the magnitude of intermanual transfer effects. No difference in the intermanual transfer effect in upper-limb prosthesis training was found for training on a daily basis as compared to training twice a week. Nederlands Trial Register NTR3888.

  9. Influence of Inter-Training Intervals on Intermanual Transfer Effects in Upper-Limb Prosthesis Training: A Randomized Pre-Posttest Study

    PubMed Central

    Romkema, Sietske; Bongers, Raoul M.; van der Sluis, Corry K.

    2015-01-01

    Improvement in prosthetic training using intermanual transfer (the transfer of motor skills from the trained, “unaffected” hand to the untrained, “affected” hand) has been shown in previous studies. The aim of this study is to determine the influence of the inter-training interval on the magnitude of the intermanual transfer effects. This was done using a mechanistic, randomized, single-blinded pretest-posttest design. Sixty-four able-bodied, right-handed participants were randomly assigned to the Short and Long Interval Training Groups and the Short and Long Interval Control Groups. The Short and Long Interval Training Groups used a prosthesis simulator in their training program. The Short and Long Interval Control Groups executed a sham training program, that is, a dummy training program in which the same muscles were trained as with the prosthesis simulator. The Short Interval Training Group and the Short Interval Control Groups trained on consecutive days, while the Long Interval Training Group and Long Interval Control Group trained twice a week. To determine the improvement in skills, a test was administered before, immediately after, and at two points in time after the training. Training was performed with the “unaffected” arm; tests were performed with the “affected” arm. The outcome measurements were: the movement time (the time from the beginning of the movement until completion of the task); the duration of maximum hand opening, (the opening of the prosthetic hand while grasping an object); and the grip-force control (the error from the required grip-force during a tracking task). Intermanual transfer was found in movement times, but not in hand opening or grip-force control. The length of the inter-training interval did not affect the magnitude of intermanual transfer effects. No difference in the intermanual transfer effect in upper-limb prosthesis training was found for training on a daily basis as compared to training twice a week. Trial Registration Nederlands Trial Register NTR3888 PMID:26075396

  10. Linear Programming for Vocational Education Planning. Interim Report.

    ERIC Educational Resources Information Center

    Young, Robert C.; And Others

    The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…

  11. Reducing Formation-Keeping Maneuver Costs for Formation Flying Satellites in Low-Earth Orbit

    NASA Technical Reports Server (NTRS)

    Hamilton, Nicholas

    2001-01-01

    Several techniques are used to synthesize the formation-keeping control law for a three-satellite formation in low-earth orbit. The objective is to minimize maneuver cost and position tracking error. Initial reductions are found for a one-satellite case by tuning the state-weighting matrix within the linear-quadratic-Gaussian framework. Further savings come from adjusting the maneuver interval. Scenarios examined include cases with and without process noise. These results are then applied to a three-satellite formation. For both the one-satellite and three-satellite cases, increasing the maneuver interval yields a decrease in maneuver cost and an increase in position tracking error. A maneuver interval of 8-10 minutes provides a good trade-off between maneuver cost and position tracking error. An analysis of the closed-loop poles with respect to varying maneuver intervals explains the effectiveness of the chosen maneuver interval.

  12. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  13. Infinite time interval backward stochastic differential equations with continuous coefficients.

    PubMed

    Zong, Zhaojun; Hu, Feng

    2016-01-01

    In this paper, we study the existence theorem for [Formula: see text] [Formula: see text] solutions to a class of 1-dimensional infinite time interval backward stochastic differential equations (BSDEs) under the conditions that the coefficients are continuous and have linear growths. We also obtain the existence of a minimal solution. Furthermore, we study the existence and uniqueness theorem for [Formula: see text] [Formula: see text] solutions of infinite time interval BSDEs with non-uniformly Lipschitz coefficients. It should be pointed out that the assumptions of this result is weaker than that of Theorem 3.1 in Zong (Turkish J Math 37:704-718, 2013).

  14. Confirmation of linear system theory prediction: Changes in Herrnstein's k as a function of changes in reinforcer magnitude.

    PubMed

    McDowell, J J; Wood, H M

    1984-03-01

    Eight human subjects pressed a lever on a range of variable-interval schedules for 0.25 cent to 35.0 cent per reinforcement. Herrnstein's hyperbola described seven of the eight subjects' response-rate data well. For all subjects, the y-asymptote of the hyperbola increased with increasing reinforcer magnitude and its reciprocal was a linear function of the reciprocal of reinforcer magnitude. These results confirm predictions made by linear system theory; they contradict formal properties of Herrnstein's account and of six other mathematical accounts of single-alternative responding.

  15. Confirmation of linear system theory prediction: Changes in Herrnstein's k as a function of changes in reinforcer magnitude

    PubMed Central

    McDowell, J. J; Wood, Helena M.

    1984-01-01

    Eight human subjects pressed a lever on a range of variable-interval schedules for 0.25¢ to 35.0¢ per reinforcement. Herrnstein's hyperbola described seven of the eight subjects' response-rate data well. For all subjects, the y-asymptote of the hyperbola increased with increasing reinforcer magnitude and its reciprocal was a linear function of the reciprocal of reinforcer magnitude. These results confirm predictions made by linear system theory; they contradict formal properties of Herrnstein's account and of six other mathematical accounts of single-alternative responding. PMID:16812366

  16. Analysis of an inventory model for both linearly decreasing demand and holding cost

    NASA Astrophysics Data System (ADS)

    Malik, A. K.; Singh, Parth Raj; Tomar, Ajay; Kumar, Satish; Yadav, S. K.

    2016-03-01

    This study proposes the analysis of an inventory model for linearly decreasing demand and holding cost for non-instantaneous deteriorating items. The inventory model focuses on commodities having linearly decreasing demand without shortages. The holding cost doesn't remain uniform with time due to any form of variation in the time value of money. Here we consider that the holding cost decreases with respect to time. The optimal time interval for the total profit and the optimal order quantity are determined. The developed inventory model is pointed up through a numerical example. It also includes the sensitivity analysis.

  17. Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.

    PubMed

    Obuchowski, Nancy A; Bullen, Jennifer

    2017-01-01

    Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.

  18. Algorithms and Complexity Results for Genome Mapping Problems.

    PubMed

    Rajaraman, Ashok; Zanetti, Joao Paulo Pereira; Manuch, Jan; Chauve, Cedric

    2017-01-01

    Genome mapping algorithms aim at computing an ordering of a set of genomic markers based on local ordering information such as adjacencies and intervals of markers. In most genome mapping models, markers are assumed to occur uniquely in the resulting map. We introduce algorithmic questions that consider repeats, i.e., markers that can have several occurrences in the resulting map. We show that, provided with an upper bound on the copy number of repeated markers and with intervals that span full repeat copies, called repeat spanning intervals, the problem of deciding if a set of adjacencies and repeat spanning intervals admits a genome representation is tractable if the target genome can contain linear and/or circular chromosomal fragments. We also show that extracting a maximum cardinality or weight subset of repeat spanning intervals given a set of adjacencies that admits a genome realization is NP-hard but fixed-parameter tractable in the maximum copy number and the number of adjacent repeats, and tractable if intervals contain a single repeated marker.

  19. Guided Discovery, Visualization, and Technology Applied to the New Curriculum for Secondary Mathematics.

    ERIC Educational Resources Information Center

    Smith, Karan B.

    1996-01-01

    Presents activities which highlight major concepts of linear programming. Demonstrates how technology allows students to solve linear programming problems using exploration prior to learning algorithmic methods. (DDR)

  20. A convenient method of obtaining percentile norms and accompanying interval estimates for self-report mood scales (DASS, DASS-21, HADS, PANAS, and sAD).

    PubMed

    Crawford, John R; Garthwaite, Paul H; Lawrie, Caroline J; Henry, Julie D; MacDonald, Marie A; Sutherland, Jane; Sinha, Priyanka

    2009-06-01

    A series of recent papers have reported normative data from the general adult population for commonly used self-report mood scales. To bring together and supplement these data in order to provide a convenient means of obtaining percentile norms for the mood scales. A computer program was developed that provides point and interval estimates of the percentile rank corresponding to raw scores on the various self-report scales. The program can be used to obtain point and interval estimates of the percentile rank of an individual's raw scores on the DASS, DASS-21, HADS, PANAS, and sAD mood scales, based on normative sample sizes ranging from 758 to 3822. The interval estimates can be obtained using either classical or Bayesian methods as preferred. The computer program (which can be downloaded at www.abdn.ac.uk/~psy086/dept/MoodScore.htm) provides a convenient and reliable means of supplementing existing cut-off scores for self-report mood scales.

  1. A queuing-theory-based interval-fuzzy robust two-stage programming model for environmental management under uncertainty

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Li, Y. P.; Huang, G. H.

    2012-06-01

    In this study, a queuing-theory-based interval-fuzzy robust two-stage programming (QB-IRTP) model is developed through introducing queuing theory into an interval-fuzzy robust two-stage (IRTP) optimization framework. The developed QB-IRTP model can not only address highly uncertain information for the lower and upper bounds of interval parameters but also be used for analysing a variety of policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. Moreover, it can reflect uncertainties in queuing theory problems. The developed method has been applied to a case of long-term municipal solid waste (MSW) management planning. Interval solutions associated with different waste-generation rates, different waiting costs and different arriving rates have been obtained. They can be used for generating decision alternatives and thus help managers to identify desired MSW management policies under various economic objectives and system reliability constraints.

  2. SCI Identification (SCIDNT) program user's guide. [maximum likelihood method for linear rotorcraft models

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The computer program Linear SCIDNT which evaluates rotorcraft stability and control coefficients from flight or wind tunnel test data is described. It implements the maximum likelihood method to maximize the likelihood function of the parameters based on measured input/output time histories. Linear SCIDNT may be applied to systems modeled by linear constant-coefficient differential equations. This restriction in scope allows the application of several analytical results which simplify the computation and improve its efficiency over the general nonlinear case.

  3. Indirect synthesis of multi-degree of freedom transient systems. [linear programming for a kinematically linear system

    NASA Technical Reports Server (NTRS)

    Pilkey, W. D.; Chen, Y. H.

    1974-01-01

    An indirect synthesis method is used in the efficient optimal design of multi-degree of freedom, multi-design element, nonlinear, transient systems. A limiting performance analysis which requires linear programming for a kinematically linear system is presented. The system is selected using system identification methods such that the designed system responds as closely as possible to the limiting performance. The efficiency is a result of the method avoiding the repetitive systems analyses accompanying other numerical optimization methods.

  4. Preliminary efficacy and feasibility of embedding high intensity interval training into the school day: A pilot randomized controlled trial.

    PubMed

    Costigan, S A; Eather, N; Plotnikoff, R C; Taaffe, D R; Pollock, E; Kennedy, S G; Lubans, D R

    2015-01-01

    Current physical activity and fitness levels among adolescents are low, increasing the risk of chronic disease. Although the efficacy of high intensity interval training (HIIT) for improving metabolic health is now well established, it is not known if this type of activity can be effective to improve adolescent health. The primary aim of this study is to assess the effectiveness and feasibility of embedding HIIT into the school day. A 3-arm pilot randomized controlled trial was conducted in one secondary school in Newcastle, Australia. Participants (n = 65; mean age = 15.8(0.6) years) were randomized into one of three conditions: aerobic exercise program (AEP) (n = 21), resistance and aerobic exercise program (RAP) (n = 22) and control (n = 22). The 8-week intervention consisted of three HIIT sessions per week (8-10 min/session), delivered during physical education (PE) lessons or at lunchtime. Assessments were conducted at baseline and post-intervention to detect changes in cardiorespiratory fitness (multi-stage shuttle-run), muscular fitness (push-up, standing long jump tests), body composition (Body Mass Index (BMI), BMI-z scores, waist circumference) and physical activity motivation (questionnaire), by researchers blinded to treatment allocation. Intervention effects for outcomes were examined using linear mixed models, and Cohen's d effect sizes were reported. Participants in the AEP and RAP groups had moderate intervention effects for waist circumference (p = 0.024), BMI-z (p = 0.037) and BMI (not significant) in comparison to the control group. A small intervention effect was also evident for cardiorespiratory fitness in the RAP group.

  5. A general strategy for performing temperature-programming in high performance liquid chromatography--further improvements in the accuracy of retention time predictions of segmented temperature gradients.

    PubMed

    Wiese, Steffen; Teutenberg, Thorsten; Schmidt, Torsten C

    2012-01-27

    In the present work it is shown that the linear elution strength (LES) model which was adapted from temperature-programming gas chromatography (GC) can also be employed for systematic method development in high-temperature liquid chromatography (HT-HPLC). The ability to predict isothermal retention times based on temperature-gradient as well as isothermal input data was investigated. For a small temperature interval of ΔT=40°C, both approaches result in very similar predictions. Average relative errors of predicted retention times of 2.7% and 1.9% were observed for simulations based on isothermal and temperature-gradient measurements, respectively. Concurrently, it was investigated whether the accuracy of retention time predictions of segmented temperature gradients can be further improved by temperature dependent calculation of the parameter S(T) of the LES relationship. It was found that the accuracy of retention time predictions of multi-step temperature gradients can be improved to around 1.5%, if S(T) was also calculated temperature dependent. The adjusted experimental design making use of four temperature-gradient measurements was applied for systematic method development of selected food additives by high-temperature liquid chromatography. Method development was performed within a temperature interval from 40°C to 180°C using water as mobile phase. Two separation methods were established where selected food additives were baseline separated. In addition, a good agreement between simulation and experiment was observed, because an average relative error of predicted retention times of complex segmented temperature gradients less than 5% was observed. Finally, a schedule of recommendations to assist the practitioner during systematic method development in high-temperature liquid chromatography was established. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Predicting U.S. tuberculosis case counts through 2020.

    PubMed

    Woodruff, Rachel S Y E L K; Winston, Carla A; Miramontes, Roque

    2013-01-01

    In 2010, foreign-born persons accounted for 60% of all tuberculosis (TB) cases in the United States. Understanding which national groups make up the highest proportion of TB cases will assist TB control programs in concentrating limited resources where they can provide the greatest impact on preventing transmission of TB disease. The objective of our study was to predict through 2020 the numbers of U.S. TB cases among U.S.-born, foreign-born and foreign-born persons from selected countries of birth. TB case counts reported through the National Tuberculosis Surveillance System from 2000-2010 were log-transformed, and linear regression was performed to calculate predicted annual case counts and 95% prediction intervals for 2011-2020. Data were analyzed in 2011 before 2011 case counts were known. Decreases were predicted between 2010 observed and 2020 predicted counts for total TB cases (11,182 to 8,117 [95% prediction interval 7,262-9,073]) as well as TB cases among foreign-born persons from Mexico (1,541 to 1,420 [1,066-1,892]), the Philippines (740 to 724 [569-922]), India (578 to 553 [455-672]), Vietnam (532 to 429 [367-502]) and China (364 to 328 [249-433]). TB cases among persons who are U.S.-born and foreign-born were predicted to decline 47% (4,393 to 2,338 [2,113-2,586]) and 6% (6,720 to 6,343 [5,382-7,476]), respectively. Assuming rates of declines observed from 2000-2010 continue until 2020, a widening gap between the numbers of U.S.-born and foreign-born TB cases was predicted. TB case count predictions will help TB control programs identify needs for cultural competency, such as languages and interpreters needed for translating materials or engaging in appropriate community outreach.

  7. Problem Based Learning Technique and Its Effect on Acquisition of Linear Programming Skills by Secondary School Students in Kenya

    ERIC Educational Resources Information Center

    Nakhanu, Shikuku Beatrice; Musasia, Amadalo Maurice

    2015-01-01

    The topic Linear Programming is included in the compulsory Kenyan secondary school mathematics curriculum at form four. The topic provides skills for determining best outcomes in a given mathematical model involving some linear relationship. This technique has found application in business, economics as well as various engineering fields. Yet many…

  8. Development of Regional Supply Functions and a Least-Cost Model for Allocating Water Resources in Utah: A Parametric Linear Programming Approach.

    DTIC Science & Technology

    SYSTEMS ANALYSIS, * WATER SUPPLIES, MATHEMATICAL MODELS, OPTIMIZATION, ECONOMICS, LINEAR PROGRAMMING, HYDROLOGY, REGIONS, ALLOCATIONS, RESTRAINT, RIVERS, EVAPORATION, LAKES, UTAH, SALVAGE, MINES(EXCAVATIONS).

  9. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  10. The Use of Linear Programming for Prediction.

    ERIC Educational Resources Information Center

    Schnittjer, Carl J.

    The purpose of the study was to develop a linear programming model to be used for prediction, test the accuracy of the predictions, and compare the accuracy with that produced by curvilinear multiple regression analysis. (Author)

  11. Acceleration of genetic gain in cattle by reduction of generation interval.

    PubMed

    Kasinathan, Poothappillai; Wei, Hong; Xiang, Tianhao; Molina, Jose A; Metzger, John; Broek, Diane; Kasinathan, Sivakanthan; Faber, David C; Allan, Mark F

    2015-03-02

    Genomic selection (GS) approaches, in combination with reproductive technologies, are revolutionizing the design and implementation of breeding programs in livestock species, particularly in cattle. GS leverages genomic readouts to provide estimates of breeding value early in the life of animals. However, the capacity of these approaches for improving genetic gain in breeding programs is limited by generation interval, the average age of an animal when replacement progeny are born. Here, we present a cost-effective approach that combines GS with reproductive technologies to reduce generation interval by rapidly producing high genetic merit calves.

  12. An interval logic for higher-level temporal reasoning

    NASA Technical Reports Server (NTRS)

    Schwartz, R. L.; Melliar-Smith, P. M.; Vogt, F. H.; Plaisted, D. A.

    1983-01-01

    Prior work explored temporal logics, based on classical modal logics, as a framework for specifying and reasoning about concurrent programs, distributed systems, and communications protocols, and reported on efforts using temporal reasoning primitives to express very high level abstract requirements that a program or system is to satisfy. Based on experience with those primitives, this report describes an Interval Logic that is more suitable for expressing such higher level temporal properties. The report provides a formal semantics for the Interval Logic, and several examples of its use. A description of decision procedures for the logic is also included.

  13. Liver function testing on the Abaxis Piccolo Xpress: Use in Ebola virus disease protocols.

    PubMed

    Owen, William E; Caron, Justin E; Genzen, Jonathan R

    2015-06-15

    Laboratories that choose a point-of-care approach for liver function testing in patients undergoing evaluation for Ebola virus disease (EVD) have few options to choose from. The primary objective of this study was to conduct a performance characterization of a Clinical Laboratory Improvement Amendments (CLIA)-waived liver function panel on the Abaxis Piccolo® Xpress chemistry analyzer. The secondary objectives were to evaluate multiple specimen types, characterize whole blood specimen stability, and validate disposable exact transfer pipettes. Our final objective was to assess instrument airflow from a biosafety perspective. An instrument performance characterization, including precision, linearity, accuracy, reference interval verification, and specimen type evaluation was conducted using Liver Panel Plus reagent discs on the Piccolo® Xpress. All assays demonstrated acceptable linearity (slopes, 0.938-1.061; observed error, 0.8-6.3%). Assay precision was 0.0-3.6% (%CV; within-day studies) and 0.9-5.6% (between-day studies). Method comparison experiments (versus Roche cobas c502/c702 chemistry analyzers) showed excellent correlation for most assays, although a few notable differences were observed (Piccolo versus Roche): alkaline phosphatase, -18.6%; amylase, -29.0%; total bilirubin, +0.3mg/dl. Pre-programmed reference intervals were verified except for the alkaline phosphatase (male and female) and alanine aminotransferase (female), which had greater than 10% of results fall below the programmed ranges. Piccolo instrument results were largely consistent across specimen types tested (lithium-heparin whole blood, lithium-heparin plasma, and serum), although some statistical differences were observed for aspartate aminotransferase, gamma glutamyltransferase, and total protein. Whole blood time course studies demonstrated that some analytes (albumin, amylase, and total protein) showed remarkable stability, while others (such as aspartate aminotransferase) showed a slight trend toward decreased activity over time. Exact volume transfer pipettes provided an effective disposable option for disc loading. Finally, airflow studies suggested that, in the context of EVD protocols, instrument placement in a biosafety level (BSL) 2 cabinet or greater is justified. Given its analytical performance and ease of operation, the Piccolo Xpress was transferred to a BSL-2 cabinet in our BSL-3 suite for use in our hospital's diagnostic protocol for providing liver function testing in patients undergoing evaluation for EVD. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Synthesizing Dynamic Programming Algorithms from Linear Temporal Logic Formulae

    NASA Technical Reports Server (NTRS)

    Rosu, Grigore; Havelund, Klaus

    2001-01-01

    The problem of testing a linear temporal logic (LTL) formula on a finite execution trace of events, generated by an executing program, occurs naturally in runtime analysis of software. We present an algorithm which takes an LTL formula and generates an efficient dynamic programming algorithm. The generated algorithm tests whether the LTL formula is satisfied by a finite trace of events given as input. The generated algorithm runs in linear time, its constant depending on the size of the LTL formula. The memory needed is constant, also depending on the size of the formula.

  15. On the stability and instantaneous velocity of grasped frictionless objects

    NASA Technical Reports Server (NTRS)

    Trinkle, Jeffrey C.

    1992-01-01

    A quantitative test for form closure valid for any number of contact points is formulated as a linear program, the optimal objective value of which provides a measure of how far a grasp is from losing form closure. Another contribution of the study is the formulation of a linear program whose solution yields the same information as the classical approach. The benefit of the formulation is that explicit testing of all possible combinations of contact interactions can be avoided by the algorithm used to solve the linear program.

  16. A novel recurrent neural network with finite-time convergence for linear programming.

    PubMed

    Liu, Qingshan; Cao, Jinde; Chen, Guanrong

    2010-11-01

    In this letter, a novel recurrent neural network based on the gradient method is proposed for solving linear programming problems. Finite-time convergence of the proposed neural network is proved by using the Lyapunov method. Compared with the existing neural networks for linear programming, the proposed neural network is globally convergent to exact optimal solutions in finite time, which is remarkable and rare in the literature of neural networks for optimization. Some numerical examples are given to show the effectiveness and excellent performance of the new recurrent neural network.

  17. Investigating the correlation between paediatric stride interval persistence and gross energy expenditure.

    PubMed

    Fairley, Jillian A; Sejdić, Ervin; Chau, Tom

    2010-02-26

    Stride interval persistence, a term used to describe the correlation structure of stride interval time series, is thought to provide insight into neuromotor control, though its exact clinical meaning has not yet been realized. Since human locomotion is shaped by energy efficient movements, it has been hypothesized that stride interval dynamics and energy expenditure may be inherently tied, both having demonstrated similar sensitivities to age, disease, and pace-constrained walking. This study tested for correlations between stride interval persistence and measures of energy expenditure including mass-specific gross oxygen consumption per minute (VO₂), mass-specific gross oxygen cost per meter (VO₂) and heart rate (HR). Metabolic and stride interval data were collected from 30 asymptomatic children who completed one 10-minute walking trial under each of the following conditions: (i) overground walking, (ii) hands-free treadmill walking, and (iii) handrail-supported treadmill walking. Stride interval persistence was not significantly correlated with (p > 0.32), VO₂ (p > 0.18) or HR (p > 0.56). No simple linear dependence exists between stride interval persistence and measures of gross energy expenditure in asymptomatic children when walking overground and on a treadmill.

  18. Characterising non-linear dynamics in nocturnal breathing patterns of healthy infants using recurrence quantification analysis.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2013-05-01

    Breathing dynamics vary between infant sleep states, and are likely to exhibit non-linear behaviour. This study applied the non-linear analytical tool recurrence quantification analysis (RQA) to 400 breath interval periods of REM and N-REM sleep, and then using an overlapping moving window. The RQA variables were different between sleep states, with REM radius 150% greater than N-REM radius, and REM laminarity 79% greater than N-REM laminarity. RQA allowed the observation of temporal variations in non-linear breathing dynamics across a night's sleep at 30s resolution, and provides a basis for quantifying changes in complex breathing dynamics with physiology and pathology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Associations between Mycobacterium paratuberculosis sero-status, milk quality parameters, and reproduction in dairy cows.

    PubMed

    Pesqueira, María N; Factor, Camino; Mato, Ivan; Sanjuán, María L; Macias, Laura; Eiras, Carmen; Arnaiz, Ignacio; Camino, Fernando; Yus, Eduardo; Diéguez, Francisco J

    2015-01-01

    The objective of this study was to investigate the influence of Mycobocterium avium subsp. paratuberculosis (MAP) sero-status of dairy cows on different milk production variables and reproductive traits. The study was carried out on 40 herds from the region of Galicia (North-West Spain). These herds were randomly selected from a larger group that had taken part in a voluntary paratuberculosis control program since 2005, which involves regular serum sampling of every adult animal to run antibody-ELISA tests. Milk production and reproductive data were obtained from the "Dairy Herd Improvement Program (DHIP) of Galicia". All the gathered data were processed following a linear regression model. Results indicated that there was no significant effect of MAP sero-status on individual milk production variables. However, a significant difference was observed at the calving-to-first-insemination interval, with an average increase of 14 days in positive animals compared to negatives. It has to be taken into consideration that the paratuberculosis status was only defined by the serological status. Since para tb-infected animals may have antbodies or may not, para tb-positive animals can also be included in the sero-negative group of animals, which may bias the results.

  20. A comparison of a behavioral weight loss program to a stress management program: A pilot randomized controlled trial.

    PubMed

    Webber, Kelly H; Casey, Erin M; Mayes, Lindsey; Katsumata, Yuriko; Mellin, Laurel

    2016-01-01

    This study compared a behavioral weight loss program (BWL) with a stress management-based program, Emotional Brain Training (EBT), on weight loss, blood pressure, depression, perceived stress, diet, and physical activity. Subjects with a body mass index (BMI) of >28 and <45 kg/m(2) were recruited in Lexington, Kentucky in January 2014 and randomized to BWL or EBT for a 20-week intervention. Of those recruited, 49 participants were randomized to EBT or BWL. Randomization and allocation to group were performed using SPSS software. Weight, blood pressure, depression, perceived stress, dietary intake, and physical activity were measured at baseline, 10 week, and 20 week. Linear models for change over time were fit to calculate 95% confidence intervals of intervention effects. BWL produced greater changes in BMI than EBT at both 10 (P = 0.02) and 20 wk (P = 0.03). At 10 wk, both EBT and BWL improved BMI, systolic blood pressure, depression and perceived stress (P < 0.05). BWL also improved diastolic blood pressure (P = 0.005). At 20 wk, EBT maintained improvements in BMI, systolic blood pressure, depression, and perceived stress while BWL maintained improvements only in BMI and depression (P < 0.05). BWL produced greater weight loss than EBT; however, EBT produced sustained improvements in stress, depression, and systolic blood pressure. A combination of the two approaches should be explored. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Large-scale linear programs in planning and prediction.

    DOT National Transportation Integrated Search

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  2. Computer-aided linear-circuit design.

    NASA Technical Reports Server (NTRS)

    Penfield, P.

    1971-01-01

    Usually computer-aided design (CAD) refers to programs that analyze circuits conceived by the circuit designer. Among the services such programs should perform are direct network synthesis, analysis, optimization of network parameters, formatting, storage of miscellaneous data, and related calculations. The program should be embedded in a general-purpose conversational language such as BASIC, JOSS, or APL. Such a program is MARTHA, a general-purpose linear-circuit analyzer embedded in APL.

  3. Time to adapt exercise training regimens in pulmonary rehabilitation – a review of the literature

    PubMed Central

    Lee, Annemarie L; Holland, Anne E

    2014-01-01

    Exercise intolerance, exertional dyspnea, reduced health-related quality of life, and acute exacerbations are features characteristic of chronic obstructive pulmonary disease (COPD). Patients with a primary diagnosis of COPD often report comorbidities and other secondary manifestations, which diversifies the clinical presentation. Pulmonary rehabilitation that includes whole body exercise training is a critical part of management, and core programs involve endurance and resistance training for the upper and lower limbs. Improvement in maximal and submaximal exercise capacity, dyspnea, fatigue, health-related quality of life, and psychological symptoms are outcomes associated with exercise training in pulmonary rehabilitation, irrespective of the clinical state in which it is commenced. There may be benefits for the health care system as well as the individual patient, with fewer exacerbations and subsequent hospitalization reported with exercise training. The varying clinical profile of COPD may direct the need for modification to traditional training strategies for some patients. Interval training, one-legged cycling (partitioning) and non-linear periodized training appear to be equally or more effective than continuous training. Inspiratory muscle training may have a role as an adjunct to whole body training in selected patients. The benefits of balance training are also emerging. Strategies to ensure that health enhancing behaviors are adopted and maintained are essential. These may include training for an extended duration, alternative environments to undertake the initial program, maintenance programs following initial exercise training, program repetition, and incorporation of approaches to address behavioral change. This may be complemented by methods designed to maximize uptake and completion of a pulmonary rehabilitation program. PMID:25419125

  4. Time to adapt exercise training regimens in pulmonary rehabilitation--a review of the literature.

    PubMed

    Lee, Annemarie L; Holland, Anne E

    2014-01-01

    Exercise intolerance, exertional dyspnea, reduced health-related quality of life, and acute exacerbations are features characteristic of chronic obstructive pulmonary disease (COPD). Patients with a primary diagnosis of COPD often report comorbidities and other secondary manifestations, which diversifies the clinical presentation. Pulmonary rehabilitation that includes whole body exercise training is a critical part of management, and core programs involve endurance and resistance training for the upper and lower limbs. Improvement in maximal and submaximal exercise capacity, dyspnea, fatigue, health-related quality of life, and psychological symptoms are outcomes associated with exercise training in pulmonary rehabilitation, irrespective of the clinical state in which it is commenced. There may be benefits for the health care system as well as the individual patient, with fewer exacerbations and subsequent hospitalization reported with exercise training. The varying clinical profile of COPD may direct the need for modification to traditional training strategies for some patients. Interval training, one-legged cycling (partitioning) and non-linear periodized training appear to be equally or more effective than continuous training. Inspiratory muscle training may have a role as an adjunct to whole body training in selected patients. The benefits of balance training are also emerging. Strategies to ensure that health enhancing behaviors are adopted and maintained are essential. These may include training for an extended duration, alternative environments to undertake the initial program, maintenance programs following initial exercise training, program repetition, and incorporation of approaches to address behavioral change. This may be complemented by methods designed to maximize uptake and completion of a pulmonary rehabilitation program.

  5. Delay-Dependent Stability Criterion for Bidirectional Associative Memory Neural Networks with Interval Time-Varying Delays

    NASA Astrophysics Data System (ADS)

    Park, Ju H.; Kwon, O. M.

    In the letter, the global asymptotic stability of bidirectional associative memory (BAM) neural networks with delays is investigated. The delay is assumed to be time-varying and belongs to a given interval. A novel stability criterion for the stability is presented based on the Lyapunov method. The criterion is represented in terms of linear matrix inequality (LMI), which can be solved easily by various optimization algorithms. Two numerical examples are illustrated to show the effectiveness of our new result.

  6. Chirp Scaling Algorithms for SAR Processing

    NASA Technical Reports Server (NTRS)

    Jin, M.; Cheng, T.; Chen, M.

    1993-01-01

    The chirp scaling SAR processing algorithm is both accurate and efficient. Successful implementation requires proper selection of the interval of output samples, which is a function of the chirp interval, signal sampling rate, and signal bandwidth. Analysis indicates that for both airborne and spaceborne SAR applications in the slant range domain a linear chirp scaling is sufficient. To perform nonlinear interpolation process such as to output ground range SAR images, one can use a nonlinear chirp scaling interpolator presented in this paper.

  7. Anderson localisation and optical-event horizons in rogue-soliton generation.

    PubMed

    Saleh, Mohammed F; Conti, Claudio; Biancalana, Fabio

    2017-03-06

    We unveil the relation between the linear Anderson localisation process and nonlinear modulation instability. Anderson localised modes are formed in certain temporal intervals due to the random background noise. Such localised modes seed the formation of solitary waves that will appear during the modulation instability process at those preferred intervals. Afterwards, optical-event horizon effects between dispersive waves and solitons produce an artificial collective acceleration that favours the collision of solitons, which could eventually lead to a rogue-soliton generation.

  8. Risk of Interval Cancer in Fecal Immunochemical Test Screening Significantly Higher During the Summer Months: Results from the National Cancer Screening Program in Korea.

    PubMed

    Cha, Jae Myung; Suh, Mina; Kwak, Min Seob; Sung, Na Young; Choi, Kui Son; Park, Boyoung; Jun, Jae Kwan; Hwang, Sang-Hyun; Lee, Do-Hoon; Kim, Byung Chang; Lee, You Kyoung; Han, Dong Soo

    2018-04-01

    This study aimed to evaluate the impact of seasonal variations in climate on the performance of the fecal immunochemical test (FIT) in screening for colorectal cancer in the National Cancer Screening Program in Korea. Data were extracted from the National Cancer Screening Program databases for participants who underwent FIT between 2009 and 2010. We compared positivity rates, cancer detection rates, interval cancer rates, positive predictive value, sensitivity, and specificity for FIT during the spring, summer, fall, and winter seasons in Korea. In total, 4,788,104 FIT results were analyzed. FIT positivity rate was lowest during the summer months. In the summer, the positive predictive value of FIT was about 1.1 times (adjusted odds ratio (aOR) 1.08, 95% confidence interval (CI) 1.00-1.16) higher in the overall FIT group and about 1.3 times (aOR 1.29, 95% CI 1.10-1.50) higher in the quantitative FIT group, compared to those in the other seasons. Cancer detection rates, however, were similar regardless of season. Interval cancer risk was significantly higher in the summer for both the overall FIT group (aOR 1.16, 95% CI 1.07-1.27) and the quantitative FIT group (aOR 1.31, 95% CI 1.12-1.52). In addition, interval cancers in the rectum and distal colon were more frequently detected in the summer and autumn than in the winter. The positivity rate of FIT was lower in the summer, and the performance of the FIT screening program was influenced by seasonal variations in Korea. These results suggest that more efforts to reduce interval cancer during the summer are needed in population-based screening programs using FIT, particularly in countries with high ambient temperatures.

  9. The potential effectiveness of the nutrition improvement program on infant and young child feeding and nutritional status in the Northwest and Southwest regions of Cameroon, Central Africa.

    PubMed

    Reinsma, Kate; Nkuoh, Godlove; Nshom, Emmanuel

    2016-11-15

    Despite the recent international focus on maternal and child nutrition, little attention is paid to nutrition capacity development. Although infant feeding counselling by health workers increases caregivers' knowledge, and improves breastfeeding, complementary feeding, and children's linear growth, most of the counselling in sub-Saharan Africa is primarily conducted by nurses or volunteers, and little is done to develop capacity for nutrition at the professional, organizational, or systemic levels. The Cameroon Baptist Convention Health Services Nutrition Improvement Program (NIP) has integrated a cadre of nutrition counselors into prevention of mother-to-child transmission of HIV programs, infant welfare clinics, and antenatal clinics to improve infant and young child feeding practices (IYCF). The study objective was to evaluate the effects of NIP's infant feeding counselors on exclusive breastfeeding (EBF), complementary feeding (CF), and children's linear growth. A cross-sectional evaluation design was used. Using systematic random sampling, caregivers were recruited from NIP sites (n = 359) and non-NIP sites (n = 415) from Infant Welfare Clinics (IWCs) in the Northwest (NWR) and Southwest Regions (SWR) of Cameroon between October 2014 and April 2015. Differences in EBF and CF practices and children's linear growth between NIP and non-NIP sites were determined using chi-square and multiple logistic regression. After adjusting for differences in religion, occupation, and number of months planning to breastfeed, children were almost seven times (Odds Ratio [OR]: 6.9; 95% Confidence Interval [CI]: 2.30, 21.09; β = 1.94) more likely to be exclusively breastfed at NIP sites compared to non-NIP sites. After adjusting for differences in occupation, religion, number of months planning to breastfeed, rural environment, economic status, attending other Infant Welfare Clinics, and non-biological caregiver, children were five times more likely to be stunted at non-NIP sites compared to NIP sites. Training a cadre of nutrition counselors is one approach towards increasing nutrition human resources to implement nutrition interventions to improve maternal and child nutrition. In this research project, the study design did not allow for conclusive results, but rather suggest IYCF counseling provided by nutrition counselors was effective in increasing EBF and reduced the risk of stunting in children 6-8 months.

  10. Planning Student Flow with Linear Programming: A Tunisian Case Study.

    ERIC Educational Resources Information Center

    Bezeau, Lawrence

    A student flow model in linear programming format, designed to plan the movement of students into secondary and university programs in Tunisia, is described. The purpose of the plan is to determine a sufficient number of graduating students that would flow back into the system as teachers or move into the labor market to meet fixed manpower…

  11. Pulmonary and Critical Care In-Service Training Examination Score as a Predictor of Board Certification Examination Performance.

    PubMed

    Kempainen, Robert R; Hess, Brian J; Addrizzo-Harris, Doreen J; Schaad, Douglas C; Scott, Craig S; Carlin, Brian W; Shaw, Robert C; Duhigg, Lauren; Lipner, Rebecca S

    2016-04-01

    Most trainees in combined pulmonary and critical care medicine fellowship programs complete in-service training examinations (ITEs) that test knowledge in both disciplines. Whether ITE scores predict performance on the American Board of Internal Medicine Pulmonary Disease Certification Examination and Critical Care Medicine Certification Examination is unknown. To determine whether pulmonary and critical care medicine ITE scores predict performance on subspecialty board certification examinations independently of trainee demographics, program director competency ratings, fellowship program characteristics, and prior medical knowledge assessments. First- and second-year fellows who were enrolled in the study between 2008 and 2012 completed a questionnaire encompassing demographics and fellowship training characteristics. These data and ITE scores were matched to fellows' subsequent scores on subspecialty certification examinations, program director ratings, and previous scores on their American Board of Internal Medicine Internal Medicine Certification Examination. Multiple linear regression and logistic regression were used to identify independent predictors of subspecialty certification examination scores and likelihood of passing the examinations, respectively. Of eligible fellows, 82.4% enrolled in the study. The ITE score for second-year fellows was matched to their certification examination scores, which yielded 1,484 physicians for pulmonary disease and 1,331 for critical care medicine. Second-year fellows' ITE scores (β = 0.24, P < 0.001) and Internal Medicine Certification Examination scores (β = 0.49, P < 0.001) were the strongest predictors of Pulmonary Disease Certification Examination scores, and were the only significant predictors of passing the examination (ITE odds ratio, 1.12 [95% confidence interval, 1.07-1.16]; Internal Medicine Certification Examination odds ratio, 1.01 [95% confidence interval, 1.01-1.02]). Similar results were obtained for predicting Critical Care Medicine Certification Examination scores and for passing the examination. The predictive value of ITE scores among first-year fellows on the subspecialty certification examinations was comparable to second-year fellows' ITE scores. The Pulmonary and Critical Care Medicine ITE score is an independent, and stronger, predictor of subspecialty certification examination performance than fellow demographics, program director competency ratings, and fellowship characteristics. These findings support the use of the ITE to identify the learning needs of fellows as they work toward subspecialty board certification.

  12. Linear decomposition approach for a class of nonconvex programming problems.

    PubMed

    Shen, Peiping; Wang, Chunfeng

    2017-01-01

    This paper presents a linear decomposition approach for a class of nonconvex programming problems by dividing the input space into polynomially many grids. It shows that under certain assumptions the original problem can be transformed and decomposed into a polynomial number of equivalent linear programming subproblems. Based on solving a series of liner programming subproblems corresponding to those grid points we can obtain the near-optimal solution of the original problem. Compared to existing results in the literature, the proposed algorithm does not require the assumptions of quasi-concavity and differentiability of the objective function, and it differs significantly giving an interesting approach to solving the problem with a reduced running time.

  13. The effects of time of disease occurrence, milk yield, and body condition on fertility of dairy cows.

    PubMed

    Loeffler, S H; de Vries, M J; Schukken, Y H

    1999-12-01

    The associations between occurrence of diseases, milk yield, and body condition score on conception risk after first artificial insemination (AI) were analyzed in an observational study on a convenience sample of 43 farms participating in a herd health program. Data were taken from 9369 lactations, from 4382 cows inseminated between 20 and 180 d in milk from 1990 to 1996. Two logistic regression models, one containing data from all lactations and a subset containing data from 1762 lactations with body condition scoring, were used to determine pregnancy risk at first AI. The effects of herd deviation in test-day milk yield, body condition score loss, and milk fat to protein ratio changes in early lactation were significant predictors of pregnancy risk, independent of disease; days in milk; farm; and seasonal factors. Three different methods of disease parameterization (incidence rates, binomial classes dependent on the interval in days since last occurrence with respect to AI, and a linear variable weighted for this interval) produced similar results. Metritis, cystic ovarian disease, lameness, and mastitis gave odds ratios for pregnancy risk ranging from 0.35 to 1.15, largely dependent on the interval in days from final disease occurrence to first AI. Displaced abomasum, milk fever, and retained fetal membranes resulted in odds ratios for pregnancy risk of 0.25, 0.85, and 0.55, respectively. These diseases showed little relationship between fertility and the number of days since last occurrence. Results of this study confirm the negative effects of milk yield, body score condition loss, and disease on dairy cow fertility. The effects of some diseases on first service conception were strongly dependent on the interval since last disease occurrence. This was especially valid for clinical mastitis, which has an extremely weak effect on conception if occurring prior to AI and is associated with > 50% reduction in pregnancy risk if occurring in the 3 wk directly after AI.

  14. Toward the prevention of childhood undernutrition: diet diversity strategies using locally produced food can overcome gaps in nutrient supply.

    PubMed

    Parlesak, Alexandr; Geelhoed, Diederike; Robertson, Aileen

    2014-06-01

    Chronic undernutrition is prevalent in Mozambique, where children suffer from stunting, vitamin A deficiency, anemia, and other nutrition-related disorders. Complete diet formulation products (CDFPs) are increasingly promoted to prevent chronic undernutrition. Using linear programming, to investigate whether diet diversification using local foods should be prioritized in order to reduce the prevalence of chronic undernutrition. Market prices of local foods were collected in Tete City, Mozambique. Linear programming was applied to calculate the cheapest possible fully nutritious food baskets (FNFB) by stepwise addition of micronutrient-dense localfoods. Only the top quintile of Mozambican households, using average expenditure data, could afford the FNFB that was designed using linear programming from a spectrum of local standard foods. The addition of beef heart or liver, dried fish and fresh moringa leaves, before applying linear programming decreased the price by a factor of up to 2.6. As a result, the top three quintiles could afford the FNFB optimized using both diversification strategy and linear programming. CDFPs, when added to the baskets, were unable to overcome the micronutrient gaps without greatly exceeding recommended energy intakes, due to their high ratio of energy to micronutrient density. Dietary diversification strategies using local, low-cost, nutrient-dense foods can meet all micronutrient recommendations and overcome all micronutrient gaps. The success of linear programming to identify a low-cost FNFB depends entirely on the investigators' ability to select appropriate micronutrient-dense foods. CDFPs added to food baskets are unable to overcome micronutrient gaps without greatly exceeding recommended energy intake.

  15. Improved Equivalent Linearization Implementations Using Nonlinear Stiffness Evaluation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2001-01-01

    This report documents two new implementations of equivalent linearization for solving geometrically nonlinear random vibration problems of complicated structures. The implementations are given the acronym ELSTEP, for "Equivalent Linearization using a STiffness Evaluation Procedure." Both implementations of ELSTEP are fundamentally the same in that they use a novel nonlinear stiffness evaluation procedure to numerically compute otherwise inaccessible nonlinear stiffness terms from commercial finite element programs. The commercial finite element program MSC/NASTRAN (NASTRAN) was chosen as the core of ELSTEP. The FORTRAN implementation calculates the nonlinear stiffness terms and performs the equivalent linearization analysis outside of NASTRAN. The Direct Matrix Abstraction Program (DMAP) implementation performs these operations within NASTRAN. Both provide nearly identical results. Within each implementation, two error minimization approaches for the equivalent linearization procedure are available - force and strain energy error minimization. Sample results for a simply supported rectangular plate are included to illustrate the analysis procedure.

  16. Linear combination reading program for capture gamma rays

    USGS Publications Warehouse

    Tanner, Allan B.

    1971-01-01

    This program computes a weighting function, Qj, which gives a scalar output value of unity when applied to the spectrum of a desired element and a minimum value (considering statistics) when applied to spectra of materials not containing the desired element. Intermediate values are obtained for materials containing the desired element, in proportion to the amount of the element they contain. The program is written in the BASIC language in a format specific to the Hewlett-Packard 2000A Time-Sharing System, and is an adaptation of an earlier program for linear combination reading for X-ray fluorescence analysis (Tanner and Brinkerhoff, 1971). Following the program is a sample run from a study of the application of the linear combination technique to capture-gamma-ray analysis for calcium (report in preparation).

  17. The program LOPT for least-squares optimization of energy levels

    NASA Astrophysics Data System (ADS)

    Kramida, A. E.

    2011-02-01

    The article describes a program that solves the least-squares optimization problem for finding the energy levels of a quantum-mechanical system based on a set of measured energy separations or wavelengths of transitions between those energy levels, as well as determining the Ritz wavelengths of transitions and their uncertainties. The energy levels are determined by solving the matrix equation of the problem, and the uncertainties of the Ritz wavenumbers are determined from the covariance matrix of the problem. Program summaryProgram title: LOPT Catalogue identifier: AEHM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHM_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 19 254 No. of bytes in distributed program, including test data, etc.: 427 839 Distribution format: tar.gz Programming language: Perl v.5 Computer: PC, Mac, Unix workstations Operating system: MS Windows (XP, Vista, 7), Mac OS X, Linux, Unix (AIX) RAM: 3 Mwords or more Word size: 32 or 64 Classification: 2.2 Nature of problem: The least-squares energy-level optimization problem, i.e., finding a set of energy level values that best fits the given set of transition intervals. Solution method: The solution of the least-squares problem is found by solving the corresponding linear matrix equation, where the matrix is constructed using a new method with variable substitution. Restrictions: A practical limitation on the size of the problem N is imposed by the execution time, which scales as N and depends on the computer. Unusual features: Properly rounds the resulting data and formats the output in a format suitable for viewing with spreadsheet editing software. Estimates numerical errors resulting from the limited machine precision. Running time: 1 s for N=100, or 60 s for N=400 on a typical PC.

  18. Evaluating forest management policies by parametric linear programing

    Treesearch

    Daniel I. Navon; Richard J. McConnen

    1967-01-01

    An analytical and simulation technique, parametric linear programing explores alternative conditions and devises an optimal management plan for each condition. Its application in solving policy-decision problems in the management of forest lands is illustrated in an example.

  19. Use of nonlinear programming to optimize performance response to energy density in broiler feed formulation.

    PubMed

    Guevara, V R

    2004-02-01

    A nonlinear programming optimization model was developed to maximize margin over feed cost in broiler feed formulation and is described in this paper. The model identifies the optimal feed mix that maximizes profit margin. Optimum metabolizable energy level and performance were found by using Excel Solver nonlinear programming. Data from an energy density study with broilers were fitted to quadratic equations to express weight gain, feed consumption, and the objective function income over feed cost in terms of energy density. Nutrient:energy ratio constraints were transformed into equivalent linear constraints. National Research Council nutrient requirements and feeding program were used for examining changes in variables. The nonlinear programming feed formulation method was used to illustrate the effects of changes in different variables on the optimum energy density, performance, and profitability and was compared with conventional linear programming. To demonstrate the capabilities of the model, I determined the impact of variation in prices. Prices for broiler, corn, fish meal, and soybean meal were increased and decreased by 25%. Formulations were identical in all other respects. Energy density, margin, and diet cost changed compared with conventional linear programming formulation. This study suggests that nonlinear programming can be more useful than conventional linear programming to optimize performance response to energy density in broiler feed formulation because an energy level does not need to be set.

  20. A study of the use of linear programming techniques to improve the performance in design optimization problems

    NASA Technical Reports Server (NTRS)

    Young, Katherine C.; Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    This project has two objectives. The first is to determine whether linear programming techniques can improve performance when handling design optimization problems with a large number of design variables and constraints relative to the feasible directions algorithm. The second purpose is to determine whether using the Kreisselmeier-Steinhauser (KS) function to replace the constraints with one constraint will reduce the cost of total optimization. Comparisons are made using solutions obtained with linear and non-linear methods. The results indicate that there is no cost saving using the linear method or in using the KS function to replace constraints.

  1. A non-linear programming approach to the computer-aided design of regulators using a linear-quadratic formulation

    NASA Technical Reports Server (NTRS)

    Fleming, P.

    1985-01-01

    A design technique is proposed for linear regulators in which a feedback controller of fixed structure is chosen to minimize an integral quadratic objective function subject to the satisfaction of integral quadratic constraint functions. Application of a non-linear programming algorithm to this mathematically tractable formulation results in an efficient and useful computer-aided design tool. Particular attention is paid to computational efficiency and various recommendations are made. Two design examples illustrate the flexibility of the approach and highlight the special insight afforded to the designer.

  2. Measles Antibodies in Mother-Infant Dyads in Tianjin, China.

    PubMed

    Boulton, Matthew L; Wang, Xiexiu; Wagner, Abram L; Zhang, Ying; Carlson, Bradley F; Gillespie, Brenda W; Ding, Yaxing

    2017-11-27

    Many measles cases in Tianjin, China, occur in infants whose mothers were born after widespread vaccination programs. We assessed age-specific decreases in maternal measles antibodies in infants and examined maternal and infant characteristics in relation to infant antibody titers. Infant and mother dyads were enrolled from a sample of immunization clinics in all Tianjin districts. Participants' antibody titers were measured from dried blood spots. A multivariable log-linear model regressed infant antibody titers onto infant and mother characteristics. Among 551 infants aged ≤8 months, protective levels of measles antibodies were observed in infants whose mothers had measles titers ≥800 IU/mL (mean antibody titer, 542.5 IU/mL) or 400 to <800 IU/mL (mean, 202.2 IU/mL). Compared with infants whose mothers had no history of disease or vaccination, those with a history of disease had 1.60 times higher titers (95% confidence interval, 1.06-2.43). Limited vaccination programs in the 1980s have resulted in many Chinese women with inadequate protection against measles and an accordingly low efficiency of transplacental transmission to a fetus. Current vaccination programs, which target children aged 8 months through adolescence may be ineffective in controlling transmission of measles to infants. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  3. A sequential linear optimization approach for controller design

    NASA Technical Reports Server (NTRS)

    Horta, L. G.; Juang, J.-N.; Junkins, J. L.

    1985-01-01

    A linear optimization approach with a simple real arithmetic algorithm is presented for reliable controller design and vibration suppression of flexible structures. Using first order sensitivity of the system eigenvalues with respect to the design parameters in conjunction with a continuation procedure, the method converts a nonlinear optimization problem into a maximization problem with linear inequality constraints. The method of linear programming is then applied to solve the converted linear optimization problem. The general efficiency of the linear programming approach allows the method to handle structural optimization problems with a large number of inequality constraints on the design vector. The method is demonstrated using a truss beam finite element model for the optimal sizing and placement of active/passive-structural members for damping augmentation. Results using both the sequential linear optimization approach and nonlinear optimization are presented and compared. The insensitivity to initial conditions of the linear optimization approach is also demonstrated.

  4. Relationship between age and elite marathon race time in world single age records from 5 to 93 years

    PubMed Central

    2014-01-01

    Background The aims of the study were (i) to investigate the relationship between elite marathon race times and age in 1-year intervals by using the world single age records in marathon running from 5 to 93 years and (ii) to evaluate the sex difference in elite marathon running performance with advancing age. Methods World single age records in marathon running in 1-year intervals for women and men were analysed regarding changes across age for both men and women using linear and non-linear regression analyses for each age for women and men. Results The relationship between elite marathon race time and age was non-linear (i.e. polynomial regression 4th degree) for women and men. The curve was U-shaped where performance improved from 5 to ~20 years. From 5 years to ~15 years, boys and girls performed very similar. Between ~20 and ~35 years, performance was quite linear, but started to decrease at the age of ~35 years in a curvilinear manner with increasing age in both women and men. The sex difference increased non-linearly (i.e. polynomial regression 7th degree) from 5 to ~20 years, remained unchanged at ~20 min from ~20 to ~50 years and increased thereafter. The sex difference was lowest (7.5%, 10.5 min) at the age of 49 years. Conclusion Elite marathon race times improved from 5 to ~20 years, remained linear between ~20 and ~35 years, and started to increase at the age of ~35 years in a curvilinear manner with increasing age in both women and men. The sex difference in elite marathon race time increased non-linearly and was lowest at the age of ~49 years. PMID:25120915

  5. A New Pattern of Getting Nasty Number in Graphical Method

    NASA Astrophysics Data System (ADS)

    Sumathi, P.; Indhumathi, N.

    2018-04-01

    This paper proposed a new technique of getting nasty numbers using graphical method in linear programming problem and it has been proved for various Linear programming problems. And also some characterisation of nasty numbers is discussed in this paper.

  6. Optimal blood glucose control in diabetes mellitus treatment using dynamic programming based on Ackerman’s linear model

    NASA Astrophysics Data System (ADS)

    Pradanti, Paskalia; Hartono

    2018-03-01

    Determination of insulin injection dose in diabetes mellitus treatment can be considered as an optimal control problem. This article is aimed to simulate optimal blood glucose control for patient with diabetes mellitus. The blood glucose regulation of diabetic patient is represented by Ackerman’s Linear Model. This problem is then solved using dynamic programming method. The desired blood glucose level is obtained by minimizing the performance index in Lagrange form. The results show that dynamic programming based on Ackerman’s Linear Model is quite good to solve the problem.

  7. SPAR reference manual. [for stress analysis

    NASA Technical Reports Server (NTRS)

    Whetstone, W. D.

    1974-01-01

    SPAR is a system of related programs which may be operated either in batch or demand (teletype) mode. Information exchange between programs is automatically accomplished through one or more direct access libraries, known collectively as the data complex. Card input is command-oriented, in free-field form. Capabilities available in the first production release of the system are fully documented, and include linear stress analysis, linear bifurcation buckling analysis, and linear vibrational analysis.

  8. Resource Allocation Modelling in Vocational Rehabilitation: A Prototype Developed with the Michigan and Rhode Island VR Agencies.

    ERIC Educational Resources Information Center

    Leff, H. Stephen; Turner, Ralph R.

    This report focuses on the use of linear programming models to address the issues of how vocational rehabilitation (VR) resources should be allocated in order to maximize program efficiency within given resource constraints. A general introduction to linear programming models is first presented that describes the major types of models available,…

  9. Specification for Teaching Machines and Programmes (Interchangeability of Programmes). Part 1, Linear Machines and Programmes.

    ERIC Educational Resources Information Center

    British Standards Institution, London (England).

    To promote interchangeability of teaching machines and programs, so that the user is not so limited in his choice of programs, the British Standards Institute has offered a standard. Part I of the standard deals with linear teaching machines and programs that make use of the roll or sheet methods of presentation. Requirements cover: spools,…

  10. Psychosocial and nonclinical factors predicting hospital utilization in patients of a chronic disease management program: a prospective observational study.

    PubMed

    Tran, Mark W; Weiland, Tracey J; Phillips, Georgina A

    2015-01-01

    Psychosocial factors such as marital status (odds ratio, 3.52; 95% confidence interval, 1.43-8.69; P = .006) and nonclinical factors such as outpatient nonattendances (odds ratio, 2.52; 95% confidence interval, 1.22-5.23; P = .013) and referrals made (odds ratio, 1.20; 95% confidence interval, 1.06-1.35; P = .003) predict hospital utilization for patients in a chronic disease management program. Along with optimizing patients' clinical condition by prescribed medical guidelines and supporting patient self-management, addressing psychosocial and nonclinical issues are important in attempting to avoid hospital utilization for people with chronic illnesses.

  11. Frequency assignments for HFDF receivers in a search and rescue network

    NASA Astrophysics Data System (ADS)

    Johnson, Krista E.

    1990-03-01

    This thesis applies a multiobjective linear programming approach to the problem of assigning frequencies to high frequency direction finding (HFDF) receivers in a search-and-rescue network in order to maximize the expected number of geolocations of vessels in distress. The problem is formulated as a multiobjective integer linear programming problem. The integrality of the solutions is guaranteed by the totally unimodularity of the A-matrix. Two approaches are taken to solve the multiobjective linear programming problem: (1) the multiobjective simplex method as implemented in ADBASE; and (2) an iterative approach. In this approach, the individual objective functions are weighted and combined in a single additive objective function. The resulting single objective problem is expressed as a network programming problem and solved using SAS NETFLOW. The process is then repeated with different weightings for the objective functions. The solutions obtained from the multiobjective linear programs are evaluated using a FORTRAN program to determine which solution provides the greatest expected number of geolocations. This solution is then compared to the sample mean and standard deviation for the expected number of geolocations resulting from 10,000 random frequency assignments for the network.

  12. A Microsoft Excel® 2010 Based Tool for Calculating Interobserver Agreement

    PubMed Central

    Azulay, Richard L

    2011-01-01

    This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel®) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work. PMID:22649578

  13. A microsoft excel(®) 2010 based tool for calculating interobserver agreement.

    PubMed

    Reed, Derek D; Azulay, Richard L

    2011-01-01

    This technical report provides detailed information on the rationale for using a common computer spreadsheet program (Microsoft Excel(®)) to calculate various forms of interobserver agreement for both continuous and discontinuous data sets. In addition, we provide a brief tutorial on how to use an Excel spreadsheet to automatically compute traditional total count, partial agreement-within-intervals, exact agreement, trial-by-trial, interval-by-interval, scored-interval, unscored-interval, total duration, and mean duration-per-interval interobserver agreement algorithms. We conclude with a discussion of how practitioners may integrate this tool into their clinical work.

  14. Fractal analyses reveal independent complexity and predictability of gait

    PubMed Central

    Dierick, Frédéric; Nivard, Anne-Laure

    2017-01-01

    Locomotion is a natural task that has been assessed for decades and used as a proxy to highlight impairments of various origins. So far, most studies adopted classical linear analyses of spatio-temporal gait parameters. Here, we use more advanced, yet not less practical, non-linear techniques to analyse gait time series of healthy subjects. We aimed at finding more sensitive indexes related to spatio-temporal gait parameters than those previously used, with the hope to better identify abnormal locomotion. We analysed large-scale stride interval time series and mean step width in 34 participants while altering walking direction (forward vs. backward walking) and with or without galvanic vestibular stimulation. The Hurst exponent α and the Minkowski fractal dimension D were computed and interpreted as indexes expressing predictability and complexity of stride interval time series, respectively. These holistic indexes can easily be interpreted in the framework of optimal movement complexity. We show that α and D accurately capture stride interval changes in function of the experimental condition. Walking forward exhibited maximal complexity (D) and hence, adaptability. In contrast, walking backward and/or stimulation of the vestibular system decreased D. Furthermore, walking backward increased predictability (α) through a more stereotyped pattern of the stride interval and galvanic vestibular stimulation reduced predictability. The present study demonstrates the complementary power of the Hurst exponent and the fractal dimension to improve walking classification. Our developments may have immediate applications in rehabilitation, diagnosis, and classification procedures. PMID:29182659

  15. Synchronic interval Gaussian mixed-integer programming for air quality management.

    PubMed

    Cheng, Guanhui; Huang, Guohe Gordon; Dong, Cong

    2015-12-15

    To reveal the synchronism of interval uncertainties, the tradeoff between system optimality and security, the discreteness of facility-expansion options, the uncertainty of pollutant dispersion processes, and the seasonality of wind features in air quality management (AQM) systems, a synchronic interval Gaussian mixed-integer programming (SIGMIP) approach is proposed in this study. A robust interval Gaussian dispersion model is developed for approaching the pollutant dispersion process under interval uncertainties and seasonal variations. The reflection of synchronic effects of interval uncertainties in the programming objective is enabled through introducing interval functions. The proposition of constraint violation degrees helps quantify the tradeoff between system optimality and constraint violation under interval uncertainties. The overall optimality of system profits of an SIGMIP model is achieved based on the definition of an integrally optimal solution. Integer variables in the SIGMIP model are resolved by the existing cutting-plane method. Combining these efforts leads to an effective algorithm for the SIGMIP model. An application to an AQM problem in a region in Shandong Province, China, reveals that the proposed SIGMIP model can facilitate identifying the desired scheme for AQM. The enhancement of the robustness of optimization exercises may be helpful for increasing the reliability of suggested schemes for AQM under these complexities. The interrelated tradeoffs among control measures, emission sources, flow processes, receptors, influencing factors, and economic and environmental goals are effectively balanced. Interests of many stakeholders are reasonably coordinated. The harmony between economic development and air quality control is enabled. Results also indicate that the constraint violation degree is effective at reflecting the compromise relationship between constraint-violation risks and system optimality under interval uncertainties. This can help decision makers mitigate potential risks, e.g. insufficiency of pollutant treatment capabilities, exceedance of air quality standards, deficiency of pollution control fund, or imbalance of economic or environmental stress, in the process of guiding AQM. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. The Effects of a Local Negative Feedback Function between Choice and Relative Reinforcer Rate

    PubMed Central

    Davison, Michael; Elliffe, Douglas; Marr, M. Jackson

    2010-01-01

    Four pigeons were trained on two-key concurrent variable-interval schedules with no changeover delay. In Phase 1, relative reinforcers on the two alternatives were varied over five conditions from .1 to .9. In Phases 2 and 3, we instituted a molar feedback function between relative choice in an interreinforcer interval and the probability of reinforcers on the two keys ending the next interreinforcer interval. The feedback function was linear, and was negatively sloped so that more extreme choice in an interreinforcer interval made it more likely that a reinforcer would be available on the other key at the end of the next interval. The slope of the feedback function was −1 in Phase 2 and −3 in Phase 3. We varied relative reinforcers in each of these phases by changing the intercept of the feedback function. Little effect of the feedback functions was discernible at the local (interreinforcer interval) level, but choice measured at an extended level across sessions was strongly and significantly decreased by increasing the negative slope of the feedback function. PMID:21451748

  17. Estimation of postmortem interval through albumin in CSF by simple dye binding method.

    PubMed

    Parmar, Ankita K; Menon, Shobhana K

    2015-12-01

    Estimation of postmortem interval is a very important question in some medicolegal investigations. For the precise estimation of postmortem interval, there is a need of a method which can give accurate estimation. Bromocresol green (BCG) is a simple dye binding method and widely used in routine practice. Application of this method in forensic practice may bring revolutionary changes. In this study, cerebrospinal fluid was aspirated from cisternal puncture from 100 autopsies. A study was carried out on concentration of albumin with respect to postmortem interval. After death, albumin present in CSF undergoes changes, after 72 h of death, concentration of albumin has become 0.012 mM, and this decrease was linear from 2 h to 72 h. An important relationship was found between albumin concentration and postmortem interval with an error of ± 1-4h. The study concludes that CSF albumin can be a useful and significant parameter in estimation of postmortem interval. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  18. A binary linear programming formulation of the graph edit distance.

    PubMed

    Justice, Derek; Hero, Alfred

    2006-08-01

    A binary linear programming formulation of the graph edit distance for unweighted, undirected graphs with vertex attributes is derived and applied to a graph recognition problem. A general formulation for editing graphs is used to derive a graph edit distance that is proven to be a metric, provided the cost function for individual edit operations is a metric. Then, a binary linear program is developed for computing this graph edit distance, and polynomial time methods for determining upper and lower bounds on the solution of the binary program are derived by applying solution methods for standard linear programming and the assignment problem. A recognition problem of comparing a sample input graph to a database of known prototype graphs in the context of a chemical information system is presented as an application of the new method. The costs associated with various edit operations are chosen by using a minimum normalized variance criterion applied to pairwise distances between nearest neighbors in the database of prototypes. The new metric is shown to perform quite well in comparison to existing metrics when applied to a database of chemical graphs.

  19. Analysis of Interval Changes on Mammograms for Computer Aided Diagnosis

    DTIC Science & Technology

    2000-05-01

    tizer was calibrated so that the gray values were linearly and erage pixel values in the template and ROI, respectively. The inversely proportional to the...earlier for linearly and inversely proportional to the OD within the alignment of the breast regions, except that the regions to be range 0-4 OD...results versely proportional to the radial distance r from the nipple. in a decrease in the value of (to 20 mm. This decrease helps For the data set

  20. Energy compensation after sprint- and high-intensity interval training.

    PubMed

    Schubert, Matthew M; Palumbo, Elyse; Seay, Rebekah F; Spain, Katie K; Clarke, Holly E

    2017-01-01

    Many individuals lose less weight than expected in response to exercise interventions when considering the increased energy expenditure of exercise (ExEE). This is due to energy compensation in response to ExEE, which may include increases in energy intake (EI) and decreases in non-exercise physical activity (NEPA). We examined the degree of energy compensation in healthy young men and women in response to interval training. Data were examined from a prior study in which 24 participants (mean age, BMI, & VO2max = 28 yrs, 27.7 kg•m-2, and 32 mL∙kg-1∙min-1) completed either 4 weeks of sprint-interval training or high-intensity interval training. Energy compensation was calculated from changes in body composition (air displacement plethysmography) and exercise energy expenditure was calculated from mean heart rate based on the heart rate-VO2 relationship. Differences between high (≥ 100%) and low (< 100%) levels of energy compensation were assessed. Linear regressions were utilized to determine associations between energy compensation and ΔVO2max, ΔEI, ΔNEPA, and Δresting metabolic rate. Very large individual differences in energy compensation were noted. In comparison to individuals with low levels of compensation, individuals with high levels of energy compensation gained fat mass, lost fat-free mass, and had lower change scores for VO2max and NEPA. Linear regression results indicated that lower levels of energy compensation were associated with increases in ΔVO2max (p < 0.001) and ΔNEPA (p < 0.001). Considerable variation exists in response to short-term, low dose interval training. In agreement with prior work, increases in ΔVO2max and ΔNEPA were associated with lower energy compensation. Future studies should focus on identifying if a dose-response relationship for energy compensation exists in response to interval training, and what underlying mechanisms and participant traits contribute to the large variation between individuals.

  1. DYGABCD: A program for calculating linear A, B, C, and D matrices from a nonlinear dynamic engine simulation

    NASA Technical Reports Server (NTRS)

    Geyser, L. C.

    1978-01-01

    A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.

  2. Hybrid 3-D rocket trajectory program. Part 1: Formulation and analysis. Part 2: Computer programming and user's instruction. [computerized simulation using three dimensional motion analysis

    NASA Technical Reports Server (NTRS)

    Huang, L. C. P.; Cook, R. A.

    1973-01-01

    Models utilizing various sub-sets of the six degrees of freedom are used in trajectory simulation. A 3-D model with only linear degrees of freedom is especially attractive, since the coefficients for the angular degrees of freedom are the most difficult to determine and the angular equations are the most time consuming for the computer to evaluate. A computer program is developed that uses three separate subsections to predict trajectories. A launch rail subsection is used until the rocket has left its launcher. The program then switches to a special 3-D section which computes motions in two linear and one angular degrees of freedom. When the rocket trims out, the program switches to the standard, three linear degrees of freedom model.

  3. Solving deterministic non-linear programming problem using Hopfield artificial neural network and genetic programming techniques

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Ganesan, T.; Elamvazuthi, I.

    2012-11-01

    A fairly reasonable result was obtained for non-linear engineering problems using the optimization techniques such as neural network, genetic algorithms, and fuzzy logic independently in the past. Increasingly, hybrid techniques are being used to solve the non-linear problems to obtain better output. This paper discusses the use of neuro-genetic hybrid technique to optimize the geological structure mapping which is known as seismic survey. It involves the minimization of objective function subject to the requirement of geophysical and operational constraints. In this work, the optimization was initially performed using genetic programming, and followed by hybrid neuro-genetic programming approaches. Comparative studies and analysis were then carried out on the optimized results. The results indicate that the hybrid neuro-genetic hybrid technique produced better results compared to the stand-alone genetic programming method.

  4. On the Feasibility of a Generalized Linear Program

    DTIC Science & Technology

    1989-03-01

    generealized linear program by applying the same algorithm to a "phase-one" problem without requiring that the initial basic feasible solution to the latter be non-degenerate. secUrMTY C.AMlIS CAYI S OP ?- PAeES( UII -W & ,

  5. Impact of full field digital mammography on the classification and mammographic characteristics of interval breast cancers.

    PubMed

    Knox, Mark; O'Brien, Angela; Szabó, Endre; Smith, Clare S; Fenlon, Helen M; McNicholas, Michelle M; Flanagan, Fidelma L

    2015-06-01

    Full field digital mammography (FFDM) is increasingly replacing screen film mammography (SFM) in breast screening programs. Interval breast cancers are an issue in all screening programs and the purpose of our study is to assess the impact of FFDM on the classification of interval breast cancers at independent blind review and to compare the mammographic features of interval cancers at FFDM and SFM. This study included 138 cases of interval breast cancer, 76 following an FFDM screening examination and 62 following screening with SFM. The prior screening mammogram was assessed by each of five consultant breast radiologists who were blinded to the site of subsequent cancer. Subsequent review of the diagnostic mammogram was performed and cases were classified as missed, minimal signs, occult or true interval. Mammographic features of the interval cancer at diagnosis and any abnormality identified on the prior screening mammogram were recorded. The percentages of cancers classified as missed at FFDM and SFM did not differ significantly, 10.5% (8 of 76) at FFDM and 8.1% (5 of 62) at SFM (p=.77). There were significantly less interval cancers presenting as microcalcifications (alone or in association with another abnormality) following screening with FFDM, 16% (12 of 76) than following a SFM examination, 32% (20 of 62) (p=.02). Interval breast cancers continue to pose a problem at FFDM. The switch to FFDM has changed the mammographic presentation of interval breast cancer, with less interval cancers presenting in association with microcalcifications. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Generating AN Optimum Treatment Plan for External Beam Radiation Therapy.

    NASA Astrophysics Data System (ADS)

    Kabus, Irwin

    1990-01-01

    The application of linear programming to the generation of an optimum external beam radiation treatment plan is investigated. MPSX, an IBM linear programming software package was used. All data originated from the CAT scan of an actual patient who was treated for a pancreatic malignant tumor before this study began. An examination of several alternatives for representing the cross section of the patient showed that it was sufficient to use a set of strategically placed points in the vital organs and tumor and a grid of points spaced about one half inch apart for the healthy tissue. Optimum treatment plans were generated from objective functions representing various treatment philosophies. The optimum plans were based on allowing for 216 external radiation beams which accounted for wedges of any size. A beam reduction scheme then reduced the number of beams in the optimum plan to a number of beams small enough for implementation. Regardless of the objective function, the linear programming treatment plan preserved about 95% of the patient's right kidney vs. 59% for the plan the hospital actually administered to the patient. The clinician, on the case, found most of the linear programming treatment plans to be superior to the hospital plan. An investigation was made, using parametric linear programming, concerning any possible benefits derived from generating treatment plans based on objective functions made up of convex combinations of two objective functions, however, this proved to have only limited value. This study also found, through dual variable analysis, that there was no benefit gained from relaxing some of the constraints on the healthy regions of the anatomy. This conclusion was supported by the clinician. Finally several schemes were found that, under certain conditions, can further reduce the number of beams in the final linear programming treatment plan.

  7. Reliable design of a closed loop supply chain network under uncertainty: An interval fuzzy possibilistic chance-constrained model

    NASA Astrophysics Data System (ADS)

    Vahdani, Behnam; Tavakkoli-Moghaddam, Reza; Jolai, Fariborz; Baboli, Arman

    2013-06-01

    This article seeks to offer a systematic approach to establishing a reliable network of facilities in closed loop supply chains (CLSCs) under uncertainties. Facilities that are located in this article concurrently satisfy both traditional objective functions and reliability considerations in CLSC network designs. To attack this problem, a novel mathematical model is developed that integrates the network design decisions in both forward and reverse supply chain networks. The model also utilizes an effective reliability approach to find a robust network design. In order to make the results of this article more realistic, a CLSC for a case study in the iron and steel industry has been explored. The considered CLSC is multi-echelon, multi-facility, multi-product and multi-supplier. Furthermore, multiple facilities exist in the reverse logistics network leading to high complexities. Since the collection centres play an important role in this network, the reliability concept of these facilities is taken into consideration. To solve the proposed model, a novel interactive hybrid solution methodology is developed by combining a number of efficient solution approaches from the recent literature. The proposed solution methodology is a bi-objective interval fuzzy possibilistic chance-constraint mixed integer linear programming (BOIFPCCMILP). Finally, computational experiments are provided to demonstrate the applicability and suitability of the proposed model in a supply chain environment and to help decision makers facilitate their analyses.

  8. Perceived workplace health support is associated with employee productivity.

    PubMed

    Chen, Lu; Hannon, Peggy A; Laing, Sharon S; Kohn, Marlana J; Clark, Kathleen; Pritchard, Scott; Harris, Jeffrey R

    2015-01-01

    To examine the relationship between perceived workplace health support and employee productivity. A quantitative cross-sectional study. Washington State agencies. A total of 3528 employees from six state agencies were included in this analysis. Perceived workplace health support was assessed by two questions that queried respondents on how often they felt supported by the workplace for healthy living and physical activity. The Work Productivity and Activity Impairment Questionnaire was used to measure health-related absenteeism and presenteeism in the past 7 days. Multivariate linear regression was used to estimate the mean differences in productivity by levels of perceived health support. Most participants were between 45 and 64 years of age and were predominantly non-Hispanic white. Presenteeism varied significantly by the level of perceived workplace health support, with those who felt least supported having higher presenteeism than those who felt most supported. The difference in presenteeism by perceived workplace support remained significant in models adjusting for sociodemographic and health characteristics (mean difference: 7.1% for support for healthy living, 95% confidence interval: 3.7%, 10.4%; 4.3% for support for physical activity, 95% confidence interval: 1.7%, 6.8%). Absenteeism was not associated with perceived workplace health support. Higher perceived workplace health support is independently associated with higher work productivity. Employers may see productivity benefit from wellness programs through improved perceptions of workplace health support.

  9. Evaluating the efficiency of environmental monitoring programs

    USGS Publications Warehouse

    Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina

    2014-01-01

    Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.

  10. A Flash X-Ray Facility for the Naval Postgraduate School

    DTIC Science & Technology

    1985-06-01

    ionizing radiation, *• NPS has had active programs with a Van de Graaff generator, a reactor, radioactive sources, X-ray machines and a linear electron ...interaction of radiation with matter and with coherent radiation. Currently the most active program is at the linear electron accelerator which over...twenty years has produced some 75 theses. The flash X-ray machine was obtained to expan-i and complement the capabilities of the linear electron

  11. Summer Research Program (1992). Summer Faculty Research Program (SFRP) Reports. Volume 2. Armstrong Laboratory

    DTIC Science & Technology

    1992-12-01

    desirable. In this study, the proposed model consists of a thick-walled, highly deformable elastic tube in which the blood flow is described by linearized ...presented a mechanical model consisting of linearized Navier-Stokes and finite elasticity equations to predict blood pooling under acceleration stress... linear multielement model of the cardiovascular system which can calculate blood pressures and flows at any point in the cardio- vascular system. It

  12. Change in Breast Cancer Screening Intervals Since the 2009 USPSTF Guideline.

    PubMed

    Wernli, Karen J; Arao, Robert F; Hubbard, Rebecca A; Sprague, Brian L; Alford-Teaster, Jennifer; Haas, Jennifer S; Henderson, Louise; Hill, Deidre; Lee, Christoph I; Tosteson, Anna N A; Onega, Tracy

    2017-08-01

    In 2009, the U.S. Preventive Services Task Force (USPSTF) recommended biennial mammography for women aged 50-74 years and shared decision-making for women aged 40-49 years for breast cancer screening. We evaluated changes in mammography screening interval after the 2009 recommendations. We conducted a prospective cohort study of women aged 40-74 years who received 821,052 screening mammograms between 2006 and 2012 using data from the Breast Cancer Surveillance Consortium. We compared changes in screening intervals and stratified intervals based on whether the mammogram at the end of the interval occurred before or after the 2009 recommendation. Differences in mean interval length by woman-level characteristics were compared using linear regression. The mean interval (in months) minimally decreased after the 2009 USPSTF recommendations. Among women aged 40-49 years, the mean interval decreased from 17.2 months to 17.1 months (difference -0.16%, 95% confidence interval [CI] -0.30 to -0.01). Similar small reductions were seen for most age groups. The largest change in interval length in the post-USPSTF period was declines among women with a first-degree family history of breast cancer (difference -0.68%, 95% CI -0.82 to -0.54) or a 5-year breast cancer risk ≥2.5% (difference -0.58%, 95% CI -0.73 to -0.44). The 2009 USPSTF recommendation did not lengthen the average mammography interval among women routinely participating in mammography screening. Future studies should evaluate whether breast cancer screening intervals lengthen toward biennial intervals following new national 2016 breast cancer screening recommendations, particularly among women less than 50 years of age.

  13. EXPERIMENTS IN THE USE OF PROGRAMED MATERIALS IN TEACHING AN INTRODUCTORY COURSE IN THE BIOLOGICAL SCIENCES AT THE COLLEGE LEVEL.

    ERIC Educational Resources Information Center

    KANTASEWI, NIPHON

    THE PURPOSE OF THE STUDY WAS TO COMPARE THE EFFECTIVENESS OF (1) LECTURE PRESENTATIONS, (2) LINEAR PROGRAM USE IN CLASS WITH AND WITHOUT DISCUSSION, AND (3) LINEAR PROGRAMS USED OUTSIDE OF CLASS WITH INCLASS PROBLEMS OR DISCUSSION. THE 126 COLLEGE STUDENTS ENROLLED IN A BACTERIOLOGY COURSE WERE RANDOMLY ASSIGNED TO THREE GROUPS. IN A SUCCEEDING…

  14. A Comprehensive Meta-Analysis of Triple P-Positive Parenting Program Using Hierarchical Linear Modeling: Effectiveness and Moderating Variables

    ERIC Educational Resources Information Center

    Nowak, Christoph; Heinrichs, Nina

    2008-01-01

    A meta-analysis encompassing all studies evaluating the impact of the Triple P-Positive Parenting Program on parent and child outcome measures was conducted in an effort to identify variables that moderate the program's effectiveness. Hierarchical linear models (HLM) with three levels of data were employed to analyze effect sizes. The results (N =…

  15. User's manual for interfacing a leading edge, vortex rollup program with two linear panel methods

    NASA Technical Reports Server (NTRS)

    Desilva, B. M. E.; Medan, R. T.

    1979-01-01

    Sufficient instructions are provided for interfacing the Mangler-Smith, leading edge vortex rollup program with a vortex lattice (POTFAN) method and an advanced higher order, singularity linear analysis for computing the vortex effects for simple canard wing combinations.

  16. ELAS: A general-purpose computer program for the equilibrium problems of linear structures. Volume 2: Documentation of the program. [subroutines and flow charts

    NASA Technical Reports Server (NTRS)

    Utku, S.

    1969-01-01

    A general purpose digital computer program for the in-core solution of linear equilibrium problems of structural mechanics is documented. The program requires minimum input for the description of the problem. The solution is obtained by means of the displacement method and the finite element technique. Almost any geometry and structure may be handled because of the availability of linear, triangular, quadrilateral, tetrahedral, hexahedral, conical, triangular torus, and quadrilateral torus elements. The assumption of piecewise linear deflection distribution insures monotonic convergence of the deflections from the stiffer side with decreasing mesh size. The stresses are provided by the best-fit strain tensors in the least squares at the mesh points where the deflections are given. The selection of local coordinate systems whenever necessary is automatic. The core memory is used by means of dynamic memory allocation, an optional mesh-point relabelling scheme and imposition of the boundary conditions during the assembly time.

  17. Applying linear programming to estimate fluxes in ecosystems or food webs: An example from the herpetological assemblage of the freshwater Everglades

    USGS Publications Warehouse

    Diffendorfer, James E.; Richards, Paul M.; Dalrymple, George H.; DeAngelis, Donald L.

    2001-01-01

    We present the application of Linear Programming for estimating biomass fluxes in ecosystem and food web models. We use the herpetological assemblage of the Everglades as an example. We developed food web structures for three common Everglades freshwater habitat types: marsh, prairie, and upland. We obtained a first estimate of the fluxes using field data, literature estimates, and professional judgment. Linear programming was used to obtain a consistent and better estimate of the set of fluxes, while maintaining mass balance and minimizing deviations from point estimates. The results support the view that the Everglades is a spatially heterogeneous system, with changing patterns of energy flux, species composition, and biomasses across the habitat types. We show that a food web/ecosystem perspective, combined with Linear Programming, is a robust method for describing food webs and ecosystems that requires minimal data, produces useful post-solution analyses, and generates hypotheses regarding the structure of energy flow in the system.

  18. A Linear Programming Approach to Routing Control in Networks of Constrained Nonlinear Positive Systems with Concave Flow Rates

    NASA Technical Reports Server (NTRS)

    Arneson, Heather M.; Dousse, Nicholas; Langbort, Cedric

    2014-01-01

    We consider control design for positive compartmental systems in which each compartment's outflow rate is described by a concave function of the amount of material in the compartment.We address the problem of determining the routing of material between compartments to satisfy time-varying state constraints while ensuring that material reaches its intended destination over a finite time horizon. We give sufficient conditions for the existence of a time-varying state-dependent routing strategy which ensures that the closed-loop system satisfies basic network properties of positivity, conservation and interconnection while ensuring that capacity constraints are satisfied, when possible, or adjusted if a solution cannot be found. These conditions are formulated as a linear programming problem. Instances of this linear programming problem can be solved iteratively to generate a solution to the finite horizon routing problem. Results are given for the application of this control design method to an example problem. Key words: linear programming; control of networks; positive systems; controller constraints and structure.

  19. Train repathing in emergencies based on fuzzy linear programming.

    PubMed

    Meng, Xuelei; Cui, Bingmou

    2014-01-01

    Train pathing is a typical problem which is to assign the train trips on the sets of rail segments, such as rail tracks and links. This paper focuses on the train pathing problem, determining the paths of the train trips in emergencies. We analyze the influencing factors of train pathing, such as transferring cost, running cost, and social adverse effect cost. With the overall consideration of the segment and station capability constraints, we build the fuzzy linear programming model to solve the train pathing problem. We design the fuzzy membership function to describe the fuzzy coefficients. Furthermore, the contraction-expansion factors are introduced to contract or expand the value ranges of the fuzzy coefficients, coping with the uncertainty of the value range of the fuzzy coefficients. We propose a method based on triangular fuzzy coefficient and transfer the train pathing (fuzzy linear programming model) to a determinate linear model to solve the fuzzy linear programming problem. An emergency is supposed based on the real data of the Beijing-Shanghai Railway. The model in this paper was solved and the computation results prove the availability of the model and efficiency of the algorithm.

  20. Computer Program For Linear Algebra

    NASA Technical Reports Server (NTRS)

    Krogh, F. T.; Hanson, R. J.

    1987-01-01

    Collection of routines provided for basic vector operations. Basic Linear Algebra Subprogram (BLAS) library is collection from FORTRAN-callable routines for employing standard techniques to perform basic operations of numerical linear algebra.

  1. The Sun Health Research Institute Brain Donation Program: Description and Eexperience, 1987–2007

    PubMed Central

    Sue, Lucia I.; Walker, Douglas G.; Roher, Alex E.; Lue, LihFen; Vedders, Linda; Connor, Donald J.; Sabbagh, Marwan N.; Rogers, Joseph

    2008-01-01

    The Brain Donation Program at Sun Health Research Institute has been in continual operation since 1987, with over 1000 brains banked. The population studied primarily resides in the retirement communities of northwest metropolitan Phoenix, Arizona. The Institute is affiliated with Sun Health, a nonprofit community-owned and operated health care provider. Subjects are enrolled prospectively to allow standardized clinical assessments during life. Funding comes primarily from competitive grants. The Program has made short postmortem brain retrieval a priority, with a 2.75-h median postmortem interval for the entire collection. This maximizes the utility of the resource for molecular studies; frozen tissue from approximately 82% of all cases is suitable for RNA studies. Studies performed in-house have shown that, even with very short postmortem intervals, increasing delays in brain retrieval adversely affect RNA integrity and that cerebrospinal fluid pH increases with postmortem interval but does not predict tissue viability. PMID:18347928

  2. An interval programming model for continuous improvement in micro-manufacturing

    NASA Astrophysics Data System (ADS)

    Ouyang, Linhan; Ma, Yizhong; Wang, Jianjun; Tu, Yiliu; Byun, Jai-Hyun

    2018-03-01

    Continuous quality improvement in micro-manufacturing processes relies on optimization strategies that relate an output performance to a set of machining parameters. However, when determining the optimal machining parameters in a micro-manufacturing process, the economics of continuous quality improvement and decision makers' preference information are typically neglected. This article proposes an economic continuous improvement strategy based on an interval programming model. The proposed strategy differs from previous studies in two ways. First, an interval programming model is proposed to measure the quality level, where decision makers' preference information is considered in order to determine the weight of location and dispersion effects. Second, the proposed strategy is a more flexible approach since it considers the trade-off between the quality level and the associated costs, and leaves engineers a larger decision space through adjusting the quality level. The proposed strategy is compared with its conventional counterparts using an Nd:YLF laser beam micro-drilling process.

  3. Mortality reduction in relation to implantable cardioverter defibrillator programming in the Multicenter Automatic Defibrillator Implantation Trial-Reduce Inappropriate Therapy (MADIT-RIT).

    PubMed

    Ruwald, Anne-Christine; Schuger, Claudio; Moss, Arthur J; Kutyifa, Valentina; Olshansky, Brian; Greenberg, Henry; Cannom, David S; Estes, N A Mark; Ruwald, Martin H; Huang, David T; Klein, Helmut; McNitt, Scott; Beck, Christopher A; Goldstein, Robert; Brown, Mary W; Kautzner, Josef; Shoda, Morio; Wilber, David; Zareba, Wojciech; Daubert, James P

    2014-10-01

    The benefit of novel implantable cardioverter defibrillator (ICD) programming in reducing inappropriate ICD therapy and mortality was demonstrated in Multicenter Automatic Defibrillator Implantation Trial-Reduce Inappropriate Therapy (MADIT-RIT). However, the cause of mortality reduction remains incompletely evaluated. We aimed to identify factors associated with mortality, with focus on ICD therapy and programming in the MADIT-RIT population. In MADIT-RIT, 1500 patients with a primary prophylactic indication for ICD or cardiac resynchronization therapy with defibrillator were randomized to 1 of 3 different ICD programming arms: conventional programming (ventricular tachycardia zone ≥170 beats per minute), high-rate programming (ventricular tachycardia zone ≥200 beats per minute), and delayed programming (60-second delay before therapy ≥170 beats per minute). Multivariate Cox models were used to assess the influence of time-dependent appropriate and inappropriate ICD therapy (shock and antitachycardia pacing) and randomized programming arm on all-cause mortality. During an average follow-up of 1.4±0.6 years, 71 of 1500 (5%) patients died: cardiac in 40 patients (56.3%), noncardiac in 23 patients (32.4%), and unknown in 8 patients (11.3%). Appropriate shocks (hazard ratio, 6.32; 95% confidence interval, 3.13-12.75; P<0.001) and inappropriate therapy (hazard ratio, 2.61; 95% confidence interval, 1.28-5.31; P=0.01) were significantly associated with an increased mortality risk. There was no evidence of increased mortality risk in patients who experienced appropriate antitachycardia pacing only (hazard ratio, 1.02; 95% confidence interval, 0.36-2.88; P=0.98). Randomization to conventional programming was identified as an independent predictor of death when compared with patients randomized to high-rate programming (hazard ratio, 2.0; 95% confidence interval, 1.06-3.71; P=0.03). In MADIT-RIT, appropriate shocks, inappropriate ICD therapy, and randomization to conventional ICD programming were independently associated with an increased mortality risk. Appropriate antitachycardia pacing was not related to an adverse outcome. clinicaltrials.gov Unique identifier: NCT00947310. © 2014 American Heart Association, Inc.

  4. Long-term prevalence and predictors of urinary incontinence among women in the Diabetes Prevention Program Outcomes Study.

    PubMed

    Phelan, Suzanne; Kanaya, Alka M; Ma, Yong; Vittinghoff, Eric; Barrett-Connor, Elizabeth; Wing, Rena; Kusek, John W; Orchard, Trevor J; Crandall, Jill P; Montez, Maria G; Brown, Jeanette S

    2015-02-01

    To examine the long-term prevalence and predictors of weekly urinary incontinence in the Diabetes Prevention Program Outcomes Study, a follow-up study of the Diabetes Prevention Program randomized clinical trial of overweight adults with impaired glucose tolerance. This analysis included 1778 female participants of the Diabetes Prevention Program Outcomes Study who had been randomly assigned during the Diabetes Prevention Program to intensive lifestyle intervention (n = 582), metformin (n = 589) or placebo (n = 607). The study participants completed semi-annual assessments after the final Diabetes Prevention Program visit and for 6 years until October 2008. At the study entry, the prevalence of weekly urinary incontinence was lower in the intensive lifestyle intervention group compared with the metformin and placebo groups (44.2% vs 51.8%, 48.0% urinary incontinence/week, P = 0.04); during the 6-year follow-up period, these lower rates in intensive lifestyle intervention were maintained (46.7%, 53.1%, 49.9% urinary incontinence/week; P = 0.03). Statistically adjusting for urinary incontinence prevalence at the end of the Diabetes Prevention Program, the treatment arm no longer had a significant impact on urinary incontinence during the Diabetes Prevention Program Outcomes Study. Independent predictors of lower urinary incontinence during the Diabetes Prevention Program Outcomes Study included lower body mass index (odds ratio 0.988, 95% confidence interval 0.982-0.994) and greater physical activity (odds ratio 0.999, 95% confidence interval 0.998-1.000) at the Diabetes Prevention Program Outcomes Study entry, and greater reductions in body mass index (odds ratio 0.75, 95% confidence interval 0.60-0.94) and waist circumference (odds ratio 0.998, 95% confidence interval 0.996-1.0) during the Diabetes Prevention Program Outcomes Study. Diabetes was not significantly related to urinary incontinence. Intensive lifestyle intervention has a modest positive and enduring impact on urinary incontinence, and should be considered for the long-term prevention and treatment of urinary incontinence in overweight/obese women with glucose intolerance. © 2014 The Japanese Urological Association.

  5. The influence of sampling interval on the accuracy of trail impact assessment

    USGS Publications Warehouse

    Leung, Y.-F.; Marion, J.L.

    1999-01-01

    Trail impact assessment and monitoring (IA&M) programs have been growing in importance and application in recreation resource management at protected areas. Census-based and sampling-based approaches have been developed in such programs, with systematic point sampling being the most common survey design. This paper examines the influence of sampling interval on the accuracy of estimates for selected trail impact problems. A complete census of four impact types on 70 trails in Great Smoky Mountains National Park was utilized as the base data set for the analyses. The census data were resampled at increasing intervals to create a series of simulated point data sets. Estimates of frequency of occurrence and lineal extent for the four impact types were compared with the census data set. The responses of accuracy loss on lineal extent estimates to increasing sampling intervals varied across different impact types, while the responses on frequency of occurrence estimates were consistent, approximating an inverse asymptotic curve. These findings suggest that systematic point sampling may be an appropriate method for estimating the lineal extent but not the frequency of trail impacts. Sample intervals of less than 100 m appear to yield an excellent level of accuracy for the four impact types evaluated. Multiple regression analysis results suggest that appropriate sampling intervals are more likely to be determined by the type of impact in question rather than the length of trail. The census-based trail survey and the resampling-simulation method developed in this study can be a valuable first step in establishing long-term trail IA&M programs, in which an optimal sampling interval range with acceptable accuracy is determined before investing efforts in data collection.

  6. Program Flow Analyzer. Volume 3

    DTIC Science & Technology

    1984-08-01

    metrics are defined using these basic terms. Of interest is another measure for the size of the program, called the volume: V N x log 2 n. 5 The unit of...correlated to actual data and most useful for test. The formula des - cribing difficulty may be expressed as: nl X N2D - 2 -I/L *Difficulty then, is the...linearly independent program paths through any program graph. A maximal set of these linearly independent paths, called a "basis set," can always be found

  7. VIBRA: An interactive computer program for steady-state vibration response analysis of linear damped structures

    NASA Technical Reports Server (NTRS)

    Bowman, L. M.

    1984-01-01

    An interactive steady state frequency response computer program with graphics is documented. Single or multiple forces may be applied to the structure using a modal superposition approach to calculate response. The method can be reapplied to linear, proportionally damped structures in which the damping may be viscous or structural. The theoretical approach and program organization are described. Example problems, user instructions, and a sample interactive session are given to demonstate the program's capability in solving a variety of problems.

  8. Rational-spline approximation with automatic tension adjustment

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Kerr, P. A.

    1984-01-01

    An algorithm for weighted least-squares approximation with rational splines is presented. A rational spline is a cubic function containing a distinct tension parameter for each interval defined by two consecutive knots. For zero tension, the rational spline is identical to a cubic spline; for very large tension, the rational spline is a linear function. The approximation algorithm incorporates an algorithm which automatically adjusts the tension on each interval to fulfill a user-specified criterion. Finally, an example is presented comparing results of the rational spline with those of the cubic spline.

  9. Effects of Training and Feedback on Accuracy of Predicting Rectosigmoid Neoplastic Lesions and Selection of Surveillance Intervals by Endoscopists Performing Optical Diagnosis of Diminutive Polyps.

    PubMed

    Vleugels, Jasper L A; Dijkgraaf, Marcel G W; Hazewinkel, Yark; Wanders, Linda K; Fockens, Paul; Dekker, Evelien

    2018-05-01

    Real-time differentiation of diminutive polyps (1-5 mm) during endoscopy could replace histopathology analysis. According to guidelines, implementation of optical diagnosis into routine practice would require it to identify rectosigmoid neoplastic lesions with a negative predictive value (NPV) of more than 90%, using histologic findings as a reference, and agreement with histology-based surveillance intervals for more than 90% of cases. We performed a prospective study with 39 endoscopists accredited to perform colonoscopies on participants with positive results from fecal immunochemical tests in the Bowel Cancer Screening Program at 13 centers in the Netherlands. Endoscopists were trained in optical diagnosis using a validated module (Workgroup serrAted polypS and Polyposis). After meeting predefined performance thresholds in the training program, the endoscopists started a 1-year program (continuation phase) in which they performed narrow band imaging analyses during colonoscopies of participants in the screening program and predicted histological findings with confidence levels. The endoscopists were randomly assigned to groups that received feedback or no feedback on the accuracy of their predictions. Primary outcome measures were endoscopists' abilities to identify rectosigmoid neoplastic lesions (using histology as a reference) with NPVs of 90% or more, and selecting surveillance intervals that agreed with those determined by histology for at least 90% of cases. Of 39 endoscopists initially trained, 27 (69%) completed the training program. During the continuation phase, these 27 endoscopists performed 3144 colonoscopies in which 4504 diminutive polyps were removed. The endoscopists identified neoplastic lesions with a pooled NPV of 90.8% (95% confidence interval 88.6-92.6); their proposed surveillance intervals agreed with those determined by histologic analysis for 95.4% of cases (95% confidence interval 94.0-96.6). Findings did not differ between the group that did vs did not receive feedback. Sixteen endoscopists (59%) identified rectosigmoid neoplastic lesions with NPVs greater than 90% and selected surveillance intervals in agreement with those determined from histology for more than 90% of patients. In a prospective study following a validated training module, we found that a selected group of endoscopists identified rectosigmoid neoplastic lesions with pooled NPVs greater than 90% and accurately selected surveillance intervals for more than 90% of patients over the course of 1 year. Providing regular interim feedback on the accuracy of neoplastic lesion prediction and surveillance interval selection did not lead to differences in those endpoints. Monitoring is suggested, as individual performance varied. ClinicalTrials.gov no: NCT02516748; Netherland Trial Register: NTR4635. Copyright © 2018 AGA Institute. Published by Elsevier Inc. All rights reserved.

  10. Generalised Assignment Matrix Methodology in Linear Programming

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2012-01-01

    Discrete Mathematics instructors and students have long been struggling with various labelling and scanning algorithms for solving many important problems. This paper shows how to solve a wide variety of Discrete Mathematics and OR problems using assignment matrices and linear programming, specifically using Excel Solvers although the same…

  11. A two-stage mixed-integer fuzzy programming with interval-valued membership functions approach for flood-diversion planning.

    PubMed

    Wang, S; Huang, G H

    2013-03-15

    Flood disasters have been extremely severe in recent decades, and they account for about one third of all natural catastrophes throughout the world. In this study, a two-stage mixed-integer fuzzy programming with interval-valued membership functions (TMFP-IMF) approach is developed for flood-diversion planning under uncertainty. TMFP-IMF integrates the fuzzy flexible programming, two-stage stochastic programming, and integer programming within a general framework. A concept of interval-valued fuzzy membership function is introduced to address complexities of system uncertainties. TMFP-IMF can not only deal with uncertainties expressed as fuzzy sets and probability distributions, but also incorporate pre-regulated water-diversion policies directly into its optimization process. TMFP-IMF is applied to a hypothetical case study of flood-diversion planning for demonstrating its applicability. Results indicate that reasonable solutions can be generated for binary and continuous variables. A variety of flood-diversion and capacity-expansion schemes can be obtained under four scenarios, which enable decision makers (DMs) to identify the most desired one based on their perceptions and attitudes towards the objective-function value and constraints. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Best-Fit Conic Approximation of Spacecraft Trajectory

    NASA Technical Reports Server (NTRS)

    Singh, Gurkipal

    2005-01-01

    A computer program calculates a best conic fit of a given spacecraft trajectory. Spacecraft trajectories are often propagated as conics onboard. The conic-section parameters as a result of the best-conic-fit are uplinked to computers aboard the spacecraft for use in updating predictions of the spacecraft trajectory for operational purposes. In the initial application for which this program was written, there is a requirement to fit a single conic section (necessitated by onboard memory constraints) accurate within 200 microradians to a sequence of positions measured over a 4.7-hour interval. The present program supplants a prior one that could not cover the interval with fewer than four successive conic sections. The present program is based on formulating the best-fit conic problem as a parameter-optimization problem and solving the problem numerically, on the ground, by use of a modified steepest-descent algorithm. For the purpose of this algorithm, optimization is defined as minimization of the maximum directional propagation error across the fit interval. In the specific initial application, the program generates a single 4.7-hour conic, the directional propagation of which is accurate to within 34 microradians easily exceeding the mission constraints by a wide margin.

  13. A simplified computer program for the prediction of the linear stability behavior of liquid propellant combustors

    NASA Technical Reports Server (NTRS)

    Mitchell, C. E.; Eckert, K.

    1979-01-01

    A program for predicting the linear stability of liquid propellant rocket engines is presented. The underlying model assumptions and analytical steps necessary for understanding the program and its input and output are also given. The rocket engine is modeled as a right circular cylinder with an injector with a concentrated combustion zone, a nozzle, finite mean flow, and an acoustic admittance, or the sensitive time lag theory. The resulting partial differential equations are combined into two governing integral equations by the use of the Green's function method. These equations are solved using a successive approximation technique for the small amplitude (linear) case. The computational method used as well as the various user options available are discussed. Finally, a flow diagram, sample input and output for a typical application and a complete program listing for program MODULE are presented.

  14. Fixed interval smoothing with discrete measurements.

    NASA Technical Reports Server (NTRS)

    Bierman, G. J.

    1972-01-01

    Smoothing equations for a linear continuous dynamic system with linear discrete measurements, derived from the discrete results of Rauch, Tung, and Striebel (1965), (R-T-S), are used to extend, through recursive updating, the previously published results of Bryson and Frazier (1963), (B-F), and yield a modified Bryson and Frazier, (M-B-F), algorithm. A comparison of the (M-B-F) and (R-T-S) algorithms leads to the conclusion that the former is to be preferred because it entails less computation, less storage, and less instability. It is felt that the presented (M-B-F) smoothing algorithm is a practical mechanization and should be of value in smoothing discretely observed dynamic linear systems.

  15. The fastclime Package for Linear Programming and Large-Scale Precision Matrix Estimation in R.

    PubMed

    Pang, Haotian; Liu, Han; Vanderbei, Robert

    2014-02-01

    We develop an R package fastclime for solving a family of regularized linear programming (LP) problems. Our package efficiently implements the parametric simplex algorithm, which provides a scalable and sophisticated tool for solving large-scale linear programs. As an illustrative example, one use of our LP solver is to implement an important sparse precision matrix estimation method called CLIME (Constrained L 1 Minimization Estimator). Compared with existing packages for this problem such as clime and flare, our package has three advantages: (1) it efficiently calculates the full piecewise-linear regularization path; (2) it provides an accurate dual certificate as stopping criterion; (3) it is completely coded in C and is highly portable. This package is designed to be useful to statisticians and machine learning researchers for solving a wide range of problems.

  16. Analysis of graphic representation ability in oscillation phenomena

    NASA Astrophysics Data System (ADS)

    Dewi, A. R. C.; Putra, N. M. D.; Susilo

    2018-03-01

    This study aims to investigates how the ability of students to representation graphs of linear function and harmonic function in understanding of oscillation phenomena. Method of this research used mix methods with concurrent embedded design. The subjects were 35 students of class X MIA 3 SMA 1 Bae Kudus. Data collection through giving essays and interviews that lead to the ability to read and draw graphs in material of Hooke's law and oscillation characteristics. The results of study showed that most of the students had difficulty in drawing graph of linear function and harmonic function of deviation with time. Students’ difficulties in drawing the graph of linear function is the difficulty of analyzing the variable data needed in graph making, confusing the placement of variable data on the coordinate axis, the difficulty of determining the scale interval on each coordinate, and the variation of how to connect the dots forming the graph. Students’ difficulties in representing the graph of harmonic function is to determine the time interval of sine harmonic function, the difficulty to determine the initial deviation point of the drawing, the difficulty of finding the deviation equation of the case of oscillation characteristics and the confusion to different among the maximum deviation (amplitude) with the length of the spring caused the load.Complexity of the characteristic attributes of the oscillation phenomena graphs, students tend to show less well the ability of graphical representation of harmonic functions than the performance of the graphical representation of linear functions.

  17. Does the covariance structure matter in longitudinal modelling for the prediction of future CD4 counts?

    PubMed

    Taylor, J M; Law, N

    1998-10-30

    We investigate the importance of the assumed covariance structure for longitudinal modelling of CD4 counts. We examine how individual predictions of future CD4 counts are affected by the covariance structure. We consider four covariance structures: one based on an integrated Ornstein-Uhlenbeck stochastic process; one based on Brownian motion, and two derived from standard linear and quadratic random-effects models. Using data from the Multicenter AIDS Cohort Study and from a simulation study, we show that there is a noticeable deterioration in the coverage rate of confidence intervals if we assume the wrong covariance. There is also a loss in efficiency. The quadratic random-effects model is found to be the best in terms of correctly calibrated prediction intervals, but is substantially less efficient than the others. Incorrectly specifying the covariance structure as linear random effects gives too narrow prediction intervals with poor coverage rates. Fitting using the model based on the integrated Ornstein-Uhlenbeck stochastic process is the preferred one of the four considered because of its efficiency and robustness properties. We also use the difference between the future predicted and observed CD4 counts to assess an appropriate transformation of CD4 counts; a fourth root, cube root and square root all appear reasonable choices.

  18. Long-term prediction of creep strains of mineral wool slabs under constant compressive stress

    NASA Astrophysics Data System (ADS)

    Gnip, Ivan; Vaitkus, Saulius; Keršulis, Vladislovas; Vėjelis, Sigitas

    2012-02-01

    The results obtained in determining the creep strain of mineral wool slabs under compressive stress, used for insulating flat roofs and facades, cast-in-place floors, curtain and external basement walls, as well as for sound insulation of floors, are presented. The creep strain tests were conducted under a compressive stress of σ c =0.35 σ 10%. Interval forecasting of creep strain was made by extrapolating the creep behaviour and approximated in accordance with EN 1606 by a power equation and reduced to a linear form using logarithms. This was performed for a lead time of 10 years. The extension of the range of the confidence interval due to discount of the prediction data, i.e. a decrease in their informativity was allowed for by an additional coefficient. Analysis of the experimental data obtained from the tests having 65 and 122 days duration showed that the prediction of creep strains for 10 years can be made based on data obtained in experiments with durations shorter than the 122 days as specified by EN 13162. Interval prediction of creep strains (with a confidence probability of 90%) was based on using the mean square deviation of the actual direct observations of creep strains in logarithmic form to have the linear trend in a retrospective area.

  19. Linear Goal Programming as a Military Decision Aid.

    DTIC Science & Technology

    1988-04-01

    JAMES F. MAJOR9 USAF 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (Year, Month, Day) 15. PAGE COUNT IFROM____ TO 1988 APRIL 64 16...air warfare, advanced armour warfare, the potential f or space warfare, and many other advances have expanded the breadth of weapons employed to the...written by A. Charnes and W. W. Cooper, Management Models and Industrial Applications of Linear Programming In 1961.(3:5) Since this time linear

  20. An hourly PM10 diagnosis model for the Bilbao metropolitan area using a linear regression methodology.

    PubMed

    González-Aparicio, I; Hidalgo, J; Baklanov, A; Padró, A; Santa-Coloma, O

    2013-07-01

    There is extensive evidence of the negative impacts on health linked to the rise of the regional background of particulate matter (PM) 10 levels. These levels are often increased over urban areas becoming one of the main air pollution concerns. This is the case on the Bilbao metropolitan area, Spain. This study describes a data-driven model to diagnose PM10 levels in Bilbao at hourly intervals. The model is built with a training period of 7-year historical data covering different urban environments (inland, city centre and coastal sites). The explanatory variables are quantitative-log [NO2], temperature, short-wave incoming radiation, wind speed and direction, specific humidity, hour and vehicle intensity-and qualitative-working days/weekends, season (winter/summer), the hour (from 00 to 23 UTC) and precipitation/no precipitation. Three different linear regression models are compared: simple linear regression; linear regression with interaction terms (INT); and linear regression with interaction terms following the Sawa's Bayesian Information Criteria (INT-BIC). Each type of model is calculated selecting two different periods: the training (it consists of 6 years) and the testing dataset (it consists of 1 year). The results of each type of model show that the INT-BIC-based model (R(2) = 0.42) is the best. Results were R of 0.65, 0.63 and 0.60 for the city centre, inland and coastal sites, respectively, a level of confidence similar to the state-of-the art methodology. The related error calculated for longer time intervals (monthly or seasonal means) diminished significantly (R of 0.75-0.80 for monthly means and R of 0.80 to 0.98 at seasonally means) with respect to shorter periods.

  1. Stochastic Dynamic Mixed-Integer Programming (SD-MIP)

    DTIC Science & Technology

    2015-05-05

    stochastic linear programming ( SLP ) problems. By using a combination of ideas from cutting plane theory of deterministic MIP (especially disjunctive...developed to date. b) As part of this project, we have also developed tools for very large scale Stochastic Linear Programming ( SLP ). There are...several reasons for this. First, SLP models continue to challenge many of the fastest computers to date, and many applications within the DoD (e.g

  2. Notes of the Design of Two Supercavitating Hydrofoils

    DTIC Science & Technology

    1975-07-01

    Foil Section Characteristics Definition Tulin Two -Term Levi - Civita Larock and Street Two -Term three pararreter Prcgram and Inputs linearized two ...36 NOMENCLATURE Symbol Description Dimensions AIA 2 Angle distribution multipliers in Levi - radians Civita Program AR Aspect ratio CL Lift coefficient...angle of attack radian B Constant angle in Levi - Civita program radian 6 Linearized angle of attack superposed degrees C Wu’s 1955 program parameter

  3. Effects of body size and gender on the population pharmacokinetics of artesunate and its active metabolite dihydroartemisinin in pediatric malaria patients.

    PubMed

    Morris, Carrie A; Tan, Beesan; Duparc, Stephan; Borghini-Fuhrer, Isabelle; Jung, Donald; Shin, Chang-Sik; Fleckenstein, Lawrence

    2013-12-01

    Despite the important role of the antimalarial artesunate and its active metabolite dihydroartemisinin (DHA) in malaria treatment efforts, there are limited data on the pharmacokinetics of these agents in pediatric patients. This study evaluated the effects of body size and gender on the pharmacokinetics of artesunate-DHA using data from pediatric and adult malaria patients. Nonlinear mixed-effects modeling was used to obtain a base model consisting of first-order artesunate absorption and one-compartment models for artesunate and for DHA. Various methods of incorporating effects of body size descriptors on clearance and volume parameters were tested. An allometric scaling model for weight and a linear body surface area (BSA) model were deemed optimal. The apparent clearance and volume of distribution of DHA obtained with the allometric scaling model, normalized to a 38-kg patient, were 63.5 liters/h and 65.1 liters, respectively. Estimates for the linear BSA model were similar. The 95% confidence intervals for the estimated gender effects on clearance and volume parameters for artesunate fell outside the predefined no-relevant-clinical-effect interval of 0.75 to 1.25. However, the effect of gender on apparent DHA clearance was almost entirely contained within this interval, suggesting a lack of an influence of gender on this parameter. Overall, the pharmacokinetics of artesunate and DHA following oral artesunate administration can be described for pediatric patients using either an allometric scaling or linear BSA model. Both models predict that, for a given artesunate dose in mg/kg of body weight, younger children are expected to have lower DHA exposure than older children or adults.

  4. Effect of Short-Term, High-Intensity Exercise on Anaerobic Threshold in Women.

    ERIC Educational Resources Information Center

    Evans, Blanche W.

    This study investigated the effects of a six-week, high-intensity cycling program on anaerobic threshold (AT) in ten women. Subjects trained four days a week using high-intensity interval-type cycle exercises. Workouts included six 4-minute intervals cycling at 85 percent maximal oxygen uptake (VO sub 2 max), separated by 3-minute intervals of…

  5. A Solution Space for a System of Null-State Partial Differential Equations: Part 2

    NASA Astrophysics Data System (ADS)

    Flores, Steven M.; Kleban, Peter

    2015-01-01

    This article is the second of four that completely and rigorously characterize a solution space for a homogeneous system of 2 N + 3 linear partial differential equations in 2 N variables that arises in conformal field theory (CFT) and multiple Schramm-Löwner evolution (SLE). The system comprises 2 N null-state equations and three conformal Ward identities which govern CFT correlation functions of 2 N one-leg boundary operators. In the first article (Flores and Kleban, Commun Math Phys, arXiv:1212.2301, 2012), we use methods of analysis and linear algebra to prove that dim , with C N the Nth Catalan number. The analysis of that article is complete except for the proof of a lemma that it invokes. The purpose of this article is to provide that proof. The lemma states that if every interval among ( x 2, x 3), ( x 3, x 4),…,( x 2 N-1, x 2 N ) is a two-leg interval of (defined in Flores and Kleban, Commun Math Phys, arXiv:1212.2301, 2012), then F vanishes. Proving this lemma by contradiction, we show that the existence of such a nonzero function implies the existence of a non-vanishing CFT two-point function involving primary operators with different conformal weights, an impossibility. This proof (which is rigorous in spite of our occasional reference to CFT) involves two different types of estimates, those that give the asymptotic behavior of F as the length of one interval vanishes, and those that give this behavior as the lengths of two intervals vanish simultaneously. We derive these estimates by using Green functions to rewrite certain null-state PDEs as integral equations, combining other null-state PDEs to obtain Schauder interior estimates, and then repeatedly integrating the integral equations with these estimates until we obtain optimal bounds. Estimates in which two interval lengths vanish simultaneously divide into two cases: two adjacent intervals and two non-adjacent intervals. The analysis of the latter case is similar to that for one vanishing interval length. In contrast, the analysis of the former case is more complicated, involving a Green function that contains the Jacobi heat kernel as its essential ingredient.

  6. The Next Linear Collider Program

    Science.gov Websites

    The Next Linear Collider at SLAC Navbar NLC Playpen Warning: This page is provided as a place for Comments & Suggestions | Desktop Trouble Call | Linear Collider Group at FNAL || This page was updated

  7. Microwave and Electron Beam Computer Programs

    DTIC Science & Technology

    1988-06-01

    Research (ONR). SCRIBE was adapted by MRC from the Stanford Linear Accelerator Center Beam Trajectory Program, EGUN . oTIC NSECE Acc !,,o For IDL1C I...achieved with SCRIBE. It is a ver- sion of the Stanford Linear Accelerator (SLAC) code EGUN (Ref. 8), extensively modified by MRC for research on

  8. Interior-Point Methods for Linear Programming: A Review

    ERIC Educational Resources Information Center

    Singh, J. N.; Singh, D.

    2002-01-01

    The paper reviews some recent advances in interior-point methods for linear programming and indicates directions in which future progress can be made. Most of the interior-point methods belong to any of three categories: affine-scaling methods, potential reduction methods and central path methods. These methods are discussed together with…

  9. AN EVALUATION OF HEURISTICS FOR THRESHOLD-FUNCTION TEST-SYNTHESIS,

    DTIC Science & Technology

    Linear programming offers the most attractive procedure for testing and obtaining optimal threshold gate realizations for functions generated in...The design of the experiments may be of general interest to students of automatic problem solving; the results should be of interest in threshold logic and linear programming. (Author)

  10. Effect of the learning climate of residency programs on faculty's teaching performance as evaluated by residents.

    PubMed

    Lombarts, Kiki M J M H; Heineman, Maas Jan; Scherpbier, Albert J J A; Arah, Onyebuchi A

    2014-01-01

    To understand teaching performance of individual faculty, the climate in which residents' learning takes place, the learning climate, may be important. There is emerging evidence that specific climates do predict specific outcomes. Until now, the effect of learning climate on the performance of the individual faculty who actually do the teaching was unknown. THIS STUDY: (i) tested the hypothesis that a positive learning climate was associated with better teaching performance of individual faculty as evaluated by residents, and (ii) explored which dimensions of learning climate were associated with faculty's teaching performance. We conducted two cross-sectional questionnaire surveys amongst residents from 45 residency training programs and multiple specialties in 17 hospitals in the Netherlands. Residents evaluated the teaching performance of individual faculty using the robust System for Evaluating Teaching Qualities (SETQ) and evaluated the learning climate of residency programs using the Dutch Residency Educational Climate Test (D-RECT). The validated D-RECT questionnaire consisted of 11 subscales of learning climate. Main outcome measure was faculty's overall teaching (SETQ) score. We used multivariable adjusted linear mixed models to estimate the separate associations of overall learning climate and each of its subscales with faculty's teaching performance. In total 451 residents completed 3569 SETQ evaluations of 502 faculty. Residents also evaluated the learning climate of 45 residency programs in 17 hospitals in the Netherlands. Overall learning climate was positively associated with faculty's teaching performance (regression coefficient 0.54, 95% confidence interval: 0.37 to 0.71; P<0.001). Three out of 11 learning climate subscales were substantially associated with better teaching performance: 'coaching and assessment', 'work is adapted to residents' competence', and 'formal education'. Individual faculty's teaching performance evaluations are positively affected by better learning climate of residency programs.

  11. Inequalities in oral health: are schoolchildren receiving the Bolsa Família more vulnerable?

    PubMed Central

    de Oliveira, Luísa Jardim Corrêa; Correa, Marcos Britto; Nascimento, Gustavo Giacomelli; Goettems, Marília Leão; Tarquínio, Sandra Beatriz Chaves; Torriani, Dione Dias; Demarco, Flávio Fernando

    2013-01-01

    OBJECTIVE To evaluate the association between being a recipient of the Bolsa Família program and oral health conditions in Brazilian schoolchildren. METHODS A cross-sectional study was conducted with 1,107 schoolchildren aged between eight and 12 years from 20 public and private schools in Pelotas, RS, Southern Brazil. A list of all children receiving the Bolsa Família program was provided by the participant schools. Demographic, socioeconomic and oral hygiene information were assessed using a questionnaire completed by the schoolchildren and their parents. Dental exams were performed to assess the presence of dental plaque and prevalence of dental caries. Data were analyzed by Chi-square test, Chi-square test for linear trend and multivariate Poisson Regression (prevalence ratio; 95% confidence interval). RESULTS Schoolchildren from non-nuclear families, with a DMFT ≥ 1 and who had never visited a dentist were associated with receiving the Bolsa Família. Final model showed that caries prevalence was twice as high (PR 2.00; 95%CI 1.47;2.69) in schoolchildren benefiting from the Bolsa Família . It was also showed that schoolchildren benefiting from the program presented greater severity of dental caries compared to school children from private schools (RR 1.53; 95%CI 1.18;2.00). After final adjustments, the prevalence of schoolchildren who have never visited a dentist was six times higher in children who received the government benefit (PR 6.18; 95%CI 3.07;12.45) compared to those from private schools. CONCLUSIONS Schoolchildren benefiting from the Bolsa Família program experienced more caries lesions and have less frequently accessed dental care services, which suggest the need to include oral health in the program. PMID:24626542

  12. ALPS - A LINEAR PROGRAM SOLVER

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    Linear programming is a widely-used engineering and management tool. Scheduling, resource allocation, and production planning are all well-known applications of linear programs (LP's). Most LP's are too large to be solved by hand, so over the decades many computer codes for solving LP's have been developed. ALPS, A Linear Program Solver, is a full-featured LP analysis program. ALPS can solve plain linear programs as well as more complicated mixed integer and pure integer programs. ALPS also contains an efficient solution technique for pure binary (0-1 integer) programs. One of the many weaknesses of LP solvers is the lack of interaction with the user. ALPS is a menu-driven program with no special commands or keywords to learn. In addition, ALPS contains a full-screen editor to enter and maintain the LP formulation. These formulations can be written to and read from plain ASCII files for portability. For those less experienced in LP formulation, ALPS contains a problem "parser" which checks the formulation for errors. ALPS creates fully formatted, readable reports that can be sent to a printer or output file. ALPS is written entirely in IBM's APL2/PC product, Version 1.01. The APL2 workspace containing all the ALPS code can be run on any APL2/PC system (AT or 386). On a 32-bit system, this configuration can take advantage of all extended memory. The user can also examine and modify the ALPS code. The APL2 workspace has also been "packed" to be run on any DOS system (without APL2) as a stand-alone "EXE" file, but has limited memory capacity on a 640K system. A numeric coprocessor (80X87) is optional but recommended. The standard distribution medium for ALPS is a 5.25 inch 360K MS-DOS format diskette. IBM, IBM PC and IBM APL2 are registered trademarks of International Business Machines Corporation. MS-DOS is a registered trademark of Microsoft Corporation.

  13. Use of microcomputer in mapping depth of stratigraphic horizons in National Petroleum Reserve in Alaska

    USGS Publications Warehouse

    Payne, Thomas G.

    1982-01-01

    REGIONAL MAPPER is a menu-driven system in the BASIC language for computing and plotting (1) time, depth, and average velocity to geologic horizons, (2) interval time, thickness, and interval velocity of stratigraphic intervals, and (3) subcropping and onlapping intervals at unconformities. The system consists of three programs: FILER, TRAVERSER, and PLOTTER. A control point is a shot point with velocity analysis or a shot point at or near a well with velocity check-shot survey. Reflection time to and code number of seismic horizons are filed by digitizing tablet from record sections. TRAVERSER starts at a point of geologic control and, in traversing to another, parallels seismic events, records loss of horizons by onlap and truncation, and stores reflection time for geologic horizons at traversed shot points. TRAVERSER is basically a phantoming procedure. Permafrost thickness and velocity variations, buried canyons with low-velocity fill, and error in seismically derived velocity cause velocity anomalies that complicate depth mapping. Two depths to the top of the pebble is based shale are computed for each control point. One depth, designated Zs on seismically derived velocity. The other (Zw) is based on interval velocity interpolated linearly between wells and multiplied by interval time (isochron) to give interval thickness. Z w is computed for all geologic horizons by downward summation of interval thickness. Unknown true depth (Z) to the pebble shale may be expressed as Z = Zs + es and Z = Zw + ew where the e terms represent error. Equating the two expressions gives the depth difference D = Zs + Zw = ew + es A plot of D for the top of the pebble shale is readily contourable but smoothing is required to produce a reasonably simple surface. Seismically derived velocity used in computing Zs includes the effect of velocity anomalies but is subject to some large randomly distributed errors resulting in depth errors (es). Well-derived velocity used in computing Zw does not include the effect of velocity anomalies, but the error (ew) should reflect these anomalies and should be contourable (non-random). The D surface as contoured with smoothing is assumed to represent ew, that is, the depth effect of variations in permafrost thickness and velocity and buried canyon depth. Estimated depth (Zest) to each geologic horizon is the sum of Z w for that horizon and a constant e w as contoured for the pebble shale, which is the first highly continuous seismic horizon below the zone of anomalous velocity. Results of this 'depthing' procedure are compared with those of Tetra Tech, Inc., the subcontractor responsible for geologic and geophysical interpretation and mapping.

  14. HPV vaccination impact on a cervical cancer screening program: methods of the FASTER-Tlalpan Study in Mexico.

    PubMed

    Salmerón, Jorge; Torres-Ibarra, Leticia; Bosch, F Xavier; Cuzick, Jack; Lörincz, Attila; Wheeler, Cosette M; Castle, Philip E; Robles, Claudia; Lazcano-Ponce, Eduardo

    2016-04-01

    To outline the design of a clinical trial to evaluate the impact of HPV vaccination as part of a hrHPV-based primary screening program to extend screening intervals. A total of 18,000 women aged 25-45 years, attending the regular cervical cancer-screening program in primary health care services in Tlalpan, Mexico City, will be invited to the study. Eligible participants will be assigned to one of three comparison groups: 1) HPV16/18 vaccine and hrHPV-based screening; 2) HPV6/11/16/18 vaccine and hrHPV-based screening; 3) Control group who will receive only hrHPV-based screening. Strict surveillance of hrHPV persistent infection and occurrence of precancerous lesions will be conducted to estimate safety profiles at different screening intervals; participants will undergo diagnosis confirmation and treatment as necessary. The FASTER-Tlalpan Study will provide insights into new approaches of cervical cancer prevention programs. It will offer valuable information on potential benefits of combining HPV vaccination and hrHPV-based screening to safety extend screening intervals.

  15. Consensus for linear multi-agent system with intermittent information transmissions using the time-scale theory

    NASA Astrophysics Data System (ADS)

    Taousser, Fatima; Defoort, Michael; Djemai, Mohamed

    2016-01-01

    This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.

  16. Runtime Analysis of Linear Temporal Logic Specifications

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Havelund, Klaus

    2001-01-01

    This report presents an approach to checking a running program against its Linear Temporal Logic (LTL) specifications. LTL is a widely used logic for expressing properties of programs viewed as sets of executions. Our approach consists of translating LTL formulae to finite-state automata, which are used as observers of the program behavior. The translation algorithm we propose modifies standard LTL to B chi automata conversion techniques to generate automata that check finite program traces. The algorithm has been implemented in a tool, which has been integrated with the generic JPaX framework for runtime analysis of Java programs.

  17. From Fault-Diagnosis and Performance Recovery of a Controlled System to Chaotic Secure Communication

    NASA Astrophysics Data System (ADS)

    Hsu, Wen-Teng; Tsai, Jason Sheng-Hong; Guo, Fang-Cheng; Guo, Shu-Mei; Shieh, Leang-San

    Chaotic systems are often applied to encryption on secure communication, but they may not provide high-degree security. In order to improve the security of communication, chaotic systems may need to add other secure signals, but this may cause the system to diverge. In this paper, we redesign a communication scheme that could create secure communication with additional secure signals, and the proposed scheme could keep system convergence. First, we introduce the universal state-space adaptive observer-based fault diagnosis/estimator and the high-performance tracker for the sampled-data linear time-varying system with unanticipated decay factors in actuators/system states. Besides, robustness, convergence in the mean, and tracking ability are given in this paper. A residual generation scheme and a mechanism for auto-tuning switched gain is also presented, so that the introduced methodology is applicable for the fault detection and diagnosis (FDD) for actuator and state faults to yield a high tracking performance recovery. The evolutionary programming-based adaptive observer is then applied to the problem of secure communication. Whenever the tracker induces a large control input which might not conform to the input constraint of some physical systems, the proposed modified linear quadratic optimal tracker (LQT) can effectively restrict the control input within the specified constraint interval, under the acceptable tracking performance. The effectiveness of the proposed design methodology is illustrated through tracking control simulation examples.

  18. The Next Linear Collider Program-News

    Science.gov Websites

    The Next Linear Collider at SLAC Navbar The Next Linear Collider In The Press The Secretary of Linear Collider is a high-priority goal of this plan. http://www.sc.doe.gov/Sub/Facilities_for_future/20 -term projects in conceputal stages (the Linear Collider is the highest priority project in this

  19. Department of Defense Precise Time and Time Interval program improvement plan

    NASA Technical Reports Server (NTRS)

    Bowser, J. R.

    1981-01-01

    The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.

  20. [Cardiovascular recovery during intermittent exercise in highly-adherent partic pants with hypertension and type 2 diabetes mellitus].

    PubMed

    Cano-Montoya, Johnattan; Álvarez, Cristian; Martínez, Cristian; Salas, Andrés; Sade, Farid; Ramírez-Campillo, Rodrigo

    2016-09-01

    Despite the evidence supporting metabolic benefits of high intensity interval exercise (HIIT), there is little information about the cardiovascular response to this type of exercise in patients with type 2 diabetes (T2D) and hypertension (HTA). To analyze the changes in heart rate at rest, at the onset and at the end of each interval of training, after twelve weeks of a HIIT program in T2D and HTA patients. Twenty-three participants with T2D and HTA (20 women) participated in a controlled HIIT program. Fourteen participants attended 90% of more session of exercise and were considered as adherent. Adherent and non-adherent participants had similar body mass index (BMI), and blood pressure. A “1x2x10” (work: rest-time: intervals) HIIT exercise protocol was used both as a test and as training method during twelve weeks. The initial and finishing heart rate (HR) of each of the ten intervals before and after the intervention were measured. After twelve weeks of HIIT intervention, adherent participants had a significant reduction in the heart rate at the onset of exercise, and during intervals 4, 5, 8 and 10. A reduction in the final heart rate was observed during intervals 8 and 10. In the same participants the greatest magnitude of reduction, at the onset or end of exercise was approximately 10 beats/min. No significant changes in BMI, resting heart rate and blood pressure were observed. A HIIT program reduces the cardiovascular effort to a given work-load and improves cardiovascular recovery after exercise.

  1. Long-term Results of an Obesity Program in an Ethnically Diverse Pediatric Population

    PubMed Central

    Nowicka, Paulina; Shaw, Melissa; Yu, Sunkyung; Dziura, James; Chavent, Georgia; O'Malley, Grace; Serrecchia, John B.; Tamborlane, William V.; Caprio, Sonia

    2011-01-01

    OBJECTIVE: To determine if beneficial effects of a weight-management program could be sustained for up to 24 months in a randomized trial in an ethnically diverse obese population. PATIENTS AND METHODS: There were 209 obese children (BMI > 95th percentile), ages 8 to 16 of mixed ethnic backgrounds randomly assigned to the intensive lifestyle intervention or clinic control group. The control group received counseling every 6 months, and the intervention group received a family-based program, which included exercise, nutrition, and behavior modification. Lifestyle intervention sessions occurred twice weekly for the first 6 months, then twice monthly for the second 6 months; for the last 12 months there was no active intervention. There were 174 children who completed the 12 months of the randomized trial. Follow-up data were available for 76 of these children at 24 months. There were no statistical differences in dropout rates among ethnic groups or in any other aspects. RESULTS: Treatment effect was sustained at 24 months in the intervention versus control group for BMI z score (−0.16 [95% confidence interval: −0.23 to −0.09]), BMI (−2.8 kg/m2 [95% confidence interval: −4.0–1.6 kg/m2]), percent body fat (−4.2% [95% confidence interval: −6.4% to −2.0%]), total body fat mass (−5.8 kg [95% confidence interval: −9.1 kg to −2.6 kg]), total cholesterol (−13.0 mg/dL [95% confidence interval: −21.7 mg/dL to −4.2 mg/dL]), low-density lipoprotein cholesterol (−10.4 mg/dL [95% confidence interval: −18.3 mg/dL to −2.4 mg/dL]), and homeostasis model assessment of insulin resistance (−2.05 [95% confidence interval: −2.48 to −1.75]). CONCLUSIONS: This study, unprecedented because of the high degree of obesity and ethnically diverse backgrounds of children, reveals that benefits of an intensive lifestyle program can be sustained 12 months after completing the active intervention phase. PMID:21300674

  2. Positive Aspects of Caregiving and Caregiver Burden: A Study of Caregivers of Patients With Dementia.

    PubMed

    Abdollahpour, Ibrahim; Nedjat, Saharnaz; Salimi, Yahya

    2018-01-01

    Now positive aspect of caregiving (PAC) is well-defined as caregiver gains, satisfaction, meaningful life, and enhanced family relationship. The adjusted association of PAC and caregiver burden is not well acknowledged. This study investigated the association of caregiver burden and PAC adjusting for potential confounders. This was a cross-sectional study that recruited 132 caregivers. A linear regression model with PAC was used to estimate the adjusted associations. The caregiver burden was negatively associated with PAC (mean difference in PAC per a 1-unit increase in caregiver burden = -0.12, 95% confidence interval: -0.18 to -0.056; P < .001). This association remained after adjustment for caregivers' age and marital status as well as patients' dependency level. The negative significant association of caregiver burden with PAC reinforces the need for interventional and/or educational programs aiming at decreasing the overall imposed burden. This can play an important role in improving caregivers' general health and quality of life.

  3. Population-based absolute risk estimation with survey data

    PubMed Central

    Kovalchik, Stephanie A.; Pfeiffer, Ruth M.

    2013-01-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  4. Incidence of silicosis among ceramic workers in central Italy.

    PubMed

    Cavariani, F; Di Pietro, A; Miceli, M; Forastiere, F; Biggeri, A; Scavalli, P; Petti, A; Borgia, P

    1995-01-01

    The incidence of radiological silicosis was studied among 2480 male workers employed in the ceramics industry. The subjects entered the surveillance program during 1974-1987 and were followed through 1991 with annual chest radiographs. The cumulative risk of silicosis (1/1 or greater; p,q,r) reached 48% (95% confidence interval 41.5-54.9) after 30 years of employment. In a multivariate Cox's proportional hazards model, the effect of duration of exposure increased linearly up to the category of 25-29 years; an extremely high hazard risk of 14.6 was found among those with 30 years or more of exposure in comparison with those employed 10 years or less. Smoking habit also significantly contributed to the model, although its role in the biological process is unclear. In conclusion, exposure to silica dust has been associated with a high incidence of silicosis among ceramics workers. The risk estimates are consistent with the recent findings of silicosis incidence among South African gold miners.

  5. Three-dimensional elastic-plastic finite-element analysis of fatigue crack propagation

    NASA Technical Reports Server (NTRS)

    Goglia, G. L.; Chermahini, R. G.

    1985-01-01

    Fatigue cracks are a major problem in designing structures subjected to cyclic loading. Cracks frequently occur in structures such as aircraft and spacecraft. The inspection intervals of many aircraft structures are based on crack-propagation lives. Therefore, improved prediction of propagation lives under flight-load conditions (variable-amplitude loading) are needed to provide more realistic design criteria for these structures. The main thrust was to develop a three-dimensional, nonlinear, elastic-plastic, finite element program capable of extending a crack and changing boundary conditions for the model under consideration. The finite-element model is composed of 8-noded (linear-strain) isoparametric elements. In the analysis, the material is assumed to be elastic-perfectly plastic. The cycle stress-strain curve for the material is shown Zienkiewicz's initial-stress method, von Mises's yield criterion, and Drucker's normality condition under small-strain assumptions are used to account for plasticity. The three-dimensional analysis is capable of extending the crack and changing boundary conditions under cyclic loading.

  6. Artificial neuron synapse transistor based on silicon nanomembrane on plastic substrate

    NASA Astrophysics Data System (ADS)

    Liu, Minjie; Huang, Gaoshan; Feng, Ping; Guo, Qinglei; Shao, Feng; Tian, Ziao; Li, Gongjin; Wan, Qing; Mei, Yongfeng

    2017-06-01

    Silicon nanomembrane (SiNM) transistors gated by chitosan membrane were fabricated on plastic substrate to mimic synapse behaviors. The device has both a bottom proton gate (BG) and multiple side gates (SG). Electrical transfer properties of BG show hysteresis curves different from those of typical SiO2 gate dielectric. Synaptic behaviors and functions by linear accumulation and release of protons have been mimicked on this device: excitatory post-synaptic current (EPSC) and paired pulse facilitation behavior of biological synapses were mimicked and the paired-pulse facilitation index could be effectively tuned by the spike interval applied on the BG. Synaptic behaviors and functions, including short-term memory and long-term memory, were also experimentally demonstrated in BG mode. Meanwhile, spiking logic operation and logic modulation were realized in SG mode. Project supported by the National Natural Science Foundation of China (No. 51322201), the Specialized Research Fund for the Doctoral Program of Higher Education (No. 20120071110025), and Science and Technology Commission of Shanghai Municipality (No. 14JC1400200).

  7. A game theoretic approach to a finite-time disturbance attenuation problem

    NASA Technical Reports Server (NTRS)

    Rhee, Ihnseok; Speyer, Jason L.

    1991-01-01

    A disturbance attenuation problem over a finite-time interval is considered by a game theoretic approach where the control, restricted to a function of the measurement history, plays against adversaries composed of the process and measurement disturbances, and the initial state. A zero-sum game, formulated as a quadratic cost criterion subject to linear time-varying dynamics and measurements, is solved by a calculus of variation technique. By first maximizing the quadratic cost criterion with respect to the process disturbance and initial state, a full information game between the control and the measurement residual subject to the estimator dynamics results. The resulting solution produces an n-dimensional compensator which expresses the controller as a linear combination of the measurement history. A disturbance attenuation problem is solved based on the results of the game problem. For time-invariant systems it is shown that under certain conditions the time-varying controller becomes time-invariant on the infinite-time interval. The resulting controller satisfies an H(infinity) norm bound.

  8. On correct evaluation techniques of brightness enhancement effect measurement data

    NASA Astrophysics Data System (ADS)

    Kukačka, Leoš; Dupuis, Pascal; Motomura, Hideki; Rozkovec, Jiří; Kolář, Milan; Zissis, Georges; Jinno, Masafumi

    2017-11-01

    This paper aims to establish confidence intervals of the quantification of brightness enhancement effects resulting from the use of pulsing bright light. It is found that the methods used so far may yield significant bias in the published results, overestimating or underestimating the enhancement effect. The authors propose to use a linear algebra method called the total least squares. Upon an example dataset, it is shown that this method does not yield biased results. The statistical significance of the results is also computed. It is concluded over an observation set that the currently used linear algebra methods present many patterns of noise sensitivity. Changing algorithm details leads to inconsistent results. It is thus recommended to use the method with the lowest noise sensitivity. Moreover, it is shown that this method also permits one to obtain an estimate of the confidence interval. This paper neither aims to publish results about a particular experiment nor to draw any particular conclusion about existence or nonexistence of the brightness enhancement effect.

  9. AUTONOMIC CONTROL OF HEART RATE AFTER EXERCISE IN TRAINED WRESTLERS

    PubMed Central

    Báez, San Martín E.; Von Oetinger, A.; Cañas, Jamett R.; Ramírez, Campillo R.

    2013-01-01

    The objective of this study was to establish differences in vagal reactivation, through heart rate recovery and heart rate variability post exercise, in Brazilian jiu-jitsu wrestlers (BJJW). A total of 18 male athletes were evaluated, ten highly trained (HT) and eight moderately trained (MT), who performed a maximum incremental test. At the end of the exercise, the R-R intervals were recorded during the first minute of recovery. We calculated heart rate recovery (HRR60s), and performed linear and non-linear (standard deviation of instantaneous beat-to-beat R-R interval variability – SD1) analysis of heart rate variability (HRV), using the tachogram of the first minute of recovery divided into four segments of 15 s each (0-15 s, 15-30 s, 30-45 s, 45-60 s). Between HT and MT individuals, there were statistically significant differences in HRR60s (p <0.05) and in the non linear analysis of HRV from SD130-45s (p <0.05) and SD145-60s (p <0.05). The results of this research suggest that heart rate kinetics during the first minute after exercise are related to training level and can be used as an index for autonomic cardiovascular control in BJJW. PMID:24744476

  10. Autonomic control of heart rate after exercise in trained wrestlers.

    PubMed

    Henríquez, Olguín C; Báez, San Martín E; Von Oetinger, A; Cañas, Jamett R; Ramírez, Campillo R

    2013-06-01

    The objective of this study was to establish differences in vagal reactivation, through heart rate recovery and heart rate variability post exercise, in Brazilian jiu-jitsu wrestlers (BJJW). A total of 18 male athletes were evaluated, ten highly trained (HT) and eight moderately trained (MT), who performed a maximum incremental test. At the end of the exercise, the R-R intervals were recorded during the first minute of recovery. We calculated heart rate recovery (HRR60s), and performed linear and non-linear (standard deviation of instantaneous beat-to-beat R-R interval variability - SD1) analysis of heart rate variability (HRV), using the tachogram of the first minute of recovery divided into four segments of 15 s each (0-15 s, 15-30 s, 30-45 s, 45-60 s). Between HT and MT individuals, there were statistically significant differences in HRR60s (p <0.05) and in the non linear analysis of HRV from SD130-45s (p <0.05) and SD145-60s (p <0.05). The results of this research suggest that heart rate kinetics during the first minute after exercise are related to training level and can be used as an index for autonomic cardiovascular control in BJJW.

  11. Robust Neighboring Optimal Guidance for the Advanced Launch System

    NASA Technical Reports Server (NTRS)

    Hull, David G.

    1993-01-01

    In recent years, optimization has become an engineering tool through the availability of numerous successful nonlinear programming codes. Optimal control problems are converted into parameter optimization (nonlinear programming) problems by assuming the control to be piecewise linear, making the unknowns the nodes or junction points of the linear control segments. Once the optimal piecewise linear control (suboptimal) control is known, a guidance law for operating near the suboptimal path is the neighboring optimal piecewise linear control (neighboring suboptimal control). Research conducted under this grant has been directed toward the investigation of neighboring suboptimal control as a guidance scheme for an advanced launch system.

  12. Estimating linear temporal trends from aggregated environmental monitoring data

    USGS Publications Warehouse

    Erickson, Richard A.; Gray, Brian R.; Eager, Eric A.

    2017-01-01

    Trend estimates are often used as part of environmental monitoring programs. These trends inform managers (e.g., are desired species increasing or undesired species decreasing?). Data collected from environmental monitoring programs is often aggregated (i.e., averaged), which confounds sampling and process variation. State-space models allow sampling variation and process variations to be separated. We used simulated time-series to compare linear trend estimations from three state-space models, a simple linear regression model, and an auto-regressive model. We also compared the performance of these five models to estimate trends from a long term monitoring program. We specifically estimated trends for two species of fish and four species of aquatic vegetation from the Upper Mississippi River system. We found that the simple linear regression had the best performance of all the given models because it was best able to recover parameters and had consistent numerical convergence. Conversely, the simple linear regression did the worst job estimating populations in a given year. The state-space models did not estimate trends well, but estimated population sizes best when the models converged. We found that a simple linear regression performed better than more complex autoregression and state-space models when used to analyze aggregated environmental monitoring data.

  13. Modeling of frequency-domain scalar wave equation with the average-derivative optimal scheme based on a multigrid-preconditioned iterative solver

    NASA Astrophysics Data System (ADS)

    Cao, Jian; Chen, Jing-Bo; Dai, Meng-Xue

    2018-01-01

    An efficient finite-difference frequency-domain modeling of seismic wave propagation relies on the discrete schemes and appropriate solving methods. The average-derivative optimal scheme for the scalar wave modeling is advantageous in terms of the storage saving for the system of linear equations and the flexibility for arbitrary directional sampling intervals. However, using a LU-decomposition-based direct solver to solve its resulting system of linear equations is very costly for both memory and computational requirements. To address this issue, we consider establishing a multigrid-preconditioned BI-CGSTAB iterative solver fit for the average-derivative optimal scheme. The choice of preconditioning matrix and its corresponding multigrid components is made with the help of Fourier spectral analysis and local mode analysis, respectively, which is important for the convergence. Furthermore, we find that for the computation with unequal directional sampling interval, the anisotropic smoothing in the multigrid precondition may affect the convergence rate of this iterative solver. Successful numerical applications of this iterative solver for the homogenous and heterogeneous models in 2D and 3D are presented where the significant reduction of computer memory and the improvement of computational efficiency are demonstrated by comparison with the direct solver. In the numerical experiments, we also show that the unequal directional sampling interval will weaken the advantage of this multigrid-preconditioned iterative solver in the computing speed or, even worse, could reduce its accuracy in some cases, which implies the need for a reasonable control of directional sampling interval in the discretization.

  14. [Polar S810 as an alternative resource to the use of the electrocardiogram in the 4-second exercise test].

    PubMed

    Pimentel, Alan Santos; Alves, Eduardo da Silva; Alvim, Rafael de Oliveira; Nunes, Rogério Tasca; Costa, Carlos Magno Amaral; Lovisi, Júlio Cesar Moraes; Perrout de Lima, Jorge Roberto

    2010-05-01

    The 4-second exercise test (T4s) evaluates the cardiac vagal tone during the initial heart rate (HR) transient at sudden dynamic exercise, through the identification of the cardiac vagal index (CVI) obtained from the electrocardiogram (ECG). To evaluate the use of the Polar S810 heart rate monitor (HRM) as an alternative resource to the use of the electrocardiogram in the 4-second exercise test. In this study, 49 male individuals (25 +/- 20 years, 176 +/-12 cm, 74 +/- 6 kg) underwent the 4-second exercise test. The RR intervals were recorded simultaneously by ECG and HRM. We calculated the mean and the standard deviation of the last RR interval of the pre-exercise period, or of the first RR interval of the exercise period, whichever was longer (RRB), of the shortest RR interval of the exercise period (RRC), and of the CVI obtained by ECG and HRM. We used the Student t-test for dependent samples (p < or 0.05) to test the significance of the differences between means. To identify the correlation between the ECG and the HRM, we used the linear regression to calculate the Pearson's correlation coefficient and the strategy proposed by Bland and Altman. Linear regression showed r(2) of 0.9999 for RRB, 0.9997 for RRC, and 0.9996 for CVI. Bland e Altman strategy presented standard deviation of 0.92 ms for RRB, 0.86 ms for RRC, and 0.002 for CVI. Polar S810 HRM was more efficient in the application of T4s compared to the ECG.

  15. Feedback to providers improves evidence-based implantable cardioverter-defibrillator programming and reduces shocks.

    PubMed

    Silver, Marc T; Sterns, Laurence D; Piccini, Jonathan P; Joung, Boyoung; Ching, Chi-Keong; Pickett, Robert A; Rabinovich, Rafael; Liu, Shufeng; Peterson, Brett J; Lexcen, Daniel R

    2015-03-01

    Implantable cardioverter-defibrillator (ICD) shocks are associated with increased anxiety, health care utilization, and potentially mortality. The purpose of the Shock-Less Study was to determine if providing feedback reports to physicians on their adherence to evidence-based shock reduction programming could improve their programming behavior and reduce shocks. Shock-Less enrolled primary prevention (PP) and secondary prevention (SP) ICD patients between 2009 and 2012 at 118 study centers worldwide and followed patients longitudinally after their ICD implant. Center-specific therapy programming reports (TPRs) were delivered to each center 9 to 12 months after their first enrollment. The reports detailed adherence to evidence-based programming targets: number of intervals to detect ventricular fibrillation (VF NID), longest treatment interval (LTI), supraventricular tachycardia (SVT) discriminators (Wavelet, PR Logic), SVT limit, Lead Integrity Alert (LIA), and antitachycardia pacing (ATP). Clinicians programmed ICDs at their discretion. The primary outcome measure was the change in utilization of evidence-based shock reduction programming before (phase I, n = 2694 patients) and after initiation of the TPR (phase II, n = 1438 patients). Patients implanted after feedback reports (phase II) were up to 20% more likely to have their ICDs programmed in line with evidence-based shock reduction programming (eg, VF NID in PP patients 30/40 in 33.5% vs 18.6%, P < .0001). Patients implanted in phase II had a lower risk of all-cause shock (adjusted hazard ratio 0.72, 95% confidence interval 0.58-0.90, P = .003). Providing programming feedback reports improves adherence to evidence-based shock reduction programming and is associated with lower risk of ICD shocks. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  16. Fault detection and initial state verification by linear programming for a class of Petri nets

    NASA Technical Reports Server (NTRS)

    Rachell, Traxon; Meyer, David G.

    1992-01-01

    The authors present an algorithmic approach to determining when the marking of a LSMG (live safe marked graph) or a LSFC (live safe free choice) net is in the set of live safe markings M. Hence, once the marking of a net is determined to be in M, then if at some time thereafter the marking of this net is determined not to be in M, this indicates a fault. It is shown how linear programming can be used to determine if m is an element of M. The worst-case computational complexity of each algorithm is bounded by the number of linear programs necessary to compute.

  17. Two algorithms for neural-network design and training with application to channel equalization.

    PubMed

    Sweatman, C Z; Mulgrew, B; Gibson, G J

    1998-01-01

    We describe two algorithms for designing and training neural-network classifiers. The first, the linear programming slab algorithm (LPSA), is motivated by the problem of reconstructing digital signals corrupted by passage through a dispersive channel and by additive noise. It constructs a multilayer perceptron (MLP) to separate two disjoint sets by using linear programming methods to identify network parameters. The second, the perceptron learning slab algorithm (PLSA), avoids the computational costs of linear programming by using an error-correction approach to identify parameters. Both algorithms operate in highly constrained parameter spaces and are able to exploit symmetry in the classification problem. Using these algorithms, we develop a number of procedures for the adaptive equalization of a complex linear 4-quadrature amplitude modulation (QAM) channel, and compare their performance in a simulation study. Results are given for both stationary and time-varying channels, the latter based on the COST 207 GSM propagation model.

  18. School-based intervention to enable school children to act as change agents on weight, physical activity and diet of their mothers: a cluster randomized controlled trial.

    PubMed

    Gunawardena, Nalika; Kurotani, Kayo; Indrawansa, Susantha; Nonaka, Daisuke; Mizoue, Tetsuya; Samarasinghe, Diyanath

    2016-04-06

    School health promotion has been shown to improve the lifestyle of students, but it remains unclear whether school-based programs can influence family health. We developed an innovative program that enables school children to act as change agents in promoting healthy lifestyles of their mothers. The objective of this study was to examine the effect of the child-initiated intervention on weight, physical activity and dietary habit of their mothers. A 12-month cluster randomized trial was conducted, with school as a cluster. Participants were mothers with grade 8 students, aged around 13 years, of 20 schools in Homagama, Sri Lanka. Students of the intervention group were trained by facilitators to acquire the ability to assess noncommunicable disease risk factors in their homes and take action to address them, whereas those of the comparison group received no intervention. Body weight, step count and lifestyle of their mothers were assessed at baseline and post-intervention. Multi-level multivariable linear regression and logistic regression were used to assess the effects of intervention on continuous and binary outcomes, respectively. Of 308 study participants, 261 completed the final assessment at 12 month. There was a significantly greater decrease of weight and increase of physical activity in the intervention group. The mean (95% confidence interval) difference comparing the intervention group with the control group was -2.49 (-3.38 to -1.60) kg for weight and -0.99 (-1.40 to -0.58) kg/m(2) for body mass index. The intervention group had a 3.25 (95% confidence interval 1.87-5.62) times higher odds of engaging in adequate physical activity than the control group, and the former showed a greater number of steps than the latter after intervention. The intervention group showed a greater reduction of household purchase of biscuits and ice cream. A program to motivate students to act as change agents of family's lifestyle was effective in decreasing weight and increasing physical activity of their mothers. Sri Lanka Clinical Trials Registry SLCTR/2013/011 .

  19. Effects of long memory in the order submission process on the properties of recurrence intervals of large price fluctuations

    NASA Astrophysics Data System (ADS)

    Meng, Hao; Ren, Fei; Gu, Gao-Feng; Xiong, Xiong; Zhang, Yong-Jie; Zhou, Wei-Xing; Zhang, Wei

    2012-05-01

    Understanding the statistical properties of recurrence intervals (also termed return intervals in econophysics literature) of extreme events is crucial to risk assessment and management of complex systems. The probability distributions and correlations of recurrence intervals for many systems have been extensively investigated. However, the impacts of microscopic rules of a complex system on the macroscopic properties of its recurrence intervals are less studied. In this letter, we adopt an order-driven stock model to address this issue for stock returns. We find that the distributions of the scaled recurrence intervals of simulated returns have a power-law scaling with stretched exponential cutoff and the intervals possess multifractal nature, which are consistent with empirical results. We further investigate the effects of long memory in the directions (or signs) and relative prices of the order flow on the characteristic quantities of these properties. It is found that the long memory in the order directions (Hurst index Hs) has a negligible effect on the interval distributions and the multifractal nature. In contrast, the power-law exponent of the interval distribution increases linearly with respect to the Hurst index Hx of the relative prices, and the singularity width of the multifractal nature fluctuates around a constant value when Hx<0.7 and then increases with Hx. No evident effects of Hs and Hx are found on the long memory of the recurrence intervals. Our results indicate that the nontrivial properties of the recurrence intervals of returns are mainly caused by traders' behaviors of persistently placing new orders around the best bid and ask prices.

  20. Linear programming computational experience with onyx

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atrek, E.

    1994-12-31

    ONYX is a linear programming software package based on an efficient variation of the gradient projection method. When fully configured, it is intended for application to industrial size problems. While the computational experience is limited at the time of this abstract, the technique is found to be robust and competitive with existing methodology in terms of both accuracy and speed. An overview of the approach is presented together with a description of program capabilities, followed by a discussion of up-to-date computational experience with the program. Conclusions include advantages of the approach and envisioned future developments.

  1. STAR adaptation of QR algorithm. [program for solving over-determined systems of linear equations

    NASA Technical Reports Server (NTRS)

    Shah, S. N.

    1981-01-01

    The QR algorithm used on a serial computer and executed on the Control Data Corporation 6000 Computer was adapted to execute efficiently on the Control Data STAR-100 computer. How the scalar program was adapted for the STAR-100 and why these adaptations yielded an efficient STAR program is described. Program listings of the old scalar version and the vectorized SL/1 version are presented in the appendices. Execution times for the two versions applied to the same system of linear equations, are compared.

  2. Frequency and duration of interval training programs and changes in aerobic power

    NASA Technical Reports Server (NTRS)

    Fox, E. L.; Bartels, R. L.; Obrien, R.; Bason, R.; Mathews, D. K.; Billings, C. E.

    1975-01-01

    The present study was designed to ascertain whether a training frequency of 2 days/wk for a 7- and 13-wk interval training program would produce improvement in maximal aerobic power comparable to that obtained from 7- and 13-wk programs of the same intensity consisting of 4 training days/wk. After training, there was a significant increase in maximal aerobic power that was independent of both training frequency and duration. Maximal heart rate was significantly decreased following training. Submaximal aerobic power did not change with training, but submaximal heart rate decreased significantly with greater decreases the more frequent and the longer the training.

  3. HYSEP: A Computer Program for Streamflow Hydrograph Separation and Analysis

    USGS Publications Warehouse

    Sloto, Ronald A.; Crouse, Michele Y.

    1996-01-01

    HYSEP is a computer program that can be used to separate a streamflow hydrograph into base-flow and surface-runoff components. The base-flow component has traditionally been associated with ground-water discharge and the surface-runoff component with precipitation that enters the stream as overland runoff. HYSEP includes three methods of hydrograph separation that are referred to in the literature as the fixed interval, sliding-interval, and local-minimum methods. The program also describes the frequency and duration of measured streamflow and computed base flow and surface runoff. Daily mean stream discharge is used as input to the program in either an American Standard Code for Information Interchange (ASCII) or binary format. Output from the program includes table,s graphs, and data files. Graphical output may be plotted on the computer screen or output to a printer, plotter, or metafile.

  4. LDRD final report on massively-parallel linear programming : the parPCx system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parekh, Ojas; Phillips, Cynthia Ann; Boman, Erik Gunnar

    2005-02-01

    This report summarizes the research and development performed from October 2002 to September 2004 at Sandia National Laboratories under the Laboratory-Directed Research and Development (LDRD) project ''Massively-Parallel Linear Programming''. We developed a linear programming (LP) solver designed to use a large number of processors. LP is the optimization of a linear objective function subject to linear constraints. Companies and universities have expended huge efforts over decades to produce fast, stable serial LP solvers. Previous parallel codes run on shared-memory systems and have little or no distribution of the constraint matrix. We have seen no reports of general LP solver runsmore » on large numbers of processors. Our parallel LP code is based on an efficient serial implementation of Mehrotra's interior-point predictor-corrector algorithm (PCx). The computational core of this algorithm is the assembly and solution of a sparse linear system. We have substantially rewritten the PCx code and based it on Trilinos, the parallel linear algebra library developed at Sandia. Our interior-point method can use either direct or iterative solvers for the linear system. To achieve a good parallel data distribution of the constraint matrix, we use a (pre-release) version of a hypergraph partitioner from the Zoltan partitioning library. We describe the design and implementation of our new LP solver called parPCx and give preliminary computational results. We summarize a number of issues related to efficient parallel solution of LPs with interior-point methods including data distribution, numerical stability, and solving the core linear system using both direct and iterative methods. We describe a number of applications of LP specific to US Department of Energy mission areas and we summarize our efforts to integrate parPCx (and parallel LP solvers in general) into Sandia's massively-parallel integer programming solver PICO (Parallel Interger and Combinatorial Optimizer). We conclude with directions for long-term future algorithmic research and for near-term development that could improve the performance of parPCx.« less

  5. Cardiometabolic risk markers, adipocyte fatty acid binding protein (aFABP) and the impact of high-intensity interval training (HIIT) in obese adolescents.

    PubMed

    Blüher, Susann; Käpplinger, Jakob; Herget, Sabine; Reichardt, Sandra; Böttcher, Yvonne; Grimm, Andrea; Kratzsch, Jürgen; Petroff, David

    2017-03-01

    The impact of high-intensity interval training (HIIT) as well as the association between the adipocyte fatty binding protein (aFABP) and cardiometabolic risk factors in overweight adolescents was investigated. Twenty-eight adolescents (13-18years; BMI≥90th percentile according to German reference values) were offered HIIT twice weekly for 6months. At baseline and after program completion, anthropometric, clinical and metabolic characteristics were assessed and a fasting blood sample was obtained. Leptin, adiponectin, visfatin and aFABP were measured using commercially available kits. DNA methylation at RALBP1 was assessed using pyrosequencing. Descriptive statistics, Pearson's correlation and linear models were calculated. Mean age at start of the program was 15.5±1.4years (53.5% females) and 20/28 (71%) provided follow-up data. At baseline, aFABP was correlated with BMI-SDS (0.48 [0.13,0.72]; p=0.0095), waist-to-height-ratio (0.63 [0.33,0.81], p=0.00036) and body fat content (0.55 [0.21, 0.77]; p=0.0031). Certain markers of metabolic risk were significantly correlated with aFABP (HOMA-IR 0.52 [0.19, 0.75], p=0.0044; γGT 0.48 [0.13, 0.73], p=0.0091; uric acid 0.46 [0.11, 0.71] p=0.013; HDL-C -0.39 [-0.66, -0.01] p=0.043; triglycerides 0.38 [0.01, 0.66], p=0.047). With the exception of triglycerides, these associations vanished after adjusting for BMI-SDS. aFABP did not depend on sex, age or pubertal stage in obese adolescents. After the HIIT program, small but significant reductions were observed in waist-to-height-ratio, (0.013 [0.0025, 0.024]; p=0.023), skin-fold based body fat content (2.0% [0.6, 3.5]; p=0.011), and standard deviation score of systolic blood pressure (0.69 [0.26 to 1.1]; p=0.0036). No changes were observed in adipokines or epigenetic markers following the program. HIIT may have beneficial effects on body composition and cardiometabolic health in overweight adolescents. Like in adults, aFABP seems to be associated with markers of metabolic risk in obese adolescents. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Selection of optimal oligonucleotide probes for microarrays usingmultiple criteria, global alignment and parameter estimation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xingyuan; He, Zhili; Zhou, Jizhong

    2005-10-30

    The oligonucleotide specificity for microarray hybridizationcan be predicted by its sequence identity to non-targets, continuousstretch to non-targets, and/or binding free energy to non-targets. Mostcurrently available programs only use one or two of these criteria, whichmay choose 'false' specific oligonucleotides or miss 'true' optimalprobes in a considerable proportion. We have developed a software tool,called CommOligo using new algorithms and all three criteria forselection of optimal oligonucleotide probes. A series of filters,including sequence identity, free energy, continuous stretch, GC content,self-annealing, distance to the 3'-untranslated region (3'-UTR) andmelting temperature (Tm), are used to check each possibleoligonucleotide. A sequence identity is calculated based onmore » gapped globalalignments. A traversal algorithm is used to generate alignments for freeenergy calculation. The optimal Tm interval is determined based on probecandidates that have passed all other filters. Final probes are pickedusing a combination of user-configurable piece-wise linear functions andan iterative process. The thresholds for identity, stretch and freeenergy filters are automatically determined from experimental data by anaccessory software tool, CommOligo_PE (CommOligo Parameter Estimator).The program was used to design probes for both whole-genome and highlyhomologous sequence data. CommOligo and CommOligo_PE are freely availableto academic users upon request.« less

  7. Public Health System-Delivered Mental Health Preventive Care Links to Significant Reduction of Health Care Costs.

    PubMed

    Chen, Jie; Novak, Priscilla; Goldman, Howard

    2018-04-23

    The objective was to estimate the association between health care expenditures and implementation of preventive mental health programs by local health departments (LHDs). Multilevel nationally representative data sets were linked to test the hypothesis that LHDs' provision of preventive mental health programs was associated with cost savings. A generalized linear model with log link and gamma distribution and state-fixed effects was used to estimate the association between LHDs' mental illness prevention services and total health care expenditures per person per year for adults aged 18 years and older. The main outcome measure was the annual total health care expenditure per person. The findings indicated that LHD provision of population-based prevention of mental illness was associated with an $824 reduction (95% confidence interval: -$1,562.94 to -$85.42, P < 0.05) in annual health care costs per person, after controlling for individual, LHD, community, and state characteristics. LHDs can play a critical role in establishing an integrated health care model. Their impact, however, has often been underestimated or neglected. Results showed that a small investment in LHDs may yield substantial cost savings at the societal level. The findings of this research are critical to inform policy decisions for the expansion of the Public Health 3.0 infrastructure.

  8. Optimal Facility Location Tool for Logistics Battle Command (LBC)

    DTIC Science & Technology

    2015-08-01

    64 Appendix B. VBA Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Appendix C. Story...should city planners have located emergency service facilities so that all households (the demand) had equal access to coverage?” The critical...programming language called Visual Basic for Applications ( VBA ). CPLEX is a commercial solver for linear, integer, and mixed integer linear programming problems

  9. Fitting program for linear regressions according to Mahon (1996)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trappitsch, Reto G.

    2018-01-09

    This program takes the users' Input data and fits a linear regression to it using the prescription presented by Mahon (1996). Compared to the commonly used York fit, this method has the correct prescription for measurement error propagation. This software should facilitate the proper fitting of measurements with a simple Interface.

  10. A comprehensive linear programming tool to optimize formulations of ready-to-use therapeutic foods: An application to Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Ready-to-use therapeutic food (RUTF) is the standard of care for children suffering from noncomplicated severe acute malnutrition (SAM). The objective was to develop a comprehensive linear programming (LP) tool to create novel RUTF formulations for Ethiopia. A systematic approach that surveyed inter...

  11. Secret Message Decryption: Group Consulting Projects Using Matrices and Linear Programming

    ERIC Educational Resources Information Center

    Gurski, Katharine F.

    2009-01-01

    We describe two short group projects for finite mathematics students that incorporate matrices and linear programming into fictional consulting requests presented as a letter to the students. The students are required to use mathematics to decrypt secret messages in one project involving matrix multiplication and inversion. The second project…

  12. LCPT: a program for finding linear canonical transformations. [In MACSYMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Char, B.W.; McNamara, B.

    This article describes a MACSYMA program to compute symbolically a canonical linear transformation between coordinate systems. The difficulties in implementation of this canonical small physics problem are also discussed, along with the implications that may be drawn from such difficulties about widespread MACSYMA usage by the community of computational/theoretical physicists.

  13. A Test of a Linear Programming Model as an Optimal Solution to the Problem of Combining Methods of Reading Instruction

    ERIC Educational Resources Information Center

    Mills, James W.; And Others

    1973-01-01

    The Study reported here tested an application of the Linear Programming Model at the Reading Clinic of Drew University. Results, while not conclusive, indicate that this approach yields greater gains in speed scores than a traditional approach for this population. (Author)

  14. Visual, Algebraic and Mixed Strategies in Visually Presented Linear Programming Problems.

    ERIC Educational Resources Information Center

    Shama, Gilli; Dreyfus, Tommy

    1994-01-01

    Identified and classified solution strategies of (n=49) 10th-grade students who were presented with linear programming problems in a predominantly visual setting in the form of a computerized game. Visual strategies were developed more frequently than either algebraic or mixed strategies. Appendix includes questionnaires. (Contains 11 references.)…

  15. Quantile regression models of animal habitat relationships

    USGS Publications Warehouse

    Cade, Brian S.

    2003-01-01

    Typically, all factors that limit an organism are not measured and included in statistical models used to investigate relationships with their environment. If important unmeasured variables interact multiplicatively with the measured variables, the statistical models often will have heterogeneous response distributions with unequal variances. Quantile regression is an approach for estimating the conditional quantiles of a response variable distribution in the linear model, providing a more complete view of possible causal relationships between variables in ecological processes. Chapter 1 introduces quantile regression and discusses the ordering characteristics, interval nature, sampling variation, weighting, and interpretation of estimates for homogeneous and heterogeneous regression models. Chapter 2 evaluates performance of quantile rankscore tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). A permutation F test maintained better Type I errors than the Chi-square T test for models with smaller n, greater number of parameters p, and more extreme quantiles τ. Both versions of the test required weighting to maintain correct Type I errors when there was heterogeneity under the alternative model. An example application related trout densities to stream channel width:depth. Chapter 3 evaluates a drop in dispersion, F-ratio like permutation test for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). Chapter 4 simulates from a large (N = 10,000) finite population representing grid areas on a landscape to demonstrate various forms of hidden bias that might occur when the effect of a measured habitat variable on some animal was confounded with the effect of another unmeasured variable (spatially and not spatially structured). Depending on whether interactions of the measured habitat and unmeasured variable were negative (interference interactions) or positive (facilitation interactions), either upper (τ > 0.5) or lower (τ < 0.5) quantile regression parameters were less biased than mean rate parameters. Sampling (n = 20 - 300) simulations demonstrated that confidence intervals constructed by inverting rankscore tests provided valid coverage of these biased parameters. Quantile regression was used to estimate effects of physical habitat resources on a bivalve mussel (Macomona liliana) in a New Zealand harbor by modeling the spatial trend surface as a cubic polynomial of location coordinates.

  16. A model for managing sources of groundwater pollution

    USGS Publications Warehouse

    Gorelick, Steven M.

    1982-01-01

    The waste disposal capacity of a groundwater system can be maximized while maintaining water quality at specified locations by using a groundwater pollutant source management model that is based upon linear programing and numerical simulation. The decision variables of the management model are solute waste disposal rates at various facilities distributed over space. A concentration response matrix is used in the management model to describe transient solute transport and is developed using the U.S. Geological Survey solute transport simulation model. The management model was applied to a complex hypothetical groundwater system. Large-scale management models were formulated as dual linear programing problems to reduce numerical difficulties and computation time. Linear programing problems were solved using a numerically stable, available code. Optimal solutions to problems with successively longer management time horizons indicated that disposal schedules at some sites are relatively independent of the number of disposal periods. Optimal waste disposal schedules exhibited pulsing rather than constant disposal rates. Sensitivity analysis using parametric linear programing showed that a sharp reduction in total waste disposal potential occurs if disposal rates at any site are increased beyond their optimal values.

  17. Systematic review of serum steroid reference intervals developed using mass spectrometry.

    PubMed

    Tavita, Nevada; Greaves, Ronda F

    2017-12-01

    The aim of this study was to perform a systematic review of the published literature to determine the available serum/plasma steroid reference intervals generated by mass spectrometry (MS) methods across all age groups in healthy subjects and to suggest recommendations to achieve common MS based reference intervals for serum steroids. MEDLINE, EMBASE and PubMed databases were used to conduct a comprehensive search for English language, MS-based reference interval studies for serum/plasma steroids. Selection of steroids to include was based on those listed in the Royal College of Pathologists of Australasia Quality Assurance Programs, Chemical Pathology, Endocrine Program. This methodology has been registered onto the PROSPERO International prospective register of systematic reviews (ID number: CRD42015029637). After accounting for duplicates, a total of 60 manuscripts were identified through the search strategy. Following critical evaluation, a total of 16 studies were selected. Of the 16 studies, 12 reported reference intervals for testosterone, 11 for 17 hydroxy-progesterone, nine for androstenedione, six for cortisol, three for progesterone, two for dihydrotestosterone and only one for aldosterone and dehydroepiandrosterone sulphate. No studies established MS-based reference intervals for oestradiol. As far as we are aware, this report provides the first comparison of the peer reviewed literature for serum/plasma steroid reference intervals generated by MS-based methods. The reference intervals based on these published studies can be used to inform the process to develop common reference intervals, and agreed reporting units for mass spectrometry based steroid methods. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Long Detection Programming in Single-Chamber Defibrillators Reduces Unnecessary Therapies and Mortality: The ADVANCE III Trial.

    PubMed

    Gasparini, Maurizio; Lunati, Maurizio G; Proclemer, Alessandro; Arenal, Angel; Kloppe, Axel; Martínez Ferrer, Josè B; Hersi, Ahmad S; Gulaj, Marcin; Wijffels, Maurits C E; Santi, Elisabetta; Manotta, Laura; Varma, Niraj

    2017-11-01

    This study sought to evaluate the effects of programming a long detection in single-chamber (VVI) implantable cardioverter-defibrillators (ICDs) in the multicenter prospective ADVANCE III (Avoid DeliVering TherApies for Non-sustained Arrhythmias in ICD PatiEnts III) trial. Programming strategies may reduce unnecessary ICD shocks and their adverse effects but to date have been described only for dual-chamber ICDs. A total of 545 subjects (85% male; atrial fibrillation 25%, left ventricular ejection fraction 31%, ischemic etiology 68%, secondary prevention indications 32%) receiving a VVI ICD were randomized to long detection (30 of 40 intervals) or standard programming (18 of 24 intervals) based on device type, atrial fibrillation history, and indication. In both arms, antitachycardia pacing (ATP) therapy during charging was programmed for episodes with cycle length 320 to 200 ms and shock only for cycle length <200 ms. Wavelet and stability functions enabled. Therapies delivered were compared using a negative binomial regression model. A total of 267 patients were randomized to long detection and 278 to the control group. Median follow-up was 12 months. One hundred twelve therapies (shocks and ATP) occurred in the long detection arm versus 257 in the control arm, for a 48% reduction with 30 of 40 intervals (95% confidence interval [CI]: 0.36 to 0.76; p = 0.002). In the long detection arm, overall shocks were reduced by 40% compared to the control arm (48 vs. 24; 95% CI: 0.38 to 0.94; p = 0.026) and appropriate shocks by 51% (34 vs. 74; 95% CI: 0.26 to 0.94; p = 0.033). Syncopal events did not differ between arms, but survival improved in the long detection arm. Among patients implanted with a VVI ICD, programming with the long detection interval significantly reduced appropriate therapies, shocks, and all-cause mortality. (Avoid DeliVering TherApies for Non-sustained Arrhythmias in ICD PatiEnts III [ADVANCEIII]; NCT00617175). Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. CHO cell enlargement oscillates with a temperature-compensated period of 24 min

    NASA Technical Reports Server (NTRS)

    Pogue, R.; Morre, D. M.; Morre, D. J.

    2000-01-01

    The rate of increase in cell area of CHO cells when measured at intervals of 1 min using a light microscope equipped with a video measurement system, oscillated with a minimum period of about 24 min. The pattern of oscillations paralleled those of the 24 min period observed with the oxidation of NADH by an external cell surface or plasma membrane NADH oxidase. The increase in cell area was non-linear. Intervals of rapid increase in area alternated with intervals of rapid decrease in area. The length of the 24 min period was temperature-compensated (approximately the same when measured at 14 degrees C, 24 degrees C or 34 degrees C) while the rate of cell enlargement increased with temperature over this same range of temperatures.

  20. Nutrient density score of typical Indonesian foods and dietary formulation using linear programming.

    PubMed

    Jati, Ignasius Radix A P; Vadivel, Vellingiri; Nöhr, Donatus; Biesalski, Hans Konrad

    2012-12-01

    The present research aimed to analyse the nutrient density (ND), nutrient adequacy score (NAS) and energy density (ED) of Indonesian foods and to formulate a balanced diet using linear programming. Data on typical Indonesian diets were obtained from the Indonesian Socio-Economic Survey 2008. ND was investigated for 122 Indonesian foods. NAS was calculated for single nutrients such as Fe, Zn and vitamin A. Correlation analysis was performed between ND and ED, as well as between monthly expenditure class and food consumption pattern in Indonesia. Linear programming calculations were performed using the software POM-QM for Windows version 3. Republic of Indonesia, 2008. Public households (n 68 800). Vegetables had the highest ND of the food groups, followed by animal-based foods, fruits and staple foods. Based on NAS, the top ten food items for each food group were identified. Most of the staple foods had high ED and contributed towards daily energy fulfillment, followed by animal-based foods, vegetables and fruits. Commodities with high ND tended to have low ED. Linear programming could be used to formulate a balanced diet. In contrast to staple foods, purchases of fruit, vegetables and animal-based foods increased with the rise of monthly expenditure. People should select food items based on ND and NAS to alleviate micronutrient deficiencies in Indonesia. Dietary formulation calculated using linear programming to achieve RDA levels for micronutrients could be recommended for different age groups of the Indonesian population.

  1. Nutrient profiling can help identify foods of good nutritional quality for their price: a validation study with linear programming.

    PubMed

    Maillot, Matthieu; Ferguson, Elaine L; Drewnowski, Adam; Darmon, Nicole

    2008-06-01

    Nutrient profiling ranks foods based on their nutrient content. They may help identify foods with a good nutritional quality for their price. This hypothesis was tested using diet modeling with linear programming. Analyses were undertaken using food intake data from the nationally representative French INCA (enquête Individuelle et Nationale sur les Consommations Alimentaires) survey and its associated food composition and price database. For each food, a nutrient profile score was defined as the ratio between the previously published nutrient density score (NDS) and the limited nutrient score (LIM); a nutritional quality for price indicator was developed and calculated from the relationship between its NDS:LIM and energy cost (in euro/100 kcal). We developed linear programming models to design diets that fulfilled increasing levels of nutritional constraints at a minimal cost. The median NDS:LIM values of foods selected in modeled diets increased as the levels of nutritional constraints increased (P = 0.005). In addition, the proportion of foods with a good nutritional quality for price indicator was higher (P < 0.0001) among foods selected (81%) than among foods not selected (39%) in modeled diets. This agreement between the linear programming and the nutrient profiling approaches indicates that nutrient profiling can help identify foods of good nutritional quality for their price. Linear programming is a useful tool for testing nutrient profiling systems and validating the concept of nutrient profiling.

  2. From diets to foods: using linear programming to formulate a nutritious, minimum-cost porridge mix for children aged 1 to 2 years.

    PubMed

    De Carvalho, Irene Stuart Torrié; Granfeldt, Yvonne; Dejmek, Petr; Håkansson, Andreas

    2015-03-01

    Linear programming has been used extensively as a tool for nutritional recommendations. Extending the methodology to food formulation presents new challenges, since not all combinations of nutritious ingredients will produce an acceptable food. Furthermore, it would help in implementation and in ensuring the feasibility of the suggested recommendations. To extend the previously used linear programming methodology from diet optimization to food formulation using consistency constraints. In addition, to exemplify usability using the case of a porridge mix formulation for emergency situations in rural Mozambique. The linear programming method was extended with a consistency constraint based on previously published empirical studies on swelling of starch in soft porridges. The new method was exemplified using the formulation of a nutritious, minimum-cost porridge mix for children aged 1 to 2 years for use as a complete relief food, based primarily on local ingredients, in rural Mozambique. A nutritious porridge fulfilling the consistency constraints was found; however, the minimum cost was unfeasible with local ingredients only. This illustrates the challenges in formulating nutritious yet economically feasible foods from local ingredients. The high cost was caused by the high cost of mineral-rich foods. A nutritious, low-cost porridge that fulfills the consistency constraints was obtained by including supplements of zinc and calcium salts as ingredients. The optimizations were successful in fulfilling all constraints and provided a feasible porridge, showing that the extended constrained linear programming methodology provides a systematic tool for designing nutritious foods.

  3. Exploring nonlinear feature space dimension reduction and data representation in breast Cadx with Laplacian eigenmaps and t-SNE.

    PubMed

    Jamieson, Andrew R; Giger, Maryellen L; Drukker, Karen; Li, Hui; Yuan, Yading; Bhooshan, Neha

    2010-01-01

    In this preliminary study, recently developed unsupervised nonlinear dimension reduction (DR) and data representation techniques were applied to computer-extracted breast lesion feature spaces across three separate imaging modalities: Ultrasound (U.S.) with 1126 cases, dynamic contrast enhanced magnetic resonance imaging with 356 cases, and full-field digital mammography with 245 cases. Two methods for nonlinear DR were explored: Laplacian eigenmaps [M. Belkin and P. Niyogi, "Laplacian eigenmaps for dimensionality reduction and data representation," Neural Comput. 15, 1373-1396 (2003)] and t-distributed stochastic neighbor embedding (t-SNE) [L. van der Maaten and G. Hinton, "Visualizing data using t-SNE," J. Mach. Learn. Res. 9, 2579-2605 (2008)]. These methods attempt to map originally high dimensional feature spaces to more human interpretable lower dimensional spaces while preserving both local and global information. The properties of these methods as applied to breast computer-aided diagnosis (CADx) were evaluated in the context of malignancy classification performance as well as in the visual inspection of the sparseness within the two-dimensional and three-dimensional mappings. Classification performance was estimated by using the reduced dimension mapped feature output as input into both linear and nonlinear classifiers: Markov chain Monte Carlo based Bayesian artificial neural network (MCMC-BANN) and linear discriminant analysis. The new techniques were compared to previously developed breast CADx methodologies, including automatic relevance determination and linear stepwise (LSW) feature selection, as well as a linear DR method based on principal component analysis. Using ROC analysis and 0.632+bootstrap validation, 95% empirical confidence intervals were computed for the each classifier's AUC performance. In the large U.S. data set, sample high performance results include, AUC0.632+ = 0.88 with 95% empirical bootstrap interval [0.787;0.895] for 13 ARD selected features and AUC0.632+ = 0.87 with interval [0.817;0.906] for four LSW selected features compared to 4D t-SNE mapping (from the original 81D feature space) giving AUC0.632+ = 0.90 with interval [0.847;0.919], all using the MCMC-BANN. Preliminary results appear to indicate capability for the new methods to match or exceed classification performance of current advanced breast lesion CADx algorithms. While not appropriate as a complete replacement of feature selection in CADx problems, DR techniques offer a complementary approach, which can aid elucidation of additional properties associated with the data. Specifically, the new techniques were shown to possess the added benefit of delivering sparse lower dimensional representations for visual interpretation, revealing intricate data structure of the feature space.

  4. Chromosome structures: reduction of certain problems with unequal gene content and gene paralogs to integer linear programming.

    PubMed

    Lyubetsky, Vassily; Gershgorin, Roman; Gorbunov, Konstantin

    2017-12-06

    Chromosome structure is a very limited model of the genome including the information about its chromosomes such as their linear or circular organization, the order of genes on them, and the DNA strand encoding a gene. Gene lengths, nucleotide composition, and intergenic regions are ignored. Although highly incomplete, such structure can be used in many cases, e.g., to reconstruct phylogeny and evolutionary events, to identify gene synteny, regulatory elements and promoters (considering highly conserved elements), etc. Three problems are considered; all assume unequal gene content and the presence of gene paralogs. The distance problem is to determine the minimum number of operations required to transform one chromosome structure into another and the corresponding transformation itself including the identification of paralogs in two structures. We use the DCJ model which is one of the most studied combinatorial rearrangement models. Double-, sesqui-, and single-operations as well as deletion and insertion of a chromosome region are considered in the model; the single ones comprise cut and join. In the reconstruction problem, a phylogenetic tree with chromosome structures in the leaves is given. It is necessary to assign the structures to inner nodes of the tree to minimize the sum of distances between terminal structures of each edge and to identify the mutual paralogs in a fairly large set of structures. A linear algorithm is known for the distance problem without paralogs, while the presence of paralogs makes it NP-hard. If paralogs are allowed but the insertion and deletion operations are missing (and special constraints are imposed), the reduction of the distance problem to integer linear programming is known. Apparently, the reconstruction problem is NP-hard even in the absence of paralogs. The problem of contigs is to find the optimal arrangements for each given set of contigs, which also includes the mutual identification of paralogs. We proved that these problems can be reduced to integer linear programming formulations, which allows an algorithm to redefine the problems to implement a very special case of the integer linear programming tool. The results were tested on synthetic and biological samples. Three well-known problems were reduced to a very special case of integer linear programming, which is a new method of their solutions. Integer linear programming is clearly among the main computational methods and, as generally accepted, is fast on average; in particular, computation systems specifically targeted at it are available. The challenges are to reduce the size of the corresponding integer linear programming formulations and to incorporate a more detailed biological concept in our model of the reconstruction.

  5. Interval Colorectal Cancers following Guaiac Fecal Occult Blood Testing in the Ontario ColonCancerCheck Program.

    PubMed

    Paszat, Lawrence; Sutradhar, Rinku; Tinmouth, Jill; Baxter, Nancy; Rabeneck, Linda

    2016-01-01

    Background. This work examines the occurrence of interval colorectal cancers (CRCs) in the Ontario ColonCancerCheck (CCC) program. We define interval CRC as CRC diagnosed within 2 years following normal guaiac fecal occult blood testing (gFOBT). Methods. Persons aged 50-74 who completed a baseline CCC gFOBT kit in 2008 and 2009, without a prior history of CRC, or recent colonoscopy, flexible sigmoidoscopy, or gFOBT, were identified. Rates of CRC following positive and normal results at baseline and subsequent gFOBT screens were computed and overall survival was compared between those following positive and normal results. Results. Interval CRC was diagnosed within 24 months following the baseline screen among 0.16% of normals and following the subsequent screen among 0.18% of normals. Interval cancers comprised 38.70% of CRC following the baseline screen and 50.86% following the subsequent screen. Adjusting for age and sex, the hazard ratio (HR) for death following interval cancer compared to CRC following positive result was 1.65 (1.32, 2.05) following the first screen and 1.71 (1.00, 2.91) following the second screen. Conclusion. Interval CRCs following gFOBT screening comprise a significant proportion of CRC diagnosed within 2 years after gFOBT testing and are associated with a higher risk of death.

  6. The addition of upper cervical manipulative therapy in the treatment of patients with fibromyalgia: a randomized controlled trial.

    PubMed

    Moustafa, Ibrahim M; Diab, Aliaa A

    2015-07-01

    The aim of this study was to investigate the immediate and long-term effects of a one-year multimodal program, with the addition of upper cervical manipulative therapy, on fibromyalgia management outcomes in addition to three-dimensional (3D) postural measures. This randomized clinical trial with one-year follow-up was completed at the research laboratory of our university. A total of 120 (52 female) patients with fibromyalgia syndrome (FMS) and definite C1-2 joint dysfunction were randomly assigned to the control or an experimental group. Both groups received a multimodal program; additionally, the experimental group received upper cervical manipulative therapy. Primary outcomes were the Fibromyalgia Impact Questionnaire (FIQ), whereas secondary outcomes included Pain Catastrophizing Scale (PCS), algometric score, Pittsburgh Sleep Quality Index (PSQI), Beck Anxiety Inventory (BAI), Beck Depression Inventory (BDI), and 3D postural measures. Measures were assessed at three time intervals: baseline, 12 weeks, and 1 year after the 12-week follow-up. The general linear model with repeated measures indicated a significant group × time effect in favor of the experimental group on the measures of 3D postural parameters (P < .0005), FIQ (P < .0005), PCS (P < .0005), algometric score (F = P < .0005), PSQI (P < .0005), BAI (P < .0005), and BDI (P < .0005). The addition of the upper cervical manipulative therapy to a multimodal program is beneficial in treating patients with FMS.

  7. Internal null controllability of a linear Schrödinger-KdV system on a bounded interval

    NASA Astrophysics Data System (ADS)

    Araruna, Fágner D.; Cerpa, Eduardo; Mercado, Alberto; Santos, Maurício C.

    2016-01-01

    The control of a linear dispersive system coupling a Schrödinger and a linear Korteweg-de Vries equation is studied in this paper. The system can be viewed as three coupled real-valued equations by taking real and imaginary parts in the Schrödinger equation. The internal null controllability is proven by using either one complex-valued control on the Schrödinger equation or two real-valued controls, one on each equation. Notice that the single Schrödinger equation is not known to be controllable with a real-valued control. The standard duality method is used to reduce the controllability property to an observability inequality, which is obtained by means of a Carleman estimates approach.

  8. Technical note: Instantaneous sampling intervals validated from continuous video observation for behavioral recording of feedlot lambs.

    PubMed

    Pullin, A N; Pairis-Garcia, M D; Campbell, B J; Campler, M R; Proudfoot, K L

    2017-11-01

    When considering methodologies for collecting behavioral data, continuous sampling provides the most complete and accurate data set whereas instantaneous sampling can provide similar results and also increase the efficiency of data collection. However, instantaneous time intervals require validation to ensure accurate estimation of the data. Therefore, the objective of this study was to validate scan sampling intervals for lambs housed in a feedlot environment. Feeding, lying, standing, drinking, locomotion, and oral manipulation were measured on 18 crossbred lambs housed in an indoor feedlot facility for 14 h (0600-2000 h). Data from continuous sampling were compared with data from instantaneous scan sampling intervals of 5, 10, 15, and 20 min using a linear regression analysis. Three criteria determined if a time interval accurately estimated behaviors: 1) ≥ 0.90, 2) slope not statistically different from 1 ( > 0.05), and 3) intercept not statistically different from 0 ( > 0.05). Estimations for lying behavior were accurate up to 20-min intervals, whereas feeding and standing behaviors were accurate only at 5-min intervals (i.e., met all 3 regression criteria). Drinking, locomotion, and oral manipulation demonstrated poor associations () for all tested intervals. The results from this study suggest that a 5-min instantaneous sampling interval will accurately estimate lying, feeding, and standing behaviors for lambs housed in a feedlot, whereas continuous sampling is recommended for the remaining behaviors. This methodology will contribute toward the efficiency, accuracy, and transparency of future behavioral data collection in lamb behavior research.

  9. QT-RR relationships and suitable QT correction formulas for halothane-anesthetized dogs.

    PubMed

    Tabo, Mitsuyasu; Nakamura, Mikiko; Kimura, Kazuya; Ito, Shigeo

    2006-10-01

    Several QT correction (QTc) formulas have been used for assessing the QT liability of drugs. However, they are known to under- and over-correct the QT interval and tend to be specific to species and experimental conditions. The purpose of this study was to determine a suitable formula for halothane-anesthetized dogs highly sensitive to drug-induced QT interval prolongation. Twenty dogs were anesthetized with 1.5% halothane and the relationship between the QT and RR intervals were obtained by changing the heart rate under atrial pacing conditions. The QT interval was corrected for the RR interval by applying 4 published formulas (Bazett, Fridericia, Van de Water, and Matsunaga); Fridericia's formula (QTcF = QT/RR(0.33)) showed the least slope and lowest R(2) value for the linear regression of QTc intervals against RR intervals, indicating that it dissociated changes in heart rate most effectively. An optimized formula (QTcX = QT/RR(0.3879)) is defined by analysis of covariance and represents a correction algorithm superior to Fridericia's formula. For both Fridericia's and the optimized formula, QT-prolonging drugs (d,l-sotalol, astemizole) showed QTc interval prolongation. A non-QT-prolonging drug (d,l-propranolol) failed to prolong the QTc interval. In addition, drug-induced changes in QTcF and QTcX intervals were highly correlated with those of the QT interval paced at a cycle length of 500 msec. These findings suggest that Fridericia's and the optimized formula, although the optimized is a little bit better, are suitable for correcting the QT interval in halothane-anesthetized dogs and help to evaluate the potential QT prolongation of drugs with high accuracy.

  10. Predicting within-herd prevalence of infection with bovine leukemia virus using bulk-tank milk antibody levels.

    PubMed

    Nekouei, Omid; Stryhn, Henrik; VanLeeuwen, John; Kelton, David; Hanna, Paul; Keefe, Greg

    2015-11-01

    Enzootic bovine leukosis (EBL) is an economically important infection of dairy cattle caused by bovine leukemia virus (BLV). Estimating the prevalence of BLV within dairy herds is a fundamental step towards pursuing efficient control programs. The objectives of this study were: (1) to determine the prevalence of BLV infection at the herd level using a bulk-tank milk (BTM) antibody ELISA in the Maritime region of Canada (3 provinces); and (2) to develop appropriate statistical models for predicting within-herd prevalence of BLV infection using BTM antibody ELISA titers. During 2013, three monthly BTM samples were collected from all dairy farms in the Maritime region of Canada (n=623) and tested for BLV milk antibodies using a commercial indirect ELISA. Based on the mean of the 3 BTM titers, 15 strata of herds (5 per province) were defined. From each stratum, 6 herds were randomly selected for a total of 90 farms. Within every selected herd, an additional BTM sample was taken (round 4), approximately 2 months after the third round. On the same day of BTM sampling, all cows that contributed milk to the fourth BTM sample were individually tested for BLV milk antibodies (n=6111) to estimate the true within-herd prevalence for the 90 herds. The association between true within-herd prevalence of BLV and means of various combinations of the BTM titers was assessed using linear regression models, adjusting for the stratified random sampling design. Herd level prevalence of BLV in the region was 90.8%. In the individual testing, 30.4% of cows were positive. True within-herd prevalences ranged from 0 to 94%. All linear regression models were able to predict the true within-herd prevalence of BLV reasonably well (R(2)>0.69). Predictions from the models were particularly accurate for low-to-medium spectrums of the BTM titers. In general, as a greater number of the four repeated BTM titers were incorporated in the models, narrower confidence intervals around the prediction lines were achieved. The model including all 4 BTM tests as the predictor had the best fit, although the models using 2 and 3 BTM tests provided similar results to 4 repeated tests. Therefore, testing two or three BTM samples with approximately two-month intervals would provide relatively precise estimates for the potential number of infected cows in a herd. The developed models in this study could be applied to control and eradication programs for BLV as cost-effective tools. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Consideration of computer limitations in implementing on-line controls. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Roberts, G. K.

    1976-01-01

    A formal statement of the optimal control problem which includes the interval of dicretization as an optimization parameter, and extend this to include selection of a control algorithm as part of the optimization procedure, is formulated. The performance of the scalar linear system depends on the discretization interval. Discrete-time versions of the output feedback regulator and an optimal compensator, and the use of these results in presenting an example of a system for which fast partial-state-feedback control better minimizes a quadratic cost than either a full-state feedback control or a compensator, are developed.

  12. On ``Overestimation-free Computational Version of Interval Analysis''

    NASA Astrophysics Data System (ADS)

    Popova, Evgenija D.

    2013-10-01

    The transformation of interval parameters into trigonometric functions, proposed in Int. J. Comput. Meth. Eng. Sci. Mech., vol. 13, pp. 319-328 (2012), is not motivated in comparison to the infinitely many equivalent algebraic transformations. The conclusions about the efficacy of the methodology used are based on incorrect comparisons between solutions of different problems. We show theoretically, and in the examples considered in the commented article, that changing the number of the parameters in a system of linear algebraic equations may change the initial problem, respectively, its solution set. We also correct various misunderstandings and bugs that appear in the article noted above.

  13. An acoustic survey of beaked whales at Cross Seamount near Hawaii.

    PubMed

    McDonald, Mark A; Hildebrand, John A; Wiggins, Sean M; Johnston, David W; Polovina, Jeffrey J

    2009-02-01

    An acoustic record from Cross Seamount, southwest of Hawaii, revealed sounds characteristic of beaked whale echolocation at the same relative abundance year-around (270 of 356 days), occurring almost entirely at night. The most common sound had a linear frequency upsweep from 35 to 100 kHz (the bandwidth of recording), an interpulse interval of 0.11 s, and duration of at least 932 mus. A less common upsweep sound with shorter interpulse interval and slower sweep rate was also present. Sounds matching Cuvier's beaked whale were not detected, and Blainville's beaked whale sounds were detected on only one occasion.

  14. The Next Linear Collider Program

    Science.gov Websites

    text only International Study Group (ISG) Meetings NLC Home Page NLC Technical SLAC Eleventh Linear Collider International Study Group at KEK, December 16 - 19, 2003 Tenth (X) Linear Collider International Study Group at SLAC, June, 2003 Nineth Linear Collider ,International Study Group at KEK, December 10-13

  15. Realtime Multichannel System for Beat to Beat QT Interval Variability

    NASA Technical Reports Server (NTRS)

    Starc, Vito; Schlegel, Todd T.

    2006-01-01

    The measurement of beat-to-beat QT interval variability (QTV) shows clinical promise for identifying several types of cardiac pathology. However, until now, there has been no device capable of displaying, in real time on a beattobeat basis, changes in QTV in all 12 conventional leads in a continuously monitored patient. While several software programs have been designed to analyze QTV, heretofore, such programs have all involved only a few channels (at most) and/or have required laborious user interaction or offline calculations and postprocessing, limiting their clinical utility. This paper describes a PC-based ECG software program that in real time, acquires, analyzes and displays QTV and also PQ interval variability (PQV) in each of the eight independent channels that constitute the 12lead conventional ECG. The system also processes certain related signals that are derived from singular value decomposition and that help to reduce the overall effects of noise on the realtime QTV and PQV results.

  16. Robust portfolio selection based on asymmetric measures of variability of stock returns

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  17. Linking linear programming and spatial simulation models to predict landscape effects of forest management alternatives

    Treesearch

    Eric J. Gustafson; L. Jay Roberts; Larry A. Leefers

    2006-01-01

    Forest management planners require analytical tools to assess the effects of alternative strategies on the sometimes disparate benefits from forests such as timber production and wildlife habitat. We assessed the spatial patterns of alternative management strategies by linking two models that were developed for different purposes. We used a linear programming model (...

  18. A Partitioning and Bounded Variable Algorithm for Linear Programming

    ERIC Educational Resources Information Center

    Sheskin, Theodore J.

    2006-01-01

    An interesting new partitioning and bounded variable algorithm (PBVA) is proposed for solving linear programming problems. The PBVA is a variant of the simplex algorithm which uses a modified form of the simplex method followed by the dual simplex method for bounded variables. In contrast to the two-phase method and the big M method, the PBVA does…

  19. Airborne Tactical Crossload Planner

    DTIC Science & Technology

    2017-12-01

    set out in the Airborne Standard Operating Procedure (ASOP). 14. SUBJECT TERMS crossload, airborne, optimization, integer linear programming ...they land to their respective sub-mission locations. In this thesis, we formulate and implement an integer linear program called the Tactical...to meet any desired crossload objectives. xiv We demonstrate TCP with two real-world tactical problems from recent airborne operations: one by the

  20. Radar Resource Management in a Dense Target Environment

    DTIC Science & Technology

    2014-03-01

    problem faced by networked MFRs . While relaxing our assumptions concerning information gain presents numerous challenges worth exploring, future research...linear programming MFR multifunction phased array radar MILP mixed integer linear programming NATO North Atlantic Treaty Organization PDF probability...1: INTRODUCTION Multifunction phased array radars ( MFRs ) are capable of performing various tasks in rapid succession. The performance of target search

  1. Linear circuit analysis program for IBM 1620 Monitor 2, 1311/1443 data processing system /CIRCS/

    NASA Technical Reports Server (NTRS)

    Hatfield, J.

    1967-01-01

    CIRCS is modification of IBSNAP Circuit Analysis Program, for use on smaller systems. This data processing system retains the basic dc, transient analysis, and FORTRAN 2 formats. It can be used on the IBM 1620/1311 Monitor I Mod 5 system, and solves a linear network containing 15 nodes and 45 branches.

  2. LFSPMC: Linear feature selection program using the probability of misclassification

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr.; Marion, B. P.

    1975-01-01

    The computational procedure and associated computer program for a linear feature selection technique are presented. The technique assumes that: a finite number, m, of classes exists; each class is described by an n-dimensional multivariate normal density function of its measurement vectors; the mean vector and covariance matrix for each density function are known (or can be estimated); and the a priori probability for each class is known. The technique produces a single linear combination of the original measurements which minimizes the one-dimensional probability of misclassification defined by the transformed densities.

  3. Evaluation of Measurement Instrument Criterion Validity in Finite Mixture Settings

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Li, Tenglong

    2016-01-01

    A method for evaluating the validity of multicomponent measurement instruments in heterogeneous populations is discussed. The procedure can be used for point and interval estimation of criterion validity of linear composites in populations representing mixtures of an unknown number of latent classes. The approach permits also the evaluation of…

  4. SLFP: a stochastic linear fractional programming approach for sustainable waste management.

    PubMed

    Zhu, H; Huang, G H

    2011-12-01

    A stochastic linear fractional programming (SLFP) approach is developed for supporting sustainable municipal solid waste management under uncertainty. The SLFP method can solve ratio optimization problems associated with random information, where chance-constrained programming is integrated into a linear fractional programming framework. It has advantages in: (1) comparing objectives of two aspects, (2) reflecting system efficiency, (3) dealing with uncertainty expressed as probability distributions, and (4) providing optimal-ratio solutions under different system-reliability conditions. The method is applied to a case study of waste flow allocation within a municipal solid waste (MSW) management system. The obtained solutions are useful for identifying sustainable MSW management schemes with maximized system efficiency under various constraint-violation risks. The results indicate that SLFP can support in-depth analysis of the interrelationships among system efficiency, system cost and system-failure risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Transmission of linear regression patterns between time series: From relationship in time series to complex networks

    NASA Astrophysics Data System (ADS)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  6. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    PubMed

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  7. A novel predictive pharmacokinetic/pharmacodynamic model of repolarization prolongation derived from the effects of terfenadine, cisapride and E-4031 in the conscious chronic av node--ablated, His bundle-paced dog.

    PubMed

    Nolan, Emily R; Feng, Meihua Rose; Koup, Jeffrey R; Liu, Jing; Turluck, Daniel; Zhang, Yiqun; Paulissen, Jerome B; Olivier, N Bari; Miller, Teresa; Bailie, Marc B

    2006-01-01

    Terfenadine, cisapride, and E-4031, three drugs that prolong ventricular repolarization, were selected to evaluate the sensitivity of the conscious chronic atrioventricular node--ablated, His bundle-paced Dog for defining drug induced cardiac repolarization prolongation. A novel predictive pharmacokinetic/pharmacodynamic model of repolarization prolongation was generated from these data. Three male beagle dogs underwent radiofrequency AV nodal ablation, and placement of a His bundle-pacing lead and programmable pacemaker under anesthesia. Each dog was restrained in a sling for a series of increasing dose infusions of each drug while maintained at a constant heart rate of 80 beats/min. RT interval, a surrogate for QT interval in His bundle-paced dogs, was recorded throughout the experiment. E-4031 induced a statistically significant RT prolongation at the highest three doses. Cisapride resulted in a dose-dependent increase in RT interval, which was statistically significant at the two highest doses. Terfenadine induced a dose-dependent RT interval prolongation with a statistically significant change occurring only at the highest dose. The relationship between drug concentration and RT interval change was described by a sigmoid E(max) model with an effect site. Maximum RT change (E(max)), free drug concentration at half of the maximum effect (EC(50)), and free drug concentration associated with a 10 ms RT prolongation (EC(10 ms)) were estimated. A linear correlation between EC(10 ms) and HERG IC(50) values was identified. The conscious dog with His bundle-pacing detects delayed cardiac repolarization related to I(Kr) inhibition, and detects repolarization change induced by drugs with activity at multiple ion channels. A clinically relevant sensitivity and a linear correlation with in vitro HERG data make the conscious His bundle-paced dog a valuable tool for detecting repolarization effect of new chemical entities.

  8. Program for Weibull Analysis of Fatigue Data

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2005-01-01

    A Fortran computer program has been written for performing statistical analyses of fatigue-test data that are assumed to be adequately represented by a two-parameter Weibull distribution. This program calculates the following: (1) Maximum-likelihood estimates of the Weibull distribution; (2) Data for contour plots of relative likelihood for two parameters; (3) Data for contour plots of joint confidence regions; (4) Data for the profile likelihood of the Weibull-distribution parameters; (5) Data for the profile likelihood of any percentile of the distribution; and (6) Likelihood-based confidence intervals for parameters and/or percentiles of the distribution. The program can account for tests that are suspended without failure (the statistical term for such suspension of tests is "censoring"). The analytical approach followed in this program for the software is valid for type-I censoring, which is the removal of unfailed units at pre-specified times. Confidence regions and intervals are calculated by use of the likelihood-ratio method.

  9. Dynamic analysis for solid waste management systems: an inexact multistage integer programming approach.

    PubMed

    Li, Yongping; Huang, Guohe

    2009-03-01

    In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.

  10. Three-dimensional modeling of flexible pavements : research implementation plan.

    DOT National Transportation Integrated Search

    2006-02-14

    Many of the asphalt pavement analysis programs are based on linear elastic models. A linear viscoelastic models : would be superior to linear elastic models for analyzing the response of asphalt concrete pavements to loads. There : is a need to devel...

  11. 30 CFR 710.2 - Objectives.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... REGULATIONS INITIAL REGULATORY PROGRAM § 710.2 Objectives. The objectives of the initial regulatory program... resulting from surface coal mining operations during the interval between enactment of the Act and adoption of a permanent State or Federal regulatory program; and (b) Coordinate the State and Federal...

  12. 30 CFR 710.2 - Objectives.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... REGULATIONS INITIAL REGULATORY PROGRAM § 710.2 Objectives. The objectives of the initial regulatory program... resulting from surface coal mining operations during the interval between enactment of the Act and adoption of a permanent State or Federal regulatory program; and (b) Coordinate the State and Federal...

  13. 30 CFR 710.2 - Objectives.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... REGULATIONS INITIAL REGULATORY PROGRAM § 710.2 Objectives. The objectives of the initial regulatory program... resulting from surface coal mining operations during the interval between enactment of the Act and adoption of a permanent State or Federal regulatory program; and (b) Coordinate the State and Federal...

  14. 30 CFR 710.2 - Objectives.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... REGULATIONS INITIAL REGULATORY PROGRAM § 710.2 Objectives. The objectives of the initial regulatory program... resulting from surface coal mining operations during the interval between enactment of the Act and adoption of a permanent State or Federal regulatory program; and (b) Coordinate the State and Federal...

  15. The Behavioral Economics of Choice and Interval Timing

    PubMed Central

    Jozefowiez, J.; Staddon, J. E. R.; Cerutti, D. T.

    2009-01-01

    We propose a simple behavioral economic model (BEM) describing how reinforcement and interval timing interact. The model assumes a Weber-law-compliant logarithmic representation of time. Associated with each represented time value are the payoffs that have been obtained for each possible response. At a given real time, the response with the highest payoff is emitted. The model accounts for a wide range of data from procedures such as simple bisection, metacognition in animals, economic effects in free-operant psychophysical procedures and paradoxical choice in double-bisection procedures. Although it assumes logarithmic time representation, it can also account for data from the time-left procedure usually cited in support of linear time representation. It encounters some difficulties in complex free-operant choice procedures, such as concurrent mixed fixed-interval schedules as well as some of the data on double bisection, that may involve additional processes. Overall, BEM provides a theoretical framework for understanding how reinforcement and interval timing work together to determine choice between temporally differentiated reinforcers. PMID:19618985

  16. Spike-Interval Triggered Averaging Reveals a Quasi-Periodic Spiking Alternative for Stochastic Resonance in Catfish Electroreceptors

    PubMed Central

    Lankheet, Martin J. M.; Klink, P. Christiaan; Borghuis, Bart G.; Noest, André J.

    2012-01-01

    Catfish detect and identify invisible prey by sensing their ultra-weak electric fields with electroreceptors. Any neuron that deals with small-amplitude input has to overcome sensitivity limitations arising from inherent threshold non-linearities in spike-generation mechanisms. Many sensory cells solve this issue with stochastic resonance, in which a moderate amount of intrinsic noise causes irregular spontaneous spiking activity with a probability that is modulated by the input signal. Here we show that catfish electroreceptors have adopted a fundamentally different strategy. Using a reverse correlation technique in which we take spike interval durations into account, we show that the electroreceptors generate a supra-threshold bias current that results in quasi-periodically produced spikes. In this regime stimuli modulate the interval between successive spikes rather than the instantaneous probability for a spike. This alternative for stochastic resonance combines threshold-free sensitivity for weak stimuli with similar sensitivity for excitations and inhibitions based on single interspike intervals. PMID:22403709

  17. Final Report---Optimization Under Nonconvexity and Uncertainty: Algorithms and Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Linderoth

    2011-11-06

    the goal of this work was to develop new algorithmic techniques for solving large-scale numerical optimization problems, focusing on problems classes that have proven to be among the most challenging for practitioners: those involving uncertainty and those involving nonconvexity. This research advanced the state-of-the-art in solving mixed integer linear programs containing symmetry, mixed integer nonlinear programs, and stochastic optimization problems. The focus of the work done in the continuation was on Mixed Integer Nonlinear Programs (MINLP)s and Mixed Integer Linear Programs (MILP)s, especially those containing a great deal of symmetry.

  18. Is applying the same exercise-based inpatient program to normal and reduced left ventricular function patients the best strategy after coronary surgery? A focus on autonomic cardiac response.

    PubMed

    Mendes, Renata Gonçalves; Simões, Rodrigo Polaquini; Costa, Fernando de Souza Melo; Pantoni, Camila Bianca Falasco; Di Thommazo-Luporini, Luciana; Luzzi, Sérgio; Amaral-Neto, Othon; Arena, Ross; Catai, Aparecida Maria; Borghi-Silva, Audrey

    2014-01-01

    To assess whether the same exercise-based inpatient program applied to patients with normal and reduced left ventricular function (LVF) evokes a similar cardiac autonomic response after coronary artery bypass graft (CABG). Forty-four patients post-CABG, subgrouped according to normal LVF [LVFN: n = 23; left ventricular ejection fraction (LVEF) ≥ 55%] and reduced LVF (LVFR: n = 21; LVEF 35-54%), were included. All initiated the exercise protocol on post-operative day 1 (PO1), following a whole progressive program until discharge. Cardiac autonomic response was assessed by the indices of heart rate variability (HRV) at rest and during exercise (extremity range of motion and ambulation). During ambulation, lower values of HRV indices were found in the LVFR group compared with the LVFN group [standard deviation of all RR (STDRR; 6.1 ± 2.7 versus 8.9 ± 4.7 ms), baseline width of the RR histogram (TINN; 30.6 ± 14.8 versus 45.8 ± 24.9 ms), SD2 (14.8 ± 8.0 versus 21.3 ± 9.0 ms), Shannon entropy (3.6 ± 0.5 versus 3.9 ± 0.4) and correlation dimension (0.08 ± 0.2 versus 0.2 ± 0.2)]. Also, when comparing the ambulation to rest change, lower values were observed in the LVFR group for linear (STDRR, TINN, RR TRI, rMSSD) and non-linear (SD2 and correlation dimension) HRV indices (p < 0.05). On PO1, we observed only intra-group differences between rest and exercise (extremity range of motion), for mean intervals between heart beats and heart rate. For patients with LVFN, the same inpatient exercise protocol triggered a more attenuated autonomic response compared with patients with LVFR. These findings have implications as to how exercise should be prescribed according to LVF in the early stages following recovery from CABG. Implications for Rehabilitation Exercise-based inpatient program, performed by post-CABG patients who have normal left ventricular function, triggered a more attenuated cardiac autonomic response compared with patients with reduced left ventricular function. Volume of the inpatient exercises should be prescribed according to the left ventricular function in the early stages following recovery from CABG.

  19. Solution Methods for Stochastic Dynamic Linear Programs.

    DTIC Science & Technology

    1980-12-01

    16, No. 11, pp. 652-675, July 1970. [28] Glassey, C.R., "Dynamic linear programs for production scheduling", OR 19, pp. 45-56. 1971 . 129 Glassey, C.R...Huang, C.C., I. Vertinsky, W.T. Ziemba, ’Sharp bounds on the value of perfect information", OR 25, pp. 128-139, 1977. [37 Kall , P., ’Computational... 1971 . [701 Ziemba, W.T., *Computational algorithms for convex stochastic programs with simple recourse", OR 8, pp. 414-431, 1970. 131 UNCLASSI FIED

  20. Readiness of Primary Care Practices for Medical Home Certification

    PubMed Central

    Clark, Sarah J.; Sakshaug, Joseph W.; Chen, Lena M.; Hollingsworth, John M.

    2013-01-01

    OBJECTIVES: To assess the prevalence of medical home infrastructure among primary care practices for children and identify practice characteristics associated with medical home infrastructure. METHODS: Cross-sectional analysis of restricted data files from 2007 and 2008 of the National Ambulatory Medical Care Survey. We mapped survey items to the 2011 National Committee on Quality Assurance’s Patient-Centered Medical home standards. Points were awarded for each “passed” element based on National Committee for Quality Assurance scoring, and we then calculated the percentage of the total possible points met for each practice. We used multivariate linear regression to assess associations between practice characteristics and the percentage of medical home infrastructure points attained. RESULTS: On average, pediatric practices attained 38% (95% confidence interval 34%–41%) of medical home infrastructure points, and family/general practices attained 36% (95% confidence interval 33%–38%). Practices scored higher on medical home elements related to direct patient care (eg, providing comprehensive health assessments) and lower in areas highly dependent on health information technology (eg, computerized prescriptions, test ordering, laboratory result viewing, or quality of care measurement and reporting). In multivariate analyses, smaller practice size was significantly associated with lower infrastructure scores. Practice ownership, urban versus rural location, and proportion of visits covered by public insurers were not consistently associated with a practice’s infrastructure score. CONCLUSIONS: Medical home programs need effective approaches to support practice transformation in the small practices that provide the vast majority of the primary care for children in the United States. PMID:23382438

  1. Physiological Characteristics of Incoming Freshmen Field Players in a Men’s Division I Collegiate Soccer Team

    PubMed Central

    Lockie, Robert G.; Davis, DeShaun L.; Birmingham-Babauta, Samantha A.; Beiley, Megan D.; Hurley, Jillian M.; Stage, Alyssa A.; Stokes, John J.; Tomita, Tricia M.; Torne, Ibett A.; Lazar, Adrina

    2016-01-01

    Freshmen college soccer players will have lower training ages than their experienced teammates (sophomores, juniors, seniors). How this is reflected in field test performance is not known. Freshmen (n = 7) and experienced (n = 10) male field soccer players from the same Division I school completed soccer-specific tests to identify potential differences in incoming freshmen. Testing included: vertical jump (VJ), standing broad jump, and triple hop (TH); 30-m sprint, (0–5, 5–10, 0–10, and 0–30 m intervals); 505 change-of-direction test; Yo-Yo Intermittent Recovery Test Level 2 (YYIRT2); and 6 × 30-m sprints to measure repeated-sprint ability. A MANOVA with Bonferroni post hoc was conducted on the performance test data, and effect sizes and z-scores were calculated from the results for magnitude-based inference. There were no significant between-group differences in the performance tests. There were moderate effects for the differences in VJ height, left-leg TH, 0–5, 0–10 and 0–30 m sprint intervals, and YYIRT2 (d = 0.63–1.18), with experienced players being superior. According to z-score data, freshmen had meaningful differences below the squad mean in the 30-m sprint, YYIRT2, and jump tests. Freshmen soccer players may need to develop linear speed, high-intensity running, and jump performance upon entering a collegiate program. PMID:29910282

  2. "Functional" Inspiratory and Core Muscle Training Enhances Running Performance and Economy.

    PubMed

    Tong, Tomas K; McConnell, Alison K; Lin, Hua; Nie, Jinlei; Zhang, Haifeng; Wang, Jiayuan

    2016-10-01

    Tong, TK, McConnell, AK, Lin, H, Nie, J, Zhang, H, and Wang, J. "Functional" inspiratory and core muscle training enhances running performance and economy. J Strength Cond Res 30(10): 2942-2951, 2016-We compared the effects of two 6-week high-intensity interval training interventions. Under the control condition (CON), only interval training was undertaken, whereas under the intervention condition (ICT), interval training sessions were followed immediately by core training, which was combined with simultaneous inspiratory muscle training (IMT)-"functional" IMT. Sixteen recreational runners were allocated to either ICT or CON groups. Before the intervention phase, both groups undertook a 4-week program of "foundation" IMT to control for the known ergogenic effect of IMT (30 inspiratory efforts at 50% maximal static inspiratory pressure [P0] per set, 2 sets per day, 6 days per week). The subsequent 6-week interval running training phase consisted of 3-4 sessions per week. In addition, the ICT group undertook 4 inspiratory-loaded core exercises (10 repetitions per set, 2 sets per day, inspiratory load set at 50% post-IMT P0) immediately after each interval training session. The CON group received neither core training nor functional IMT. After the intervention phase, global inspiratory and core muscle functions increased in both groups (p ≤ 0.05), as evidenced by P0 and a sport-specific endurance plank test (SEPT) performance, respectively. Compared with CON, the ICT group showed larger improvements in SEPT, running economy at the speed of the onset of blood lactate accumulation, and 1-hour running performance (3.04% vs. 1.57%, p ≤ 0.05). The changes in these variables were interindividually correlated (r ≥ 0.57, n = 16, p ≤ 0.05). Such findings suggest that the addition of inspiratory-loaded core conditioning into a high-intensity interval training program augments the influence of the interval program on endurance running performance and that this may be underpinned by an improvement in running economy.

  3. Suggestion of an oral hygiene program for orthodontic patients with cleft lip and palate: findings of a pilot study.

    PubMed

    Brasil, Juliana Marcelina Plácido; de Almeida Pernambuco, Renata; da Silva Dalben, Gisele

    2007-11-01

    To evaluate the efficacy of an oral hygiene program for orthodontic patients with cleft lip and palate. Retrospective pilot study. Hospital for Rehabilitation of Craniofacial Anomalies, Bauru, Brazil. One hundred twenty-two patients with complete cleft lip and palate undergoing orthodontic treatment. Orientation on toothbrushing and flossing, plaque disclosure, and scoring according to an especially designed index. Statistical comparison of variation in plaque index between sessions; correlation of intervals between sessions and variation in plaque index. Mean scores were reduced significantly, from 2.17 to 1.75 between first and second, 2.18 to 1.62 between first and third, and 1.93 to 1.62 between second and third sessions. Plaque reduction was inversely proportional to the time interval. The program demonstrated a significant plaque reduction. The highest reduction between the first and second sessions reveals the need to reinforce the initial instructions at all sessions. The greatest reduction observed at shorter intervals highlights the need for regular follow up. More controlled studies on larger samples should be encouraged to evaluate the validity of the index and the efficacy of similar programs worldwide.

  4. Risk management for sulfur dioxide abatement under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Dai, C.; Sun, W.; Tan, Q.; Liu, Y.; Lu, W. T.; Guo, H. C.

    2016-03-01

    In this study, interval-parameter programming, two-stage stochastic programming (TSP), and conditional value-at-risk (CVaR) were incorporated into a general optimization framework, leading to an interval-parameter CVaR-based two-stage programming (ICTP) method. The ICTP method had several advantages: (i) its objective function simultaneously took expected cost and risk cost into consideration, and also used discrete random variables and discrete intervals to reflect uncertain properties; (ii) it quantitatively evaluated the right tail of distributions of random variables which could better calculate the risk of violated environmental standards; (iii) it was useful for helping decision makers to analyze the trade-offs between cost and risk; and (iv) it was effective to penalize the second-stage costs, as well as to capture the notion of risk in stochastic programming. The developed model was applied to sulfur dioxide abatement in an air quality management system. The results indicated that the ICTP method could be used for generating a series of air quality management schemes under different risk-aversion levels, for identifying desired air quality management strategies for decision makers, and for considering a proper balance between system economy and environmental quality.

  5. 77 FR 68172 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-15

    ... recently approved certain products to trade at $0.50 and $1 strike price intervals on CBOE within exactly... Short Term Option Series (``STOS'') Program that normally trade in $1 Strike Price Intervals shall be [[Page 68173

  6. An inventory-theory-based interval-parameter two-stage stochastic programming model for water resources management

    NASA Astrophysics Data System (ADS)

    Suo, M. Q.; Li, Y. P.; Huang, G. H.

    2011-09-01

    In this study, an inventory-theory-based interval-parameter two-stage stochastic programming (IB-ITSP) model is proposed through integrating inventory theory into an interval-parameter two-stage stochastic optimization framework. This method can not only address system uncertainties with complex presentation but also reflect transferring batch (the transferring quantity at once) and period (the corresponding cycle time) in decision making problems. A case of water allocation problems in water resources management planning is studied to demonstrate the applicability of this method. Under different flow levels, different transferring measures are generated by this method when the promised water cannot be met. Moreover, interval solutions associated with different transferring costs also have been provided. They can be used for generating decision alternatives and thus help water resources managers to identify desired policies. Compared with the ITSP method, the IB-ITSP model can provide a positive measure for solving water shortage problems and afford useful information for decision makers under uncertainty.

  7. High-intensity interval training has positive effects on performance in ice hockey players.

    PubMed

    Naimo, M A; de Souza, E O; Wilson, J M; Carpenter, A L; Gilchrist, P; Lowery, R P; Averbuch, B; White, T M; Joy, J

    2015-01-01

    In spite of the well-known benefits that have been shown, few studies have looked at the practical applications of high-intensity interval training (HIIT) on athletic performance. This study investigated the effects of a HIIT program compared to traditional continuous endurance exercise training. 24 hockey players were randomly assigned to either a continuous or high-intensity interval group during a 4-week training program. The interval group (IG) was involved in a periodized HIIT program. The continuous group (CG) performed moderate intensity cycling for 45-60 min at an intensity that was 65% of their calculated heart rate reserve. Body composition, muscle thickness, anaerobic power, and on-ice measures were assessed pre- and post-training. Muscle thickness was significantly greater in IG (p=0.01) when compared to CG. The IG had greater values for both ∆ peak power (p<0.003) and ∆ mean power (p<0.02). Additionally, IG demonstrated a faster ∆ sprint (p<0.02) and a trend (p=0.08) for faster ∆ endurance test time to completion for IG. These results indicate that hockey players may utilize short-term HIIT to elicit positive effects in muscle thickness, power and on-ice performance. © Georg Thieme Verlag KG Stuttgart · New York.

  8. High-Intensity Interval Training for Cognitive and Mental Health in Adolescents.

    PubMed

    Costigan, Sarah A; Eather, Narelle; Plotnikoff, Ronald C; Hillman, Charles H; Lubans, David R

    2016-10-01

    Emerging literature suggests that physical activity and fitness may have a positive effect on cognitive and mental health in adolescents. The purpose of the current study was to evaluate the efficacy of two high-intensity interval training (HIIT) protocols for improving cognitive and mental health outcomes (executive function, psychological well-being, psychological distress, and physical self-concept) in adolescents. Participants (n = 65; mean age = 15.8 ± 0.6 yr) were randomized to three conditions: aerobic exercise program (AEP; n = 21), resistance and aerobic program (RAP; n = 22), and control (n = 22). HIIT sessions (8-10 min per session) were delivered during physical education lessons or at lunchtime three times per week for 8 wk. Assessments were conducted at baseline and immediately postintervention to detect changes in executive function (trail making test), psychological well-being, psychological distress, and physical self-description by researchers blinded to treatment allocation. Intervention effects were examined using linear mixed models. Cohen's d effect sizes and clinical inference were also calculated. While results were not significant, small improvements in executive function (mean change (95% CI) -6.69 (-22.03, 8.64), d = -0.32) and psychological well-being (mean change (95% CI) 2.81 (-2.06, 7.68), d = 0.34) were evident in the AEP group; and moderate improvements in executive function (mean change (95% CI) -10.73 (-26.22, 4.76), d = -0.51), and small improvements in well-being (mean change (95% CI) 2.96 (-1.82, 7.75), d = 0.36) and perceived appearance (mean change (95% CI) 0.32 (-0.25, 0.86), d = 0.35), were observed for the RAP group. Mean feeling state scores improved from preworkout to postworkout in both HIIT conditions, with significant results for the AEP (P = 0.001). This study highlights the potential of embedding HIIT within the school day for improving cognitive and mental health among adolescents.

  9. Effect of Age and Sex on the QTc Interval in Children and Adolescents With Type 1 and 2 Long-QT Syndrome.

    PubMed

    Vink, Arja S; Clur, Sally-Ann B; Geskus, Ronald B; Blank, Andreas C; De Kezel, Charlotte C A; Yoshinaga, Masao; Hofman, Nynke; Wilde, Arthur A M; Blom, Nico A

    2017-04-01

    In congenital long-QT syndrome, age, sex, and genotype have been associated with cardiac events, but their effect on the trend in QTc interval has never been established. We, therefore, aimed to assess the effect of age and sex on the QTc interval in children and adolescents with type 1 (LQT1) and type 2 (LQT2) long-QT syndrome. QTc intervals of 12-lead resting electrocardiograms were determined, and trends over time were analyzed using a linear mixed-effects model. The study included 278 patients with a median follow-up of 4 years (interquartile range, 1-9) and a median number of 6 (interquartile range, 2-10) electrocardiograms per patient. Both LQT1 and LQT2 male patients showed QTc interval shortening after the onset of puberty. In LQT2 male patients, this was preceded by a progressive QTc interval prolongation. In LQT1, after the age of 12 years, male patients had a significantly shorter QTc interval than female patients. In LQT2, during the first years of life and from 14 to 26 years, male patients had a significantly shorter QTc interval than female patients. On the contrary, between 5 and 14 years, LQT2 male patients had significantly longer QTc interval than LQT2 female patients. There is a significant effect of age and sex on the QTc interval in long-QT syndrome, with a unique pattern per genotype. The age of 12 to 14 years is an important transitional period. In the risk stratification and management of long-QT syndrome patients, clinicians should be aware of these age-, sex-, and genotype-related trends in QTc interval and especially the important role of the onset of puberty. © 2017 American Heart Association, Inc.

  10. AQMAN; linear and quadratic programming matrix generator using two-dimensional ground-water flow simulation for aquifer management modeling

    USGS Publications Warehouse

    Lefkoff, L.J.; Gorelick, S.M.

    1987-01-01

    A FORTRAN-77 computer program code that helps solve a variety of aquifer management problems involving the control of groundwater hydraulics. It is intended for use with any standard mathematical programming package that uses Mathematical Programming System input format. The computer program creates the input files to be used by the optimization program. These files contain all the hydrologic information and management objectives needed to solve the management problem. Used in conjunction with a mathematical programming code, the computer program identifies the pumping or recharge strategy that achieves a user 's management objective while maintaining groundwater hydraulic conditions within desired limits. The objective may be linear or quadratic, and may involve the minimization of pumping and recharge rates or of variable pumping costs. The problem may contain constraints on groundwater heads, gradients, and velocities for a complex, transient hydrologic system. Linear superposition of solutions to the transient, two-dimensional groundwater flow equation is used by the computer program in conjunction with the response matrix optimization method. A unit stress is applied at each decision well and transient responses at all control locations are computed using a modified version of the U.S. Geological Survey two dimensional aquifer simulation model. The program also computes discounted cost coefficients for the objective function and accounts for transient aquifer conditions. (Author 's abstract)

  11. STICAP: A linear circuit analysis program with stiff systems capability. Volume 1: Theory manual. [network analysis

    NASA Technical Reports Server (NTRS)

    Cooke, C. H.

    1975-01-01

    STICAP (Stiff Circuit Analysis Program) is a FORTRAN 4 computer program written for the CDC-6400-6600 computer series and SCOPE 3.0 operating system. It provides the circuit analyst a tool for automatically computing the transient responses and frequency responses of large linear time invariant networks, both stiff and nonstiff (algorithms and numerical integration techniques are described). The circuit description and user's program input language is engineer-oriented, making simple the task of using the program. Engineering theories underlying STICAP are examined. A user's manual is included which explains user interaction with the program and gives results of typical circuit design applications. Also, the program structure from a systems programmer's viewpoint is depicted and flow charts and other software documentation are given.

  12. Evaluation of fertilization-to-planting and fertilization-to-harvest intervals for safe use of noncomposted bovine manure in Wisconsin vegetable production.

    PubMed

    Ingham, Steven C; Fanslau, Melody A; Engel, Rebecca A; Breuer, Jeffry R; Breuer, Jane E; Wright, Thomas H; Reith-Rozelle, Judith K; Zhu, Jun

    2005-06-01

    Fresh bovine manure was mechanically incorporated into loamy sand and silty clay loam Wisconsin soils in April 2004. At varying fertilization-to-planting intervals, radish, lettuce, and carrot seeds were planted; crops were harvested 90, 100, 110 or 111, and 120 days after manure application. As an indicator of potential contamination with fecal pathogens, levels of Escherichia coli in the manure-fertilized soil and presence of E. coli on harvested vegetables were monitored. From initial levels of 4.0 to 4.2 log CFU/g, E. coli levels in both manure-fertilized soils decreased by 2.4 to 2.5 log CFU/g during the first 7 weeks. However, E. coli was consistently detected from enriched soil samples through week 17, perhaps as a result of contamination by birds and other wildlife. In the higher clay silty clay loam soil, the fertilization-to-planting interval affected the prevalence of E. coli on lettuce but not on radishes and carrots. Root crop contamination was consistent across different fertilization-to-harvest intervals in silty clay loam, including the National Organic Program minimum fertilization-to-harvest interval of 120 days. However, lettuce contamination in silty clay loam was significantly (P < 0.10) affected by fertilization-to-harvest interval. Increasing the fertilization-to-planting interval in the lower clay loamy sand soil decreased the prevalence of E. coli on root crops. The fertilization-to-harvest interval had no clear effect on vegetable contamination in loamy sand. Overall, these results do not provide grounds for reducing the National Organic Program minimum fertilization-to-harvest interval from the current 120-day standard.

  13. User document for computer programs for ring-stiffened shells of revolution

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.

    1973-01-01

    A user manual and related program documentation is presented for six compatible computer programs for structural analysis of axisymmetric shell structures. The programs apply to a common structural model but analyze different modes of structural response. In particular, they are: (1) Linear static response under asymmetric loads; (2) Buckling of linear states under asymmetric loads; (3) Nonlinear static response under axisymmetric loads; (4) Buckling nonlinear states under axisymmetric (5) Imperfection sensitivity of buckling modes under axisymmetric loads; and (6) Vibrations about nonlinear states under axisymmetric loads. These programs treat branched shells of revolution with an arbitrary arrangement of a large number of open branches but with at most one closed branch.

  14. Alternative mathematical programming formulations for FSS synthesis

    NASA Technical Reports Server (NTRS)

    Reilly, C. H.; Mount-Campbell, C. A.; Gonsalvez, D. J. A.; Levis, C. A.

    1986-01-01

    A variety of mathematical programming models and two solution strategies are suggested for the problem of allocating orbital positions to (synthesizing) satellites in the Fixed Satellite Service. Mixed integer programming and almost linear programming formulations are presented in detail for each of two objectives: (1) positioning satellites as closely as possible to specified desired locations, and (2) minimizing the total length of the geostationary arc allocated to the satellites whose positions are to be determined. Computational results for mixed integer and almost linear programming models, with the objective of positioning satellites as closely as possible to their desired locations, are reported for three six-administration test problems and a thirteen-administration test problem.

  15. Colon Cancer Risk Assessment - Gauss Program

    Cancer.gov

    An executable file (in GAUSS) that projects absolute colon cancer risk (with confidence intervals) according to NCI’s Colorectal Cancer Risk Assessment Tool (CCRAT) algorithm. GAUSS is not needed to run the program.

  16. A numerical approach to 14C wiggle-match dating of organic deposits: best fits and confidence intervals

    NASA Astrophysics Data System (ADS)

    Blaauw, Maarten; Heuvelink, Gerard B. M.; Mauquoy, Dmitri; van der Plicht, Johannes; van Geel, Bas

    2003-06-01

    14C wiggle-match dating (WMD) of peat deposits uses the non-linear relationship between 14C age and calendar age to match the shape of a sequence of closely spaced peat 14C dates with the 14C calibration curve. A numerical approach to WMD enables the quantitative assessment of various possible wiggle-match solutions and of calendar year confidence intervals for sequences of 14C dates. We assess the assumptions, advantages, and limitations of the method. Several case-studies show that WMD results in more precise chronologies than when individual 14C dates are calibrated. WMD is most successful during periods with major excursions in the 14C calibration curve (e.g., in one case WMD could narrow down confidence intervals from 230 to 36 yr).

  17. A Study on Linear Programming Applications for the Optimization of School Lunch Menus. Summation Report.

    ERIC Educational Resources Information Center

    Findorff, Irene K.

    This document summarizes the results of a project at Tulane University that was designed to adapt, test, and evaluate a computerized information and menu planning system utilizing linear programing techniques for use in school lunch food service operations. The objectives of the menu planning were to formulate menu items into a palatable,…

  18. Spline smoothing of histograms by linear programming

    NASA Technical Reports Server (NTRS)

    Bennett, J. O.

    1972-01-01

    An algorithm for an approximating function to the frequency distribution is obtained from a sample of size n. To obtain the approximating function a histogram is made from the data. Next, Euclidean space approximations to the graph of the histogram using central B-splines as basis elements are obtained by linear programming. The approximating function has area one and is nonnegative.

  19. FPL-PELPS : a price endogenous linear programming system for economic modeling, supplement to PELPS III, version 1.1.

    Treesearch

    Patricia K. Lebow; Henry Spelter; Peter J. Ince

    2003-01-01

    This report provides documentation and user information for FPL-PELPS, a personal computer price endogenous linear programming system for economic modeling. Originally developed to model the North American pulp and paper industry, FPL-PELPS follows its predecessors in allowing the modeling of any appropriate sector to predict consumption, production and capacity by...

  20. Altered fractal dynamics of gait: reduced stride-interval correlations with aging and Huntington's disease

    NASA Technical Reports Server (NTRS)

    Hausdorff, J. M.; Mitchell, S. L.; Firtion, R.; Peng, C. K.; Cudkowicz, M. E.; Wei, J. Y.; Goldberger, A. L.

    1997-01-01

    Fluctuations in the duration of the gait cycle (the stride interval) display fractal dynamics and long-range correlations in healthy young adults. We hypothesized that these stride-interval correlations would be altered by changes in neurological function associated with aging and certain disease states. To test this hypothesis, we compared the stride-interval time series of 1) healthy elderly subjects and young controls and of 2) subjects with Huntington's disease and healthy controls. Using detrended fluctuation analysis we computed alpha, a measure of the degree to which one stride interval is correlated with previous and subsequent intervals over different time scales. The scaling exponent alpha was significantly lower in elderly subjects compared with young subjects (elderly: 0.68 +/- 0.14; young: 0.87 +/- 0.15; P < 0.003). The scaling exponent alpha was also smaller in the subjects with Huntington's disease compared with disease-free controls (Huntington's disease: 0.60 +/- 0.24; controls: 0.88 +/-0.17; P < 0.005). Moreover, alpha was linearly related to degree of functional impairment in subjects with Huntington's disease (r = 0.78, P < 0.0005). These findings demonstrate that strike-interval fluctuations are more random (i.e., less correlated) in elderly subjects and in subjects with Huntington's disease. Abnormal alterations in the fractal properties of gait dynamics are apparently associated with changes in central nervous system control.

  1. ZIP2DL: An Elastic-Plastic, Large-Rotation Finite-Element Stress Analysis and Crack-Growth Simulation Program

    NASA Technical Reports Server (NTRS)

    Deng, Xiaomin; Newman, James C., Jr.

    1997-01-01

    ZIP2DL is a two-dimensional, elastic-plastic finte element program for stress analysis and crack growth simulations, developed for the NASA Langley Research Center. It has many of the salient features of the ZIP2D program. For example, ZIP2DL contains five material models (linearly elastic, elastic-perfectly plastic, power-law hardening, linear hardening, and multi-linear hardening models), and it can simulate mixed-mode crack growth for prescribed crack growth paths under plane stress, plane strain and mixed state of stress conditions. Further, as an extension of ZIP2D, it also includes a number of new capabilities. The large-deformation kinematics in ZIP2DL will allow it to handle elastic problems with large strains and large rotations, and elastic-plastic problems with small strains and large rotations. Loading conditions in terms of surface traction, concentrated load, and nodal displacement can be applied with a default linear time dependence or they can be programmed according to a user-defined time dependence through a user subroutine. The restart capability of ZIP2DL will make it possible to stop the execution of the program at any time, analyze the results and/or modify execution options and resume and continue the execution of the program. This report includes three sectons: a theoretical manual section, a user manual section, and an example manual secton. In the theoretical secton, the mathematics behind the various aspects of the program are concisely outlined. In the user manual section, a line-by-line explanation of the input data is given. In the example manual secton, three types of examples are presented to demonstrate the accuracy and illustrate the use of this program.

  2. Scarp degraded by linear diffusion: inverse solution for age.

    USGS Publications Warehouse

    Andrews, D.J.; Hanks, T.C.

    1985-01-01

    Under the assumption that landforms unaffected by drainage channels are degraded according to the linear diffusion equation, a procedure is developed to invert a scarp profile to find its 'diffusion age'. The inverse procedure applied to synthetic data yields the following rules of thumb. Evidence of initial scarp shape has been lost when apparent age reaches twice its initial value. A scarp that appears to have been formed by one event may have been formed by two with an interval between them as large as apparent age. The simplicity of scarp profile measurement and this inversion makes profile analysis attractive. -from Authors

  3. The measurement of linear frequency drift in oscillators

    NASA Astrophysics Data System (ADS)

    Barnes, J. A.

    1985-04-01

    A linear drift in frequency is an important element in most stochastic models of oscillator performance. Quartz crystal oscillators often have drifts in excess of a part in ten to the tenth power per day. Even commercial cesium beam devices often show drifts of a few parts in ten to the thirteenth per year. There are many ways to estimate the drift rates from data samples (e.g., regress the phase on a quadratic; regress the frequency on a linear; compute the simple mean of the first difference of frequency; use Kalman filters with a drift term as one element in the state vector; and others). Although most of these estimators are unbiased, they vary in efficiency (i.e., confidence intervals). Further, the estimation of confidence intervals using the standard analysis of variance (typically associated with the specific estimating technique) can give amazingly optimistic results. The source of these problems is not an error in, say, the regressions techniques, but rather the problems arise from correlations within the residuals. That is, the oscillator model is often not consistent with constraints on the analysis technique or, in other words, some specific analysis techniques are often inappropriate for the task at hand. The appropriateness of a specific analysis technique is critically dependent on the oscillator model and can often be checked with a simple whiteness test on the residuals.

  4. Reference intervals and allometric scaling of two-dimensional echocardiographic measurements in 150 healthy cats.

    PubMed

    Karsten, Schober; Stephanie, Savino; Vedat, Yildiz

    2017-11-10

    The objective of the study was to evaluate the effects of body weight (BW), breed, and sex on two-dimensional (2D) echocardiographic measures, reference ranges, and prediction intervals using allometrically-scaled data of left atrial (LA) and left ventricular (LV) size and LV wall thickness in healthy cats. Study type was retrospective, observational, and clinical cohort. 150 healthy cats were enrolled and 2D echocardiograms analyzed. LA diameter, LV wall thickness, and LV dimension were quantified using three different imaging views. The effect of BW, breed, sex, age, and interaction (BW*sex) on echocardiographic variables was assessed using univariate and multivariate regression and linear mixed model analysis. Standard (using raw data) and allometrically scaled (Y=a × M b ) reference intervals and prediction intervals were determined. BW had a significant (P<0.05) independent effect on 2D variables whereas breed, sex, and age did not. There were clinically relevant differences between reference intervals using mean ± 2SD of raw data and mean and 95% prediction interval of allometrically-scaled variables, most prominent in larger (>6 kg) and smaller (<3 kg) cats. A clinically relevant difference between thickness of the interventricular septum (IVS) and dimension of the LV posterior wall (LVPW) was identified. In conclusion, allometric scaling and BW-based 95% prediction intervals should be preferred over conventional 2D echocardiographic reference intervals in cats, in particular in small and large cats. These results are particularly relevant to screening examinations for feline hypertrophic cardiomyopathy.

  5. Efficacy of a randomized trial examining commercial weight loss programs and exercise on metabolic syndrome in overweight and obese women.

    PubMed

    Baetge, Claire; Earnest, Conrad P; Lockard, Brittanie; Coletta, Adriana M; Galvan, Elfego; Rasmussen, Christopher; Levers, Kyle; Simbo, Sunday Y; Jung, Y Peter; Koozehchian, Majid; Oliver, Jonathan; Dalton, Ryan; Sanchez, Brittany; Byrd, Michael J; Khanna, Deepesh; Jagim, Andrew; Kresta, Julie; Greenwood, Mike; Kreider, Richard B

    2017-02-01

    While commercial dietary weight-loss programs typically advise exercise, few provide actual programing. The goal of this study was to compare the Curves Complete 90-day Challenge (CC, n = 29), which incorporates exercising and diet, to programs advocating exercise (Weight Watchers Points Plus (WW, n = 29), Jenny Craig At Home (JC, n = 27), and Nutrisystem Advance Select (NS, n = 28)) or control (n = 20) on metabolic syndrome (MetS) and weight loss. We randomized 133 sedentary, overweight women (age, 47 ± 11 years; body mass, 86 ± 14 kg; body mass index, 35 ± 6 kg/m 2 ) into respective treatment groups for 12 weeks. Data were analyzed using chi square and general linear models adjusted for age and respective baseline measures. Data are means ± SD or mean change ± 95% confidence intervals (CIs). We observed a significant trend for a reduction in energy intake for all treatment groups and significant weight loss for all groups except control: CC (-4.32 kg; 95% CI, -5.75, -2.88), WW (-4.31 kg; 95% CI, -5.82, -2.96), JC (-5.34 kg; 95% CI, -6.86, -3.90), NS (-5.03 kg; 95% CI, -6.49, -3.56), and control (0.16 kg, 95% CI, -1.56, 1.89). Reduced MetS prevalence was observed at follow-up for CC (35% vs. 14%, adjusted standardized residuals (adjres.) = 3.1), but not WW (31% vs. 28% adjres. = 0.5), JC (37% vs. 42%, adjres. = -0.7), NS (39% vs. 50% adjres. = -1.5), or control (45% vs. 55% adjres. = -1.7). While all groups improved relative fitness (mL·kg -1 ·min -1 ) because of weight loss, only the CC group improved absolute fitness (L/min). In conclusion, commercial programs offering concurrent diet and exercise programming appear to offer greater improvements in MetS prevalence and cardiovascular function after 12 weeks of intervention.

  6. A New Stochastic Equivalent Linearization Implementation for Prediction of Geometrically Nonlinear Vibrations

    NASA Technical Reports Server (NTRS)

    Muravyov, Alexander A.; Turner, Travis L.; Robinson, Jay H.; Rizzi, Stephen A.

    1999-01-01

    In this paper, the problem of random vibration of geometrically nonlinear MDOF structures is considered. The solutions obtained by application of two different versions of a stochastic linearization method are compared with exact (F-P-K) solutions. The formulation of a relatively new version of the stochastic linearization method (energy-based version) is generalized to the MDOF system case. Also, a new method for determination of nonlinear sti ness coefficients for MDOF structures is demonstrated. This method in combination with the equivalent linearization technique is implemented in a new computer program. Results in terms of root-mean-square (RMS) displacements obtained by using the new program and an existing in-house code are compared for two examples of beam-like structures.

  7. Long-Term Exercise in Older Adults: 4-Year Outcomes of Music-Based Multitask Training

    PubMed Central

    Herrmann, François R.; Fielding, Roger A.; Reid, Kieran F.; Rizzoli, René; Trombetti, Andrea

    2016-01-01

    Prospective controlled evidence supporting the efficacy of long-term exercise to prevent physical decline and reduce falls in old age is lacking. The present study aimed to assess the effects of long-term music-based multitask exercise (i.e., Jaques-Dalcroze eurhythmics) on physical function and fall risk in older adults. A 3-year follow-up extension of a 1-year randomized controlled trial (NCT01107288) was conducted in Geneva (Switzerland), in which 134 community-dwellers aged ≥65 years at increased risk of falls received a 6-month music-based multitask exercise program. Four years following original trial enrolment, 52 subjects (baseline mean ± SD age, 75 ± 8 years) who (i) have maintained exercise program participation through the 4-year follow-up visit (“long-term intervention group”, n = 23) or (ii) have discontinued participation following original trial completion (“control group”, n = 29) were studied. They were reassessed in a blind fashion, using the same procedures as at baseline. At 4 years, linear mixed-effects models showed significant gait (gait speed, P = 0.006) and balance (one-legged stance time, P = 0.015) improvements in the long-term intervention group, compared with the control group. Also, long-term intervention subjects did better on Timed Up & Go, Five-Times-Sit-to-Stand and handgrip strength tests, than controls (P < 0.05, for all comparisons). Furthermore, the exercise program reduced the risk of falling (relative risk, 0.69; 95 % confidence interval, 0.5–0.9; P = 0.008). These findings suggest that long-term maintenance of a music-based multitask exercise program is a promising strategy to prevent age-related physical decline in older adults. They also highlight the efficacy of sustained long-term adherence to exercise for falls prevention. PMID:25148876

  8. Data-Based Interval Throwing Programs for Collegiate Softball Players

    PubMed Central

    Axe, Michael J.; Windley, Thomas C.; Snyder-Mackler, Lynn

    2002-01-01

    Objective: To construct interval throwing programs followed by a simulated game for collegiate softball players at all positions. The programs are intended to be used as functional progressions within a comprehensive rehabilitation program for an injured athlete or to augment off-season conditioning workouts. Design and Setting: We collected data over a single season of National Collegiate Athletic Association softball at the University of Delaware and Goldey Beacom College. We observed 220 half-innings of play and 2785 pitches during data collection. Subjects: The subjects were collegiate-level softball players at all positions of play. Measurements: We recorded the number of pitches for pitchers. For catchers, we recorded the number of sprints to back up a play, time in the squat stance, throws back to the pitcher, and the perceived effort and distance of all other throws. We also collected the perceived effort and distance of all throws for infielders and outfielders. Results: Pitchers threw an average of 89.61 pitches per game; catchers were in the squat stance 14.13 minutes per game; infielders threw the ball between 4.28 times per game and 6.30 times per game; and outfielders threw distances of up to 175 feet. Conclusions: We devised the interval throwing programs from the data collected, field dimensions, the types of injuries found to occur in softball, and a general understanding of tissue healing. We designed programs that allow a safe and efficient progressive return to sport. PMID:12937435

  9. A FORTRAN technique for correlating a circular environmental variable with a linear physiological variable in the sugar maple.

    PubMed

    Pease, J M; Morselli, M F

    1987-01-01

    This paper deals with a computer program adapted to a statistical method for analyzing an unlimited quantity of binary recorded data of an independent circular variable (e.g. wind direction), and a linear variable (e.g. maple sap flow volume). Circular variables cannot be statistically analyzed with linear methods, unless they have been transformed. The program calculates a critical quantity, the acrophase angle (PHI, phi o). The technique is adapted from original mathematics [1] and is written in Fortran 77 for easier conversion between computer networks. Correlation analysis can be performed following the program or regression which, because of the circular nature of the independent variable, becomes periodic regression. The technique was tested on a file of approximately 4050 data pairs.

  10. Buffered coscheduling for parallel programming and enhanced fault tolerance

    DOEpatents

    Petrini, Fabrizio [Los Alamos, NM; Feng, Wu-chun [Los Alamos, NM

    2006-01-31

    A computer implemented method schedules processor jobs on a network of parallel machine processors or distributed system processors. Control information communications generated by each process performed by each processor during a defined time interval is accumulated in buffers, where adjacent time intervals are separated by strobe intervals for a global exchange of control information. A global exchange of the control information communications at the end of each defined time interval is performed during an intervening strobe interval so that each processor is informed by all of the other processors of the number of incoming jobs to be received by each processor in a subsequent time interval. The buffered coscheduling method of this invention also enhances the fault tolerance of a network of parallel machine processors or distributed system processors

  11. Program for the solution of multipoint boundary value problems of quasilinear differential equations

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Linear equations are solved by a method of superposition of solutions of a sequence of initial value problems. For nonlinear equations and/or boundary conditions, the solution is iterative and in each iteration a problem like the linear case is solved. A simple Taylor series expansion is used for the linearization of both nonlinear equations and nonlinear boundary conditions. The perturbation method of solution is used in preference to quasilinearization because of programming ease, and smaller storage requirements; and experiments indicate that the desired convergence properties exist although no proof or convergence is given.

  12. Description of a computer program and numerical techniques for developing linear perturbation models from nonlinear systems simulations

    NASA Technical Reports Server (NTRS)

    Dieudonne, J. E.

    1978-01-01

    A numerical technique was developed which generates linear perturbation models from nonlinear aircraft vehicle simulations. The technique is very general and can be applied to simulations of any system that is described by nonlinear differential equations. The computer program used to generate these models is discussed, with emphasis placed on generation of the Jacobian matrices, calculation of the coefficients needed for solving the perturbation model, and generation of the solution of the linear differential equations. An example application of the technique to a nonlinear model of the NASA terminal configured vehicle is included.

  13. A Meta-Analysis of School-Based Bullying Prevention Programs' Effects on Bystander Intervention Behavior

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Espelage, Dorothy L.; Pigott, Therese D.

    2012-01-01

    This meta-analysis synthesized bullying prevention programs' effectiveness at increasing bystander intervention in bullying situations. Evidence from 12 school-based programs, involving 12,874 students, indicated that overall the programs were successful (Hedges's g = 0.20, 95% confidence interval [CI] = 0.11 to 0.29, p = 0.001), with larger…

  14. Measuring Time on the PET and Other Microcomputers

    ERIC Educational Resources Information Center

    Tesler, Larry

    1978-01-01

    The operation of the microcomputer requires one or more clocks or timers to measure intervals of different magnitudes. Methods are discussed for measuring time intervals on PET in hours, minutes, seconds, microseconds, miscellaneous units, and timing events on external devices. Directions are added for BASIC program applications of timing…

  15. Starting Silicon-Ribbon Growth Automatically

    NASA Technical Reports Server (NTRS)

    Mchugh, J. P.

    1984-01-01

    Semiautomatic system starts growth of silicon sheets more reliably than system with purely manual control. Control signals for starting sheetcrystal growth consist of ramps (during which signal changes linearly from one value to another over preset time interval) and soaks (during which signal remains constant). Ramps and soaks for best temperature and pulling speed determined by experimentation.

  16. Monitoring Human Development Goals: A Straightforward (Bayesian) Methodology for Cross-National Indices

    ERIC Educational Resources Information Center

    Abayomi, Kobi; Pizarro, Gonzalo

    2013-01-01

    We offer a straightforward framework for measurement of progress, across many dimensions, using cross-national social indices, which we classify as linear combinations of multivariate country level data onto a univariate score. We suggest a Bayesian approach which yields probabilistic (confidence type) intervals for the point estimates of country…

  17. Quantization of spacetime based on a spacetime interval operator

    NASA Astrophysics Data System (ADS)

    Chiang, Hsu-Wen; Hu, Yao-Chieh; Chen, Pisin

    2016-04-01

    Motivated by both concepts of Adler's recent work on utilizing Clifford algebra as the linear line element d s =⟨γμ⟩ d Xμ and the fermionization of the cylindrical worldsheet Polyakov action, we introduce a new type of spacetime quantization that is fully covariant. The theory is based on the reinterpretation of Adler's linear line element as d s =γμ⟨λ γμ⟩ , where λ is the characteristic length of the theory. We name this new operator the "spacetime interval operator" and argue that it can be regarded as a natural extension to the one-forms in the U (s u (2 )) noncommutative geometry. By treating Fourier momentum as the particle momentum, the generalized uncertainty principle of the U (s u (2 )) noncommutative geometry, as an approximation to the generalized uncertainty principle of our theory, is derived and is shown to have a lowest order correction term of the order p2 similar to that of Snyder's. The holography nature of the theory is demonstrated and the predicted fuzziness of the geodesic is shown to be much smaller than conceivable astrophysical bounds.

  18. Virtual directions in paleomagnetism: A global and rapid approach to evaluate the NRM components.

    NASA Astrophysics Data System (ADS)

    Ramón, Maria J.; Pueyo, Emilio L.; Oliva-Urcia, Belén; Larrasoaña, Juan C.

    2017-02-01

    We introduce a method and software to process demagnetization data for a rapid and integrative estimation of characteristic remanent magnetization (ChRM) components. The virtual directions (VIDI) of a paleomagnetic site are “all” possible directions that can be calculated from a given demagnetization routine of “n” steps (being m the number of specimens in the site). If the ChRM can be defined for a site, it will be represented in the VIDI set. Directions can be calculated for successive steps using principal component analysis, both anchored to the origin (resultant virtual directions RVD; m * (n2+n)/2) and not anchored (difference virtual directions DVD; m * (n2-n)/2). The number of directions per specimen (n2) is very large and will enhance all ChRM components with noisy regions where two components were fitted together (mixing their unblocking intervals). In the same way, resultant and difference virtual circles (RVC, DVC) are calculated. Virtual directions and circles are a global and objective approach to unravel different natural remanent magnetization (NRM) components for a paleomagnetic site without any assumption. To better constrain the stable components, some filters can be applied, such as establishing an upper boundary to the MAD, removing samples with anomalous intensities, or stating a minimum number of demagnetization steps (objective filters) or selecting a given unblocking interval (subjective but based on the expertise). On the other hand, the VPD program also allows the application of standard approaches (classic PCA fitting of directions a circles) and other ancillary methods (stacking routine, linearity spectrum analysis) giving an objective, global and robust idea of the demagnetization structure with minimal assumptions. Application of the VIDI method to natural cases (outcrops in the Pyrenees and u-channel data from a Roman dam infill in northern Spain) and their comparison to other approaches (classic end-point, demagnetization circle analysis, stacking routine and linearity spectrum analysis) allows validation of this technique. The VIDI is a global approach and it is especially useful for large data sets and rapid estimation of the NRM components.

  19. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  20. On the Modeling of the Residual Effects of the Clock Behavior and the Atmosphere Effects in the Analysis of VLBI Data

    NASA Astrophysics Data System (ADS)

    Bo, Zhang; Li, Jin-Ling; Wang, Guan-Gli

    2002-01-01

    We checked the dependence of the estimation of parameters on the choice of piecewise interval in the continuous piecewise linear modeling of the residual clock and atmosphere effects by single analysis of 27 VLBI experiments involving Shanghai station (Seshan 25m). The following are tentatively shown: (1) Different choices of the piecewise interval lead to differences in the estimation of station coordinates and in the weighted root mean squares ( wrms ) of the delay residuals, which can be of the order of centimeters or dozens of picoseconds respectively. So the choice of piecewise interval should not be arbitrary . (2) The piecewise interval should not be too long, otherwise the short - term variations in the residual clock and atmospheric effects can not be properly modeled. While in order to maintain enough degrees of freedom in parameter estimation, the interval can not be too short, otherwise the normal equation may become near or solely singular and the noises can not be constrained as well. Therefore the choice of the interval should be within some reasonable range. (3) Since the conditions of clock and atmosphere are different from experiment to experiment and from station to station, the reasonable range of the piecewise interval should be tested and chosen separately for each experiment as well as for each station by real data analysis. This is really arduous work in routine data analysis. (4) Generally speaking, with the default interval for clock as 60min, the reasonable range of piecewise interval for residual atmospheric effect modeling is between 10min to 40min, while with the default interval for atmosphere as 20min, that for residual clock behavior is between 20min to 100min.

Top