Sample records for multivariate point processes

  1. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics (“order parameters”) inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. PMID:28336305

  2. From point process observations to collective neural dynamics: Nonlinear Hawkes process GLMs, low-dimensional dynamics and coarse graining.

    PubMed

    Truccolo, Wilson

    2016-11-01

    This review presents a perspective on capturing collective dynamics in recorded neuronal ensembles based on multivariate point process models, inference of low-dimensional dynamics and coarse graining of spatiotemporal measurements. A general probabilistic framework for continuous time point processes reviewed, with an emphasis on multivariate nonlinear Hawkes processes with exogenous inputs. A point process generalized linear model (PP-GLM) framework for the estimation of discrete time multivariate nonlinear Hawkes processes is described. The approach is illustrated with the modeling of collective dynamics in neocortical neuronal ensembles recorded in human and non-human primates, and prediction of single-neuron spiking. A complementary approach to capture collective dynamics based on low-dimensional dynamics ("order parameters") inferred via latent state-space models with point process observations is presented. The approach is illustrated by inferring and decoding low-dimensional dynamics in primate motor cortex during naturalistic reach and grasp movements. Finally, we briefly review hypothesis tests based on conditional inference and spatiotemporal coarse graining for assessing collective dynamics in recorded neuronal ensembles. Published by Elsevier Ltd.

  3. A trust region approach with multivariate Padé model for optimal circuit design

    NASA Astrophysics Data System (ADS)

    Abdel-Malek, Hany L.; Ebid, Shaimaa E. K.; Mohamed, Ahmed S. A.

    2017-11-01

    Since the optimization process requires a significant number of consecutive function evaluations, it is recommended to replace the function by an easily evaluated approximation model during the optimization process. The model suggested in this article is based on a multivariate Padé approximation. This model is constructed using data points of ?, where ? is the number of parameters. The model is updated over a sequence of trust regions. This model avoids the slow convergence of linear models of ? and has features of quadratic models that need interpolation data points of ?. The proposed approach is tested by applying it to several benchmark problems. Yield optimization using such a direct method is applied to some practical circuit examples. Minimax solution leads to a suitable initial point to carry out the yield optimization process. The yield is optimized by the proposed derivative-free method for active and passive filter examples.

  4. Multivariable model predictive control design of reactive distillation column for Dimethyl Ether production

    NASA Astrophysics Data System (ADS)

    Wahid, A.; Putra, I. G. E. P.

    2018-03-01

    Dimethyl ether (DME) as an alternative clean energy has attracted a growing attention in the recent years. DME production via reactive distillation has potential for capital cost and energy requirement savings. However, combination of reaction and distillation on a single column makes reactive distillation process a very complex multivariable system with high non-linearity of process and strong interaction between process variables. This study investigates a multivariable model predictive control (MPC) based on two-point temperature control strategy for the DME reactive distillation column to maintain the purities of both product streams. The process model is estimated by a first order plus dead time model. The DME and water purity is maintained by controlling a stage temperature in rectifying and stripping section, respectively. The result shows that the model predictive controller performed faster responses compared to conventional PI controller that are showed by the smaller ISE values. In addition, the MPC controller is able to handle the loop interactions well.

  5. A model-based approach to wildland fire reconstruction using sediment charcoal records

    USGS Publications Warehouse

    Itter, Malcolm S.; Finley, Andrew O.; Hooten, Mevin B.; Higuera, Philip E.; Marlon, Jennifer R.; Kelly, Ryan; McLachlan, Jason S.

    2017-01-01

    Lake sediment charcoal records are used in paleoecological analyses to reconstruct fire history, including the identification of past wildland fires. One challenge of applying sediment charcoal records to infer fire history is the separation of charcoal associated with local fire occurrence and charcoal originating from regional fire activity. Despite a variety of methods to identify local fires from sediment charcoal records, an integrated statistical framework for fire reconstruction is lacking. We develop a Bayesian point process model to estimate the probability of fire associated with charcoal counts from individual-lake sediments and estimate mean fire return intervals. A multivariate extension of the model combines records from multiple lakes to reduce uncertainty in local fire identification and estimate a regional mean fire return interval. The univariate and multivariate models are applied to 13 lakes in the Yukon Flats region of Alaska. Both models resulted in similar mean fire return intervals (100–350 years) with reduced uncertainty under the multivariate model due to improved estimation of regional charcoal deposition. The point process model offers an integrated statistical framework for paleofire reconstruction and extends existing methods to infer regional fire history from multiple lake records with uncertainty following directly from posterior distributions.

  6. Advanced multivariable control of a turboexpander plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Altena, D.; Howard, M.; Bullin, K.

    1998-12-31

    This paper describes an application of advanced multivariable control on a natural gas plant and compares its performance to the previous conventional feed-back control. This control algorithm utilizes simple models from existing plant data and/or plant tests to hold the process at the desired operating point in the presence of disturbances and changes in operating conditions. The control software is able to accomplish this due to effective handling of process variable interaction, constraint avoidance and feed-forward of measured disturbances. The economic benefit of improved control lies in operating closer to the process constraints while avoiding significant violations. The South Texasmore » facility where this controller was implemented experienced reduced variability in process conditions which increased liquids recovery because the plant was able to operate much closer to the customer specified impurity constraint. An additional benefit of this implementation of multivariable control is the ability to set performance criteria beyond simple setpoints, including process variable constraints, relative variable merit and optimizing use of manipulated variables. The paper also details the control scheme applied to the complex turboexpander process and some of the safety features included to improve reliability.« less

  7. Generating functions and stability study of multivariate self-excited epidemic processes

    NASA Astrophysics Data System (ADS)

    Saichev, A. I.; Sornette, D.

    2011-09-01

    We present a stability study of the class of multivariate self-excited Hawkes point processes, that can model natural and social systems, including earthquakes, epileptic seizures and the dynamics of neuron assemblies, bursts of exchanges in social communities, interactions between Internet bloggers, bank network fragility and cascading of failures, national sovereign default contagion, and so on. We present the general theory of multivariate generating functions to derive the number of events over all generations of various types that are triggered by a mother event of a given type. We obtain the stability domains of various systems, as a function of the topological structure of the mutual excitations across different event types. We find that mutual triggering tends to provide a significant extension of the stability (or subcritical) domain compared with the case where event types are decoupled, that is, when an event of a given type can only trigger events of the same type.

  8. Nonparametric Bayesian Segmentation of a Multivariate Inhomogeneous Space-Time Poisson Process.

    PubMed

    Ding, Mingtao; He, Lihan; Dunson, David; Carin, Lawrence

    2012-12-01

    A nonparametric Bayesian model is proposed for segmenting time-evolving multivariate spatial point process data. An inhomogeneous Poisson process is assumed, with a logistic stick-breaking process (LSBP) used to encourage piecewise-constant spatial Poisson intensities. The LSBP explicitly favors spatially contiguous segments, and infers the number of segments based on the observed data. The temporal dynamics of the segmentation and of the Poisson intensities are modeled with exponential correlation in time, implemented in the form of a first-order autoregressive model for uniformly sampled discrete data, and via a Gaussian process with an exponential kernel for general temporal sampling. We consider and compare two different inference techniques: a Markov chain Monte Carlo sampler, which has relatively high computational complexity; and an approximate and efficient variational Bayesian analysis. The model is demonstrated with a simulated example and a real example of space-time crime events in Cincinnati, Ohio, USA.

  9. Applying the multivariate time-rescaling theorem to neural population models

    PubMed Central

    Gerhard, Felipe; Haslinger, Robert; Pipa, Gordon

    2011-01-01

    Statistical models of neural activity are integral to modern neuroscience. Recently, interest has grown in modeling the spiking activity of populations of simultaneously recorded neurons to study the effects of correlations and functional connectivity on neural information processing. However any statistical model must be validated by an appropriate goodness-of-fit test. Kolmogorov-Smirnov tests based upon the time-rescaling theorem have proven to be useful for evaluating point-process-based statistical models of single-neuron spike trains. Here we discuss the extension of the time-rescaling theorem to the multivariate (neural population) case. We show that even in the presence of strong correlations between spike trains, models which neglect couplings between neurons can be erroneously passed by the univariate time-rescaling test. We present the multivariate version of the time-rescaling theorem, and provide a practical step-by-step procedure for applying it towards testing the sufficiency of neural population models. Using several simple analytically tractable models and also more complex simulated and real data sets, we demonstrate that important features of the population activity can only be detected using the multivariate extension of the test. PMID:21395436

  10. Properties of multivariable root loci. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Yagle, A. E.

    1981-01-01

    Various properties of multivariable root loci are analyzed from a frequency domain point of view by using the technique of Newton polygons, and some generalizations of the SISO root locus rules to the multivariable case are pointed out. The behavior of the angles of arrival and departure is related to the Smith-MacMillan form of G(s) and explicit equations for these angles are obtained. After specializing to first order and a restricted class of higher order poles and zeros, some simple equations for these angles that are direct generalizations of the SISO equations are found. The unusual behavior of root loci on the real axis at branch points is studied. The SISO root locus rules for break-in and break-out points are shown to generalize directly to the multivariable case. Some methods for computing both types of points are presented.

  11. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  12. Moving Average Models with Bivariate Exponential and Geometric Distributions.

    DTIC Science & Technology

    1985-03-01

    ordinary time series and of point processes. Developments in Statistics, Vol. 1, P.R. Krishnaiah , ed. Academic Press, New York. [9] Esary, J.D. and...valued and discrete - valued time series with ARMA correlation structure. Multivariate Analysis V, P.R. Krishnaiah , ed. North-Holland. 151-166. [28

  13. Multivariate Autoregressive Modeling and Granger Causality Analysis of Multiple Spike Trains

    PubMed Central

    Krumin, Michael; Shoham, Shy

    2010-01-01

    Recent years have seen the emergence of microelectrode arrays and optical methods allowing simultaneous recording of spiking activity from populations of neurons in various parts of the nervous system. The analysis of multiple neural spike train data could benefit significantly from existing methods for multivariate time-series analysis which have proven to be very powerful in the modeling and analysis of continuous neural signals like EEG signals. However, those methods have not generally been well adapted to point processes. Here, we use our recent results on correlation distortions in multivariate Linear-Nonlinear-Poisson spiking neuron models to derive generalized Yule-Walker-type equations for fitting ‘‘hidden” Multivariate Autoregressive models. We use this new framework to perform Granger causality analysis in order to extract the directed information flow pattern in networks of simulated spiking neurons. We discuss the relative merits and limitations of the new method. PMID:20454705

  14. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    NASA Astrophysics Data System (ADS)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  15. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  16. Multivariate Meta-Analysis of Genetic Association Studies: A Simulation Study

    PubMed Central

    Neupane, Binod; Beyene, Joseph

    2015-01-01

    In a meta-analysis with multiple end points of interests that are correlated between or within studies, multivariate approach to meta-analysis has a potential to produce more precise estimates of effects by exploiting the correlation structure between end points. However, under random-effects assumption the multivariate estimation is more complex (as it involves estimation of more parameters simultaneously) than univariate estimation, and sometimes can produce unrealistic parameter estimates. Usefulness of multivariate approach to meta-analysis of the effects of a genetic variant on two or more correlated traits is not well understood in the area of genetic association studies. In such studies, genetic variants are expected to roughly maintain Hardy-Weinberg equilibrium within studies, and also their effects on complex traits are generally very small to modest and could be heterogeneous across studies for genuine reasons. We carried out extensive simulation to explore the comparative performance of multivariate approach with most commonly used univariate inverse-variance weighted approach under random-effects assumption in various realistic meta-analytic scenarios of genetic association studies of correlated end points. We evaluated the performance with respect to relative mean bias percentage, and root mean square error (RMSE) of the estimate and coverage probability of corresponding 95% confidence interval of the effect for each end point. Our simulation results suggest that multivariate approach performs similarly or better than univariate method when correlations between end points within or between studies are at least moderate and between-study variation is similar or larger than average within-study variation for meta-analyses of 10 or more genetic studies. Multivariate approach produces estimates with smaller bias and RMSE especially for the end point that has randomly or informatively missing summary data in some individual studies, when the missing data in the endpoint are imputed with null effects and quite large variance. PMID:26196398

  17. Multivariate Meta-Analysis of Genetic Association Studies: A Simulation Study.

    PubMed

    Neupane, Binod; Beyene, Joseph

    2015-01-01

    In a meta-analysis with multiple end points of interests that are correlated between or within studies, multivariate approach to meta-analysis has a potential to produce more precise estimates of effects by exploiting the correlation structure between end points. However, under random-effects assumption the multivariate estimation is more complex (as it involves estimation of more parameters simultaneously) than univariate estimation, and sometimes can produce unrealistic parameter estimates. Usefulness of multivariate approach to meta-analysis of the effects of a genetic variant on two or more correlated traits is not well understood in the area of genetic association studies. In such studies, genetic variants are expected to roughly maintain Hardy-Weinberg equilibrium within studies, and also their effects on complex traits are generally very small to modest and could be heterogeneous across studies for genuine reasons. We carried out extensive simulation to explore the comparative performance of multivariate approach with most commonly used univariate inverse-variance weighted approach under random-effects assumption in various realistic meta-analytic scenarios of genetic association studies of correlated end points. We evaluated the performance with respect to relative mean bias percentage, and root mean square error (RMSE) of the estimate and coverage probability of corresponding 95% confidence interval of the effect for each end point. Our simulation results suggest that multivariate approach performs similarly or better than univariate method when correlations between end points within or between studies are at least moderate and between-study variation is similar or larger than average within-study variation for meta-analyses of 10 or more genetic studies. Multivariate approach produces estimates with smaller bias and RMSE especially for the end point that has randomly or informatively missing summary data in some individual studies, when the missing data in the endpoint are imputed with null effects and quite large variance.

  18. Evaluation of in-line Raman data for end-point determination of a coating process: Comparison of Science-Based Calibration, PLS-regression and univariate data analysis.

    PubMed

    Barimani, Shirin; Kleinebudde, Peter

    2017-10-01

    A multivariate analysis method, Science-Based Calibration (SBC), was used for the first time for endpoint determination of a tablet coating process using Raman data. Two types of tablet cores, placebo and caffeine cores, received a coating suspension comprising a polyvinyl alcohol-polyethylene glycol graft-copolymer and titanium dioxide to a maximum coating thickness of 80µm. Raman spectroscopy was used as in-line PAT tool. The spectra were acquired every minute and correlated to the amount of applied aqueous coating suspension. SBC was compared to another well-known multivariate analysis method, Partial Least Squares-regression (PLS) and a simpler approach, Univariate Data Analysis (UVDA). All developed calibration models had coefficient of determination values (R 2 ) higher than 0.99. The coating endpoints could be predicted with root mean square errors (RMSEP) less than 3.1% of the applied coating suspensions. Compared to PLS and UVDA, SBC proved to be an alternative multivariate calibration method with high predictive power. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Multivariate survivorship analysis using two cross-sectional samples.

    PubMed

    Hill, M E

    1999-11-01

    As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.

  20. Does Learning to Read Improve Intelligence? A Longitudinal Multivariate Analysis in Identical Twins from Age 7 to 16

    ERIC Educational Resources Information Center

    Ritchie, Stuart J.; Bates, Timothy C.; Plomin, Robert

    2015-01-01

    Evidence from twin studies points to substantial environmental influences on intelligence, but the specifics of this influence are unclear. This study examined one developmental process that potentially causes intelligence differences: learning to read. In 1,890 twin pairs tested at 7, 9, 10, 12, and 16 years, a cross-lagged…

  1. Computational Approaches to Image Understanding.

    DTIC Science & Technology

    1981-10-01

    represnting points, edges, surfaces, and volumes to facilitate display. The geometry or perspective and parailcl (or orthographic) projection has...of making the image forming process explicit. This in turn leads to a concern with geometry , such as the properties f the gradient, stereographic, and...dual spaces. Combining geometry and smoothness leads naturally to multi-variate vector analysis, and to differential geometry . For the most part, a

  2. Macroscopically constrained Wang-Landau method for systems with multiple order parameters and its application to drawing complex phase diagrams

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Brown, G.; Rikvold, P. A.

    2017-05-01

    A generalized approach to Wang-Landau simulations, macroscopically constrained Wang-Landau, is proposed to simulate the density of states of a system with multiple macroscopic order parameters. The method breaks a multidimensional random-walk process in phase space into many separate, one-dimensional random-walk processes in well-defined subspaces. Each of these random walks is constrained to a different set of values of the macroscopic order parameters. When the multivariable density of states is obtained for one set of values of fieldlike model parameters, the density of states for any other values of these parameters can be obtained by a simple transformation of the total system energy. All thermodynamic quantities of the system can then be rapidly calculated at any point in the phase diagram. We demonstrate how to use the multivariable density of states to draw the phase diagram, as well as order-parameter probability distributions at specific phase points, for a model spin-crossover material: an antiferromagnetic Ising model with ferromagnetic long-range interactions. The fieldlike parameters in this model are an effective magnetic field and the strength of the long-range interaction.

  3. Spatial land-use inventory, modeling, and projection/Denver metropolitan area, with inputs from existing maps, airphotos, and LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Tom, C.; Miller, L. D.; Christenson, J. W.

    1978-01-01

    A landscape model was constructed with 34 land-use, physiographic, socioeconomic, and transportation maps. A simple Markov land-use trend model was constructed from observed rates of change and nonchange from photointerpreted 1963 and 1970 airphotos. Seven multivariate land-use projection models predicting 1970 spatial land-use changes achieved accuracies from 42 to 57 percent. A final modeling strategy was designed, which combines both Markov trend and multivariate spatial projection processes. Landsat-1 image preprocessing included geometric rectification/resampling, spectral-band, and band/insolation ratioing operations. A new, systematic grid-sampled point training-set approach proved to be useful when tested on the four orginal MSS bands, ten image bands and ratios, and all 48 image and map variables (less land use). Ten variable accuracy was raised over 15 percentage points from 38.4 to 53.9 percent, with the use of the 31 ancillary variables. A land-use classification map was produced with an optimal ten-channel subset of four image bands and six ancillary map variables. Point-by-point verification of 331,776 points against a 1972/1973 U.S. Geological Survey (UGSG) land-use map prepared with airphotos and the same classification scheme showed average first-, second-, and third-order accuracies of 76.3, 58.4, and 33.0 percent, respectively.

  4. Neural network-based nonlinear model predictive control vs. linear quadratic gaussian control

    USGS Publications Warehouse

    Cho, C.; Vance, R.; Mardi, N.; Qian, Z.; Prisbrey, K.

    1997-01-01

    One problem with the application of neural networks to the multivariable control of mineral and extractive processes is determining whether and how to use them. The objective of this investigation was to compare neural network control to more conventional strategies and to determine if there are any advantages in using neural network control in terms of set-point tracking, rise time, settling time, disturbance rejection and other criteria. The procedure involved developing neural network controllers using both historical plant data and simulation models. Various control patterns were tried, including both inverse and direct neural network plant models. These were compared to state space controllers that are, by nature, linear. For grinding and leaching circuits, a nonlinear neural network-based model predictive control strategy was superior to a state space-based linear quadratic gaussian controller. The investigation pointed out the importance of incorporating state space into neural networks by making them recurrent, i.e., feeding certain output state variables into input nodes in the neural network. It was concluded that neural network controllers can have better disturbance rejection, set-point tracking, rise time, settling time and lower set-point overshoot, and it was also concluded that neural network controllers can be more reliable and easy to implement in complex, multivariable plants.

  5. Viewpoints: A New Computer Program for Interactive Exploration of Large Multivariate Space Science and Astrophysics Data.

    NASA Astrophysics Data System (ADS)

    Levit, Creon; Gazis, P.

    2006-06-01

    The graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform (windows, linux, Apple OSX) application which leverages some of the power latent in the GPU to enable smooth interactive exploration and analysis of large high-dimensional data using a variety of classical and recent techniques. The targeted application area is the interactive analysis of complex, multivariate space science and astrophysics data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 10^6-10^8.

  6. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A Comparison of the Influences of Verbal-Successive and Spatial-Simultaneous Factors on Achieving Readers in Fourth and Fifth Grade: A Multivariate Correlational Study.

    ERIC Educational Resources Information Center

    Solan, Harold A.

    1987-01-01

    This study involving 38 normally achieving fourth and fifth grade children confirmed previous studies indicating that both spatial-simultaneous (in which perceived stimuli are totally available at one point in time) and verbal-successive (information is presented in serial order) cognitive processing are important in normal learning. (DB)

  8. Stratification of Recanalization for Patients with Endovascular Treatment of Intracranial Aneurysms

    PubMed Central

    Ogilvy, Christopher S.; Chua, Michelle H.; Fusco, Matthew R.; Reddy, Arra S.; Thomas, Ajith J.

    2015-01-01

    Background With increasing utilization of endovascular techniques in the treatment of both ruptured and unruptured intracranial aneurysms, the issue of obliteration efficacy has become increasingly important. Objective Our goal was to systematically develop a comprehensive model for predicting retreatment with various types of endovascular treatment. Methods We retrospectively reviewed medical records that were prospectively collected for 305 patients who received endovascular treatment for intracranial aneurysms from 2007 to 2013. Multivariable logistic regression was performed on candidate predictors identified by univariable screening analysis to detect independent predictors of retreatment. A composite risk score was constructed based on the proportional contribution of independent predictors in the multivariable model. Results Size (>10 mm), aneurysm rupture, stent assistance, and post-treatment degree of aneurysm occlusion were independently associated with retreatment while intraluminal thrombosis and flow diversion demonstrated a trend towards retreatment. The Aneurysm Recanalization Stratification Scale was constructed by assigning the following weights to statistically and clinically significant predictors. Aneurysm-specific factors: Size (>10 mm), 2 points; rupture, 2 points; presence of thrombus, 2 points. Treatment-related factors: Stent assistance, -1 point; flow diversion, -2 points; Raymond Roy 2 occlusion, 1 point; Raymond Roy 3 occlusion, 2 points. This scale demonstrated good discrimination with a C-statistic of 0.799. Conclusion Surgical decision-making and patient-centered informed consent require comprehensive and accessible information on treatment efficacy. We have constructed the Aneurysm Recanalization Stratification Scale to enhance this decision-making process. This is the first comprehensive model that has been developed to quantitatively predict the risk of retreatment following endovascular therapy. PMID:25621984

  9. Multivariate optical computing using a digital micromirror device for fluorescence and Raman spectroscopy.

    PubMed

    Smith, Zachary J; Strombom, Sven; Wachsmann-Hogiu, Sebastian

    2011-08-29

    A multivariate optical computer has been constructed consisting of a spectrograph, digital micromirror device, and photomultiplier tube that is capable of determining absolute concentrations of individual components of a multivariate spectral model. We present experimental results on ternary mixtures, showing accurate quantification of chemical concentrations based on integrated intensities of fluorescence and Raman spectra measured with a single point detector. We additionally show in simulation that point measurements based on principal component spectra retain the ability to classify cancerous from noncancerous T cells.

  10. Heuristic-driven graph wavelet modeling of complex terrain

    NASA Astrophysics Data System (ADS)

    Cioacǎ, Teodor; Dumitrescu, Bogdan; Stupariu, Mihai-Sorin; Pǎtru-Stupariu, Ileana; Nǎpǎrus, Magdalena; Stoicescu, Ioana; Peringer, Alexander; Buttler, Alexandre; Golay, François

    2015-03-01

    We present a novel method for building a multi-resolution representation of large digital surface models. The surface points coincide with the nodes of a planar graph which can be processed using a critically sampled, invertible lifting scheme. To drive the lazy wavelet node partitioning, we employ an attribute aware cost function based on the generalized quadric error metric. The resulting algorithm can be applied to multivariate data by storing additional attributes at the graph's nodes. We discuss how the cost computation mechanism can be coupled with the lifting scheme and examine the results by evaluating the root mean square error. The algorithm is experimentally tested using two multivariate LiDAR sets representing terrain surface and vegetation structure with different sampling densities.

  11. A Unified Point Process Probabilistic Framework to Assess Heartbeat Dynamics and Autonomic Cardiovascular Control

    PubMed Central

    Chen, Zhe; Purdon, Patrick L.; Brown, Emery N.; Barbieri, Riccardo

    2012-01-01

    In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model’s statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR) structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR), heart rate variability (HRV), respiratory sinus arrhythmia (RSA), and baroreceptor-cardiac reflex (baroreflex) sensitivity (BRS), are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second-order non-linearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of non-linearity. We here present a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, non-invasive assessment in clinical practice. We also discuss the limitations and other alternative modeling strategies of our point process approach. PMID:22375120

  12. Multivariate meta-analysis of prognostic factor studies with multiple cut-points and/or methods of measurement.

    PubMed

    Riley, Richard D; Elia, Eleni G; Malin, Gemma; Hemming, Karla; Price, Malcolm P

    2015-07-30

    A prognostic factor is any measure that is associated with the risk of future health outcomes in those with existing disease. Often, the prognostic ability of a factor is evaluated in multiple studies. However, meta-analysis is difficult because primary studies often use different methods of measurement and/or different cut-points to dichotomise continuous factors into 'high' and 'low' groups; selective reporting is also common. We illustrate how multivariate random effects meta-analysis models can accommodate multiple prognostic effect estimates from the same study, relating to multiple cut-points and/or methods of measurement. The models account for within-study and between-study correlations, which utilises more information and reduces the impact of unreported cut-points and/or measurement methods in some studies. The applicability of the approach is improved with individual participant data and by assuming a functional relationship between prognostic effect and cut-point to reduce the number of unknown parameters. The models provide important inferential results for each cut-point and method of measurement, including the summary prognostic effect, the between-study variance and a 95% prediction interval for the prognostic effect in new populations. Two applications are presented. The first reveals that, in a multivariate meta-analysis using published results, the Apgar score is prognostic of neonatal mortality but effect sizes are smaller at most cut-points than previously thought. In the second, a multivariate meta-analysis of two methods of measurement provides weak evidence that microvessel density is prognostic of mortality in lung cancer, even when individual participant data are available so that a continuous prognostic trend is examined (rather than cut-points). © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  13. Applying Multivariate Adaptive Splines to Identify Genes With Expressions Varying After Diagnosis in Microarray Experiments.

    PubMed

    Duan, Fenghai; Xu, Ye

    2017-01-01

    To analyze a microarray experiment to identify the genes with expressions varying after the diagnosis of breast cancer. A total of 44 928 probe sets in an Affymetrix microarray data publicly available on Gene Expression Omnibus from 249 patients with breast cancer were analyzed by the nonparametric multivariate adaptive splines. Then, the identified genes with turning points were grouped by K-means clustering, and their network relationship was subsequently analyzed by the Ingenuity Pathway Analysis. In total, 1640 probe sets (genes) were reliably identified to have turning points along with the age at diagnosis in their expression profiling, of which 927 expressed lower after turning points and 713 expressed higher after the turning points. K-means clustered them into 3 groups with turning points centering at 54, 62.5, and 72, respectively. The pathway analysis showed that the identified genes were actively involved in various cancer-related functions or networks. In this article, we applied the nonparametric multivariate adaptive splines method to a publicly available gene expression data and successfully identified genes with expressions varying before and after breast cancer diagnosis.

  14. A Novel Health Evaluation Strategy for Multifunctional Self-Validating Sensors

    PubMed Central

    Shen, Zhengguang; Wang, Qi

    2013-01-01

    The performance evaluation of sensors is very important in actual application. In this paper, a theory based on multi-variable information fusion is studied to evaluate the health level of multifunctional sensors. A novel conception of health reliability degree (HRD) is defined to indicate a quantitative health level, which is different from traditional so-called qualitative fault diagnosis. To evaluate the health condition from both local and global perspectives, the HRD of a single sensitive component at multiple time points and the overall multifunctional sensor at a single time point are defined, respectively. The HRD methodology is emphasized by using multi-variable data fusion technology coupled with a grey comprehensive evaluation method. In this method, to acquire the distinct importance of each sensitive unit and the sensitivity of different time points, the information entropy and analytic hierarchy process method are used, respectively. In order to verify the feasibility of the proposed strategy, a health evaluating experimental system for multifunctional self-validating sensors was designed. The five different health level situations have been discussed. Successful results show that the proposed method is feasible, the HRD could be used to quantitatively indicate the health level and it does have a fast response to the performance changes of multifunctional sensors. PMID:23291576

  15. Multivariate spatiotemporal visualizations for mobile devices in Flyover Country

    NASA Astrophysics Data System (ADS)

    Loeffler, S.; Thorn, R.; Myrbo, A.; Roth, R.; Goring, S. J.; Williams, J.

    2017-12-01

    Visualizing and interacting with complex multivariate and spatiotemporal datasets on mobile devices is challenging due to their smaller screens, reduced processing power, and limited data connectivity. Pollen data require visualizing pollen assemblages spatially, temporally, and across multiple taxa to understand plant community dynamics through time. Drawing from cartography, information visualization, and paleoecology, we have created new mobile-first visualization techniques that represent multiple taxa across many sites and enable user interaction. Using pollen datasets from the Neotoma Paleoecology Database as a case study, the visualization techniques allow ecological patterns and trends to be quickly understood on a mobile device compared to traditional pollen diagrams and maps. This flexible visualization system can be used for datasets beyond pollen, with the only requirements being point-based localities and multiple variables changing through time or depth.

  16. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    PubMed

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  17. A Multivariate Magnitude Robust Control Chart for Mean Shift Detection and Change Point Estimation

    DTIC Science & Technology

    2007-03-01

    data like scratches on a desk to the width of lumber at a mill and intangible data like heart rate and microprocessor switching frequency. These...of historical data . The preferred method involves a series of designed experiments to understand what factors affect the process, and their...outcome. Since the QE is involved in every step of data collection, he/she receives exactly the data required, and possesses in-depth knowledge of the

  18. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    ERIC Educational Resources Information Center

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  19. Normalization methods in time series of platelet function assays

    PubMed Central

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  20. Matching pollution with adaptive changes in mangrove plants by multivariate statistics. A case study, Rhizophora mangle from four neotropical mangroves in Brazil.

    PubMed

    Souza, Iara da Costa; Morozesk, Mariana; Duarte, Ian Drumond; Bonomo, Marina Marques; Rocha, Lívia Dorsch; Furlan, Larissa Maria; Arrivabene, Hiulana Pereira; Monferrán, Magdalena Victoria; Matsumoto, Silvia Tamie; Milanez, Camilla Rozindo Dias; Wunderlin, Daniel Alberto; Fernandes, Marisa Narciso

    2014-08-01

    Roots of mangrove trees have an important role in depurating water and sediments by retaining metals that may accumulate in different plant tissues, affecting physiological processes and anatomy. The present study aimed to evaluate adaptive changes in root of Rhizophora mangle in response to different levels of chemical elements (metals/metalloids) in interstitial water and sediments from four neotropical mangroves in Brazil. What sets this study apart from other studies is that we not only investigate adaptive modifications in R. mangle but also changes in environments where this plant grows, evaluating correspondence between physical, chemical and biological issues by a combined set of multivariate statistical methods (pattern recognition). Thus, we looked to match changes in the environment with adaptations in plants. Multivariate statistics highlighted that the lignified periderm and the air gaps are directly related to the environmental contamination. Current results provide new evidences of root anatomical strategies to deal with contaminated environments. Multivariate statistics greatly contributes to extrapolate results from complex data matrixes obtained when analyzing environmental issues, pointing out parameters involved in environmental changes and also evidencing the adaptive response of the exposed biota. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  2. The neural basis of visual word form processing: a multivariate investigation.

    PubMed

    Nestor, Adrian; Behrmann, Marlene; Plaut, David C

    2013-07-01

    Current research on the neurobiological bases of reading points to the privileged role of a ventral cortical network in visual word processing. However, the properties of this network and, in particular, its selectivity for orthographic stimuli such as words and pseudowords remain topics of significant debate. Here, we approached this issue from a novel perspective by applying pattern-based analyses to functional magnetic resonance imaging data. Specifically, we examined whether, where and how, orthographic stimuli elicit distinct patterns of activation in the human cortex. First, at the category level, multivariate mapping found extensive sensitivity throughout the ventral cortex for words relative to false-font strings. Secondly, at the identity level, the multi-voxel pattern classification provided direct evidence that different pseudowords are encoded by distinct neural patterns. Thirdly, a comparison of pseudoword and face identification revealed that both stimulus types exploit common neural resources within the ventral cortical network. These results provide novel evidence regarding the involvement of the left ventral cortex in orthographic stimulus processing and shed light on its selectivity and discriminability profile. In particular, our findings support the existence of sublexical orthographic representations within the left ventral cortex while arguing for the continuity of reading with other visual recognition skills.

  3. Design and Assessment of Online, Interactive Tutorials That Teach Science Process Skills.

    PubMed

    Kramer, Maxwell; Olson, Dalay; Walker, J D

    2018-06-01

    Explicit emphasis on teaching science process skills leads to both gains in the skills themselves and, strikingly, deeper understanding of content. Here, we created and tested a series of online, interactive tutorials with the goal of helping undergraduate students develop science process skills. We designed the tutorials in accordance with evidence-based multimedia design principles and student feedback from usability testing. We then tested the efficacy of the tutorials in an introductory undergraduate biology class. On the basis of a multivariate ordinary least-squares regression model, students who received the tutorials are predicted to score 0.82 points higher on a 15-point science process skill assessment than their peers who received traditional textbook instruction on the same topic. This moderate but significant impact indicates that well-designed online tutorials can be more effective than traditional ways of teaching science process skills to undergraduate students. We also found trends that suggest the tutorials are especially effective for nonnative English-speaking students. However, due to a limited sample size, we were unable to confirm that these trends occurred due to more than just variation in the student group sampled.

  4. A statistical approach for segregating cognitive task stages from multivariate fMRI BOLD time series.

    PubMed

    Demanuele, Charmaine; Bähner, Florian; Plichta, Michael M; Kirsch, Peter; Tost, Heike; Meyer-Lindenberg, Andreas; Durstewitz, Daniel

    2015-01-01

    Multivariate pattern analysis can reveal new information from neuroimaging data to illuminate human cognition and its disturbances. Here, we develop a methodological approach, based on multivariate statistical/machine learning and time series analysis, to discern cognitive processing stages from functional magnetic resonance imaging (fMRI) blood oxygenation level dependent (BOLD) time series. We apply this method to data recorded from a group of healthy adults whilst performing a virtual reality version of the delayed win-shift radial arm maze (RAM) task. This task has been frequently used to study working memory and decision making in rodents. Using linear classifiers and multivariate test statistics in conjunction with time series bootstraps, we show that different cognitive stages of the task, as defined by the experimenter, namely, the encoding/retrieval, choice, reward and delay stages, can be statistically discriminated from the BOLD time series in brain areas relevant for decision making and working memory. Discrimination of these task stages was significantly reduced during poor behavioral performance in dorsolateral prefrontal cortex (DLPFC), but not in the primary visual cortex (V1). Experimenter-defined dissection of time series into class labels based on task structure was confirmed by an unsupervised, bottom-up approach based on Hidden Markov Models. Furthermore, we show that different groupings of recorded time points into cognitive event classes can be used to test hypotheses about the specific cognitive role of a given brain region during task execution. We found that whilst the DLPFC strongly differentiated between task stages associated with different memory loads, but not between different visual-spatial aspects, the reverse was true for V1. Our methodology illustrates how different aspects of cognitive information processing during one and the same task can be separated and attributed to specific brain regions based on information contained in multivariate patterns of voxel activity.

  5. Locating the Seventh Cervical Spinous Process: Development and Validation of a Multivariate Model Using Palpation and Personal Information.

    PubMed

    Ferreira, Ana Paula A; Póvoa, Luciana C; Zanier, José F C; Ferreira, Arthur S

    2017-02-01

    The aim of this study was to develop and validate a multivariate prediction model, guided by palpation and personal information, for locating the seventh cervical spinous process (C7SP). A single-blinded, cross-sectional study at a primary to tertiary health care center was conducted for model development and temporal validation. One-hundred sixty participants were prospectively included for model development (n = 80) and time-split validation stages (n = 80). The C7SP was located using the thorax-rib static method (TRSM). Participants underwent chest radiography for assessment of the inner body structure located with TRSM and using radio-opaque markers placed over the skin. Age, sex, height, body mass, body mass index, and vertex-marker distance (D V-M ) were used to predict the distance from the C7SP to the vertex (D V-C7 ). Multivariate linear regression modeling, limits of agreement plot, histogram of residues, receiver operating characteristic curves, and confusion tables were analyzed. The multivariate linear prediction model for D V-C7 (in centimeters) was D V-C7 = 0.986D V-M + 0.018(mass) + 0.014(age) - 1.008. Receiver operating characteristic curves had better discrimination of D V-C7 (area under the curve = 0.661; 95% confidence interval = 0.541-0.782; P = .015) than D V-M (area under the curve = 0.480; 95% confidence interval = 0.345-0.614; P = .761), with respective cutoff points at 23.40 cm (sensitivity = 41%, specificity = 63%) and 24.75 cm (sensitivity = 69%, specificity = 52%). The C7SP was correctly located more often when using predicted D V-C7 in the validation sample than when using the TRSM in the development sample: n = 53 (66%) vs n = 32 (40%), P < .001. Better accuracy was obtained when locating the C7SP by use of a multivariate model that incorporates palpation and personal information. Copyright © 2016. Published by Elsevier Inc.

  6. [Multivariate Adaptive Regression Splines (MARS), an alternative for the analysis of time series].

    PubMed

    Vanegas, Jairo; Vásquez, Fabián

    Multivariate Adaptive Regression Splines (MARS) is a non-parametric modelling method that extends the linear model, incorporating nonlinearities and interactions between variables. It is a flexible tool that automates the construction of predictive models: selecting relevant variables, transforming the predictor variables, processing missing values and preventing overshooting using a self-test. It is also able to predict, taking into account structural factors that might influence the outcome variable, thereby generating hypothetical models. The end result could identify relevant cut-off points in data series. It is rarely used in health, so it is proposed as a tool for the evaluation of relevant public health indicators. For demonstrative purposes, data series regarding the mortality of children under 5 years of age in Costa Rica were used, comprising the period 1978-2008. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  7. A simple rapid approach using coupled multivariate statistical methods, GIS and trajectory models to delineate areas of common oil spill risk

    NASA Astrophysics Data System (ADS)

    Guillen, George; Rainey, Gail; Morin, Michelle

    2004-04-01

    Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.

  8. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  9. The Python Spectral Analysis Tool (PySAT) for Powerful, Flexible, and Easy Preprocessing and Machine Learning with Point Spectral Data

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2018-04-01

    The PySAT point spectra tool provides a flexible graphical interface, enabling scientists to apply a wide variety of preprocessing and machine learning methods to point spectral data, with an emphasis on multivariate regression.

  10. The association of 83 plasma proteins with CHD mortality, BMI, HDL-, and total-cholesterol in men: applying multivariate statistics to identify proteins with prognostic value and biological relevance.

    PubMed

    Heidema, A Geert; Thissen, Uwe; Boer, Jolanda M A; Bouwman, Freek G; Feskens, Edith J M; Mariman, Edwin C M

    2009-06-01

    In this study, we applied the multivariate statistical tool Partial Least Squares (PLS) to analyze the relative importance of 83 plasma proteins in relation to coronary heart disease (CHD) mortality and the intermediate end points body mass index, HDL-cholesterol and total cholesterol. From a Dutch monitoring project for cardiovascular disease risk factors, men who died of CHD between initial participation (1987-1991) and end of follow-up (January 1, 2000) (N = 44) and matched controls (N = 44) were selected. Baseline plasma concentrations of proteins were measured by a multiplex immunoassay. With the use of PLS, we identified 15 proteins with prognostic value for CHD mortality and sets of proteins associated with the intermediate end points. Subsequently, sets of proteins and intermediate end points were analyzed together by Principal Components Analysis, indicating that proteins involved in inflammation explained most of the variance, followed by proteins involved in metabolism and proteins associated with total-C. This study is one of the first in which the association of a large number of plasma proteins with CHD mortality and intermediate end points is investigated by applying multivariate statistics, providing insight in the relationships among proteins, intermediate end points and CHD mortality, and a set of proteins with prognostic value.

  11. Comparative study of different approaches for multivariate image analysis in HPTLC fingerprinting of natural products such as plant resin.

    PubMed

    Ristivojević, Petar; Trifković, Jelena; Vovk, Irena; Milojković-Opsenica, Dušanka

    2017-01-01

    Considering the introduction of phytochemical fingerprint analysis, as a method of screening the complex natural products for the presence of most bioactive compounds, use of chemometric classification methods, application of powerful scanning and image capturing and processing devices and algorithms, advancement in development of novel stationary phases as well as various separation modalities, high-performance thin-layer chromatography (HPTLC) fingerprinting is becoming attractive and fruitful field of separation science. Multivariate image analysis is crucial in the light of proper data acquisition. In a current study, different image processing procedures were studied and compared in detail on the example of HPTLC chromatograms of plant resins. In that sense, obtained variables such as gray intensities of pixels along the solvent front, peak area and mean values of peak were used as input data and compared to obtained best classification models. Important steps in image analysis, baseline removal, denoising, target peak alignment and normalization were pointed out. Numerical data set based on mean value of selected bands and intensities of pixels along the solvent front proved to be the most convenient for planar-chromatographic profiling, although required at least the basic knowledge on image processing methodology, and could be proposed for further investigation in HPLTC fingerprinting. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  13. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  14. Model Based Predictive Control of Multivariable Hammerstein Processes with Fuzzy Logic Hypercube Interpolated Models

    PubMed Central

    Coelho, Antonio Augusto Rodrigues

    2016-01-01

    This paper introduces the Fuzzy Logic Hypercube Interpolator (FLHI) and demonstrates applications in control of multiple-input single-output (MISO) and multiple-input multiple-output (MIMO) processes with Hammerstein nonlinearities. FLHI consists of a Takagi-Sugeno fuzzy inference system where membership functions act as kernel functions of an interpolator. Conjunction of membership functions in an unitary hypercube space enables multivariable interpolation of N-dimensions. Membership functions act as interpolation kernels, such that choice of membership functions determines interpolation characteristics, allowing FLHI to behave as a nearest-neighbor, linear, cubic, spline or Lanczos interpolator, to name a few. The proposed interpolator is presented as a solution to the modeling problem of static nonlinearities since it is capable of modeling both a function and its inverse function. Three study cases from literature are presented, a single-input single-output (SISO) system, a MISO and a MIMO system. Good results are obtained regarding performance metrics such as set-point tracking, control variation and robustness. Results demonstrate applicability of the proposed method in modeling Hammerstein nonlinearities and their inverse functions for implementation of an output compensator with Model Based Predictive Control (MBPC), in particular Dynamic Matrix Control (DMC). PMID:27657723

  15. Resilience and tipping points of an exploited fish population over six decades.

    PubMed

    Vasilakopoulos, Paraskevas; Marshall, C Tara

    2015-05-01

    Complex natural systems with eroded resilience, such as populations, ecosystems and socio-ecological systems, respond to small perturbations with abrupt, discontinuous state shifts, or critical transitions. Theory of critical transitions suggests that such systems exhibit fold bifurcations featuring folded response curves, tipping points and alternate attractors. However, there is little empirical evidence of fold bifurcations occurring in actual complex natural systems impacted by multiple stressors. Moreover, resilience of complex systems to change currently lacks clear operational measures with generic application. Here, we provide empirical evidence for the occurrence of a fold bifurcation in an exploited fish population and introduce a generic measure of ecological resilience based on the observed fold bifurcation attributes. We analyse the multivariate development of Barents Sea cod (Gadus morhua), which is currently the world's largest cod stock, over six decades (1949-2009), and identify a population state shift in 1981. By plotting a multivariate population index against a multivariate stressor index, the shift mechanism was revealed suggesting that the observed population shift was a nonlinear response to the combined effects of overfishing and climate change. Annual resilience values were estimated based on the position of each year in relation to the fitted attractors and assumed tipping points of the fold bifurcation. By interpolating the annual resilience values, a folded stability landscape was fit, which was shaped as predicted by theory. The resilience assessment suggested that the population may be close to another tipping point. This study illustrates how a multivariate analysis, supported by theory of critical transitions and accompanied by a quantitative resilience assessment, can clarify shift mechanisms in data-rich complex natural systems. © 2014 John Wiley & Sons Ltd.

  16. High-sensitivity C-reactive protein and cognitive decline: the English Longitudinal Study of Ageing.

    PubMed

    Zheng, Fanfan; Xie, Wuxiang

    2018-06-01

    High-sensitivity C-reactive protein (hs-CRP) has been suggested to be involved in the process of cognitive decline. However, the results from previous studies exploring the relationship between hs-CRP concentration and cognitive decline are inconsistent. We employed data from wave 2 (2004-2005) to wave 7 (2014-2015) of the English Longitudinal Study of Ageing. Cognitive function was assessed at baseline (wave 2) and reassessed biennially at waves 3-7. A total of 5257 participants (54.9% women, mean age 65.4 ± 9.4 years) with baseline hs-CRP levels ranged from 0.2 to 210.0 mg/L (median: 2.0 mg/L, interquartile range: 0.9-4.1 mg/L) were studied. The mean follow-up duration was 8.1 ± 2.8 years, and the mean number of cognitive assessment was 4.9 ± 1.5. Linear mixed models show that a one-unit increment in natural log-transformed hs-CRP was associated with faster declines in global cognitive scores [-0.048 points/year, 95% confidence interval (CI) -0.072 to -0.023], memory scores (-0.022 points/year, 95% CI -0.031 to -0.013), and executive function scores (-0.025 points/year, 95% CI -0.043 to -0.006), after multivariable adjustment. Compared with the lowest quartile of hs-CRP, the multivariable-adjusted rate of global cognitive decline associated with the second, third, and highest quartile was faster by -0.043 points/year (95% CI -0.116 to 0.029), -0.090 points/year (95% CI -0.166 to -0.015), -0.145 (95% CI -0.221 to -0.069), respectively (p for trend <0.001). Similarly, memory and executive function also declined faster with increasing quartiles of hs-CRP. A significant association between hs-CRP concentration and long-term cognitive decline was observed in this study. Hs-CRP might serve as a biomarker for cognitive decline.

  17. About the dark and bright sides of self-efficacy: workaholism and work engagement.

    PubMed

    Del Líbano, Mario; Llorens, Susana; Salanoval, Marisa; Schaufeli, Wilmar B

    2012-07-01

    Taking the Resources-Experiences-Demands Model (RED Model) by Salanova and colleagues as our starting point, we tested how work self-efficacy relates positively to negative (i.e., work overload and work-family conflict) and positive outcomes (i.e., job satisfaction and organizational commitment), through the mediating role of workaholism (health impairment process) and work engagement (motivational process). In a sample of 386 administrative staff from a Spanish University (65% women), Structural Equation Modeling provided full evidence for the research model. In addition, Multivariate Analyses of Variance showed that self-efficacy was only related positively to one of the two dimensions of workaholism, namely, working excessively. Finally, we discuss the theoretical and practical contributions in terms of the RED Model.

  18. Departure from Normality in Multivariate Normative Comparison: The Cramer Alternative for Hotelling's "T[squared]"

    ERIC Educational Resources Information Center

    Grasman, Raoul P. P. P.; Huizenga, Hilde M.; Geurts, Hilde M.

    2010-01-01

    Crawford and Howell (1998) have pointed out that the common practice of z-score inference on cognitive disability is inappropriate if a patient's performance on a task is compared with relatively few typical control individuals. Appropriate univariate and multivariate statistical tests have been proposed for these studies, but these are only valid…

  19. The effect of organizational climate on patient-centered medical home implementation.

    PubMed

    Reddy, Ashok; Shea, Judy A; Canamucio, Anne; Werner, Rachel M

    2015-01-01

    Organizational climate is a key determinant of successful adoption of innovations; however, its relation to medical home implementation is unknown. This study examined the association between primary care providers' (PCPs') perception of organization climate and medical home implementation in the Veterans Health Administration. Multivariate regression was used to test the hypothesis that organizational climate predicts medical home implementation. This analysis of 191 PCPs found that higher scores in 2 domains of organizational climate (communication and cooperation, and orientation to quality improvement) were associated with a statistically significantly higher percentage (from 7 to 10 percentage points) of PCPs implementing structural changes to support the medical home model. In addition, some aspects of a better organizational climate were associated with improved organizational processes of care, including a higher percentage of patients contacted within 2 days of hospital discharge (by 2 to 3 percentage points) and appointments made within 3 days of a patient request (by 2 percentage points). © The Author(s) 2014.

  20. Testing for Granger Causality in the Frequency Domain: A Phase Resampling Method.

    PubMed

    Liu, Siwei; Molenaar, Peter

    2016-01-01

    This article introduces phase resampling, an existing but rarely used surrogate data method for making statistical inferences of Granger causality in frequency domain time series analysis. Granger causality testing is essential for establishing causal relations among variables in multivariate dynamic processes. However, testing for Granger causality in the frequency domain is challenging due to the nonlinear relation between frequency domain measures (e.g., partial directed coherence, generalized partial directed coherence) and time domain data. Through a simulation study, we demonstrate that phase resampling is a general and robust method for making statistical inferences even with short time series. With Gaussian data, phase resampling yields satisfactory type I and type II error rates in all but one condition we examine: when a small effect size is combined with an insufficient number of data points. Violations of normality lead to slightly higher error rates but are mostly within acceptable ranges. We illustrate the utility of phase resampling with two empirical examples involving multivariate electroencephalography (EEG) and skin conductance data.

  1. A hybrid approach identifies metabolic signatures of high-producers for chinese hamster ovary clone selection and process optimization.

    PubMed

    Popp, Oliver; Müller, Dirk; Didzus, Katharina; Paul, Wolfgang; Lipsmeier, Florian; Kirchner, Florian; Niklas, Jens; Mauch, Klaus; Beaucamp, Nicola

    2016-09-01

    In-depth characterization of high-producer cell lines and bioprocesses is vital to ensure robust and consistent production of recombinant therapeutic proteins in high quantity and quality for clinical applications. This requires applying appropriate methods during bioprocess development to enable meaningful characterization of CHO clones and processes. Here, we present a novel hybrid approach for supporting comprehensive characterization of metabolic clone performance. The approach combines metabolite profiling with multivariate data analysis and fluxomics to enable a data-driven mechanistic analysis of key metabolic traits associated with desired cell phenotypes. We applied the methodology to quantify and compare metabolic performance in a set of 10 recombinant CHO-K1 producer clones and a host cell line. The comprehensive characterization enabled us to derive an extended set of clone performance criteria that not only captured growth and product formation, but also incorporated information on intracellular clone physiology and on metabolic changes during the process. These criteria served to establish a quantitative clone ranking and allowed us to identify metabolic differences between high-producing CHO-K1 clones yielding comparably high product titers. Through multivariate data analysis of the combined metabolite and flux data we uncovered common metabolic traits characteristic of high-producer clones in the screening setup. This included high intracellular rates of glutamine synthesis, low cysteine uptake, reduced excretion of aspartate and glutamate, and low intracellular degradation rates of branched-chain amino acids and of histidine. Finally, the above approach was integrated into a workflow that enables standardized high-content selection of CHO producer clones in a high-throughput fashion. In conclusion, the combination of quantitative metabolite profiling, multivariate data analysis, and mechanistic network model simulations can identify metabolic traits characteristic of high-performance clones and enables informed decisions on which clones provide a good match for a particular process platform. The proposed approach also provides a mechanistic link between observed clone phenotype, process setup, and feeding regimes, and thereby offers concrete starting points for subsequent process optimization. Biotechnol. Bioeng. 2016;113: 2005-2019. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. Diagnosis of abnormal patterns in multivariate microclimate monitoring: a case study of an open-air archaeological site in Pompeii (Italy).

    PubMed

    Merello, Paloma; García-Diego, Fernando-Juan; Zarzo, Manuel

    2014-08-01

    Chemometrics has been applied successfully since the 1990s for the multivariate statistical control of industrial processes. A new area of interest for these tools is the microclimatic monitoring of cultural heritage. Sensors record climatic parameters over time and statistical data analysis is performed to obtain valuable information for preventive conservation. A case study of an open-air archaeological site is presented here. A set of 26 temperature and relative humidity data-loggers was installed in four rooms of Ariadne's house (Pompeii). If climatic values are recorded versus time at different positions, the resulting data structure is equivalent to records of physical parameters registered at several points of a continuous chemical process. However, there is an important difference in this case: continuous processes are controlled to reach a steady state, whilst open-air sites undergo tremendous fluctuations. Although data from continuous processes are usually column-centred prior to applying principal components analysis, it turned out that another pre-treatment (row-centred data) was more convenient for the interpretation of components and to identify abnormal patterns. The detection of typical trajectories was more straightforward by dividing the whole monitored period into several sub-periods, because the marked climatic fluctuations throughout the year affect the correlation structures. The proposed statistical methodology is of interest for the microclimatic monitoring of cultural heritage, particularly in the case of open-air or semi-confined archaeological sites. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Bivariate versus multivariate smart spectrophotometric calibration methods for the simultaneous determination of a quaternary mixture of mosapride, pantoprazole and their degradation products.

    PubMed

    Hegazy, M A; Yehia, A M; Moustafa, A A

    2013-05-01

    The ability of bivariate and multivariate spectrophotometric methods was demonstrated in the resolution of a quaternary mixture of mosapride, pantoprazole and their degradation products. The bivariate calibrations include bivariate spectrophotometric method (BSM) and H-point standard addition method (HPSAM), which were able to determine the two drugs, simultaneously, but not in the presence of their degradation products, the results showed that simultaneous determinations could be performed in the concentration ranges of 5.0-50.0 microg/ml for mosapride and 10.0-40.0 microg/ml for pantoprazole by bivariate spectrophotometric method and in the concentration ranges of 5.0-45.0 microg/ml for both drugs by H-point standard addition method. Moreover, the applied multivariate calibration methods were able for the determination of mosapride, pantoprazole and their degradation products using concentration residuals augmented classical least squares (CRACLS) and partial least squares (PLS). The proposed multivariate methods were applied to 17 synthetic samples in the concentration ranges of 3.0-12.0 microg/ml mosapride, 8.0-32.0 microg/ml pantoprazole, 1.5-6.0 microg/ml mosapride degradation products and 2.0-8.0 microg/ml pantoprazole degradation products. The proposed bivariate and multivariate calibration methods were successfully applied to the determination of mosapride and pantoprazole in their pharmaceutical preparations.

  4. Quality of life in Chronic Pancreatitis is determined by constant pain, disability/unemployment, current smoking and associated co-morbidities

    PubMed Central

    Machicado, Jorge D.; Amann, Stephen T; Anderson, Michelle A.; Abberbock, Judah; Sherman, Stuart; Conwell, Darwin; Cote, Gregory A.; Singh, Vikesh K.; Lewis, Michele; Alkaade, Samer; Sandhu, Bimaljit S.; Guda, Nalini M.; Muniraj, Thiruvengadam; Tang, Gong; Baillie, John; Brand, Randall; Gardner, Timothy B.; Gelrud, Andres; Forsmark, Christopher E.; Banks, Peter A.; Slivka, Adam; Wilcox, C. Mel; Whitcomb, David C.; Yadav, Dhiraj

    2018-01-01

    Background Chronic pancreatitis (CP) has a profound independent effect on quality of life (QOL). Our aim was to identify factors that impact the QOL in CP patients. Methods We used data on 1,024 CP patients enrolled in the three NAPS2 studies. Information on demographics, risk factors, co-morbidities, disease phenotype and treatments was obtained from responses to structured questionnaires. Physical (PCS) and mental (MCS) component summary scores generated using responses to the Short Form-12 (SF-12) survey were used to assess QOL at enrollment. Multivariable linear regression models determined independent predictors of QOL. Results Mean PCS and MCS scores were 36.7±11.7 and 42.4±12.2, respectively. Significant (p<0.05) negative impact on PCS scores in multivariable analyses was noted due to constant mild-moderate pain with episodes of severe pain or constant severe pain (10 points), constant mild-moderate pain (5.2), pain-related disability/unemployment (5.1), current smoking (2.9 points) and medical co-morbidities. Significant (p<0.05) negative impact on MCS scores was related to constant pain irrespective of severity (6.8-6.9 points), current smoking (3.9 points) and pain-related disability/unemployment (2.4 points). In women, disability/unemployment resulted in an additional reduction 3.7 point reduction in MCS score. Final multivariable models explained 27% and 18% of the variance in PCS and MCS scores, respectively. Etiology, disease duration, pancreatic morphology, diabetes, exocrine insufficiency and prior endotherapy/pancreatic surgery had no significant independent effect on QOL. Conclusion Constant pain, pain-related disability/unemployment, current smoking, and concurrent co-morbidities significantly affect the QOL in CP. Further research is needed to identify factors impacting QOL not explained by our analyses. PMID:28244497

  5. Quality of Life in Chronic Pancreatitis is Determined by Constant Pain, Disability/Unemployment, Current Smoking, and Associated Co-Morbidities.

    PubMed

    Machicado, Jorge D; Amann, Stephen T; Anderson, Michelle A; Abberbock, Judah; Sherman, Stuart; Conwell, Darwin L; Cote, Gregory A; Singh, Vikesh K; Lewis, Michele D; Alkaade, Samer; Sandhu, Bimaljit S; Guda, Nalini M; Muniraj, Thiruvengadam; Tang, Gong; Baillie, John; Brand, Randall E; Gardner, Timothy B; Gelrud, Andres; Forsmark, Christopher E; Banks, Peter A; Slivka, Adam; Wilcox, C Mel; Whitcomb, David C; Yadav, Dhiraj

    2017-04-01

    Chronic pancreatitis (CP) has a profound independent effect on quality of life (QOL). Our aim was to identify factors that impact the QOL in CP patients. We used data on 1,024 CP patients enrolled in the three NAPS2 studies. Information on demographics, risk factors, co-morbidities, disease phenotype, and treatments was obtained from responses to structured questionnaires. Physical and mental component summary (PCS and MCS, respectively) scores generated using responses to the Short Form-12 (SF-12) survey were used to assess QOL at enrollment. Multivariable linear regression models determined independent predictors of QOL. Mean PCS and MCS scores were 36.7±11.7 and 42.4±12.2, respectively. Significant (P<0.05) negative impact on PCS scores in multivariable analyses was noted owing to constant mild-moderate pain with episodes of severe pain or constant severe pain (10 points), constant mild-moderate pain (5.2), pain-related disability/unemployment (5.1), current smoking (2.9 points), and medical co-morbidities. Significant (P<0.05) negative impact on MCS scores was related to constant pain irrespective of severity (6.8-6.9 points), current smoking (3.9 points), and pain-related disability/unemployment (2.4 points). In women, disability/unemployment resulted in an additional 3.7 point reduction in MCS score. Final multivariable models explained 27% and 18% of the variance in PCS and MCS scores, respectively. Etiology, disease duration, pancreatic morphology, diabetes, exocrine insufficiency, and prior endotherapy/pancreatic surgery had no significant independent effect on QOL. Constant pain, pain-related disability/unemployment, current smoking, and concurrent co-morbidities significantly affect the QOL in CP. Further research is needed to identify factors impacting QOL not explained by our analyses.

  6. Geochemical processes controlling water salinization in an irrigated basin in Spain: identification of natural and anthropogenic influence.

    PubMed

    Merchán, D; Auqué, L F; Acero, P; Gimeno, M J; Causapé, J

    2015-01-01

    Salinization of water bodies represents a significant risk in water systems. The salinization of waters in a small irrigated hydrological basin is studied herein through an integrated hydrogeochemical study including multivariate statistical analyses and geochemical modeling. The study zone has two well differentiated geologic materials: (i) Quaternary sediments of low salinity and high permeability and (ii) Tertiary sediments of high salinity and very low permeability. In this work, soil samples were collected and leaching experiments conducted on them in the laboratory. In addition, water samples were collected from precipitation, irrigation, groundwater, spring and surface waters. The waters show an increase in salinity from precipitation and irrigation water to ground- and, finally, surface water. The enrichment in salinity is related to the dissolution of soluble mineral present mainly in the Tertiary materials. Cation exchange, precipitation of calcite and, probably, incongruent dissolution of dolomite, have been inferred from the hydrochemical data set. Multivariate statistical analysis provided information about the structure of the data, differentiating the group of surface waters from the groundwaters and the salinization from the nitrate pollution processes. The available information was included in geochemical models in which hypothesis of consistency and thermodynamic feasibility were checked. The assessment of the collected information pointed to a natural control on salinization processes in the Lerma Basin with minimal influence of anthropogenic factors. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. A Multivariate Model for the Meta-Analysis of Study Level Survival Data at Multiple Times

    ERIC Educational Resources Information Center

    Jackson, Dan; Rollins, Katie; Coughlin, Patrick

    2014-01-01

    Motivated by our meta-analytic dataset involving survival rates after treatment for critical leg ischemia, we develop and apply a new multivariate model for the meta-analysis of study level survival data at multiple times. Our data set involves 50 studies that provide mortality rates at up to seven time points, which we model simultaneously, and…

  8. Impact of Explicit Presentation of Slopes in Three Dimensions on Students' Understanding of Derivatives in Multivariable Calculus

    ERIC Educational Resources Information Center

    McGee, Daniel Lee; Moore-Russo, Deborah

    2015-01-01

    In two dimensions (2D), representations associated with slopes are seen in numerous forms before representations associated with derivatives are presented. These include the slope between two points and the constant slope of a linear function of a single variable. In almost all multivariable calculus textbooks, however, the first discussion of…

  9. Piecewise multivariate modelling of sequential metabolic profiling data.

    PubMed

    Rantalainen, Mattias; Cloarec, Olivier; Ebbels, Timothy M D; Lundstedt, Torbjörn; Nicholson, Jeremy K; Holmes, Elaine; Trygg, Johan

    2008-02-19

    Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS) models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA) for modelling and analysis of short time-series data.

  10. Trends in modern system theory

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1976-01-01

    The topics considered are related to linear control system design, adaptive control, failure detection, control under failure, system reliability, and large-scale systems and decentralized control. It is pointed out that the design of a linear feedback control system which regulates a process about a desirable set point or steady-state condition in the presence of disturbances is a very important problem. The linearized dynamics of the process are used for design purposes. The typical linear-quadratic design involving the solution of the optimal control problem of a linear time-invariant system with respect to a quadratic performance criterion is considered along with gain reduction theorems and the multivariable phase margin theorem. The stumbling block in many adaptive design methodologies is associated with the amount of real time computation which is necessary. Attention is also given to the desperate need to develop good theories for large-scale systems, the beginning of a microprocessor revolution, the translation of the Wiener-Hopf theory into the time domain, and advances made in dynamic team theory, dynamic stochastic games, and finite memory stochastic control.

  11. To See the World in a Grain of Sand: Recognizing the Origin of Sand Specimens by Diffuse Reflectance Infrared Fourier Transform Spectroscopy and Multivariate Exploratory Data Analysis

    ERIC Educational Resources Information Center

    Pezzolo, Alessandra De Lorenzi

    2011-01-01

    The diffuse reflectance infrared Fourier transform (DRIFT) spectra of sand samples exhibit features reflecting their composition. Basic multivariate analysis (MVA) can be used to effectively sort subsets of homogeneous specimens collected from nearby locations, as well as pointing out similarities in composition among sands of different origins.…

  12. Process analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy.

    PubMed

    Fink, Herbert; Panne, Ulrich; Niessner, Reinhard

    2002-09-01

    An experimental setup for direct elemental analysis of recycled thermoplasts from consumer electronics by laser-induced plasma spectroscopy (LIPS, or laser-induced breakdown spectroscopy, LIBS) was realized. The combination of a echelle spectrograph, featuring a high resolution with a broad spectral coverage, with multivariate methods, such as PLS, PCR, and variable subset selection via a genetic algorithm, resulted in considerable improvements in selectivity and sensitivity for this complex matrix. With a normalization to carbon as internal standard, the limits of detection were in the ppm range. A preliminary pattern recognition study points to the possibility of polymer recognition via the line-rich echelle spectra. Several experiments at an extruder within a recycling plant demonstrated successfully the capability of LIPS for different kinds of routine on-line process analysis.

  13. Nearest neighbors by neighborhood counting.

    PubMed

    Wang, Hui

    2006-06-01

    Finding nearest neighbors is a general idea that underlies many artificial intelligence tasks, including machine learning, data mining, natural language understanding, and information retrieval. This idea is explicitly used in the k-nearest neighbors algorithm (kNN), a popular classification method. In this paper, this idea is adopted in the development of a general methodology, neighborhood counting, for devising similarity functions. We turn our focus from neighbors to neighborhoods, a region in the data space covering the data point in question. To measure the similarity between two data points, we consider all neighborhoods that cover both data points. We propose to use the number of such neighborhoods as a measure of similarity. Neighborhood can be defined for different types of data in different ways. Here, we consider one definition of neighborhood for multivariate data and derive a formula for such similarity, called neighborhood counting measure or NCM. NCM was tested experimentally in the framework of kNN. Experiments show that NCM is generally comparable to VDM and its variants, the state-of-the-art distance functions for multivariate data, and, at the same time, is consistently better for relatively large k values. Additionally, NCM consistently outperforms HEOM (a mixture of Euclidean and Hamming distances), the "standard" and most widely used distance function for multivariate data. NCM has a computational complexity in the same order as the standard Euclidean distance function and NCM is task independent and works for numerical and categorical data in a conceptually uniform way. The neighborhood counting methodology is proven sound for multivariate data experimentally. We hope it will work for other types of data.

  14. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    USGS Publications Warehouse

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  15. Python Spectral Analysis Tool (PySAT) for Preprocessing, Multivariate Analysis, and Machine Learning with Point Spectra

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.

    2017-06-01

    We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.

  16. Using Multivariate Regression Model with Least Absolute Shrinkage and Selection Operator (LASSO) to Predict the Incidence of Xerostomia after Intensity-Modulated Radiotherapy for Head and Neck Cancer

    PubMed Central

    Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Wu, Jia-Ming; Wang, Hung-Yu; Horng, Mong-Fong; Chang, Chun-Ming; Lan, Jen-Hong; Huang, Ya-Yu; Fang, Fu-Min; Leung, Stephen Wan

    2014-01-01

    Purpose The aim of this study was to develop a multivariate logistic regression model with least absolute shrinkage and selection operator (LASSO) to make valid predictions about the incidence of moderate-to-severe patient-rated xerostomia among head and neck cancer (HNC) patients treated with IMRT. Methods and Materials Quality of life questionnaire datasets from 206 patients with HNC were analyzed. The European Organization for Research and Treatment of Cancer QLQ-H&N35 and QLQ-C30 questionnaires were used as the endpoint evaluation. The primary endpoint (grade 3+ xerostomia) was defined as moderate-to-severe xerostomia at 3 (XER3m) and 12 months (XER12m) after the completion of IMRT. Normal tissue complication probability (NTCP) models were developed. The optimal and suboptimal numbers of prognostic factors for a multivariate logistic regression model were determined using the LASSO with bootstrapping technique. Statistical analysis was performed using the scaled Brier score, Nagelkerke R2, chi-squared test, Omnibus, Hosmer-Lemeshow test, and the AUC. Results Eight prognostic factors were selected by LASSO for the 3-month time point: Dmean-c, Dmean-i, age, financial status, T stage, AJCC stage, smoking, and education. Nine prognostic factors were selected for the 12-month time point: Dmean-i, education, Dmean-c, smoking, T stage, baseline xerostomia, alcohol abuse, family history, and node classification. In the selection of the suboptimal number of prognostic factors by LASSO, three suboptimal prognostic factors were fine-tuned by Hosmer-Lemeshow test and AUC, i.e., Dmean-c, Dmean-i, and age for the 3-month time point. Five suboptimal prognostic factors were also selected for the 12-month time point, i.e., Dmean-i, education, Dmean-c, smoking, and T stage. The overall performance for both time points of the NTCP model in terms of scaled Brier score, Omnibus, and Nagelkerke R2 was satisfactory and corresponded well with the expected values. Conclusions Multivariate NTCP models with LASSO can be used to predict patient-rated xerostomia after IMRT. PMID:24586971

  17. Using multivariate regression model with least absolute shrinkage and selection operator (LASSO) to predict the incidence of Xerostomia after intensity-modulated radiotherapy for head and neck cancer.

    PubMed

    Lee, Tsair-Fwu; Chao, Pei-Ju; Ting, Hui-Min; Chang, Liyun; Huang, Yu-Jie; Wu, Jia-Ming; Wang, Hung-Yu; Horng, Mong-Fong; Chang, Chun-Ming; Lan, Jen-Hong; Huang, Ya-Yu; Fang, Fu-Min; Leung, Stephen Wan

    2014-01-01

    The aim of this study was to develop a multivariate logistic regression model with least absolute shrinkage and selection operator (LASSO) to make valid predictions about the incidence of moderate-to-severe patient-rated xerostomia among head and neck cancer (HNC) patients treated with IMRT. Quality of life questionnaire datasets from 206 patients with HNC were analyzed. The European Organization for Research and Treatment of Cancer QLQ-H&N35 and QLQ-C30 questionnaires were used as the endpoint evaluation. The primary endpoint (grade 3(+) xerostomia) was defined as moderate-to-severe xerostomia at 3 (XER3m) and 12 months (XER12m) after the completion of IMRT. Normal tissue complication probability (NTCP) models were developed. The optimal and suboptimal numbers of prognostic factors for a multivariate logistic regression model were determined using the LASSO with bootstrapping technique. Statistical analysis was performed using the scaled Brier score, Nagelkerke R(2), chi-squared test, Omnibus, Hosmer-Lemeshow test, and the AUC. Eight prognostic factors were selected by LASSO for the 3-month time point: Dmean-c, Dmean-i, age, financial status, T stage, AJCC stage, smoking, and education. Nine prognostic factors were selected for the 12-month time point: Dmean-i, education, Dmean-c, smoking, T stage, baseline xerostomia, alcohol abuse, family history, and node classification. In the selection of the suboptimal number of prognostic factors by LASSO, three suboptimal prognostic factors were fine-tuned by Hosmer-Lemeshow test and AUC, i.e., Dmean-c, Dmean-i, and age for the 3-month time point. Five suboptimal prognostic factors were also selected for the 12-month time point, i.e., Dmean-i, education, Dmean-c, smoking, and T stage. The overall performance for both time points of the NTCP model in terms of scaled Brier score, Omnibus, and Nagelkerke R(2) was satisfactory and corresponded well with the expected values. Multivariate NTCP models with LASSO can be used to predict patient-rated xerostomia after IMRT.

  18. Systematic design of membership functions for fuzzy-logic control: A case study on one-stage partial nitritation/anammox treatment systems.

    PubMed

    Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan

    2016-10-01

    A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Multivariate Statistical Modelling of Drought and Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A copula is a multivariate distribution function which allows one to model the dependence structure of given variables separately from the marginal behaviour. We firstly look at the structure of soil moisture drought over the entire of France using the SAFRAN dataset between 1959 and 2009. Soil moisture is represented using the Standardised Precipitation Evapotranspiration Index (SPEI). Drought characteristics are computed at grid point scale where drought conditions are identified as those with an SPEI value below -1.0. We model the multivariate dependence structure of drought events defined by certain characteristics and compute return levels of these events. We initially find that drought characteristics such as duration, mean SPEI and the maximum contiguous area to a grid point all have positive correlations, though the degree to which they are correlated can vary considerably spatially. A spatial representation of return levels then may provide insight into the areas most prone to drought conditions. As a next step, we analyse the dependence structure between soil moisture conditions preceding the onset of a heat wave and the heat wave itself.

  20. Instantaneous Transfer Entropy for the Study of Cardiovascular and Cardiorespiratory Nonstationary Dynamics.

    PubMed

    Valenza, Gaetano; Faes, Luca; Citi, Luca; Orini, Michele; Barbieri, Riccardo

    2018-05-01

    Measures of transfer entropy (TE) quantify the direction and strength of coupling between two complex systems. Standard approaches assume stationarity of the observations, and therefore are unable to track time-varying changes in nonlinear information transfer with high temporal resolution. In this study, we aim to define and validate novel instantaneous measures of TE to provide an improved assessment of complex nonstationary cardiorespiratory interactions. We here propose a novel instantaneous point-process TE (ipTE) and validate its assessment as applied to cardiovascular and cardiorespiratory dynamics. In particular, heartbeat and respiratory dynamics are characterized through discrete time series, and modeled with probability density functions predicting the time of the next physiological event as a function of the past history. Likewise, nonstationary interactions between heartbeat and blood pressure dynamics are characterized as well. Furthermore, we propose a new measure of information transfer, the instantaneous point-process information transfer (ipInfTr), which is directly derived from point-process-based definitions of the Kolmogorov-Smirnov distance. Analysis on synthetic data, as well as on experimental data gathered from healthy subjects undergoing postural changes confirms that ipTE, as well as ipInfTr measures are able to dynamically track changes in physiological systems coupling. This novel approach opens new avenues in the study of hidden, transient, nonstationary physiological states involving multivariate autonomic dynamics in cardiovascular health and disease. The proposed method can also be tailored for the study of complex multisystem physiology (e.g., brain-heart or, more in general, brain-body interactions).

  1. Multivariate random regression analysis for body weight and main morphological traits in genetically improved farmed tilapia (Oreochromis niloticus).

    PubMed

    He, Jie; Zhao, Yunfeng; Zhao, Jingli; Gao, Jin; Han, Dandan; Xu, Pao; Yang, Runqing

    2017-11-02

    Because of their high economic importance, growth traits in fish are under continuous improvement. For growth traits that are recorded at multiple time-points in life, the use of univariate and multivariate animal models is limited because of the variable and irregular timing of these measures. Thus, the univariate random regression model (RRM) was introduced for the genetic analysis of dynamic growth traits in fish breeding. We used a multivariate random regression model (MRRM) to analyze genetic changes in growth traits recorded at multiple time-point of genetically-improved farmed tilapia. Legendre polynomials of different orders were applied to characterize the influences of fixed and random effects on growth trajectories. The final MRRM was determined by optimizing the univariate RRM for the analyzed traits separately via penalizing adaptively the likelihood statistical criterion, which is superior to both the Akaike information criterion and the Bayesian information criterion. In the selected MRRM, the additive genetic effects were modeled by Legendre polynomials of three orders for body weight (BWE) and body length (BL) and of two orders for body depth (BD). By using the covariance functions of the MRRM, estimated heritabilities were between 0.086 and 0.628 for BWE, 0.155 and 0.556 for BL, and 0.056 and 0.607 for BD. Only heritabilities for BD measured from 60 to 140 days of age were consistently higher than those estimated by the univariate RRM. All genetic correlations between growth time-points exceeded 0.5 for either single or pairwise time-points. Moreover, correlations between early and late growth time-points were lower. Thus, for phenotypes that are measured repeatedly in aquaculture, an MRRM can enhance the efficiency of the comprehensive selection for BWE and the main morphological traits.

  2. Kinetics of Thermal Decomposition of Ammonium Perchlorate by TG/DSC-MS-FTIR

    NASA Astrophysics Data System (ADS)

    Zhu, Yan-Li; Huang, Hao; Ren, Hui; Jiao, Qing-Jie

    2014-01-01

    The method of thermogravimetry/differential scanning calorimetry-mass spectrometry-Fourier transform infrared (TG/DSC-MS-FTIR) simultaneous analysis has been used to study thermal decomposition of ammonium perchlorate (AP). The processing of nonisothermal data at various heating rates was performed using NETZSCH Thermokinetics. The MS-FTIR spectra showed that N2O and NO2 were the main gaseous products of the thermal decomposition of AP, and there was a competition between the formation reaction of N2O and that of NO2 during the process with an iso-concentration point of N2O and NO2. The dependence of the activation energy calculated by Friedman's iso-conversional method on the degree of conversion indicated that the AP decomposition process can be divided into three stages, which are autocatalytic, low-temperature diffusion and high-temperature, stable-phase reaction. The corresponding kinetic parameters were determined by multivariate nonlinear regression and the mechanism of the AP decomposition process was proposed.

  3. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  4. Towards better process understanding: chemometrics and multivariate measurements in manufacturing of solid dosage forms.

    PubMed

    Matero, Sanni; van Den Berg, Frans; Poutiainen, Sami; Rantanen, Jukka; Pajander, Jari

    2013-05-01

    The manufacturing of tablets involves many unit operations that possess multivariate and complex characteristics. The interactions between the material characteristics and process related variation are presently not comprehensively analyzed due to univariate detection methods. As a consequence, current best practice to control a typical process is to not allow process-related factors to vary i.e. lock the production parameters. The problem related to the lack of sufficient process understanding is still there: the variation within process and material properties is an intrinsic feature and cannot be compensated for with constant process parameters. Instead, a more comprehensive approach based on the use of multivariate tools for investigating processes should be applied. In the pharmaceutical field these methods are referred to as Process Analytical Technology (PAT) tools that aim to achieve a thorough understanding and control over the production process. PAT includes the frames for measurement as well as data analyzes and controlling for in-depth understanding, leading to more consistent and safer drug products with less batch rejections. In the optimal situation, by applying these techniques, destructive end-product testing could be avoided. In this paper the most prominent multivariate data analysis measuring tools within tablet manufacturing and basic research on operations are reviewed. Copyright © 2013 Wiley Periodicals, Inc.

  5. Nonlinear Performance Seeking Control using Fuzzy Model Reference Learning Control and the Method of Steepest Descent

    NASA Technical Reports Server (NTRS)

    Kopasakis, George

    1997-01-01

    Performance Seeking Control (PSC) attempts to find and control the process at the operating condition that will generate maximum performance. In this paper a nonlinear multivariable PSC methodology will be developed, utilizing the Fuzzy Model Reference Learning Control (FMRLC) and the method of Steepest Descent or Gradient (SDG). This PSC control methodology employs the SDG method to find the operating condition that will generate maximum performance. This operating condition is in turn passed to the FMRLC controller as a set point for the control of the process. The conventional SDG algorithm is modified in this paper in order for convergence to occur monotonically. For the FMRLC control, the conventional fuzzy model reference learning control methodology is utilized, with guidelines generated here for effective tuning of the FMRLC controller.

  6. Multivariate meta-analysis: potential and promise.

    PubMed

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-09-10

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day 'Multivariate meta-analysis' event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  7. Multivariate meta-analysis: Potential and promise

    PubMed Central

    Jackson, Dan; Riley, Richard; White, Ian R

    2011-01-01

    The multivariate random effects model is a generalization of the standard univariate model. Multivariate meta-analysis is becoming more commonly used and the techniques and related computer software, although continually under development, are now in place. In order to raise awareness of the multivariate methods, and discuss their advantages and disadvantages, we organized a one day ‘Multivariate meta-analysis’ event at the Royal Statistical Society. In addition to disseminating the most recent developments, we also received an abundance of comments, concerns, insights, critiques and encouragement. This article provides a balanced account of the day's discourse. By giving others the opportunity to respond to our assessment, we hope to ensure that the various view points and opinions are aired before multivariate meta-analysis simply becomes another widely used de facto method without any proper consideration of it by the medical statistics community. We describe the areas of application that multivariate meta-analysis has found, the methods available, the difficulties typically encountered and the arguments for and against the multivariate methods, using four representative but contrasting examples. We conclude that the multivariate methods can be useful, and in particular can provide estimates with better statistical properties, but also that these benefits come at the price of making more assumptions which do not result in better inference in every case. Although there is evidence that multivariate meta-analysis has considerable potential, it must be even more carefully applied than its univariate counterpart in practice. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21268052

  8. Enhancing e-waste estimates: Improving data quality by multivariate Input–Output Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Feng, E-mail: fwang@unu.edu; Design for Sustainability Lab, Faculty of Industrial Design Engineering, Delft University of Technology, Landbergstraat 15, 2628CE Delft; Huisman, Jaco

    2013-11-15

    Highlights: • A multivariate Input–Output Analysis method for e-waste estimates is proposed. • Applying multivariate analysis to consolidate data can enhance e-waste estimates. • We examine the influence of model selection and data quality on e-waste estimates. • Datasets of all e-waste related variables in a Dutch case study have been provided. • Accurate modeling of time-variant lifespan distributions is critical for estimate. - Abstract: Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lackmore » of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input–Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies.« less

  9. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  10. MEMD-enhanced multivariate fuzzy entropy for the evaluation of complexity in biomedical signals.

    PubMed

    Azami, Hamed; Smith, Keith; Escudero, Javier

    2016-08-01

    Multivariate multiscale entropy (mvMSE) has been proposed as a combination of the coarse-graining process and multivariate sample entropy (mvSE) to quantify the irregularity of multivariate signals. However, both the coarse-graining process and mvSE may not be reliable for short signals. Although the coarse-graining process can be replaced with multivariate empirical mode decomposition (MEMD), the relative instability of mvSE for short signals remains a problem. Here, we address this issue by proposing the multivariate fuzzy entropy (mvFE) with a new fuzzy membership function. The results using white Gaussian noise show that the mvFE leads to more reliable and stable results, especially for short signals, in comparison with mvSE. Accordingly, we propose MEMD-enhanced mvFE to quantify the complexity of signals. The characteristics of brain regions influenced by partial epilepsy are investigated by focal and non-focal electroencephalogram (EEG) time series. In this sense, the proposed MEMD-enhanced mvFE and mvSE are employed to discriminate focal EEG signals from non-focal ones. The results demonstrate the MEMD-enhanced mvFE values have a smaller coefficient of variation in comparison with those obtained by the MEMD-enhanced mvSE, even for long signals. The results also show that the MEMD-enhanced mvFE has better performance to quantify focal and non-focal signals compared with multivariate multiscale permutation entropy.

  11. Prediction of processing tomato peeling outcomes

    USDA-ARS?s Scientific Manuscript database

    Peeling outcomes of processing tomatoes were predicted using multivariate analysis of Magnetic Resonance (MR) images. Tomatoes were obtained from a whole-peel production line. Each fruit was imaged using a 7 Tesla MR system, and a multivariate data set was created from 28 different images. After ...

  12. Adjustment of automatic control systems of production facilities at coal processing plants using multivariant physico- mathematical models

    NASA Astrophysics Data System (ADS)

    Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.

    2016-10-01

    The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.

  13. A Hierarchical Multivariate Bayesian Approach to Ensemble Model output Statistics in Atmospheric Prediction

    DTIC Science & Technology

    2017-09-01

    efficacy of statistical post-processing methods downstream of these dynamical model components with a hierarchical multivariate Bayesian approach to...Bayesian hierarchical modeling, Markov chain Monte Carlo methods , Metropolis algorithm, machine learning, atmospheric prediction 15. NUMBER OF PAGES...scale processes. However, this dissertation explores the efficacy of statistical post-processing methods downstream of these dynamical model components

  14. Allogeneic transplantation provides durable remission in a subset of DLBCL patients relapsing after autologous transplantation.

    PubMed

    Fenske, Timothy S; Ahn, Kwang W; Graff, Tara M; DiGilio, Alyssa; Bashir, Qaiser; Kamble, Rammurti T; Ayala, Ernesto; Bacher, Ulrike; Brammer, Jonathan E; Cairo, Mitchell; Chen, Andy; Chen, Yi-Bin; Chhabra, Saurabh; D'Souza, Anita; Farooq, Umar; Freytes, Cesar; Ganguly, Siddhartha; Hertzberg, Mark; Inwards, David; Jaglowski, Samantha; Kharfan-Dabaja, Mohamed A; Lazarus, Hillard M; Nathan, Sunita; Pawarode, Attaphol; Perales, Miguel-Angel; Reddy, Nishitha; Seo, Sachiko; Sureda, Anna; Smith, Sonali M; Hamadani, Mehdi

    2016-07-01

    For diffuse large B-cell lymphoma (DLBCL) patients progressing after autologous haematopoietic cell transplantation (autoHCT), allogeneic HCT (alloHCT) is often considered, although limited information is available to guide patient selection. Using the Center for International Blood and Marrow Transplant Research (CIBMTR) database, we identified 503 patients who underwent alloHCT after disease progression/relapse following a prior autoHCT. The 3-year probabilities of non-relapse mortality, progression/relapse, progression-free survival (PFS) and overall survival (OS) were 30, 38, 31 and 37% respectively. Factors associated with inferior PFS on multivariate analysis included Karnofsky performance status (KPS) <80, chemoresistance, autoHCT to alloHCT interval <1-year and myeloablative conditioning. Factors associated with worse OS on multivariate analysis included KPS<80, chemoresistance and myeloablative conditioning. Three adverse prognostic factors were used to construct a prognostic model for PFS, including KPS<80 (4 points), autoHCT to alloHCT interval <1-year (2 points) and chemoresistant disease at alloHCT (5 points). This CIBMTR prognostic model classified patients into four groups: low-risk (0 points), intermediate-risk (2-5 points), high-risk (6-9 points) or very high-risk (11 points), predicting 3-year PFS of 40, 32, 11 and 6%, respectively, with 3-year OS probabilities of 43, 39, 19 and 11% respectively. In conclusion, the CIBMTR prognostic model identifies a subgroup of DLBCL patients experiencing long-term survival with alloHCT after a failed prior autoHCT. © 2016 John Wiley & Sons Ltd.

  15. Estimating the decomposition of predictive information in multivariate systems

    NASA Astrophysics Data System (ADS)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  16. Weekly Online Quizzes to a Mathematics Course for Engineering Students

    ERIC Educational Resources Information Center

    Gaspar Martins, Sandra

    2017-01-01

    A set of weekly optional online quizzes was used with 104 students on a Multivariable Calculus course (MC), via the Moodle online system. These quizzes contributed a maximum of two extra points, and this was awarded if the student scored more than 9 points (out of 20) on the exam. All the students got the same questions and could resubmit the…

  17. [Effect of angiotensin II depot administration on bioelectric functional processes of the central nervous system].

    PubMed

    Martin, G; Baumann, H; Grieger, F

    1976-01-01

    Using the average evoked potential technique, angiotensin-II depot effects (1 mg implantate = 3--4 mg/kg body weight angiotensin-II) were studied neuroelectrophysiologically in reticular, hippocampal and neocrotical structures of albino rats. A multivariate variance and discriminance analysis program revealed differentiated changes of the bioelectrical processing data of the CNS. Evidence was obtained for a varying structural sensitivity of central-nervous substructures under depot administration of angiotensin-II. In later phases of angiotensin-II action, the hippocampus was characterized by an electrographic synchronization phenomenon with high-amplitude average evoked potentials. The reticular formation, and to a lesser extent the visual cortex, showed an angiotensin-induced diminution of bioelectrical excitation. However, the intensity of the change in functional CNS patterns did not always correlate with maximal blood pressure rises. The described changes of afference processing to standardized sensory stimuli, especially in hippocampal and reticular structures of the CNS foll owing angiotensin depot action, point to a central-nervous action mechanism of angiotensin-II.

  18. Methodology for the optimal design of an integrated first and second generation ethanol production plant combined with power cogeneration.

    PubMed

    Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François

    2016-08-01

    The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Particle Filter-Based Recursive Data Fusion With Sensor Indexing for Large Core Neutron Flux Estimation

    NASA Astrophysics Data System (ADS)

    Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol

    2017-06-01

    We introduce a sequential importance sampling particle filter (PF)-based multisensor multivariate nonlinear estimator for estimating the in-core neutron flux distribution for pressurized heavy water reactor core. Many critical applications such as reactor protection and control rely upon neutron flux information, and thus their reliability is of utmost importance. The point kinetic model based on neutron transport conveniently explains the dynamics of nuclear reactor. The neutron flux in the large core loosely coupled reactor is sensed by multiple sensors measuring point fluxes located at various locations inside the reactor core. The flux values are coupled to each other through diffusion equation. The coupling facilitates redundancy in the information. It is shown that multiple independent data about the localized flux can be fused together to enhance the estimation accuracy to a great extent. We also propose the sensor anomaly handling feature in multisensor PF to maintain the estimation process even when the sensor is faulty or generates data anomaly.

  20. Analytical Models of Cross-Layer Protocol Optimization in Real-Time Wireless Sensor Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    The real-time interactions among the nodes of a wireless sensor network (WSN) to cooperatively process data from multiple sensors are modeled. Quality-of-service (QoS) metrics are associated with the quality of fused information: throughput, delay, packet error rate, etc. Multivariate point process (MVPP) models of discrete random events in WSNs establish stochastic characteristics of optimal cross-layer protocols. Discrete-event, cross-layer interactions in mobile ad hoc network (MANET) protocols have been modeled using a set of concatenated design parameters and associated resource levels by the MVPPs. Characterization of the "best" cross-layer designs for a MANET is formulated by applying the general theory of martingale representations to controlled MVPPs. Performance is described in terms of concatenated protocol parameters and controlled through conditional rates of the MVPPs. Modeling limitations to determination of closed-form solutions versus explicit iterative solutions for ad hoc WSN controls are examined.

  1. The efficacy and cost of alternative strategies for systematic screening for type 2 diabetes in the U.S. population 45-74 years of age.

    PubMed

    Johnson, Susan L; Tabaei, Bahman P; Herman, William H

    2005-02-01

    To simulate the outcomes of alternative strategies for screening the U.S. population 45-74 years of age for type 2 diabetes. We simulated screening with random plasma glucose (RPG) and cut points of 100, 130, and 160 mg/dl and a multivariate equation including RPG and other variables. Over 15 years, we simulated screening at intervals of 1, 3, and 5 years. All positive screening tests were followed by a diagnostic fasting plasma glucose or an oral glucose tolerance test. Outcomes include the numbers of false-negative, true-positive, and false-positive screening tests and the direct and indirect costs. At year 15, screening every 3 years with an RPG cut point of 100 mg/dl left 0.2 million false negatives, an RPG of 130 mg/dl or the equation left 1.3 million false negatives, and an RPG of 160 mg/dl left 2.8 million false negatives. Over 15 years, the absolute difference between the most sensitive and most specific screening strategy was 4.5 million true positives and 476 million false-positives. Strategies using RPG cut points of 130 mg/dl or the multivariate equation every 3 years identified 17.3 million true positives; however, the equation identified fewer false-positives. The total cost of the most sensitive screening strategy was $42.7 billion and that of the most specific strategy was $6.9 billion. Screening for type 2 diabetes every 3 years with an RPG cut point of 130 mg/dl or the multivariate equation provides good yield and minimizes false-positive screening tests and costs.

  2. Asymptotics of bivariate generating functions with algebraic singularities

    NASA Astrophysics Data System (ADS)

    Greenwood, Torin

    Flajolet and Odlyzko (1990) derived asymptotic formulae the coefficients of a class of uni- variate generating functions with algebraic singularities. Gao and Richmond (1992) and Hwang (1996, 1998) extended these results to classes of multivariate generating functions, in both cases by reducing to the univariate case. Pemantle and Wilson (2013) outlined new multivariate ana- lytic techniques and used them to analyze the coefficients of rational generating functions. After overviewing these methods, we use them to find asymptotic formulae for the coefficients of a broad class of bivariate generating functions with algebraic singularities. Beginning with the Cauchy integral formula, we explicity deform the contour of integration so that it hugs a set of critical points. The asymptotic contribution to the integral comes from analyzing the integrand near these points, leading to explicit asymptotic formulae. Next, we use this formula to analyze an example from current research. In the following chapter, we apply multivariate analytic techniques to quan- tum walks. Bressler and Pemantle (2007) found a (d + 1)-dimensional rational generating function whose coefficients described the amplitude of a particle at a position in the integer lattice after n steps. Here, the minimal critical points form a curve on the (d + 1)-dimensional unit torus. We find asymptotic formulae for the amplitude of a particle in a given position, normalized by the number of steps n, as n approaches infinity. Each critical point contributes to the asymptotics for a specific normalized position. Using Groebner bases in Maple again, we compute the explicit locations of peak amplitudes. In a scaling window of size the square root of n near the peaks, each amplitude is asymptotic to an Airy function.

  3. Enhancing e-waste estimates: improving data quality by multivariate Input-Output Analysis.

    PubMed

    Wang, Feng; Huisman, Jaco; Stevels, Ab; Baldé, Cornelis Peter

    2013-11-01

    Waste electrical and electronic equipment (or e-waste) is one of the fastest growing waste streams, which encompasses a wide and increasing spectrum of products. Accurate estimation of e-waste generation is difficult, mainly due to lack of high quality data referred to market and socio-economic dynamics. This paper addresses how to enhance e-waste estimates by providing techniques to increase data quality. An advanced, flexible and multivariate Input-Output Analysis (IOA) method is proposed. It links all three pillars in IOA (product sales, stock and lifespan profiles) to construct mathematical relationships between various data points. By applying this method, the data consolidation steps can generate more accurate time-series datasets from available data pool. This can consequently increase the reliability of e-waste estimates compared to the approach without data processing. A case study in the Netherlands is used to apply the advanced IOA model. As a result, for the first time ever, complete datasets of all three variables for estimating all types of e-waste have been obtained. The result of this study also demonstrates significant disparity between various estimation models, arising from the use of data under different conditions. It shows the importance of applying multivariate approach and multiple sources to improve data quality for modelling, specifically using appropriate time-varying lifespan parameters. Following the case study, a roadmap with a procedural guideline is provided to enhance e-waste estimation studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Static and dynamic factors in an information-based multi-asset artificial stock market

    NASA Astrophysics Data System (ADS)

    Ponta, Linda; Pastore, Stefano; Cincotti, Silvano

    2018-02-01

    An information-based multi-asset artificial stock market characterized by different types of stocks and populated by heterogeneous agents is presented. In the market, agents trade risky assets in exchange for cash. Beside the amount of cash and of stocks owned, each agent is characterized by sentiments and agents share their sentiments by means of interactions that are determined by sparsely connected networks. A central market maker (clearing house mechanism) determines the price processes for each stock at the intersection of the demand and the supply curves. Single stock price processes exhibit volatility clustering and fat-tailed distribution of returns whereas multivariate price process exhibits both static and dynamic stylized facts, i.e., the presence of static factors and common trends. Static factors are studied making reference to the cross-correlation of returns of different stocks. The common trends are investigated considering the variance-covariance matrix of prices. Results point out that the probability distribution of eigenvalues of the cross-correlation matrix of returns shows the presence of sectors, similar to those observed on real empirical data. As regarding the dynamic factors, the variance-covariance matrix of prices point out a limited number of assets prices series that are independent integrated processes, in close agreement with the empirical evidence of asset price time series of real stock markets. These results remarks the crucial dependence of statistical properties of multi-assets stock market on the agents' interaction structure.

  5. Dynamic connectivity regression: Determining state-related changes in brain connectivity

    PubMed Central

    Cribben, Ivor; Haraldsdottir, Ragnheidur; Atlas, Lauren Y.; Wager, Tor D.; Lindquist, Martin A.

    2014-01-01

    Most statistical analyses of fMRI data assume that the nature, timing and duration of the psychological processes being studied are known. However, often it is hard to specify this information a priori. In this work we introduce a data-driven technique for partitioning the experimental time course into distinct temporal intervals with different multivariate functional connectivity patterns between a set of regions of interest (ROIs). The technique, called Dynamic Connectivity Regression (DCR), detects temporal change points in functional connectivity and estimates a graph, or set of relationships between ROIs, for data in the temporal partition that falls between pairs of change points. Hence, DCR allows for estimation of both the time of change in connectivity and the connectivity graph for each partition, without requiring prior knowledge of the nature of the experimental design. Permutation and bootstrapping methods are used to perform inference on the change points. The method is applied to various simulated data sets as well as to an fMRI data set from a study (N=26) of a state anxiety induction using a socially evaluative threat challenge. The results illustrate the method’s ability to observe how the networks between different brain regions changed with subjects’ emotional state. PMID:22484408

  6. Patient attitudes toward using computers to improve health services delivery.

    PubMed

    Sciamanna, Christopher N; Diaz, Joseph; Myne, Puja

    2002-09-11

    The aim of this study was to examine the acceptability of point of care computerized prompts to improve health services delivery among a sample of primary care patients. Primary data collection. Cross-sectional survey. Patients were surveyed after their visit with a primary care provider. Data were obtained from patients of ten community-based primary care practices in the spring of 2001. Almost all patients reported that they would support using a computer before each visit to prompt their doctor to: "do health screening tests" (92%), "counsel about health behaviors (like diet and exercise)" (92%) and "change treatments for health conditions" (86%). In multivariate testing, the only variable that was associated with acceptability of the point of care computerized prompts was patient's confidence in their ability to answer questions about their health using a computer (beta = 0.39, p =.001). Concerns about data security were expressed by 36.3% of subjects, but were not related to acceptability of the prompts. Support for using computers to generate point of care prompts to improve quality-oriented processes of care was high in our sample, but may be contingent on patients feeling familiar with their personal medical history.

  7. Combining Frequency Doubling Technology Perimetry and Scanning Laser Polarimetry for Glaucoma Detection.

    PubMed

    Mwanza, Jean-Claude; Warren, Joshua L; Hochberg, Jessica T; Budenz, Donald L; Chang, Robert T; Ramulu, Pradeep Y

    2015-01-01

    To determine the ability of frequency doubling technology (FDT) and scanning laser polarimetry with variable corneal compensation (GDx-VCC) to detect glaucoma when used individually and in combination. One hundred ten normal and 114 glaucomatous subjects were tested with FDT C-20-5 screening protocol and the GDx-VCC. The discriminating ability was tested for each device individually and for both devices combined using GDx-NFI, GDx-TSNIT, number of missed points of FDT, and normal or abnormal FDT. Measures of discrimination included sensitivity, specificity, area under the curve (AUC), Akaike's information criterion (AIC), and prediction confidence interval lengths. For detecting glaucoma regardless of severity, the multivariable model resulting from the combination of GDx-TSNIT, number of abnormal points on FDT (NAP-FDT), and the interaction GDx-TSNIT×NAP-FDT (AIC: 88.28, AUC: 0.959, sensitivity: 94.6%, specificity: 89.5%) outperformed the best single-variable model provided by GDx-NFI (AIC: 120.88, AUC: 0.914, sensitivity: 87.8%, specificity: 84.2%). The multivariable model combining GDx-TSNIT, NAP-FDT, and interaction GDx-TSNIT×NAP-FDT consistently provided better discriminating abilities for detecting early, moderate, and severe glaucoma than the best single-variable models. The multivariable model including GDx-TSNIT, NAP-FDT, and the interaction GDx-TSNIT×NAP-FDT provides the best glaucoma prediction compared with all other multivariable and univariable models. Combining the FDT C-20-5 screening protocol and GDx-VCC improves glaucoma detection compared with using GDx or FDT alone.

  8. Regression analysis for LED color detection of visual-MIMO system

    NASA Astrophysics Data System (ADS)

    Banik, Partha Pratim; Saha, Rappy; Kim, Ki-Doo

    2018-04-01

    Color detection from a light emitting diode (LED) array using a smartphone camera is very difficult in a visual multiple-input multiple-output (visual-MIMO) system. In this paper, we propose a method to determine the LED color using a smartphone camera by applying regression analysis. We employ a multivariate regression model to identify the LED color. After taking a picture of an LED array, we select the LED array region, and detect the LED using an image processing algorithm. We then apply the k-means clustering algorithm to determine the number of potential colors for feature extraction of each LED. Finally, we apply the multivariate regression model to predict the color of the transmitted LEDs. In this paper, we show our results for three types of environmental light condition: room environmental light, low environmental light (560 lux), and strong environmental light (2450 lux). We compare the results of our proposed algorithm from the analysis of training and test R-Square (%) values, percentage of closeness of transmitted and predicted colors, and we also mention about the number of distorted test data points from the analysis of distortion bar graph in CIE1931 color space.

  9. Retro-regression--another important multivariate regression improvement.

    PubMed

    Randić, M

    2001-01-01

    We review the serious problem associated with instabilities of the coefficients of regression equations, referred to as the MRA (multivariate regression analysis) "nightmare of the first kind". This is manifested when in a stepwise regression a descriptor is included or excluded from a regression. The consequence is an unpredictable change of the coefficients of the descriptors that remain in the regression equation. We follow with consideration of an even more serious problem, referred to as the MRA "nightmare of the second kind", arising when optimal descriptors are selected from a large pool of descriptors. This process typically causes at different steps of the stepwise regression a replacement of several previously used descriptors by new ones. We describe a procedure that resolves these difficulties. The approach is illustrated on boiling points of nonanes which are considered (1) by using an ordered connectivity basis; (2) by using an ordering resulting from application of greedy algorithm; and (3) by using an ordering derived from an exhaustive search for optimal descriptors. A novel variant of multiple regression analysis, called retro-regression (RR), is outlined showing how it resolves the ambiguities associated with both "nightmares" of the first and the second kind of MRA.

  10. Evolution of the Max and Mlx networks in animals.

    PubMed

    McFerrin, Lisa G; Atchley, William R

    2011-01-01

    Transcription factors (TFs) are essential for the regulation of gene expression and often form emergent complexes to perform vital roles in cellular processes. In this paper, we focus on the parallel Max and Mlx networks of TFs because of their critical involvement in cell cycle regulation, proliferation, growth, metabolism, and apoptosis. A basic-helix-loop-helix-zipper (bHLHZ) domain mediates the competitive protein dimerization and DNA binding among Max and Mlx network members to form a complex system of cell regulation. To understand the importance of these network interactions, we identified the bHLHZ domain of Max and Mlx network proteins across the animal kingdom and carried out several multivariate statistical analyses. The presence and conservation of Max and Mlx network proteins in animal lineages stemming from the divergence of Metazoa indicate that these networks have ancient and essential functions. Phylogenetic analysis of the bHLHZ domain identified clear relationships among protein families with distinct points of radiation and divergence. Multivariate discriminant analysis further isolated specific amino acid changes within the bHLHZ domain that classify proteins, families, and network configurations. These analyses on Max and Mlx network members provide a model for characterizing the evolution of TFs involved in essential networks.

  11. A new subgrid-scale representation of hydrometeor fields using a multivariate PDF

    DOE PAGES

    Griffin, Brian M.; Larson, Vincent E.

    2016-06-03

    The subgrid-scale representation of hydrometeor fields is important for calculating microphysical process rates. In order to represent subgrid-scale variability, the Cloud Layers Unified By Binormals (CLUBB) parameterization uses a multivariate probability density function (PDF). In addition to vertical velocity, temperature, and moisture fields, the PDF includes hydrometeor fields. Previously, hydrometeor fields were assumed to follow a multivariate single lognormal distribution. Now, in order to better represent the distribution of hydrometeors, two new multivariate PDFs are formulated and introduced.The new PDFs represent hydrometeors using either a delta-lognormal or a delta-double-lognormal shape. The two new PDF distributions, plus the previous single lognormalmore » shape, are compared to histograms of data taken from large-eddy simulations (LESs) of a precipitating cumulus case, a drizzling stratocumulus case, and a deep convective case. In conclusion, the warm microphysical process rates produced by the different hydrometeor PDFs are compared to the same process rates produced by the LES.« less

  12. Comparison of connectivity analyses for resting state EEG data

    NASA Astrophysics Data System (ADS)

    Olejarczyk, Elzbieta; Marzetti, Laura; Pizzella, Vittorio; Zappasodi, Filippo

    2017-06-01

    Objective. In the present work, a nonlinear measure (transfer entropy, TE) was used in a multivariate approach for the analysis of effective connectivity in high density resting state EEG data in eyes open and eyes closed. Advantages of the multivariate approach in comparison to the bivariate one were tested. Moreover, the multivariate TE was compared to an effective linear measure, i.e. directed transfer function (DTF). Finally, the existence of a relationship between the information transfer and the level of brain synchronization as measured by phase synchronization value (PLV) was investigated. Approach. The comparison between the connectivity measures, i.e. bivariate versus multivariate TE, TE versus DTF, TE versus PLV, was performed by means of statistical analysis of indexes based on graph theory. Main results. The multivariate approach is less sensitive to false indirect connections with respect to the bivariate estimates. The multivariate TE differentiated better between eyes closed and eyes open conditions compared to DTF. Moreover, the multivariate TE evidenced non-linear phenomena in information transfer, which are not evidenced by the use of DTF. We also showed that the target of information flow, in particular the frontal region, is an area of greater brain synchronization. Significance. Comparison of different connectivity analysis methods pointed to the advantages of nonlinear methods, and indicated a relationship existing between the flow of information and the level of synchronization of the brain.

  13. Multivariate Generalizations of Student's t-Distribution. ONR Technical Report. [Biometric Lab Report No. 90-3.

    ERIC Educational Resources Information Center

    Gibbons, Robert D.; And Others

    In the process of developing a conditionally-dependent item response theory (IRT) model, the problem arose of modeling an underlying multivariate normal (MVN) response process with general correlation among the items. Without the assumption of conditional independence, for which the underlying MVN cdf takes on comparatively simple forms and can be…

  14. A model for incomplete longitudinal multivariate ordinal data.

    PubMed

    Liu, Li C

    2008-12-30

    In studies where multiple outcome items are repeatedly measured over time, missing data often occur. A longitudinal item response theory model is proposed for analysis of multivariate ordinal outcomes that are repeatedly measured. Under the MAR assumption, this model accommodates missing data at any level (missing item at any time point and/or missing time point). It allows for multiple random subject effects and the estimation of item discrimination parameters for the multiple outcome items. The covariates in the model can be at any level. Assuming either a probit or logistic response function, maximum marginal likelihood estimation is described utilizing multidimensional Gauss-Hermite quadrature for integration of the random effects. An iterative Fisher-scoring solution, which provides standard errors for all model parameters, is used. A data set from a longitudinal prevention study is used to motivate the application of the proposed model. In this study, multiple ordinal items of health behavior are repeatedly measured over time. Because of a planned missing design, subjects answered only two-third of all items at a given point. Copyright 2008 John Wiley & Sons, Ltd.

  15. Stochastic modeling of neurobiological time series: Power, coherence, Granger causality, and separation of evoked responses from ongoing activity

    NASA Astrophysics Data System (ADS)

    Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou

    2006-06-01

    In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.

  16. Multivariate statistical data analysis methods for detecting baroclinic wave interactions in the thermally driven rotating annulus

    NASA Astrophysics Data System (ADS)

    von Larcher, Thomas; Harlander, Uwe; Alexandrov, Kiril; Wang, Yongtai

    2010-05-01

    Experiments on baroclinic wave instabilities in a rotating cylindrical gap have been long performed, e.g., to unhide regular waves of different zonal wave number, to better understand the transition to the quasi-chaotic regime, and to reveal the underlying dynamical processes of complex wave flows. We present the application of appropriate multivariate data analysis methods on time series data sets acquired by the use of non-intrusive measurement techniques of a quite different nature. While the high accurate Laser-Doppler-Velocimetry (LDV ) is used for measurements of the radial velocity component at equidistant azimuthal positions, a high sensitive thermographic camera measures the surface temperature field. The measurements are performed at particular parameter points, where our former studies show that kinds of complex wave patterns occur [1, 2]. Obviously, the temperature data set has much more information content as the velocity data set due to the particular measurement techniques. Both sets of time series data are analyzed by using multivariate statistical techniques. While the LDV data sets are studied by applying the Multi-Channel Singular Spectrum Analysis (M - SSA), the temperature data sets are analyzed by applying the Empirical Orthogonal Functions (EOF ). Our goal is (a) to verify the results yielded with the analysis of the velocity data and (b) to compare the data analysis methods. Therefor, the temperature data are processed in a way to become comparable to the LDV data, i.e. reducing the size of the data set in such a manner that the temperature measurements would imaginary be performed at equidistant azimuthal positions only. This approach initially results in a great loss of information. But applying the M - SSA to the reduced temperature data sets enable us to compare the methods. [1] Th. von Larcher and C. Egbers, Experiments on transitions of baroclinic waves in a differentially heated rotating annulus, Nonlinear Processes in Geophysics, 2005, 12, 1033-1041, NPG Print: ISSN 1023-5809, NPG Online: ISSN 1607-7946 [2] U. Harlander, Th. von Larcher, Y. Wang and C. Egbers, PIV- and LDV-measurements of baroclinic wave interactions in a thermally driven rotating annulus, Experiments in Fluids, 2009, DOI: 10.1007/s00348-009-0792-5

  17. Body proportions of circumpolar peoples as evidenced from skeletal data: Ipiutak and Tigara (Point Hope) versus Kodiak Island Inuit.

    PubMed

    Holliday, Trenton W; Hilton, Charles E

    2010-06-01

    Given the well-documented fact that human body proportions covary with climate (presumably due to the action of selection), one would expect that the Ipiutak and Tigara Inuit samples from Point Hope, Alaska, would be characterized by an extremely cold-adapted body shape. Comparison of the Point Hope Inuit samples to a large (n > 900) sample of European and European-derived, African and African-derived, and Native American skeletons (including Koniag Inuit from Kodiak Island, Alaska) confirms that the Point Hope Inuit evince a cold-adapted body form, but analyses also reveal some unexpected results. For example, one might suspect that the Point Hope samples would show a more cold-adapted body form than the Koniag, given their more extreme environment, but this is not the case. Additionally, univariate analyses seldom show the Inuit samples to be more cold-adapted in body shape than Europeans, and multivariate cluster analyses that include a myriad of body shape variables such as femoral head diameter, bi-iliac breadth, and limb segment lengths fail to effectively separate the Inuit samples from Europeans. In fact, in terms of body shape, the European and the Inuit samples tend to be cold-adapted and tend to be separated in multivariate space from the more tropically adapted Africans, especially those groups from south of the Sahara. Copyright 2009 Wiley-Liss, Inc.

  18. Multivariate Models of Parent-Late Adolescent Gender Dyads: The Importance of Parenting Processes in Predicting Adjustment

    ERIC Educational Resources Information Center

    McKinney, Cliff; Renk, Kimberly

    2008-01-01

    Although parent-adolescent interactions have been examined, relevant variables have not been integrated into a multivariate model. As a result, this study examined a multivariate model of parent-late adolescent gender dyads in an attempt to capture important predictors in late adolescents' important and unique transition to adulthood. The sample…

  19. Multivariate fault isolation of batch processes via variable selection in partial least squares discriminant analysis.

    PubMed

    Yan, Zhengbing; Kuang, Te-Hui; Yao, Yuan

    2017-09-01

    In recent years, multivariate statistical monitoring of batch processes has become a popular research topic, wherein multivariate fault isolation is an important step aiming at the identification of the faulty variables contributing most to the detected process abnormality. Although contribution plots have been commonly used in statistical fault isolation, such methods suffer from the smearing effect between correlated variables. In particular, in batch process monitoring, the high autocorrelations and cross-correlations that exist in variable trajectories make the smearing effect unavoidable. To address such a problem, a variable selection-based fault isolation method is proposed in this research, which transforms the fault isolation problem into a variable selection problem in partial least squares discriminant analysis and solves it by calculating a sparse partial least squares model. As different from the traditional methods, the proposed method emphasizes the relative importance of each process variable. Such information may help process engineers in conducting root-cause diagnosis. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Implementation of a process analytical technology system in a freeze-drying process using Raman spectroscopy for in-line process monitoring.

    PubMed

    De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G

    2007-11-01

    The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us to understand the process.

  1. Predicting critical care unit-level complications after long-segment fusion procedures for adult spinal deformity.

    PubMed

    De la Garza-Ramos, Rafael; Nakhla, Jonathan; Gelfand, Yaroslav; Echt, Murray; Scoco, Aleka N; Kinon, Merritt D; Yassari, Reza

    2018-03-01

    To identify predictive factors for critical care unit-level complications (CCU complication) after long-segment fusion procedures for adult spinal deformity (ASD). The American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) database [2010-2014] was reviewed. Only adult patients who underwent fusion of 7 or more spinal levels for ASD were included. CCU complications included intraoperative arrest/infarction, ventilation >48 hours, pulmonary embolism, renal failure requiring dialysis, cardiac arrest, myocardial infarction, unplanned intubation, septic shock, stroke, coma, or new neurological deficit. A stepwise multivariate regression was used to identify independent predictors of CCU complications. Among 826 patients, the rate of CCU complications was 6.4%. On multivariate regression analysis, dependent functional status (P=0.004), combined approach (P=0.023), age (P=0.044), diabetes (P=0.048), and surgery for over 8 hours (P=0.080) were significantly associated with complication development. A simple scoring system was developed to predict complications with 0 points for patients aged <50, 1 point for patients between 50-70, 2 points for patients 70 or over, 1 point for diabetes, 2 points dependent functional status, 1 point for combined approach, and 1 point for surgery over 8 hours. The rate of CCU complications was 0.7%, 3.2%, 9.0%, and 12.6% for patients with 0, 1, 2, and 3+ points, respectively (P<0.001). The findings in this study suggest that older patients, patients with diabetes, patients who depend on others for activities of daily living, and patients who undergo combined approaches or surgery for over 8 hours may be at a significantly increased risk of developing a CCU-level complication after ASD surgery.

  2. Brain shaving: adaptive detection for brain PET data

    NASA Astrophysics Data System (ADS)

    Grecchi, Elisabetta; Doyle, Orla M.; Bertoldo, Alessandra; Pavese, Nicola; Turkheimer, Federico E.

    2014-05-01

    The intricacy of brain biology is such that the variation of imaging end-points in health and disease exhibits an unpredictable range of spatial distributions from the extremely localized to the very diffuse. This represents a challenge for the two standard approaches to analysis, the mass univariate and the multivariate that exhibit either strong specificity but not as good sensitivity (the former) or poor specificity and comparatively better sensitivity (the latter). In this work, we develop an analytical methodology for positron emission tomography that operates an extraction (‘shaving’) of coherent patterns of signal variation while maintaining control of the type I error. The methodology operates two rotations on the image data, one local using the wavelet transform and one global using the singular value decomposition. The control of specificity is obtained by using the gap statistic that selects, within each eigenvector, a subset of significantly coherent elements. Face-validity of the algorithm is demonstrated using a paradigmatic data-set with two radiotracers, [11C]-raclopride and [11C]-(R)-PK11195, measured on the same Huntington's disease patients, a disorder with a genetic based diagnosis. The algorithm is able to detect the two well-known separate but connected processes of dopamine neuronal loss (localized in the basal ganglia) and neuroinflammation (diffusive around the whole brain). These processes are at the two extremes of the distributional envelope, one being very sparse and the latter being perfectly Gaussian and they are not adequately detected by the univariate and the multivariate approaches.

  3. Gain-scheduling multivariable LPV control of an irrigation canal system.

    PubMed

    Bolea, Yolanda; Puig, Vicenç

    2016-07-01

    The purpose of this paper is to present a multivariable linear parameter varying (LPV) controller with a gain scheduling Smith Predictor (SP) scheme applicable to open-flow canal systems. This LPV controller based on SP is designed taking into account the uncertainty in the estimation of delay and the variation of plant parameters according to the operating point. This new methodology can be applied to a class of delay systems that can be represented by a set of models that can be factorized into a rational multivariable model in series with left/right diagonal (multiple) delays, such as, the case of irrigation canals. A multiple pool canal system is used to test and validate the proposed control approach. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  4. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  5. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  6. Practical robustness measures in multivariable control system analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Lehtomaki, N. A.

    1981-01-01

    The robustness of the stability of multivariable linear time invariant feedback control systems with respect to model uncertainty is considered using frequency domain criteria. Available robustness tests are unified under a common framework based on the nature and structure of model errors. These results are derived using a multivariable version of Nyquist's stability theorem in which the minimum singular value of the return difference transfer matrix is shown to be the multivariable generalization of the distance to the critical point on a single input, single output Nyquist diagram. Using the return difference transfer matrix, a very general robustness theorem is presented from which all of the robustness tests dealing with specific model errors may be derived. The robustness tests that explicitly utilized model error structure are able to guarantee feedback system stability in the face of model errors of larger magnitude than those robustness tests that do not. The robustness of linear quadratic Gaussian control systems are analyzed.

  7. A Course in... Multivariable Control Methods.

    ERIC Educational Resources Information Center

    Deshpande, Pradeep B.

    1988-01-01

    Describes an engineering course for graduate study in process control. Lists four major topics: interaction analysis, multiloop controller design, decoupling, and multivariable control strategies. Suggests a course outline and gives information about each topic. (MVL)

  8. Centralized PI control for high dimensional multivariable systems based on equivalent transfer function.

    PubMed

    Luan, Xiaoli; Chen, Qiang; Liu, Fei

    2014-09-01

    This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Multiscale analysis of information dynamics for linear multivariate processes.

    PubMed

    Faes, Luca; Montalto, Alessandro; Stramaglia, Sebastiano; Nollo, Giandomenico; Marinazzo, Daniele

    2016-08-01

    In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving average (MA) component, we describe how to represent the resulting VARMA process using statespace (SS) models and how to exploit the SS model parameters to compute analytical measures of information storage and information transfer for the original and rescaled processes. The framework is then used to quantify multiscale information dynamics for simulated unidirectionally and bidirectionally coupled VAR processes, showing that rescaling may lead to insightful patterns of information storage and transfer but also to potentially misleading behaviors.

  10. Grammatical Analysis as a Distributed Neurobiological Function

    PubMed Central

    Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

    2015-01-01

    Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences—inflectionally complex words and minimal phrases—and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. PMID:25421880

  11. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  12. A traditional Sami diet score as a determinant of mortality in a general northern Swedish population.

    PubMed

    Nilsson, Lena Maria; Winkvist, Anna; Brustad, Magritt; Jansson, Jan-Håkan; Johansson, Ingegerd; Lenner, Per; Lindahl, Bernt; Van Guelpen, Bethany

    2012-05-04

    To examine the relationship between "traditional Sami" dietary pattern and mortality in a general northern Swedish population. Population-based cohort study. We examined 77,319 subjects from the Västerbotten Intervention Program (VIP) cohort. A traditional Sami diet score was constructed by adding 1 point for intake above the median level of red meat, fatty fish, total fat, berries and boiled coffee, and 1 point for intake below the median of vegetables, bread and fibre. Hazard ratios (HR) for mortality were calculated by Cox regression. Increasing traditional Sami diet scores were associated with slightly elevated all-cause mortality in men [Multivariate HR per 1-point increase in score 1.04 (95% CI 1.01-1.07), p=0.018], but not for women [Multivariate HR 1.03 (95% CI 0.99-1.07), p=0.130]. This increased risk was approximately equally attributable to cardiovascular disease and cancer, though somewhat more apparent for cardiovascular disease mortality in men free from diabetes, hypertension and obesity at baseline [Multivariate HR 1.10 (95% CI 1.01-1.20), p=0.023]. A weak increased all-cause mortality was observed in men with higher traditional Sami diet scores. However, due to the complexity in defining a "traditional Sami" diet, and the limitations of our questionnaire for this purpose, the study should be considered exploratory, a first attempt to relate a "traditional Sami" dietary pattern to health endpoints. Further investigation of cohorts with more detailed information on dietary and lifestyle items relevant for traditional Sami culture is warranted.

  13. Gaussian windows: A tool for exploring multivariate data

    NASA Technical Reports Server (NTRS)

    Jaeckel, Louis A.

    1990-01-01

    Presented here is a method for interactively exploring a large set of quantitative multivariate data, in order to estimate the shape of the underlying density function. It is assumed that the density function is more or less smooth, but no other specific assumptions are made concerning its structure. The local structure of the data in a given region may be examined by viewing the data through a Gaussian window, whose location and shape are chosen by the user. A Gaussian window is defined by giving each data point a weight based on a multivariate Gaussian function. The weighted sample mean and sample covariance matrix are then computed, using the weights attached to the data points. These quantities are used to compute an estimate of the shape of the density function in the window region. The local structure of the data is described by a method similar to the method of principal components. By taking many such local views of the data, we can form an idea of the structure of the data set. The method is applicable in any number of dimensions. The method can be used to find and describe simple structural features such as peaks, valleys, and saddle points in the density function, and also extended structures in higher dimensions. With some practice, we can apply our geometrical intuition to these structural features in any number of dimensions, so that we can think about and describe the structure of the data. Since the computations involved are relatively simple, the method can easily be implemented on a small computer.

  14. Viewpoints: Interactive Exploration of Large Multivariate Earth and Space Science Data Sets

    NASA Astrophysics Data System (ADS)

    Levit, C.; Gazis, P. R.

    2006-05-01

    Analysis and visualization of extremely large and complex data sets may be one of the most significant challenges facing earth and space science investigators in the forthcoming decades. While advances in hardware speed and storage technology have roughly kept up with (indeed, have driven) increases in database size, the same is not of our abilities to manage the complexity of these data. Current missions, instruments, and simulations produce so much data of such high dimensionality that they outstrip the capabilities of traditional visualization and analysis software. This problem can only be expected to get worse as data volumes increase by orders of magnitude in future missions and in ever-larger supercomputer simulations. For large multivariate data (more than 105 samples or records with more than 5 variables per sample) the interactive graphics response of most existing statistical analysis, machine learning, exploratory data analysis, and/or visualization tools such as Torch, MLC++, Matlab, S++/R, and IDL stutters, stalls, or stops working altogether. Fortunately, the graphics processing units (GPUs) built in to all professional desktop and laptop computers currently on the market are capable of transforming, filtering, and rendering hundreds of millions of points per second. We present a prototype open-source cross-platform application which leverages much of the power latent in the GPU to enable smooth interactive exploration and analysis of large high- dimensional data using a variety of classical and recent techniques. The targeted application is the interactive analysis of large, complex, multivariate data sets, with dimensionalities that may surpass 100 and sample sizes that may exceed 106-108.

  15. Combining Frequency Doubling Technology Perimetry and Scanning Laser Polarimetry for Glaucoma Detection

    PubMed Central

    Mwanza, Jean-Claude; Warren, Joshua L.; Hochberg, Jessica T.; Budenz, Donald L.; Chang, Robert T.; Ramulu, Pradeep Y.

    2014-01-01

    Purpose To determine the ability of frequency doubling technology (FDT) and scanning laser polarimetry with variable corneal compensation (GDx-VCC) to detect glaucoma when used individually and in combination. Methods One hundred and ten normal and 114 glaucomatous subjects were tested with FDT C-20-5 screening protocol and the GDx-VCC. The discriminating ability was tested for each device individually and for both devices combined using GDx-NFI, GDx-TSNIT, number of missed points of FDT, and normal or abnormal FDT. Measures of discrimination included sensitivity, specificity, area under the curve (AUC), Akaike’s information criterion (AIC), and prediction confidence interval lengths (PIL). Results For detecting glaucoma regardless of severity, the multivariable model resulting from the combination of GDX-TSNIT, number of abnormal points on FDT (NAP-FDT), and the interaction GDx-TSNIT * NAP-FDT (AIC: 88.28, AUC: 0.959, sensitivity: 94.6%, specificity: 89.5%) outperformed the best single variable model provided by GDx-NFI (AIC: 120.88, AUC: 0.914, sensitivity: 87.8%, specificity: 84.2%). The multivariable model combining GDx-TSNIT, NAPFDT, and interaction GDx-TSNIT*NAP-FDT consistently provided better discriminating abilities for detecting early, moderate and severe glaucoma than the best single variable models. Conclusions The multivariable model including GDx-TSNIT, NAP-FDT, and the interaction GDX-TSNIT * NAP-FDT provides the best glaucoma prediction compared to all other multivariable and univariable models. Combining the FDT C-20-5 screening protocol and GDx-VCC improves glaucoma detection compared to using GDx or FDT alone. PMID:24777046

  16. The effect of process parameters on audible acoustic emissions from high-shear granulation.

    PubMed

    Hansuld, Erin M; Briens, Lauren; Sayani, Amyn; McCann, Joe A B

    2013-02-01

    Product quality in high-shear granulation is easily compromised by minor changes in raw material properties or process conditions. It is desired to develop a process analytical technology (PAT) that can monitor the process in real-time and provide feedback for quality control. In this work, the application of audible acoustic emissions (AAEs) as a PAT tool was investigated. A condenser microphone was placed at the top of the air exhaust on a PMA-10 high-shear granulator to collect AAEs for a design of experiment (DOE) varying impeller speed, total binder volume and spray rate. The results showed the 10 Hz total power spectral densities (TPSDs) between 20 and 250 Hz were significantly affected by the changes in process conditions. Impeller speed and spray rate were shown to have statistically significant effects on granulation wetting, and impeller speed and total binder volume were significant in terms of process end-point. The DOE results were confirmed by a multivariate PLS model of the TPSDs. The scores plot showed separation based on impeller speed in the first component and spray rate in the second component. The findings support the use of AAEs to monitor changes in process conditions in real-time and achieve consistent product quality.

  17. Plurigon: three dimensional visualization and classification of high-dimensionality data

    PubMed Central

    Martin, Bronwen; Chen, Hongyu; Daimon, Caitlin M.; Chadwick, Wayne; Siddiqui, Sana; Maudsley, Stuart

    2013-01-01

    High-dimensionality data is rapidly becoming the norm for biomedical sciences and many other analytical disciplines. Not only is the collection and processing time for such data becoming problematic, but it has become increasingly difficult to form a comprehensive appreciation of high-dimensionality data. Though data analysis methods for coping with multivariate data are well-documented in technical fields such as computer science, little effort is currently being expended to condense data vectors that exist beyond the realm of physical space into an easily interpretable and aesthetic form. To address this important need, we have developed Plurigon, a data visualization and classification tool for the integration of high-dimensionality visualization algorithms with a user-friendly, interactive graphical interface. Unlike existing data visualization methods, which are focused on an ensemble of data points, Plurigon places a strong emphasis upon the visualization of a single data point and its determining characteristics. Multivariate data vectors are represented in the form of a deformed sphere with a distinct topology of hills, valleys, plateaus, peaks, and crevices. The gestalt structure of the resultant Plurigon object generates an easily-appreciable model. User interaction with the Plurigon is extensive; zoom, rotation, axial and vector display, feature extraction, and anaglyph stereoscopy are currently supported. With Plurigon and its ability to analyze high-complexity data, we hope to see a unification of biomedical and computational sciences as well as practical applications in a wide array of scientific disciplines. Increased accessibility to the analysis of high-dimensionality data may increase the number of new discoveries and breakthroughs, ranging from drug screening to disease diagnosis to medical literature mining. PMID:23885241

  18. Impaired left ventricular systolic function and increased brachial-ankle pulse-wave velocity are independently associated with rapid renal function progression.

    PubMed

    Chen, Szu-Chia; Lin, Tsung-Hsien; Hsu, Po-Chao; Chang, Jer-Ming; Lee, Chee-Siong; Tsai, Wei-Chung; Su, Ho-Ming; Voon, Wen-Chol; Chen, Hung-Chun

    2011-09-01

    Heart failure and increased arterial stiffness are associated with declining renal function. Few studies have evaluated the association between left ventricular ejection fraction (LVEF) and brachial-ankle pulse-wave velocity (baPWV) and renal function progression. The aim of this study was to assess whether LVEF<40% and baPWV are associated with a decline in the estimated glomerular filtration rate (eGFR) and the progression to a renal end point of ≥25% decline in eGFR. This longitudinal study included 167 patients. The baPWV was measured with an ankle-brachial index-form device. The change in renal function was estimated by eGFR slope. The renal end point was defined as ≥25% decline in eGFR. Clinical and echocardiographic parameters were compared and analyzed. After a multivariate analysis, serum hematocrit was positively associated with eGFR slope, and diabetes mellitus, baPWV (P=0.031) and LVEF<40% (P=0.001) were negatively associated with eGFR slope. Forty patients reached the renal end point. Multivariate, forward Cox regression analysis found that lower serum albumin and hematocrit levels, higher triglyceride levels, higher baPWV (P=0.039) and LVEF<40% (P<0.001) were independently associated with progression to the renal end point. Our results show that LVEF<40% and increased baPWV are independently associated with renal function decline and progression to the renal end point.

  19. Monte Carlo algorithms for Brownian phylogenetic models.

    PubMed

    Horvilleur, Benjamin; Lartillot, Nicolas

    2014-11-01

    Brownian models have been introduced in phylogenetics for describing variation in substitution rates through time, with applications to molecular dating or to the comparative analysis of variation in substitution patterns among lineages. Thus far, however, the Monte Carlo implementations of these models have relied on crude approximations, in which the Brownian process is sampled only at the internal nodes of the phylogeny or at the midpoints along each branch, and the unknown trajectory between these sampled points is summarized by simple branchwise average substitution rates. A more accurate Monte Carlo approach is introduced, explicitly sampling a fine-grained discretization of the trajectory of the (potentially multivariate) Brownian process along the phylogeny. Generic Monte Carlo resampling algorithms are proposed for updating the Brownian paths along and across branches. Specific computational strategies are developed for efficient integration of the finite-time substitution probabilities across branches induced by the Brownian trajectory. The mixing properties and the computational complexity of the resulting Markov chain Monte Carlo sampler scale reasonably with the discretization level, allowing practical applications with up to a few hundred discretization points along the entire depth of the tree. The method can be generalized to other Markovian stochastic processes, making it possible to implement a wide range of time-dependent substitution models with well-controlled computational precision. The program is freely available at www.phylobayes.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. DENBRAN: A basic program for a significance test for multivariate normality of clusters from branching patterns in dendrograms

    NASA Astrophysics Data System (ADS)

    Sneath, P. H. A.

    A BASIC program is presented for significance tests to determine whether a dendrogram is derived from clustering of points that belong to a single multivariate normal distribution. The significance tests are based on statistics of the Kolmogorov—Smirnov type, obtained by comparing the observed cumulative graph of branch levels with a graph for the hypothesis of multivariate normality. The program also permits testing whether the dendrogram could be from a cluster of lower dimensionality due to character correlations. The program makes provision for three similarity coefficients, (1) Euclidean distances, (2) squared Euclidean distances, and (3) Simple Matching Coefficients, and for five cluster methods (1) WPGMA, (2) UPGMA, (3) Single Linkage (or Minimum Spanning Trees), (4) Complete Linkage, and (5) Ward's Increase in Sums of Squares. The program is entitled DENBRAN.

  1. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    PubMed

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  2. Does Learning to Read Improve Intelligence? A Longitudinal Multivariate Analysis in Identical Twins From Age 7 to 16

    PubMed Central

    Ritchie, Stuart J; Bates, Timothy C; Plomin, Robert

    2015-01-01

    Evidence from twin studies points to substantial environmental influences on intelligence, but the specifics of this influence are unclear. This study examined one developmental process that potentially causes intelligence differences: learning to read. In 1,890 twin pairs tested at 7, 9, 10, 12, and 16 years, a cross-lagged monozygotic-differences design was used to test for associations of earlier within-pair reading ability differences with subsequent intelligence differences. The results showed several such associations, which were not explained by differences in reading exposure and were not restricted to verbal cognitive domains. The study highlights the potentially important influence of reading ability, driven by the nonshared environment, on intellectual development and raises theoretical questions about the mechanism of this influence. PMID:25056688

  3. Multivariate analysis of meat production traits in Murciano-Granadina goat kids.

    PubMed

    Zurita-Herrera, P; Delgado, J V; Argüello, A; Camacho, M E

    2011-07-01

    Growth, carcass quality, and meat quality data from Murciano-Granadina kids (n=61) raised under three different systems were collected. Canonical discriminatory analysis and cluster analysis of the entire meat production process and its stages were performed using the rearing systems as grouping criteria. All comparisons resulted in significant differences and indicated the existence of three products with different quality characteristics as a result of the three rearing systems. Differences among groups were greater when comparing carcass and meat qualities as compared with growth differences. The paired analyses of canonical correlations among groups of variables integrated in growth, carcass and meat quality, resulted in all being statistically significant, pointing out the canonical correlation coefficient between carcass quality and meat quality. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Machine processing for remotely acquired data. [using multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1974-01-01

    This paper is a general discussion of earth resources information systems which utilize airborne and spaceborne sensors. It points out that information may be derived by sensing and analyzing the spectral, spatial and temporal variations of electromagnetic fields emanating from the earth surface. After giving an overview system organization, the two broad categories of system types are discussed. These are systems in which high quality imagery is essential and those more numerically oriented. Sensors are also discussed with this categorization of systems in mind. The multispectral approach and pattern recognition are described as an example data analysis procedure for numerically-oriented systems. The steps necessary in using a pattern recognition scheme are described and illustrated with data obtained from aircraft and the Earth Resources Technology Satellite (ERTS-1).

  5. Sequence-independent construction of ordered combinatorial libraries with predefined crossover points.

    PubMed

    Jézéquel, Laetitia; Loeper, Jacqueline; Pompon, Denis

    2008-11-01

    Combinatorial libraries coding for mosaic enzymes with predefined crossover points constitute useful tools to address and model structure-function relationships and for functional optimization of enzymes based on multivariate statistics. The presented method, called sequence-independent generation of a chimera-ordered library (SIGNAL), allows easy shuffling of any predefined amino acid segment between two or more proteins. This method is particularly well adapted to the exchange of protein structural modules. The procedure could also be well suited to generate ordered combinatorial libraries independent of sequence similarities in a robotized manner. Sequence segments to be recombined are first extracted by PCR from a single-stranded template coding for an enzyme of interest using a biotin-avidin-based method. This technique allows the reduction of parental template contamination in the final library. Specific PCR primers allow amplification of two complementary mosaic DNA fragments, overlapping in the region to be exchanged. Fragments are finally reassembled using a fusion PCR. The process is illustrated via the construction of a set of mosaic CYP2B enzymes using this highly modular approach.

  6. New multivariable capabilities of the INCA program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1989-01-01

    The INteractive Controls Analysis (INCA) program was developed at NASA's Goddard Space Flight Center to provide a user friendly, efficient environment for the design and analysis of control systems, specifically spacecraft control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. The (INCA) program was initially developed as a comprehensive classical design analysis tool for small and large order control systems. The latest version of INCA, expected to be released in February of 1990, was expanded to include the capability to perform multivariable controls analysis and design.

  7. Modeling strategies for pharmaceutical blend monitoring and end-point determination by near-infrared spectroscopy.

    PubMed

    Igne, Benoît; de Juan, Anna; Jaumot, Joaquim; Lallemand, Jordane; Preys, Sébastien; Drennen, James K; Anderson, Carl A

    2014-10-01

    The implementation of a blend monitoring and control method based on a process analytical technology such as near infrared spectroscopy requires the selection and optimization of numerous criteria that will affect the monitoring outputs and expected blend end-point. Using a five component formulation, the present article contrasts the modeling strategies and end-point determination of a traditional quantitative method based on the prediction of the blend parameters employing partial least-squares regression with a qualitative strategy based on principal component analysis and Hotelling's T(2) and residual distance to the model, called Prototype. The possibility to monitor and control blend homogeneity with multivariate curve resolution was also assessed. The implementation of the above methods in the presence of designed experiments (with variation of the amount of active ingredient and excipients) and with normal operating condition samples (nominal concentrations of the active ingredient and excipients) was tested. The impact of criteria used to stop the blends (related to precision and/or accuracy) was assessed. Results demonstrated that while all methods showed similarities in their outputs, some approaches were preferred for decision making. The selectivity of regression based methods was also contrasted with the capacity of qualitative methods to determine the homogeneity of the entire formulation. Copyright © 2014. Published by Elsevier B.V.

  8. Linear models of coregionalization for multivariate lattice data: Order-dependent and order-free cMCARs.

    PubMed

    MacNab, Ying C

    2016-08-01

    This paper concerns with multivariate conditional autoregressive models defined by linear combination of independent or correlated underlying spatial processes. Known as linear models of coregionalization, the method offers a systematic and unified approach for formulating multivariate extensions to a broad range of univariate conditional autoregressive models. The resulting multivariate spatial models represent classes of coregionalized multivariate conditional autoregressive models that enable flexible modelling of multivariate spatial interactions, yielding coregionalization models with symmetric or asymmetric cross-covariances of different spatial variation and smoothness. In the context of multivariate disease mapping, for example, they facilitate borrowing strength both over space and cross variables, allowing for more flexible multivariate spatial smoothing. Specifically, we present a broadened coregionalization framework to include order-dependent, order-free, and order-robust multivariate models; a new class of order-free coregionalized multivariate conditional autoregressives is introduced. We tackle computational challenges and present solutions that are integral for Bayesian analysis of these models. We also discuss two ways of computing deviance information criterion for comparison among competing hierarchical models with or without unidentifiable prior parameters. The models and related methodology are developed in the broad context of modelling multivariate data on spatial lattice and illustrated in the context of multivariate disease mapping. The coregionalization framework and related methods also present a general approach for building spatially structured cross-covariance functions for multivariate geostatistics. © The Author(s) 2016.

  9. Multiresponse modeling of variably saturated flow and isotope tracer transport for a hillslope experiment at the Landscape Evolution Observatory

    NASA Astrophysics Data System (ADS)

    Scudeler, Carlotta; Pangle, Luke; Pasetto, Damiano; Niu, Guo-Yue; Volkmann, Till; Paniconi, Claudio; Putti, Mario; Troch, Peter

    2016-10-01

    This paper explores the challenges of model parameterization and process representation when simulating multiple hydrologic responses from a highly controlled unsaturated flow and transport experiment with a physically based model. The experiment, conducted at the Landscape Evolution Observatory (LEO), involved alternate injections of water and deuterium-enriched water into an initially very dry hillslope. The multivariate observations included point measures of water content and tracer concentration in the soil, total storage within the hillslope, and integrated fluxes of water and tracer through the seepage face. The simulations were performed with a three-dimensional finite element model that solves the Richards and advection-dispersion equations. Integrated flow, integrated transport, distributed flow, and distributed transport responses were successively analyzed, with parameterization choices at each step supported by standard model performance metrics. In the first steps of our analysis, where seepage face flow, water storage, and average concentration at the seepage face were the target responses, an adequate match between measured and simulated variables was obtained using a simple parameterization consistent with that from a prior flow-only experiment at LEO. When passing to the distributed responses, it was necessary to introduce complexity to additional soil hydraulic parameters to obtain an adequate match for the point-scale flow response. This also improved the match against point measures of tracer concentration, although model performance here was considerably poorer. This suggests that still greater complexity is needed in the model parameterization, or that there may be gaps in process representation for simulating solute transport phenomena in very dry soils.

  10. QSAR modeling of cumulative environmental end-points for the prioritization of hazardous chemicals.

    PubMed

    Gramatica, Paola; Papa, Ester; Sangion, Alessandro

    2018-01-24

    The hazard of chemicals in the environment is inherently related to the molecular structure and derives simultaneously from various chemical properties/activities/reactivities. Models based on Quantitative Structure Activity Relationships (QSARs) are useful to screen, rank and prioritize chemicals that may have an adverse impact on humans and the environment. This paper reviews a selection of QSAR models (based on theoretical molecular descriptors) developed for cumulative multivariate endpoints, which were derived by mathematical combination of multiple effects and properties. The cumulative end-points provide an integrated holistic point of view to address environmentally relevant properties of chemicals.

  11. FROM FINANCE TO COSMOLOGY: THE COPULA OF LARGE-SCALE STRUCTURE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scherrer, Robert J.; Berlind, Andreas A.; Mao, Qingqing

    2010-01-01

    Any multivariate distribution can be uniquely decomposed into marginal (one-point) distributions, and a function called the copula, which contains all of the information on correlations between the distributions. The copula provides an important new methodology for analyzing the density field in large-scale structure. We derive the empirical two-point copula for the evolved dark matter density field. We find that this empirical copula is well approximated by a Gaussian copula. We consider the possibility that the full n-point copula is also Gaussian and describe some of the consequences of this hypothesis. Future directions for investigation are discussed.

  12. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  13. Multivariate Visualization in Social Sciences and Survey Data

    DTIC Science & Technology

    2013-09-01

    uses bubbles indicating Walmart store locations. The bubble size is misleading as it does not reflect the amount of stores or the size of any store...displaying survey data, the bubbles’ exact location is relevant, indicating Walmart store locations. Yau’s choropleth (Figure 2.7, right chart) displays...is able to see the embedded image. 14 Figure 2.7: Point-based bubbles (left) display the the locations of Walmart stores at some point in the stores

  14. Estimating brain connectivity when few data points are available: Perspectives and limitations.

    PubMed

    Antonacci, Yuri; Toppi, Jlenia; Caschera, Stefano; Anzolin, Alessandra; Mattia, Donatella; Astolfi, Laura

    2017-07-01

    Methods based on the use of multivariate autoregressive modeling (MVAR) have proved to be an accurate and flexible tool for the estimation of brain functional connectivity. The multivariate approach, however, implies the use of a model whose complexity (in terms of number of parameters) increases quadratically with the number of signals included in the problem. This can often lead to an underdetermined problem and to the condition of multicollinearity. The aim of this paper is to introduce and test an approach based on Ridge Regression combined with a modified version of the statistics usually adopted for these methods, to broaden the estimation of brain connectivity to those conditions in which current methods fail, due to the lack of enough data points. We tested the performances of this new approach, in comparison with the classical approach based on ordinary least squares (OLS), by means of a simulation study implementing different ground-truth networks, under different network sizes and different levels of data points. Simulation results showed that the new approach provides better performances, in terms of accuracy of the parameters estimation and false positives/false negatives rates, in all conditions related to a low data points/model dimension ratio, and may thus be exploited to estimate and validate estimated patterns at single-trial level or when short time data segments are available.

  15. Structural Equation Modeling: Applications in ecological and evolutionary biology research

    USGS Publications Warehouse

    Pugesek, Bruce H.; von Eye, Alexander; Tomer, Adrian

    2003-01-01

    This book presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems. Supplementary information can be found at the authors website, http://www.jamesbgrace.com/. • Details why multivariate analyses should be used to study ecological systems • Exposes unappreciated weakness in many current popular analyses • Emphasizes the future methodological developments needed to advance our understanding of ecological systems.

  16. Experimental and simulation studies of multivariable adaptive optimization of continuous bioreactors using bilevel forgetting factors.

    PubMed

    Chang, Y K; Lim, H C

    1989-08-20

    A multivariable on-line adaptive optimization algorithm using a bilevel forgetting factor method was developed and applied to a continuous baker's yeast culture in simulation and experimental studies to maximize the cellular productivity by manipulating the dilution rate and the temperature. The algorithm showed a good optimization speed and a good adaptability and reoptimization capability. The algorithm was able to stably maintain the process around the optimum point for an extended period of time. Two cases were investigated: an unconstrained and a constrained optimization. In the constrained optimization the ethanol concentration was used as an index for the baking quality of yeast cells. An equality constraint with a quadratic penalty was imposed on the ethanol concentration to keep its level close to a hypothetical "optimum" value. The developed algorithm was experimentally applied to a baker's yeast culture to demonstrate its validity. Only unconstrained optimization was carried out experimentally. A set of tuning parameter values was suggested after evaluating the results from several experimental runs. With those tuning parameter values the optimization took 50-90 h. At the attained steady state the dilution rate was 0.310 h(-1) the temperature 32.8 degrees C, and the cellular productivity 1.50 g/L/h.

  17. Beer fermentation: monitoring of process parameters by FT-NIR and multivariate data analysis.

    PubMed

    Grassi, Silvia; Amigo, José Manuel; Lyndgaard, Christian Bøge; Foschino, Roberto; Casiraghi, Ernestina

    2014-07-15

    This work investigates the capability of Fourier-Transform near infrared (FT-NIR) spectroscopy to monitor and assess process parameters in beer fermentation at different operative conditions. For this purpose, the fermentation of wort with two different yeast strains and at different temperatures was monitored for nine days by FT-NIR. To correlate the collected spectra with °Brix, pH and biomass, different multivariate data methodologies were applied. Principal component analysis (PCA), partial least squares (PLS) and locally weighted regression (LWR) were used to assess the relationship between FT-NIR spectra and the abovementioned process parameters that define the beer fermentation. The accuracy and robustness of the obtained results clearly show the suitability of FT-NIR spectroscopy, combined with multivariate data analysis, to be used as a quality control tool in the beer fermentation process. FT-NIR spectroscopy, when combined with LWR, demonstrates to be a perfectly suitable quantitative method to be implemented in the production of beer. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Creation and validation of a novel body condition scoring method for the magellanic penguin (Spheniscus magellanicus) in the zoo setting.

    PubMed

    Clements, Julie; Sanchez, Jessica N

    2015-11-01

    This research aims to validate a novel, visual body scoring system created for the Magellanic penguin (Spheniscus magellanicus) suitable for the zoo practitioner. Magellanics go through marked seasonal fluctuations in body mass gains and losses. A standardized multi-variable visual body condition guide may provide a more sensitive and objective assessment tool compared to the previously used single variable method. Accurate body condition scores paired with seasonal weight variation measurements give veterinary and keeper staff a clearer understanding of an individual's nutritional status. San Francisco Zoo staff previously used a nine-point body condition scale based on the classic bird standard of a single point of keel palpation with the bird restrained in hand, with no standard measure of reference assigned to each scoring category. We created a novel, visual body condition scoring system that does not require restraint to assesses subcutaneous fat and muscle at seven body landmarks using illustrations and descriptive terms. The scores range from one, the least robust or under-conditioned, to five, the most robust, or over-conditioned. The ratio of body weight to wing length was used as a "gold standard" index of body condition and compared to both the novel multi-variable and previously used single-variable body condition scores. The novel multi-variable scale showed improved agreement with weight:wing ratio compared to the single-variable scale, demonstrating greater accuracy, and reliability when a trained assessor uses the multi-variable body condition scoring system. Zoo staff may use this tool to manage both the colony and the individual to assist in seasonally appropriate Magellanic penguin nutrition assessment. © 2015 Wiley Periodicals, Inc.

  19. Nitrate reductase activity of Staphylococcus carnosus affecting the color formation in cured raw ham.

    PubMed

    Bosse Née Danz, Ramona; Gibis, Monika; Schmidt, Herbert; Weiss, Jochen

    2016-07-01

    The influence of the nitrate reductase activity of two Staphylococcus carnosus strains used as starter cultures on the formation of nitrate, nitrite and color pigments in cured raw ham was investigated. In this context, microbiological, chemical and multivariate image analyses were carried out on cured raw hams, which were injected with different brines containing either nitrite or nitrate, with or without the S. carnosus starter cultures. During processing and storage, the viable counts of staphylococci remained constant at 6.5logcfu/g in the hams inoculated with starter cultures, while the background microbiota of the hams processed without the starter cultures developed after 14days. Those cured hams inoculated with S. carnosus LTH 7036 (high nitrate reductase activity) showed the highest decrease in nitrate and high nitrite concentrations in the end product, but were still in the range of the legal European level. The hams cured with nitrate and without starter culture or with the other strain, S. carnosus LTH 3838 (low nitrate reductase activity) showed higher residual nitrate levels and a lower nitrite content in the end product. The multivariate image analysis identified spatial and temporal differences in the meat pigment profiles of the differently cured hams. The cured hams inoculated with S. carnosus LTH 3838 showed an uncured core due to a delay in pigment formation. Therefore, the selection of starter cultures based on their nitrate reductase activity is a key point in the formation of curing compounds and color pigments in cured raw ham manufacture. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Untangling Magmatic Processes and Hydrothermal Alteration of in situ Superfast Spreading Ocean Crust at ODP/IODP Site 1256 with Fuzzy c-means Cluster Analysis of Rock Magnetic Properties

    NASA Astrophysics Data System (ADS)

    Dekkers, M. J.; Heslop, D.; Herrero-Bervera, E.; Acton, G.; Krasa, D.

    2014-12-01

    Ocean Drilling Program (ODP)/Integrated ODP (IODP) Hole 1256D (6.44.1' N, 91.56.1' W) on the Cocos Plate occurs in 15.2 Ma oceanic crust generated by superfast seafloor spreading. Presently, it is the only drill hole that has sampled all three oceanic crust layers in a tectonically undisturbed setting. Here we interpret down-hole trends in several rock-magnetic parameters with fuzzy c-means cluster analysis, a multivariate statistical technique. The parameters include the magnetization ratio, the coercivity ratio, the coercive force, the low-field susceptibility, and the Curie temperature. By their combined, multivariate, analysis the effects of magmatic and hydrothermal processes can be evaluated. The optimal number of clusters - a key point in the analysis because there is no a priori information on this - was determined through a combination of approaches: by calculation of several cluster validity indices, by testing for coherent cluster distributions on non-linear-map plots, and importantly by testing for stability of the cluster solution from all possible starting points. Here, we consider a solution robust if the cluster allocation is independent of the starting configuration. The five-cluster solution appeared to be robust. Three clusters are distinguished in the extrusive segment of the Hole that express increasing hydrothermal alteration of the lavas. The sheeted dike and gabbro portions are characterized by two clusters, both with higher coercivities than in lava samples. Extensive alteration, however, can obliterate magnetic property differences between lavas, dikes, and gabbros. The imprint of thermochemical alteration on the iron-titanium oxides is only partially related to the porosity of the rocks. All clusters display rock magnetic characteristics in line with a stable NRM. This implies that the entire sampled sequence of ocean crust can contribute to marine magnetic anomalies. Determination of the absolute paleointensity with thermal techniques is not straightforward because of the propensity of oxyexsolution during laboratory heating and/or the presence of intergrowths. The upper part of the extrusive sequence, the granoblastic portion of the dikes, and moderately altered gabbros may contain a comparatively uncontaminated thermoremanent magnetization.

  1. Multivariate generalized multifactor dimensionality reduction to detect gene-gene interactions

    PubMed Central

    2013-01-01

    Background Recently, one of the greatest challenges in genome-wide association studies is to detect gene-gene and/or gene-environment interactions for common complex human diseases. Ritchie et al. (2001) proposed multifactor dimensionality reduction (MDR) method for interaction analysis. MDR is a combinatorial approach to reduce multi-locus genotypes into high-risk and low-risk groups. Although MDR has been widely used for case-control studies with binary phenotypes, several extensions have been proposed. One of these methods, a generalized MDR (GMDR) proposed by Lou et al. (2007), allows adjusting for covariates and applying to both dichotomous and continuous phenotypes. GMDR uses the residual score of a generalized linear model of phenotypes to assign either high-risk or low-risk group, while MDR uses the ratio of cases to controls. Methods In this study, we propose multivariate GMDR, an extension of GMDR for multivariate phenotypes. Jointly analysing correlated multivariate phenotypes may have more power to detect susceptible genes and gene-gene interactions. We construct generalized estimating equations (GEE) with multivariate phenotypes to extend generalized linear models. Using the score vectors from GEE we discriminate high-risk from low-risk groups. We applied the multivariate GMDR method to the blood pressure data of the 7,546 subjects from the Korean Association Resource study: systolic blood pressure (SBP) and diastolic blood pressure (DBP). We compare the results of multivariate GMDR for SBP and DBP to the results from separate univariate GMDR for SBP and DBP, respectively. We also applied the multivariate GMDR method to the repeatedly measured hypertension status from 5,466 subjects and compared its result with those of univariate GMDR at each time point. Results Results from the univariate GMDR and multivariate GMDR in two-locus model with both blood pressures and hypertension phenotypes indicate best combinations of SNPs whose interaction has significant association with risk for high blood pressures or hypertension. Although the test balanced accuracy (BA) of multivariate analysis was not always greater than that of univariate analysis, the multivariate BAs were more stable with smaller standard deviations. Conclusions In this study, we have developed multivariate GMDR method using GEE approach. It is useful to use multivariate GMDR with correlated multiple phenotypes of interests. PMID:24565370

  2. A stepwise, multi-objective, multi-variable parameter optimization method for the APEX model

    USDA-ARS?s Scientific Manuscript database

    Proper parameterization enables hydrological models to make reliable estimates of non-point source pollution for effective control measures. The automatic calibration of hydrologic models requires significant computational power limiting its application. The study objective was to develop and eval...

  3. Shape model of the maxillary dental arch using Fourier descriptors with an application in the rehabilitation for edentulous patient.

    PubMed

    Rijal, Omar M; Abdullah, Norli A; Isa, Zakiah M; Noor, Norliza M; Tawfiq, Omar F

    2013-01-01

    The knowledge of teeth positions on the maxillary arch is useful in the rehabilitation of the edentulous patient. A combination of angular (θ), and linear (l) variables representing position of four teeth were initially proposed as the shape descriptor of the maxillary dental arch. Three categories of shape were established, each having a multivariate normal distribution. It may be argued that 4 selected teeth on the standardized digital images of the dental casts could be considered as insufficient with respect to representing shape. However, increasing the number of points would create problems with dimensions and proof of existence of the multivariate normal distribution is extremely difficult. This study investigates the ability of Fourier descriptors (FD) using all maxillary teeth to find alternative shape models. Eight FD terms were sufficient to represent 21 points on the arch. Using these 8 FD terms as an alternative shape descriptor, three categories of shape were verified, each category having the complex normal distribution.

  4. Multivariable Time Series Prediction for the Icing Process on Overhead Power Transmission Line

    PubMed Central

    Li, Peng; Zhao, Na; Zhou, Donghua; Cao, Min; Li, Jingjie; Shi, Xinling

    2014-01-01

    The design of monitoring and predictive alarm systems is necessary for successful overhead power transmission line icing. Given the characteristics of complexity, nonlinearity, and fitfulness in the line icing process, a model based on a multivariable time series is presented here to predict the icing load of a transmission line. In this model, the time effects of micrometeorology parameters for the icing process have been analyzed. The phase-space reconstruction theory and machine learning method were then applied to establish the prediction model, which fully utilized the history of multivariable time series data in local monitoring systems to represent the mapping relationship between icing load and micrometeorology factors. Relevant to the characteristic of fitfulness in line icing, the simulations were carried out during the same icing process or different process to test the model's prediction precision and robustness. According to the simulation results for the Tao-Luo-Xiong Transmission Line, this model demonstrates a good accuracy of prediction in different process, if the prediction length is less than two hours, and would be helpful for power grid departments when deciding to take action in advance to address potential icing disasters. PMID:25136653

  5. Multivariable control altitude demonstration on the F100 turbofan engine

    NASA Technical Reports Server (NTRS)

    Lehtinen, B.; Dehoff, R. L.; Hackney, R. D.

    1979-01-01

    The F100 Multivariable control synthesis (MVCS) program, was aimed at demonstrating the benefits of LGR synthesis theory in the design of a multivariable engine control system for operation throughout the flight envelope. The advantages of such procedures include: (1) enhanced performance from cross-coupled controls, (2) maximum use of engine variable geometry, and (3) a systematic design procedure that can be applied efficiently to new engine systems. The control system designed, under the MVCS program, for the Pratt & Whitney F100 turbofan engine is described. Basic components of the control include: (1) a reference value generator for deriving a desired equilibrium state and an approximate control vector, (2) a transition model to produce compatible reference point trajectories during gross transients, (3) gain schedules for producing feedback terms appropriate to the flight condition, and (4) integral switching logic to produce acceptable steady-state performance without engine operating limit exceedance.

  6. Multivariate multiscale entropy of financial markets

    NASA Astrophysics Data System (ADS)

    Lu, Yunfan; Wang, Jun

    2017-11-01

    In current process of quantifying the dynamical properties of the complex phenomena in financial market system, the multivariate financial time series are widely concerned. In this work, considering the shortcomings and limitations of univariate multiscale entropy in analyzing the multivariate time series, the multivariate multiscale sample entropy (MMSE), which can evaluate the complexity in multiple data channels over different timescales, is applied to quantify the complexity of financial markets. Its effectiveness and advantages have been detected with numerical simulations with two well-known synthetic noise signals. For the first time, the complexity of four generated trivariate return series for each stock trading hour in China stock markets is quantified thanks to the interdisciplinary application of this method. We find that the complexity of trivariate return series in each hour show a significant decreasing trend with the stock trading time progressing. Further, the shuffled multivariate return series and the absolute multivariate return series are also analyzed. As another new attempt, quantifying the complexity of global stock markets (Asia, Europe and America) is carried out by analyzing the multivariate returns from them. Finally we utilize the multivariate multiscale entropy to assess the relative complexity of normalized multivariate return volatility series with different degrees.

  7. Remote-sensing data processing with the multivariate regression analysis method for iron mineral resource potential mapping: a case study in the Sarvian area, central Iran

    NASA Astrophysics Data System (ADS)

    Mansouri, Edris; Feizi, Faranak; Jafari Rad, Alireza; Arian, Mehran

    2018-03-01

    This paper uses multivariate regression to create a mathematical model for iron skarn exploration in the Sarvian area, central Iran, using multivariate regression for mineral prospectivity mapping (MPM). The main target of this paper is to apply multivariate regression analysis (as an MPM method) to map iron outcrops in the northeastern part of the study area in order to discover new iron deposits in other parts of the study area. Two types of multivariate regression models using two linear equations were employed to discover new mineral deposits. This method is one of the reliable methods for processing satellite images. ASTER satellite images (14 bands) were used as unique independent variables (UIVs), and iron outcrops were mapped as dependent variables for MPM. According to the results of the probability value (p value), coefficient of determination value (R2) and adjusted determination coefficient (Radj2), the second regression model (which consistent of multiple UIVs) fitted better than other models. The accuracy of the model was confirmed by iron outcrops map and geological observation. Based on field observation, iron mineralization occurs at the contact of limestone and intrusive rocks (skarn type).

  8. Implementation Challenges for Multivariable Control: What You Did Not Learn in School

    NASA Technical Reports Server (NTRS)

    Garg, Sanjay

    2008-01-01

    Multivariable control allows controller designs that can provide decoupled command tracking and robust performance in the presence of modeling uncertainties. Although the last two decades have seen extensive development of multivariable control theory and example applications to complex systems in software/hardware simulations, there are no production flying systems aircraft or spacecraft, that use multivariable control. This is because of the tremendous challenges associated with implementation of such multivariable control designs. Unfortunately, the curriculum in schools does not provide sufficient time to be able to provide an exposure to the students in such implementation challenges. The objective of this paper is to share the lessons learned by a practitioner of multivariable control in the process of applying some of the modern control theory to the Integrated Flight Propulsion Control (IFPC) design for an advanced Short Take-Off Vertical Landing (STOVL) aircraft simulation.

  9. Predict or classify: The deceptive role of time-locking in brain signal classification

    NASA Astrophysics Data System (ADS)

    Rusconi, Marco; Valleriani, Angelo

    2016-06-01

    Several experimental studies claim to be able to predict the outcome of simple decisions from brain signals measured before subjects are aware of their decision. Often, these studies use multivariate pattern recognition methods with the underlying assumption that the ability to classify the brain signal is equivalent to predict the decision itself. Here we show instead that it is possible to correctly classify a signal even if it does not contain any predictive information about the decision. We first define a simple stochastic model that mimics the random decision process between two equivalent alternatives, and generate a large number of independent trials that contain no choice-predictive information. The trials are first time-locked to the time point of the final event and then classified using standard machine-learning techniques. The resulting classification accuracy is above chance level long before the time point of time-locking. We then analyze the same trials using information theory. We demonstrate that the high classification accuracy is a consequence of time-locking and that its time behavior is simply related to the large relaxation time of the process. We conclude that when time-locking is a crucial step in the analysis of neural activity patterns, both the emergence and the timing of the classification accuracy are affected by structural properties of the network that generates the signal.

  10. Determination of boiling point of petrochemicals by gas chromatography-mass spectrometry and multivariate regression analysis of structural activity relationship.

    PubMed

    Fakayode, Sayo O; Mitchell, Breanna S; Pollard, David A

    2014-08-01

    Accurate understanding of analyte boiling points (BP) is of critical importance in gas chromatographic (GC) separation and crude oil refinery operation in petrochemical industries. This study reported the first combined use of GC separation and partial-least-square (PLS1) multivariate regression analysis of petrochemical structural activity relationship (SAR) for accurate BP determination of two commercially available (D3710 and MA VHP) calibration gas mix samples. The results of the BP determination using PLS1 multivariate regression were further compared with the results of traditional simulated distillation method of BP determination. The developed PLS1 regression was able to correctly predict analytes BP in D3710 and MA VHP calibration gas mix samples, with a root-mean-square-%-relative-error (RMS%RE) of 6.4%, and 10.8% respectively. In contrast, the overall RMS%RE of 32.9% and 40.4%, respectively obtained for BP determination in D3710 and MA VHP using a traditional simulated distillation method were approximately four times larger than the corresponding RMS%RE of BP prediction using MRA, demonstrating the better predictive ability of MRA. The reported method is rapid, robust, and promising, and can be potentially used routinely for fast analysis, pattern recognition, and analyte BP determination in petrochemical industries. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Deformation integrity monitoring for GNSS positioning services including local, regional and large scale hazard monitoring - the Karlsruhe approach and software(MONIKA)

    NASA Astrophysics Data System (ADS)

    Jaeger, R.

    2007-05-01

    GNSS-positioning services like SAPOS/ascos in Germany and many others in Europe, America and worldwide, usually yield in a short time their interdisciplinary and country-wide use for precise geo-referencing, replacing traditional low order geodetic networks. So it becomes necessary that possible changes of the reference stations' coordinates are detected ad hoc. The GNSS-reference-station MONitoring by the KArlsruhe approach and software (MONIKA) are designed for that task. The developments at Karlsruhe University of Applied Sciences in cooperation with the State Survey of Baden-Württemberg are further motivated by a the official resolution of the German state survey departments' association (Arbeitsgemeinschaft der Vermessungsverwaltungen Deutschland (AdV)) 2006 on coordinate monitoring as a quality-control duty of the GNSS-positioning service provider. The presented approach can - besides the coordinate control of GNSS-positioning services - also be used to set up any GNSS-service for the tasks of an area-wide geodynamical and natural disaster-prevention service. The mathematical model of approach, which enables a multivariate and multi-epochal design approach, is based on the GNSS-observations input of the RINEX-data of the GNSS service, followed by fully automatic processing of baselines and/or session, and a near-online setting up of epoch-state vectors and their covariance-matrices in a rigorous 3D network adjustment. In case of large scale and long-term monitoring situations, geodynamical standard trends (datum-drift, plate-movements etc.) are accordingly considered and included in the mathematical model of MONIKA. The coordinate-based deformation monitoring approach, as third step of the stepwise adjustments, is based on the above epoch-state vectors, and - splitting off geodynamics trends - hereby on a multivariate and multi-epochal congruency testing. So far, that no other information exists, all points are assumed as being stable and congruent reference points. Stations, which a priori assumed as moving - in that way local monitoring areas can be included- are to be monitored and analyzed in reference to the stable reference points. In that way, a high sensitivity for the detection of GNSS station displacements, both for assumed stable points, as well as for a priori moving points, can be achieved. The results for the concept are shown at the example of a monitoring using the MONINKA-software in the 300 x 300 km area of the state of Baden-Württemberg, Germany.

  12. Multi-Fault Diagnosis of Rolling Bearings via Adaptive Projection Intrinsically Transformed Multivariate Empirical Mode Decomposition and High Order Singular Value Decomposition

    PubMed Central

    Lv, Yong; Song, Gangbing

    2018-01-01

    Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal. PMID:29659510

  13. Multi-Fault Diagnosis of Rolling Bearings via Adaptive Projection Intrinsically Transformed Multivariate Empirical Mode Decomposition and High Order Singular Value Decomposition.

    PubMed

    Yuan, Rui; Lv, Yong; Song, Gangbing

    2018-04-16

    Rolling bearings are important components in rotary machinery systems. In the field of multi-fault diagnosis of rolling bearings, the vibration signal collected from single channels tends to miss some fault characteristic information. Using multiple sensors to collect signals at different locations on the machine to obtain multivariate signal can remedy this problem. The adverse effect of a power imbalance between the various channels is inevitable, and unfavorable for multivariate signal processing. As a useful, multivariate signal processing method, Adaptive-projection has intrinsically transformed multivariate empirical mode decomposition (APIT-MEMD), and exhibits better performance than MEMD by adopting adaptive projection strategy in order to alleviate power imbalances. The filter bank properties of APIT-MEMD are also adopted to enable more accurate and stable intrinsic mode functions (IMFs), and to ease mode mixing problems in multi-fault frequency extractions. By aligning IMF sets into a third order tensor, high order singular value decomposition (HOSVD) can be employed to estimate the fault number. The fault correlation factor (FCF) analysis is used to conduct correlation analysis, in order to determine effective IMFs; the characteristic frequencies of multi-faults can then be extracted. Numerical simulations and the application of multi-fault situation can demonstrate that the proposed method is promising in multi-fault diagnoses of multivariate rolling bearing signal.

  14. Grammatical analysis as a distributed neurobiological function.

    PubMed

    Bozic, Mirjana; Fonteneau, Elisabeth; Su, Li; Marslen-Wilson, William D

    2015-03-01

    Language processing engages large-scale functional networks in both hemispheres. Although it is widely accepted that left perisylvian regions have a key role in supporting complex grammatical computations, patient data suggest that some aspects of grammatical processing could be supported bilaterally. We investigated the distribution and the nature of grammatical computations across language processing networks by comparing two types of combinatorial grammatical sequences--inflectionally complex words and minimal phrases--and contrasting them with grammatically simple words. Novel multivariate analyses revealed that they engage a coalition of separable subsystems: inflected forms triggered left-lateralized activation, dissociable into dorsal processes supporting morphophonological parsing and ventral, lexically driven morphosyntactic processes. In contrast, simple phrases activated a consistently bilateral pattern of temporal regions, overlapping with inflectional activations in L middle temporal gyrus. These data confirm the role of the left-lateralized frontotemporal network in supporting complex grammatical computations. Critically, they also point to the capacity of bilateral temporal regions to support simple, linear grammatical computations. This is consistent with a dual neurobiological framework where phylogenetically older bihemispheric systems form part of the network that supports language function in the modern human, and where significant capacities for language comprehension remain intact even following severe left hemisphere damage. Copyright © 2014 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  15. Scattered colorimetry and multivariate data processing as an objective tool for liquid mapping (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Mignani, A. G.; Ciaccheri, L.; Smith, P. R.; Cimato, A.; Attilio, C.; Huertas, R.; Melgosa Latorre, Manuel; Bertho, A. C.; O'Rourke, B.; McMillan, N. D.

    2005-05-01

    Scattered colorimetry, i.e., multi-angle and multi-wavelength absorption spectroscopy performed in the visible spectral range, was used to map three kinds of liquids: extra virgin olive oils, frying oils, and detergents in water. By multivariate processing of the spectral data, the liquids could be classified according to their intrinisic characteristics: geographic area of extra virgin olive oils, degradation of frying oils, and surfactant types and mixtures in water.

  16. Multidimensional stochastic approximation using locally contractive functions

    NASA Technical Reports Server (NTRS)

    Lawton, W. M.

    1975-01-01

    A Robbins-Monro type multidimensional stochastic approximation algorithm which converges in mean square and with probability one to the fixed point of a locally contractive regression function is developed. The algorithm is applied to obtain maximum likelihood estimates of the parameters for a mixture of multivariate normal distributions.

  17. Nonlinear and adaptive control

    NASA Technical Reports Server (NTRS)

    Athans, Michael

    1989-01-01

    The primary thrust of the research was to conduct fundamental research in the theories and methodologies for designing complex high-performance multivariable feedback control systems; and to conduct feasibiltiy studies in application areas of interest to NASA sponsors that point out advantages and shortcomings of available control system design methodologies.

  18. Evaluation of a stepwise, multi-objective, multi-variable parameter optimization method for the APEX model

    USDA-ARS?s Scientific Manuscript database

    Hydrologic models are essential tools for environmental assessment of agricultural non-point source pollution. The automatic calibration of hydrologic models, though efficient, demands significant computational power, which can limit its application. The study objective was to investigate a cost e...

  19. Application of a quality by design approach to the cell culture process of monoclonal antibody production, resulting in the establishment of a design space.

    PubMed

    Nagashima, Hiroaki; Watari, Akiko; Shinoda, Yasuharu; Okamoto, Hiroshi; Takuma, Shinya

    2013-12-01

    This case study describes the application of Quality by Design elements to the process of culturing Chinese hamster ovary cells in the production of a monoclonal antibody. All steps in the cell culture process and all process parameters in each step were identified by using a cause-and-effect diagram. Prospective risk assessment using failure mode and effects analysis identified the following four potential critical process parameters in the production culture step: initial viable cell density, culture duration, pH, and temperature. These parameters and lot-to-lot variability in raw material were then evaluated by process characterization utilizing a design of experiments approach consisting of a face-centered central composite design integrated with a full factorial design. Process characterization was conducted using a scaled down model that had been qualified by comparison with large-scale production data. Multivariate regression analysis was used to establish statistical prediction models for performance indicators and quality attributes; with these, we constructed contour plots and conducted Monte Carlo simulation to clarify the design space. The statistical analyses, especially for raw materials, identified set point values, which were most robust with respect to the lot-to-lot variability of raw materials while keeping the product quality within the acceptance criteria. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  20. Relationship between cardiac vagal activity and mood congruent memory bias in major depression.

    PubMed

    Garcia, Ronald G; Valenza, Gaetano; Tomaz, Carlos A; Barbieri, Riccardo

    2016-01-15

    Previous studies suggest that autonomic reactivity during encoding of emotional information could modulate the neural processes mediating mood-congruent memory. In this study, we use a point-process model to determine dynamic autonomic tone in response to negative emotions and its influence on long-term memory of major depressed subjects. Forty-eight patients with major depression and 48 healthy controls were randomly assigned to either neutral or emotionally arousing audiovisual stimuli. An adaptive point-process algorithm was applied to compute instantaneous estimates of the spectral components of heart rate variability [Low frequency (LF), 0.04-0.15 Hz; High frequency (HF), 0.15-0.4 Hz]. Three days later subjects were submitted to a recall test. A significant increase in HF power was observed in depressed subjects in response to the emotionally arousing stimulus (p=0.03). The results of a multivariate analysis revealed that the HF power during the emotional segment of the stimulus was independently associated with the score of the recall test in depressed subjects, after adjusting for age, gender and educational level (Coef. 0.003, 95%CI, 0.0009-0.005, p=0.008). These results could only be interpreted as responses to elicitation of specific negative emotions, the relationship between HF changes and encoding/recall of positive stimuli should be further examined. Alterations on parasympathetic response to emotion are involved in the mood-congruent cognitive bias observed in major depression. These findings are clinically relevant because it could constitute the mechanism by which depressed patients maintain maladaptive patterns of negative information processing that trigger and sustain depressed mood. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. A Longitudinal Study of Lexical Development in Children Learning Vietnamese and English

    PubMed Central

    Pham, Giang; Kohnert, Kathryn

    2013-01-01

    This longitudinal study modeled lexical development among children who spoke Vietnamese as a first language (L1) and English as a second language (L2). Participants (n=33, initial mean age of 7.3 years) completed a total of eight tasks (four in each language) that measured vocabulary knowledge and lexical processing at four yearly time points. Multivariate hierarchical linear modeling was used to calculate L1 and L2 trajectories within the same model for each task. Main findings included (a) positive growth in each language, (b) greater gains in English resulting in shifts toward L2 dominance, and (c) different patterns for receptive and expressive domains. Timing of shifts to L2 dominance underscored L1 skills that are resilient and vulnerable to increases in L2 proficiency. PMID:23869741

  2. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  3. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  4. Extracting galactic structure parameters from multivariated density estimation

    NASA Technical Reports Server (NTRS)

    Chen, B.; Creze, M.; Robin, A.; Bienayme, O.

    1992-01-01

    Multivariate statistical analysis, including includes cluster analysis (unsupervised classification), discriminant analysis (supervised classification) and principle component analysis (dimensionlity reduction method), and nonparameter density estimation have been successfully used to search for meaningful associations in the 5-dimensional space of observables between observed points and the sets of simulated points generated from a synthetic approach of galaxy modelling. These methodologies can be applied as the new tools to obtain information about hidden structure otherwise unrecognizable, and place important constraints on the space distribution of various stellar populations in the Milky Way. In this paper, we concentrate on illustrating how to use nonparameter density estimation to substitute for the true densities in both of the simulating sample and real sample in the five-dimensional space. In order to fit model predicted densities to reality, we derive a set of equations which include n lines (where n is the total number of observed points) and m (where m: the numbers of predefined groups) unknown parameters. A least-square estimation will allow us to determine the density law of different groups and components in the Galaxy. The output from our software, which can be used in many research fields, will also give out the systematic error between the model and the observation by a Bayes rule.

  5. Palliative Care Specialist Consultation Is Associated With Supportive Care Quality in Advanced Cancer.

    PubMed

    Walling, Anne M; Tisnado, Diana; Ettner, Susan L; Asch, Steven M; Dy, Sydney M; Pantoja, Philip; Lee, Martin; Ahluwalia, Sangeeta C; Schreibeis-Baum, Hannah; Malin, Jennifer L; Lorenz, Karl A

    2016-10-01

    Although recent randomized controlled trials support early palliative care for patients with advanced cancer, the specific processes of care associated with these findings and whether these improvements can be replicated in the broader health care system are uncertain. The aim of this study was to evaluate the occurrence of palliative care consultation and its association with specific processes of supportive care in a national cohort of Veterans using the Cancer Quality ASSIST (Assessing Symptoms Side Effects and Indicators of Supportive Treatment) measures. We abstracted data from 719 patients' medical records diagnosed with advanced lung, colorectal, or pancreatic cancer in 2008 over a period of three years or until death who received care in the Veterans Affairs Health System to evaluate the association of palliative care specialty consultation with the quality of supportive care overall and by domain using a multivariate regression model. All but 54 of 719 patients died within three years and 293 received at least one palliative care consult. Patients evaluated by a palliative care specialist at diagnosis scored seven percentage points higher overall (P < 0.001) and 11 percentage points higher (P < 0.001) within the information and care planning domain compared with those without a consult. Early palliative care specialist consultation is associated with better quality of supportive care in three advanced cancers, predominantly driven by improvements in information and care planning. This study supports the effectiveness of early palliative care consultation in three common advanced cancers within the Veterans Affairs Health System and provides a greater understanding of what care processes palliative care teams influence. Published by Elsevier Inc.

  6. Method of identifying clusters representing statistical dependencies in multivariate data

    NASA Technical Reports Server (NTRS)

    Borucki, W. J.; Card, D. H.; Lyle, G. C.

    1975-01-01

    Approach is first to cluster and then to compute spatial boundaries for resulting clusters. Next step is to compute, from set of Monte Carlo samples obtained from scrambled data, estimates of probabilities of obtaining at least as many points within boundaries as were actually observed in original data.

  7. Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies

    ERIC Educational Resources Information Center

    Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.

    2012-01-01

    In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…

  8. A Multivariate Descriptive Model of Motivation for Orthodontic Treatment.

    ERIC Educational Resources Information Center

    Hackett, Paul M. W.; And Others

    1993-01-01

    Motivation for receiving orthodontic treatment was studied among 109 young adults, and a multivariate model of the process is proposed. The combination of smallest scale analysis and Partial Order Scalogram Analysis by base Coordinates (POSAC) illustrates an interesting methodology for health treatment studies and explores motivation for dental…

  9. Readiness of Primary Care Practices for Medical Home Certification

    PubMed Central

    Clark, Sarah J.; Sakshaug, Joseph W.; Chen, Lena M.; Hollingsworth, John M.

    2013-01-01

    OBJECTIVES: To assess the prevalence of medical home infrastructure among primary care practices for children and identify practice characteristics associated with medical home infrastructure. METHODS: Cross-sectional analysis of restricted data files from 2007 and 2008 of the National Ambulatory Medical Care Survey. We mapped survey items to the 2011 National Committee on Quality Assurance’s Patient-Centered Medical home standards. Points were awarded for each “passed” element based on National Committee for Quality Assurance scoring, and we then calculated the percentage of the total possible points met for each practice. We used multivariate linear regression to assess associations between practice characteristics and the percentage of medical home infrastructure points attained. RESULTS: On average, pediatric practices attained 38% (95% confidence interval 34%–41%) of medical home infrastructure points, and family/general practices attained 36% (95% confidence interval 33%–38%). Practices scored higher on medical home elements related to direct patient care (eg, providing comprehensive health assessments) and lower in areas highly dependent on health information technology (eg, computerized prescriptions, test ordering, laboratory result viewing, or quality of care measurement and reporting). In multivariate analyses, smaller practice size was significantly associated with lower infrastructure scores. Practice ownership, urban versus rural location, and proportion of visits covered by public insurers were not consistently associated with a practice’s infrastructure score. CONCLUSIONS: Medical home programs need effective approaches to support practice transformation in the small practices that provide the vast majority of the primary care for children in the United States. PMID:23382438

  10. Multivariate Analysis and Machine Learning in Cerebral Palsy Research

    PubMed Central

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP. PMID:29312134

  11. Multivariate Analysis and Machine Learning in Cerebral Palsy Research.

    PubMed

    Zhang, Jing

    2017-01-01

    Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP.

  12. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  13. Structural brain connectivity and cognitive ability differences: A multivariate distance matrix regression analysis.

    PubMed

    Ponsoda, Vicente; Martínez, Kenia; Pineda-Pardo, José A; Abad, Francisco J; Olea, Julio; Román, Francisco J; Barbey, Aron K; Colom, Roberto

    2017-02-01

    Neuroimaging research involves analyses of huge amounts of biological data that might or might not be related with cognition. This relationship is usually approached using univariate methods, and, therefore, correction methods are mandatory for reducing false positives. Nevertheless, the probability of false negatives is also increased. Multivariate frameworks have been proposed for helping to alleviate this balance. Here we apply multivariate distance matrix regression for the simultaneous analysis of biological and cognitive data, namely, structural connections among 82 brain regions and several latent factors estimating cognitive performance. We tested whether cognitive differences predict distances among individuals regarding their connectivity pattern. Beginning with 3,321 connections among regions, the 36 edges better predicted by the individuals' cognitive scores were selected. Cognitive scores were related to connectivity distances in both the full (3,321) and reduced (36) connectivity patterns. The selected edges connect regions distributed across the entire brain and the network defined by these edges supports high-order cognitive processes such as (a) (fluid) executive control, (b) (crystallized) recognition, learning, and language processing, and (c) visuospatial processing. This multivariate study suggests that one widespread, but limited number, of regions in the human brain, supports high-level cognitive ability differences. Hum Brain Mapp 38:803-816, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Professional quality of life and organizational changes: a five-year observational study in Primary Care

    PubMed Central

    Martin-Fernandez, Jesus; Gomez-Gascon, Tomas; Beamud-Lagos, Milagros; Cortes-Rubio, Jose Alfonso; Alberquilla-Menendez-Asenjo, Angel

    2007-01-01

    Background The satisfaction and the quality of life perceived by professionals have implications for the performance of health organizations. We have assessed the variations in professional quality of life (PQL) and their explanatory factors during a services management decentralization process. Methods It was designed as a longitudinal analytical observational study in a Health Area in Madrid, Spain. Three surveys were sent out during an ongoing management decentralization process between 2001 and 2005. The professionals surveyed were divided into three groups: Group I (97.3% physicians), group II (92.5% nurses) and group III (auxiliary personnel). Analysis of the tendency and elaboration of an explanatory multivariate model was made. The PQL -35 questionnaire, based on Karasek's demand-control theory, was used to measure PQL. This questionnaire recognizes three PQL dimensions: management support (MS), workload (WL) and intrinsic motivation (IM). Results 1444 responses were analyzed. PQL increased 0.16 (CI 95% 0.04 – 0.28) points in each survey. Group II presents over time a higher PQL score than group I of 0.38 (IC 95% 0.18 – 0.59) points. There is no difference between groups I and III. For each point that MS increases, PQL increases between 0.44 and 0.59 points. PQL decreases an average of between 0.35 and 0.49 point, for each point that WL increases. Age appears to have a marginal association with PQL (CI 95% 0.00 – 0.02), as it occurs with being single or not having a stable relationship (CI 95% 0.01 – 0.41). Performing management tasks currently or in the past is related to poorer PQL perception (CI 95% -0.45 – -0.06), and the same occurs with working other than morning shifts (CI 95% -0.03 – -0.40 points). PQL is not related to sex, location of the centre (rural/urban), time spent working in the organization or contractual situation. Conclusion With the improvement in work control and avoiding increases in workloads, PQL perception can be maintained despite deep organizational changes at the macro-management level. Different professional groups experience different perceptions depending on how the changes impact their position in the organization. PMID:17610728

  15. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    PubMed

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  16. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  17. Fining of Red Wine Monitored by Multiple Light Scattering.

    PubMed

    Ferrentino, Giovanna; Ramezani, Mohsen; Morozova, Ksenia; Hafner, Daniela; Pedri, Ulrich; Pixner, Konrad; Scampicchio, Matteo

    2017-07-12

    This work describes a new approach based on multiple light scattering to study red wine clarification processes. The whole spectral signal (1933 backscattering points along the length of each sample vial) were fitted by a multivariate kinetic model that was built with a three-step mechanism, implying (1) adsorption of wine colloids to fining agents, (2) aggregation into larger particles, and (3) sedimentation. Each step is characterized by a reaction rate constant. According to the first reaction, the results showed that gelatin was the most efficient fining agent, concerning the main objective, which was the clarification of the wine, and consequently the increase in its limpidity. Such a trend was also discussed in relation to the results achieved by nephelometry, total phenols, ζ-potential, color, sensory, and electronic nose analyses. Also, higher concentrations of the fining agent (from 5 to 30 g/100 L) or higher temperatures (from 10 to 20 °C) sped up the process. Finally, the advantage of using the whole spectral signal vs classical univariate approaches was demonstrated by comparing the uncertainty associated with the rate constants of the proposed kinetic model. Overall, multiple light scattering technique showed a great potential for studying fining processes compared to classical univariate approaches.

  18. Top-down beta oscillatory signaling conveys behavioral context in early visual cortex.

    PubMed

    Richter, Craig G; Coppola, Richard; Bressler, Steven L

    2018-05-03

    Top-down modulation of sensory processing is a critical neural mechanism subserving numerous important cognitive roles, one of which may be to inform lower-order sensory systems of the current 'task at hand' by conveying behavioral context to these systems. Accumulating evidence indicates that top-down cortical influences are carried by directed interareal synchronization of oscillatory neuronal populations, with recent results pointing to beta-frequency oscillations as particularly important for top-down processing. However, it remains to be determined if top-down beta-frequency oscillations indeed convey behavioral context. We measured spectral Granger Causality (sGC) using local field potentials recorded from microelectrodes chronically implanted in visual areas V1/V2, V4, and TEO of two rhesus macaque monkeys, and applied multivariate pattern analysis to the spatial patterns of top-down sGC. We decoded behavioral context by discriminating patterns of top-down (V4/TEO-to-V1/V2) beta-peak sGC for two different task rules governing correct responses to identical visual stimuli. The results indicate that top-down directed influences are carried to visual cortex by beta oscillations, and differentiate task demands even before visual stimulus processing. They suggest that top-down beta-frequency oscillatory processes coordinate processing of sensory information by conveying global knowledge states to early levels of the sensory cortical hierarchy independently of bottom-up stimulus-driven processing.

  19. Simulation of multivariate stationary stochastic processes using dimension-reduction representation methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui; Peng, Yongbo

    2018-03-01

    In view of the Fourier-Stieltjes integral formula of multivariate stationary stochastic processes, a unified formulation accommodating spectral representation method (SRM) and proper orthogonal decomposition (POD) is deduced. By introducing random functions as constraints correlating the orthogonal random variables involved in the unified formulation, the dimension-reduction spectral representation method (DR-SRM) and the dimension-reduction proper orthogonal decomposition (DR-POD) are addressed. The proposed schemes are capable of representing the multivariate stationary stochastic process with a few elementary random variables, bypassing the challenges of high-dimensional random variables inherent in the conventional Monte Carlo methods. In order to accelerate the numerical simulation, the technique of Fast Fourier Transform (FFT) is integrated with the proposed schemes. For illustrative purposes, the simulation of horizontal wind velocity field along the deck of a large-span bridge is proceeded using the proposed methods containing 2 and 3 elementary random variables. Numerical simulation reveals the usefulness of the dimension-reduction representation methods.

  20. A Hybrid Index for Characterizing Drought Based on a Nonparametric Kernel Estimator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Shengzhi; Huang, Qiang; Leng, Guoyong

    This study develops a nonparametric multivariate drought index, namely, the Nonparametric Multivariate Standardized Drought Index (NMSDI), by considering the variations of both precipitation and streamflow. Building upon previous efforts in constructing Nonparametric Multivariate Drought Index, we use the nonparametric kernel estimator to derive the joint distribution of precipitation and streamflow, thus providing additional insights in drought index development. The proposed NMSDI are applied in the Wei River Basin (WRB), based on which the drought evolution characteristics are investigated. Results indicate: (1) generally, NMSDI captures the drought onset similar to Standardized Precipitation Index (SPI) and drought termination and persistence similar tomore » Standardized Streamflow Index (SSFI). The drought events identified by NMSDI match well with historical drought records in the WRB. The performances are also consistent with that by an existing Multivariate Standardized Drought Index (MSDI) at various timescales, confirming the validity of the newly constructed NMSDI in drought detections (2) An increasing risk of drought has been detected for the past decades, and will be persistent to a certain extent in future in most areas of the WRB; (3) the identified change points of annual NMSDI are mainly concentrated in the early 1970s and middle 1990s, coincident with extensive water use and soil reservation practices. This study highlights the nonparametric multivariable drought index, which can be used for drought detections and predictions efficiently and comprehensively.« less

  1. Predictive model for falling in Parkinson disease patients.

    PubMed

    Custodio, Nilton; Lira, David; Herrera-Perez, Eder; Montesinos, Rosa; Castro-Suarez, Sheila; Cuenca-Alfaro, Jose; Cortijo, Patricia

    2016-12-01

    Falls are a common complication of advancing Parkinson's disease (PD). Although numerous risk factors are known, reliable predictors of future falls are still lacking. The aim of this study was to develop a multivariate model to predict falling in PD patients. Prospective cohort with forty-nine PD patients. The area under the receiver-operating characteristic curve (AUC) was calculated to evaluate predictive performance of the purposed multivariate model. The median of PD duration and UPDRS-III score in the cohort was 6 years and 24 points, respectively. Falls occurred in 18 PD patients (30%). Predictive factors for falling identified by univariate analysis were age, PD duration, physical activity, and scores of UPDRS motor, FOG, ACE, IFS, PFAQ and GDS ( p -value < 0.001), as well as fear of falling score ( p -value = 0.04). The final multivariate model (PD duration, FOG, ACE, and physical activity) showed an AUC = 0.9282 (correctly classified = 89.83%; sensitivity = 92.68%; specificity = 83.33%). This study showed that our multivariate model have a high performance to predict falling in a sample of PD patients.

  2. Parent education and biologic factors influence on cognition in sickle cell anemia

    PubMed Central

    King, Allison A.; Strouse, John J.; Rodeghier, Mark J.; Compas, Bruce E.; Casella, James F.; McKinstry, Robert C.; Noetzel, Michael J.; Quinn, Charles T.; Ichord, Rebecca; Dowling, Michael M.; Miller, J. Philip; DeBaun, Michael R.

    2015-01-01

    Children with sickle cell anemia have a high prevalence of silent cerebral infarcts (SCIs) that are associated with decreased full-scale intelligence quotient (FSIQ). While the educational attainment of parents is a known strong predictor of the cognitive development of children in general, the role of parental education in sickle cell anemia along with other factors that adversely affect cognitive function (anemia, cerebral infarcts) is not known. We tested the hypothesis that both the presence of SCI and parental education would impact FSIQ in children with sickle cell anemia. A multicenter, cross-sectional study was conducted in 19 US sites of the Silent Infarct Transfusion Trial among children with sickle cell anemia, age 5–15 years. All were screened for SCIs. Participants with and without SCI were administered the Wechsler Abbreviated Scale of Intelligence. A total of 150 participants (107 with and 43 without SCIs) were included in the analysis. In a multivariable linear regression model for FSIQ, the absence of college education for the head of household was associated with a decrease of 6.2 points (P=0.005); presence of SCI with a 5.2 point decrease (P=0.017); each $1000 of family income per capita with a 0.33 point increase (P=0.023); each increase of 1 year in age with a 0.96 point decrease (P=0.023); and each 1% (absolute) decrease in hemoglobin oxygen saturation with 0.75 point decrease (P=0.030). In conclusion, FSIQ in children with sickle cell anemia is best accounted for by a multivariate model that includes both biologic and socioenvironmental factors. PMID:24123128

  3. Relationships between Participants' International Prostate Symptom Score and BPH Impact Index Changes and Global Ratings of Change in a Trial of Phytotherapy for Men with Lower Urinary Tract Symptoms

    PubMed Central

    Barry, Michael J.; Cantor, Alan; Roehrborn, Claus G.

    2014-01-01

    Purpose To relate changes in AUA Symptom Index (AUASI) scores with bother measures and global ratings of change among men with lower urinary tract symptoms enrolled in a trial of saw palmetto. Materials and Methods To be eligible, men were ≥45 years old, had ajpeak uroflow ≥4 ml/sec, and an AUASI score ≥ 8 and ≤ 24. Participants self-administered the AUASI, IPSS quality of life item (IPSS QoL), BPH Impact Index (BII) and two global change questions at baseline and 24, 48, and 72 weeks. Results Among 357 participants, global ratings of “a little better” were associated with mean decreases in AUASI scores from 2.8 to 4.1 points, across three time points. The analogous range for mean decreases in BII scores was 1.0 to 1.7 points, and for the IPSS QoL item 0.5 to 0.8 points. At 72 weeks, for the first global change question, each change measure could discriminate between participants rating themselves at least a little better versus unchanged or worse 70-72% of the time. A multivariable model increased discrimination to 77%. For the second global change question, each change measure correctly discriminated ratings of at least a little better versus unchanged or worse 69-74% of the time, and a multivariable model increased discrimination to 79%. Conclusions Changes in AUASI scores could discriminate between participants rating themselves at least a little better versus unchanged or worse. Our findings support the practice of powering studies to detect group mean differences in AUASI scores of at least 3 points. PMID:23017510

  4. A General Multivariate Latent Growth Model with Applications to Student Achievement

    ERIC Educational Resources Information Center

    Bianconcini, Silvia; Cagnone, Silvia

    2012-01-01

    The evaluation of the formative process in the University system has been assuming an ever increasing importance in the European countries. Within this context, the analysis of student performance and capabilities plays a fundamental role. In this work, the authors propose a multivariate latent growth model for studying the performances of a…

  5. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  6. Chemical structure of wood charcoal by infrared spectroscopy and multivariate analysis

    Treesearch

    Nicole Labbe; David Harper; Timothy Rials; Thomas Elder

    2006-01-01

    In this work, the effect of temperature on charcoal structure and chemical composition is investigated for four tree species. Wood charcoal carbonized at various temperatures is analyzed by mid infrared spectroscopy coupled with multivariate analysis and by thermogravimetric analysis to characterize the chemical composition during the carbonization process. The...

  7. Parametric Cost Models for Space Telescopes

    NASA Technical Reports Server (NTRS)

    Stahl, H. Philip

    2010-01-01

    A study is in-process to develop a multivariable parametric cost model for space telescopes. Cost and engineering parametric data has been collected on 30 different space telescopes. Statistical correlations have been developed between 19 variables of 59 variables sampled. Single Variable and Multi-Variable Cost Estimating Relationships have been developed. Results are being published.

  8. A Simpli ed, General Approach to Simulating from Multivariate Copula Functions

    Treesearch

    Barry Goodwin

    2012-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...

  9. Choosing the Greenest Synthesis: A Multivariate Metric Green Chemistry Exercise

    ERIC Educational Resources Information Center

    Mercer, Sean M.; Andraos, John; Jessop, Philip G.

    2012-01-01

    The ability to correctly identify the greenest of several syntheses is a particularly useful asset for young chemists in the growing green economy. The famous univariate metrics atom economy and environmental factor provide insufficient information to allow for a proper selection of a green process. Multivariate metrics, such as those used in…

  10. IRT-ZIP Modeling for Multivariate Zero-Inflated Count Data

    ERIC Educational Resources Information Center

    Wang, Lijuan

    2010-01-01

    This study introduces an item response theory-zero-inflated Poisson (IRT-ZIP) model to investigate psychometric properties of multiple items and predict individuals' latent trait scores for multivariate zero-inflated count data. In the model, two link functions are used to capture two processes of the zero-inflated count data. Item parameters are…

  11. MToS: A Tree of Shapes for Multivariate Images.

    PubMed

    Carlinet, Edwin; Géraud, Thierry

    2015-12-01

    The topographic map of a gray-level image, also called tree of shapes, provides a high-level hierarchical representation of the image contents. This representation, invariant to contrast changes and to contrast inversion, has been proved very useful to achieve many image processing and pattern recognition tasks. Its definition relies on the total ordering of pixel values, so this representation does not exist for color images, or more generally, multivariate images. Common workarounds, such as marginal processing, or imposing a total order on data, are not satisfactory and yield many problems. This paper presents a method to build a tree-based representation of multivariate images, which features marginally the same properties of the gray-level tree of shapes. Briefly put, we do not impose an arbitrary ordering on values, but we only rely on the inclusion relationship between shapes in the image definition domain. The interest of having a contrast invariant and self-dual representation of multivariate image is illustrated through several applications (filtering, segmentation, and object recognition) on different types of data: color natural images, document images, satellite hyperspectral imaging, multimodal medical imaging, and videos.

  12. Impact of liver volume and liver function on posthepatectomy liver failure after portal vein embolization- A multivariable cohort analysis.

    PubMed

    Alizai, Patrick H; Haelsig, Annabel; Bruners, Philipp; Ulmer, Florian; Klink, Christian D; Dejong, Cornelis H C; Neumann, Ulf P; Schmeding, Maximilian

    2018-01-01

    Liver failure remains a life-threatening complication after liver resection, and is difficult to predict preoperatively. This retrospective cohort study evaluated different preoperative factors in regard to their impact on posthepatectomy liver failure (PHLF) after extended liver resection and previous portal vein embolization (PVE). Patient characteristics, liver function and liver volumes of patients undergoing PVE and subsequent liver resection were analyzed. Liver function was determined by the LiMAx test (enzymatic capacity of cytochrome P450 1A2). Factors associated with the primary end point PHLF (according to ISGLS definition) were identified through multivariable analysis. Secondary end points were 30-day mortality and morbidity. 95 patients received PVE, of which 64 patients underwent major liver resection. PHLF occurred in 7 patients (11%). Calculated postoperative liver function was significantly lower in patients with PHLF than in patients without PHLF (67 vs. 109 μg/kg/h; p = 0.01). Other factors associated with PHLF by univariable analysis were age, future liver remnant, MELD score, ASA score, renal insufficiency and heart insufficiency. By multivariable analysis, future liver remnant was the only factor significantly associated with PHLF (p = 0.03). Mortality and morbidity rates were 4.7% and 29.7% respectively. Future liver remnant is the only preoperative factor with a significant impact on PHLF. Assessment of preoperative liver function may additionally help identify patients at risk for PHLF.

  13. Potential use of MCR-ALS for the identification of coeliac-related biochemical changes in hyperspectral Raman maps from pediatric intestinal biopsies.

    PubMed

    Fornasaro, Stefano; Vicario, Annalisa; De Leo, Luigina; Bonifacio, Alois; Not, Tarcisio; Sergo, Valter

    2018-05-14

    Raman hyperspectral imaging is an emerging practice in biological and biomedical research for label free analysis of tissues and cells. Using this method, both spatial distribution and spectral information of analyzed samples can be obtained. The current study reports the first Raman microspectroscopic characterisation of colon tissues from patients with Coeliac Disease (CD). The aim was to assess if Raman imaging coupled with hyperspectral multivariate image analysis is capable of detecting the alterations in the biochemical composition of intestinal tissues associated with CD. The analytical approach was based on a multi-step methodology: duodenal biopsies from healthy and coeliac patients were measured and processed with Multivariate Curve Resolution Alternating Least Squares (MCR-ALS). Based on the distribution maps and the pure spectra of the image constituents obtained from MCR-ALS, interesting biochemical differences between healthy and coeliac patients has been derived. Noticeably, a reduced distribution of complex lipids in the pericryptic space, and a different distribution and abundance of proteins rich in beta-sheet structures was found in CD patients. The output of the MCR-ALS analysis was then used as a starting point for two clustering algorithms (k-means clustering and hierarchical clustering methods). Both methods converged with similar results providing precise segmentation over multiple Raman images of studied tissues.

  14. Multivariate quantile mapping bias correction: an N-dimensional probability density function transform for climate model simulations of multiple variables

    NASA Astrophysics Data System (ADS)

    Cannon, Alex J.

    2018-01-01

    Most bias correction algorithms used in climatology, for example quantile mapping, are applied to univariate time series. They neglect the dependence between different variables. Those that are multivariate often correct only limited measures of joint dependence, such as Pearson or Spearman rank correlation. Here, an image processing technique designed to transfer colour information from one image to another—the N-dimensional probability density function transform—is adapted for use as a multivariate bias correction algorithm (MBCn) for climate model projections/predictions of multiple climate variables. MBCn is a multivariate generalization of quantile mapping that transfers all aspects of an observed continuous multivariate distribution to the corresponding multivariate distribution of variables from a climate model. When applied to climate model projections, changes in quantiles of each variable between the historical and projection period are also preserved. The MBCn algorithm is demonstrated on three case studies. First, the method is applied to an image processing example with characteristics that mimic a climate projection problem. Second, MBCn is used to correct a suite of 3-hourly surface meteorological variables from the Canadian Centre for Climate Modelling and Analysis Regional Climate Model (CanRCM4) across a North American domain. Components of the Canadian Forest Fire Weather Index (FWI) System, a complicated set of multivariate indices that characterizes the risk of wildfire, are then calculated and verified against observed values. Third, MBCn is used to correct biases in the spatial dependence structure of CanRCM4 precipitation fields. Results are compared against a univariate quantile mapping algorithm, which neglects the dependence between variables, and two multivariate bias correction algorithms, each of which corrects a different form of inter-variable correlation structure. MBCn outperforms these alternatives, often by a large margin, particularly for annual maxima of the FWI distribution and spatiotemporal autocorrelation of precipitation fields.

  15. Nonparametric estimation of the multivariate survivor function: the multivariate Kaplan-Meier estimator.

    PubMed

    Prentice, Ross L; Zhao, Shanshan

    2018-01-01

    The Dabrowska (Ann Stat 16:1475-1489, 1988) product integral representation of the multivariate survivor function is extended, leading to a nonparametric survivor function estimator for an arbitrary number of failure time variates that has a simple recursive formula for its calculation. Empirical process methods are used to sketch proofs for this estimator's strong consistency and weak convergence properties. Summary measures of pairwise and higher-order dependencies are also defined and nonparametrically estimated. Simulation evaluation is given for the special case of three failure time variates.

  16. Bootstrapping Cox’s Regression Model.

    DTIC Science & Technology

    1985-11-01

    crucial points a multivariate martingale central limit theorem. Involved in this is a p x p covariance matrix Z with elements T j2= f {2(s8 ) - s(l)( s ,8o...1980). The statistical analaysis of failure time data. Wiley, New York. Meyer, P.-A. (1971). Square integrable martingales, a survey. Lecture Notes

  17. Monitoring Human Development Goals: A Straightforward (Bayesian) Methodology for Cross-National Indices

    ERIC Educational Resources Information Center

    Abayomi, Kobi; Pizarro, Gonzalo

    2013-01-01

    We offer a straightforward framework for measurement of progress, across many dimensions, using cross-national social indices, which we classify as linear combinations of multivariate country level data onto a univariate score. We suggest a Bayesian approach which yields probabilistic (confidence type) intervals for the point estimates of country…

  18. Importance of the Correlation between Width and Length in the Shape Analysis of Nanorods: Use of a 2D Size Plot To Probe Such a Correlation.

    PubMed

    Zhao, Zhihua; Zheng, Zhiqin; Roux, Clément; Delmas, Céline; Marty, Jean-Daniel; Kahn, Myrtil L; Mingotaud, Christophe

    2016-08-22

    Analysis of nanoparticle size through a simple 2D plot is proposed in order to extract the correlation between length and width in a collection or a mixture of anisotropic particles. Compared to the usual statistics on the length associated with a second and independent statistical analysis of the width, this simple plot easily points out the various types of nanoparticles and their (an)isotropy. For each class of nano-objects, the relationship between width and length (i.e., the strong or weak correlations between these two parameters) may suggest information concerning the nucleation/growth processes. It allows one to follow the effect on the shape and size distribution of physical or chemical processes such as simple ripening. Various electron microscopy pictures from the literature or from the authors' own syntheses are used as examples to demonstrate the efficiency and simplicity of the proposed 2D plot combined with a multivariate analysis. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Igloo-Plot: a tool for visualization of multidimensional datasets.

    PubMed

    Kuntal, Bhusan K; Ghosh, Tarini Shankar; Mande, Sharmila S

    2014-01-01

    Advances in science and technology have resulted in an exponential growth of multivariate (or multi-dimensional) datasets which are being generated from various research areas especially in the domain of biological sciences. Visualization and analysis of such data (with the objective of uncovering the hidden patterns therein) is an important and challenging task. We present a tool, called Igloo-Plot, for efficient visualization of multidimensional datasets. The tool addresses some of the key limitations of contemporary multivariate visualization and analysis tools. The visualization layout, not only facilitates an easy identification of clusters of data-points having similar feature compositions, but also the 'marker features' specific to each of these clusters. The applicability of the various functionalities implemented herein is demonstrated using several well studied multi-dimensional datasets. Igloo-Plot is expected to be a valuable resource for researchers working in multivariate data mining studies. Igloo-Plot is available for download from: http://metagenomics.atc.tcs.com/IglooPlot/. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Genetic variation of growth dynamics in maize (Zea mays L.) revealed through automated non-invasive phenotyping.

    PubMed

    Muraya, Moses M; Chu, Jianting; Zhao, Yusheng; Junker, Astrid; Klukas, Christian; Reif, Jochen C; Altmann, Thomas

    2017-01-01

    Hitherto, most quantitative trait loci of maize growth and biomass yield have been identified for a single time point, usually the final harvest stage. Through this approach cumulative effects are detected, without considering genetic factors causing phase-specific differences in growth rates. To assess the genetics of growth dynamics, we employed automated non-invasive phenotyping to monitor the plant sizes of 252 diverse maize inbred lines at 11 different developmental time points; 50 k SNP array genotype data were used for genome-wide association mapping and genomic selection. The heritability of biomass was estimated to be over 71%, and the average prediction accuracy amounted to 0.39. Using the individual time point data, 12 main effect marker-trait associations (MTAs) and six pairs of epistatic interactions were detected that displayed different patterns of expression at various developmental time points. A subset of them also showed significant effects on relative growth rates in different intervals. The detected MTAs jointly explained up to 12% of the total phenotypic variation, decreasing with developmental progression. Using non-parametric functional mapping and multivariate mapping approaches, four additional marker loci affecting growth dynamics were detected. Our results demonstrate that plant biomass accumulation is a complex trait governed by many small effect loci, most of which act at certain restricted developmental phases. This highlights the need for investigation of stage-specific growth affecting genes to elucidate important processes operating at different developmental phases. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  1. Global spectral graph wavelet signature for surface analysis of carpal bones

    NASA Astrophysics Data System (ADS)

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.

    2018-02-01

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  2. Global spectral graph wavelet signature for surface analysis of carpal bones.

    PubMed

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A

    2018-02-05

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  3. Measures for brain connectivity analysis: nodes centrality and their invariant patterns

    NASA Astrophysics Data System (ADS)

    da Silva, Laysa Mayra Uchôa; Baltazar, Carlos Arruda; Silva, Camila Aquemi; Ribeiro, Mauricio Watanabe; de Aratanha, Maria Adelia Albano; Deolindo, Camila Sardeto; Rodrigues, Abner Cardoso; Machado, Birajara Soares

    2017-07-01

    The high dynamical complexity of the brain is related to its small-world topology, which enable both segregated and integrated information processing capabilities. Several measures of connectivity estimation have already been employed to characterize functional brain networks from multivariate electrophysiological data. However, understanding the properties of each measure that lead to a better description of the real topology and capture the complex phenomena present in the brain remains challenging. In this work we compared four nonlinear connectivity measures and show that each method characterizes distinct features of brain interactions. The results suggest an invariance of global network parameters from different behavioral states and that more complete description may be reached considering local features, independently of the connectivity measure employed. Our findings also point to future perspectives in connectivity studies that combine distinct and complementary dependence measures in assembling higher dimensions manifolds.

  4. A longitudinal study of lexical development in children learning Vietnamese and English.

    PubMed

    Pham, Giang; Kohnert, Kathryn

    2014-01-01

    This longitudinal study modeled lexical development among children who spoke Vietnamese as a first language (L1) and English as a second language (L2). Participants (n = 33, initial mean age of 7.3 years) completed a total of eight tasks (four in each language) that measured vocabulary knowledge and lexical processing at four yearly time points. Multivariate hierarchical linear modeling was used to calculate L1 and L2 trajectories within the same model for each task. Main findings included (a) positive growth in each language, (b) greater gains in English resulting in shifts toward L2 dominance, and (c) different patterns for receptive and expressive domains. Timing of shifts to L2 dominance underscored L1 skills that are resilient and vulnerable to increases in L2 proficiency. © 2013 The Authors. Child Development © 2013 Society for Research in Child Development, Inc.

  5. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less

  6. Bayesian Treed Multivariate Gaussian Process with Adaptive Design: Application to a Carbon Capture Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Sarkar, Avik

    2014-05-16

    Computer experiments (numerical simulations) are widely used in scientific research to study and predict the behavior of complex systems, which usually have responses consisting of a set of distinct outputs. The computational cost of the simulations at high resolution are often expensive and become impractical for parametric studies at different input values. To overcome these difficulties we develop a Bayesian treed multivariate Gaussian process (BTMGP) as an extension of the Bayesian treed Gaussian process (BTGP) in order to model and evaluate a multivariate process. A suitable choice of covariance function and the prior distributions facilitates the different Markov chain Montemore » Carlo (MCMC) movements. We utilize this model to sequentially sample the input space for the most informative values, taking into account model uncertainty and expertise gained. A simulation study demonstrates the use of the proposed method and compares it with alternative approaches. We apply the sequential sampling technique and BTMGP to model the multiphase flow in a full scale regenerator of a carbon capture unit. The application presented in this paper is an important tool for research into carbon dioxide emissions from thermal power plants.« less

  7. Tuning algorithms for fractional order internal model controllers for time delay processes

    NASA Astrophysics Data System (ADS)

    Muresan, Cristina I.; Dutta, Abhishek; Dulf, Eva H.; Pinar, Zehra; Maxim, Anca; Ionescu, Clara M.

    2016-03-01

    This paper presents two tuning algorithms for fractional-order internal model control (IMC) controllers for time delay processes. The two tuning algorithms are based on two specific closed-loop control configurations: the IMC control structure and the Smith predictor structure. In the latter, the equivalency between IMC and Smith predictor control structures is used to tune a fractional-order IMC controller as the primary controller of the Smith predictor structure. Fractional-order IMC controllers are designed in both cases in order to enhance the closed-loop performance and robustness of classical integer order IMC controllers. The tuning procedures are exemplified for both single-input-single-output as well as multivariable processes, described by first-order and second-order transfer functions with time delays. Different numerical examples are provided, including a general multivariable time delay process. Integer order IMC controllers are designed in each case, as well as fractional-order IMC controllers. The simulation results show that the proposed fractional-order IMC controller ensures an increased robustness to modelling uncertainties. Experimental results are also provided, for the design of a multivariable fractional-order IMC controller in a Smith predictor structure for a quadruple-tank system.

  8. Functional Path Analysis as a Multivariate Technique in Developing a Theory of Participation in Adult Education.

    ERIC Educational Resources Information Center

    Martin, James L.

    This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…

  9. Outcomes and Prognostic Factors in Women With 1 to 3 Breast Cancer Brain Metastases Treated With Definitive Stereotactic Radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, T. Jonathan; Oh, Jung Hun; Folkert, Michael R.

    2014-11-01

    Background: With the continuing increase in the use of definitive stereotactic radiosurgery (SRS) for patients with limited brain metastases (BM), clinicians need more specific prognostic tools. We investigated clinical predictors of outcomes in patients with limited breast cancer BM treated with SRS alone. Methods and Materials: We identified 136 patients with breast cancer and 1-3 BM who underwent definitive SRS for 186 BM between 2000 and 2012. The Kaplan-Meier method was used to assess overall survival (OS), regional failure (RF), and local failure (LF). Associations between clinical factors and outcomes were tested using Cox regression. A point scoring system wasmore » used to stratify patients based on OS, and the predictive power was tested with concordance probability estimate (CPE). Results: The median OS was 17.6 months. The 12-month RF and LF rates were 45% and 10%, respectively. On multivariate analysis, >1 lesion (hazard ratio [HR] = 1.6, P=.02), triple-negative (TN) disease (HR=2.0, P=.006), and active extracranial disease (ED) (HR=2.7, P<.0001) were significantly associated with worse OS. The point score system was defined using proportional simplification of the multivariate Cox proportional hazards regression function. The median OS for patients with 3.0-4.0 points (n=37), 4.5-5.5 points (n=28), 6.0-6.5 points (n=37), and 8-8.5 points (n=34) were 9.2, 15.6, 25.1, and 45.1 months, respectively (P<.0001, CPE = 0.72). Active ED (HR=2.4, P=.0007) was significantly associated with RF. Higher risk for LF was significantly associated with larger BM size (HR=3.1, P=.0001). Conclusion: Patients with >1 BM, active ED, and TN had the highest risk of death after SRS. Active ED is an important prognostic factor for OS and intracranial control.« less

  10. Computed Tomographic Evaluation of Posttreatment Soft-Tissue Changes by Using a Lymphedema Scoring System in Patients with Oral Cancer.

    PubMed

    Akashi, Masaya; Teraoka, Shun; Kakei, Yasumasa; Kusumoto, Junya; Hasegawa, Takumi; Minamikawa, Tsutomu; Hashikawa, Kazunobu; Komori, Takahide

    2018-04-01

    This study aimed to evaluate posttreatment soft-tissue changes in patients with oral cancer with computed tomography (CT). To accomplish that purpose, a scoring system was established, referring to the criteria of lower leg lymphedema (LE). One hundred and six necks in 95 patients who underwent oral oncologic surgery with neck dissection (ND) were analyzed retrospectively using routine follow-up CT images. A two-point scoring system to evaluate soft-tissue changes (so-called "LE score") was established as follows: Necks with a "honeycombing" appearance were assigned 1 point. Necks with "taller than wide" fat lobules were assigned 1 point. Necks with neither appearance were assigned 0 points. Comparisons between patients with LE score ≥1 and LE score = 0 at 6 months postoperatively were performed using the Fisher exact test for discrete variables and the Mann-Whitney U test for continuous variables. Univariate predictors associated with posttreatment changes (i.e., LE score ≥1 at 6 months postoperatively) were entered into a multivariate logistic regression analysis. Values of p < 0.05 were considered to indicate statistical significance. The occurrence of the posttreatment soft-tissue changes was 32%. Multivariate logistic regression analysis showed that postoperative radiation therapy (RT) and bilateral ND were potential risk factors of posttreatment soft-tissue changes on CT images. Sequential evaluation of "honeycombing" and the "taller than wide" appearances on routine follow-up CT revealed the persistence of posttreatment soft-tissue changes in patients who underwent oral cancer treatment, and those potential risk factors were postoperative RT and bilateral ND.

  11. Detecting correlation changes in multivariate time series: A comparison of four non-parametric change point detection methods.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Grassmann, Mariel; Ceulemans, Eva

    2017-06-01

    Change point detection in multivariate time series is a complex task since next to the mean, the correlation structure of the monitored variables may also alter when change occurs. DeCon was recently developed to detect such changes in mean and\\or correlation by combining a moving windows approach and robust PCA. However, in the literature, several other methods have been proposed that employ other non-parametric tools: E-divisive, Multirank, and KCP. Since these methods use different statistical approaches, two issues need to be tackled. First, applied researchers may find it hard to appraise the differences between the methods. Second, a direct comparison of the relative performance of all these methods for capturing change points signaling correlation changes is still lacking. Therefore, we present the basic principles behind DeCon, E-divisive, Multirank, and KCP and the corresponding algorithms, to make them more accessible to readers. We further compared their performance through extensive simulations using the settings of Bulteel et al. (Biological Psychology, 98 (1), 29-42, 2014) implying changes in mean and in correlation structure and those of Matteson and James (Journal of the American Statistical Association, 109 (505), 334-345, 2014) implying different numbers of (noise) variables. KCP emerged as the best method in almost all settings. However, in case of more than two noise variables, only DeCon performed adequately in detecting correlation changes.

  12. Application of dual-cloud point extraction for the trace levels of copper in serum of different viral hepatitis patients by flame atomic absorption spectrometry: A multivariate study

    NASA Astrophysics Data System (ADS)

    Arain, Salma Aslam; Kazi, Tasneem G.; Afridi, Hassan Imran; Abbasi, Abdul Rasool; Panhwar, Abdul Haleem; Naeemullah; Shanker, Bhawani; Arain, Mohammad Balal

    2014-12-01

    An efficient, innovative preconcentration method, dual-cloud point extraction (d-CPE) has been developed for the extraction and preconcentration of copper (Cu2+) in serum samples of different viral hepatitis patients prior to couple with flame atomic absorption spectrometry (FAAS). The d-CPE procedure was based on forming complexes of elemental ions with complexing reagent 1-(2-pyridylazo)-2-naphthol (PAN), and subsequent entrapping the complexes in nonionic surfactant (Triton X-114). Then the surfactant rich phase containing the metal complexes was treated with aqueous nitric acid solution, and metal ions were back extracted into the aqueous phase, as second cloud point extraction stage, and finally determined by flame atomic absorption spectrometry using conventional nebulization. The multivariate strategy was applied to estimate the optimum values of experimental variables for the recovery of Cu2+ using d-CPE. In optimum experimental conditions, the limit of detection and the enrichment factor were 0.046 μg L-1 and 78, respectively. The validity and accuracy of proposed method were checked by analysis of Cu2+ in certified sample of serum (CRM) by d-CPE and conventional CPE procedure on same CRM. The proposed method was successfully applied to the determination of Cu2+ in serum samples of different viral hepatitis patients and healthy controls.

  13. Prognostic value of preoperative serum CA 242 in Esophageal squamous cell carcinoma cases.

    PubMed

    Feng, Ji-Feng; Huang, Ying; Chen, Qi-Xun

    2013-01-01

    Carbohydrate antigen (CA) 242 is inversely related to prognosis in many cancers. However, few data regarding CA 242 in esophageal cancer (EC) are available. The aim of this study was to determine the prognostic value of CA 242 and propose an optimum cut-off point in predicting survival difference in patients with esophageal squamous cell carcinoma (ESCC). A retrospective analysis was conducted of 192 cases. A receiver operating characteristic (ROC) curve for survival prediction was plotted to verify the optimum cuf- off point. Univariate and multivariate analyses were performed to evaluate prognostic parameters for survival. The positive rate for CA 242 was 7.3% (14/192). The ROC curve for survival prediction gave an optimum cut-off of 2.15 (U/ml). Patients with CA 242 ≤ 2.15 U/ml had significantly better 5-year survival than patients with CA 242 >2.15 U/ml (45.4% versus 22.6%; P=0.003). Multivariate analysis showed that differentiation (P=0.033), CA 242 (P=0.017), T grade (P=0.004) and N staging (P<0.001) were independent prognostic factors. Preoperative CA 242 is a predictive factor for long-term survival in ESCC, especially in nodal-negative patients. We conclude that 2.15 U/ml may be the optimum cuf-off point for CA 242 in predicting survival in ESCC.

  14. Multivariate pattern analysis of MEG and EEG: A comparison of representational structure in time and space.

    PubMed

    Cichy, Radoslaw Martin; Pantazis, Dimitrios

    2017-09-01

    Multivariate pattern analysis of magnetoencephalography (MEG) and electroencephalography (EEG) data can reveal the rapid neural dynamics underlying cognition. However, MEG and EEG have systematic differences in sampling neural activity. This poses the question to which degree such measurement differences consistently bias the results of multivariate analysis applied to MEG and EEG activation patterns. To investigate, we conducted a concurrent MEG/EEG study while participants viewed images of everyday objects. We applied multivariate classification analyses to MEG and EEG data, and compared the resulting time courses to each other, and to fMRI data for an independent evaluation in space. We found that both MEG and EEG revealed the millisecond spatio-temporal dynamics of visual processing with largely equivalent results. Beyond yielding convergent results, we found that MEG and EEG also captured partly unique aspects of visual representations. Those unique components emerged earlier in time for MEG than for EEG. Identifying the sources of those unique components with fMRI, we found the locus for both MEG and EEG in high-level visual cortex, and in addition for MEG in low-level visual cortex. Together, our results show that multivariate analyses of MEG and EEG data offer a convergent and complimentary view on neural processing, and motivate the wider adoption of these methods in both MEG and EEG research. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Inferring Instantaneous, Multivariate and Nonlinear Sensitivities for the Analysis of Feedback Processes in a Dynamical System: Lorenz Model Case Study

    NASA Technical Reports Server (NTRS)

    Aires, Filipe; Rossow, William B.; Hansen, James E. (Technical Monitor)

    2001-01-01

    A new approach is presented for the analysis of feedback processes in a nonlinear dynamical system by observing its variations. The new methodology consists of statistical estimates of the sensitivities between all pairs of variables in the system based on a neural network modeling of the dynamical system. The model can then be used to estimate the instantaneous, multivariate and nonlinear sensitivities, which are shown to be essential for the analysis of the feedbacks processes involved in the dynamical system. The method is described and tested on synthetic data from the low-order Lorenz circulation model where the correct sensitivities can be evaluated analytically.

  16. Analysis and assessment on heavy metal sources in the coastal soils developed from alluvial deposits using multivariate statistical methods.

    PubMed

    Li, Jinling; He, Ming; Han, Wei; Gu, Yifan

    2009-05-30

    An investigation on heavy metal sources, i.e., Cu, Zn, Ni, Pb, Cr, and Cd in the coastal soils of Shanghai, China, was conducted using multivariate statistical methods (principal component analysis, clustering analysis, and correlation analysis). All the results of the multivariate analysis showed that: (i) Cu, Ni, Pb, and Cd had anthropogenic sources (e.g., overuse of chemical fertilizers and pesticides, industrial and municipal discharges, animal wastes, sewage irrigation, etc.); (ii) Zn and Cr were associated with parent materials and therefore had natural sources (e.g., the weathering process of parent materials and subsequent pedo-genesis due to the alluvial deposits). The effect of heavy metals in the soils was greatly affected by soil formation, atmospheric deposition, and human activities. These findings provided essential information on the possible sources of heavy metals, which would contribute to the monitoring and assessment process of agricultural soils in worldwide regions.

  17. Copula-based analysis of rhythm

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.; Viola, M. L. Lanfredi

    2016-06-01

    In this paper we establish stochastic profiles of the rhythm for three languages: English, Japanese and Spanish. We model the increase or decrease of the acoustical energy, collected into three bands coming from the acoustic signal. The number of parameters needed to specify a discrete multivariate Markov chain grows exponentially with the order and dimension of the chain. In this case the size of the database is not large enough for a consistent estimation of the model. We apply a strategy to estimate a multivariate process with an order greater than the order achieved using standard procedures. The new strategy consist on obtaining a partition of the state space which is constructed from a combination of the partitions corresponding to the three marginal processes, one for each band of energy, and the partition coming from to the multivariate Markov chain. Then, all the partitions are linked using a copula, in order to estimate the transition probabilities.

  18. A simple scoring system for predicting early major complications in spine surgery: the cumulative effect of age and size of surgery.

    PubMed

    Brasil, Albert Vincent Berthier; Teles, Alisson R; Roxo, Marcelo Ricardo; Schuster, Marcelo Neutzling; Zauk, Eduardo Ballverdu; Barcellos, Gabriel da Costa; Costa, Pablo Ramon Fruett da; Ferreira, Nelson Pires; Kraemer, Jorge Luiz; Ferreira, Marcelo Paglioli; Gobbato, Pedro Luis; Worm, Paulo Valdeci

    2016-10-01

    To analyze the cumulative effect of risk factors associated with early major complications in postoperative spine surgery. Retrospective analysis of 583 surgically-treated patients. Early "major" complications were defined as those that may lead to permanent detrimental effects or require further significant intervention. A balanced risk score was built using multiple logistic regression. Ninety-two early major complications occurred in 76 patients (13%). Age > 60 years and surgery of three or more levels proved to be significant independent risk factors in the multivariate analysis. The balanced scoring system was defined as: 0 points (no risk factor), 2 points (1 factor) or 4 points (2 factors). The incidence of early major complications in each category was 7% (0 points), 15% (2 points) and 29% (4 points) respectively. This balanced scoring system, based on two risk factors, represents an important tool for both surgical indication and for patient counseling before surgery.

  19. The PIT-trap-A "model-free" bootstrap procedure for inference about regression models with discrete, multivariate responses.

    PubMed

    Warton, David I; Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.

  20. The PIT-trap—A “model-free” bootstrap procedure for inference about regression models with discrete, multivariate responses

    PubMed Central

    Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)—common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of “model-free bootstrap”, adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods. PMID:28738071

  1. Universal portfolios generated by weakly stationary processes

    NASA Astrophysics Data System (ADS)

    Tan, Choon Peng; Pang, Sook Theng

    2014-12-01

    Recently, a universal portfolio generated by a set of independent Brownian motions where a finite number of past stock prices are weighted by the moments of the multivariate normal distribution is introduced and studied. The multivariate normal moments as polynomials in time consequently lead to a constant rebalanced portfolio depending on the drift coefficients of the Brownian motions. For a weakly stationary process, a different type of universal portfolio is proposed where the weights on the stock prices depend only on the time differences of the stock prices. An empirical study is conducted on the returns achieved by the universal portfolios generated by the Ornstein-Uhlenbeck process on selected stock-price data sets. Promising results are demonstrated for increasing the wealth of the investor by using the weakly-stationary-process-generated universal portfolios.

  2. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  3. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.

  4. Multivariable nonlinear analysis of foreign exchange rates

    NASA Astrophysics Data System (ADS)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2003-05-01

    We analyze the multivariable time series of foreign exchange rates. These are price movements that have often been analyzed, and dealing time intervals and spreads between bid and ask prices. Considering dealing time intervals as event timing such as neurons’ firings, we use raster plots (RPs) and peri-stimulus time histograms (PSTHs) which are popular methods in the field of neurophysiology. Introducing special processings to obtaining RPs and PSTHs time histograms for analyzing exchange rates time series, we discover that there exists dynamical interaction among three variables. We also find that adopting multivariables leads to improvements of prediction accuracy.

  5. Psychological Correlates of University Students' Academic Performance: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Richardson, Michelle; Abraham, Charles; Bond, Rod

    2012-01-01

    A review of 13 years of research into antecedents of university students' grade point average (GPA) scores generated the following: a comprehensive, conceptual map of known correlates of tertiary GPA; assessment of the magnitude of average, weighted correlations with GPA; and tests of multivariate models of GPA correlates within and across…

  6. Multivariate missing data in hydrology - Review and applications

    NASA Astrophysics Data System (ADS)

    Ben Aissia, Mohamed-Aymen; Chebana, Fateh; Ouarda, Taha B. M. J.

    2017-12-01

    Water resources planning and management require complete data sets of a number of hydrological variables, such as flood peaks and volumes. However, hydrologists are often faced with the problem of missing data (MD) in hydrological databases. Several methods are used to deal with the imputation of MD. During the last decade, multivariate approaches have gained popularity in the field of hydrology, especially in hydrological frequency analysis (HFA). However, treating the MD remains neglected in the multivariate HFA literature whereas the focus has been mainly on the modeling component. For a complete analysis and in order to optimize the use of data, MD should also be treated in the multivariate setting prior to modeling and inference. Imputation of MD in the multivariate hydrological framework can have direct implications on the quality of the estimation. Indeed, the dependence between the series represents important additional information that can be included in the imputation process. The objective of the present paper is to highlight the importance of treating MD in multivariate hydrological frequency analysis by reviewing and applying multivariate imputation methods and by comparing univariate and multivariate imputation methods. An application is carried out for multiple flood attributes on three sites in order to evaluate the performance of the different methods based on the leave-one-out procedure. The results indicate that, the performance of imputation methods can be improved by adopting the multivariate setting, compared to mean substitution and interpolation methods, especially when using the copula-based approach.

  7. Total anthocyanin content determination in intact açaí (Euterpe oleracea Mart.) and palmitero-juçara (Euterpe edulis Mart.) fruit using near infrared spectroscopy (NIR) and multivariate calibration.

    PubMed

    Inácio, Maria Raquel Cavalcanti; de Lima, Kássio Michell Gomes; Lopes, Valquiria Garcia; Pessoa, José Dalton Cruz; de Almeida Teixeira, Gustavo Henrique

    2013-02-15

    The aim of this study was to evaluate near-infrared reflectance spectroscopy (NIR), and multivariate calibration potential as a rapid method to determinate anthocyanin content in intact fruit (açaí and palmitero-juçara). Several multivariate calibration techniques, including partial least squares (PLS), interval partial least squares, genetic algorithm, successive projections algorithm, and net analyte signal were compared and validated by establishing figures of merit. Suitable results were obtained with the PLS model (four latent variables and 5-point smoothing) with a detection limit of 6.2 g kg(-1), limit of quantification of 20.7 g kg(-1), accuracy estimated as root mean square error of prediction of 4.8 g kg(-1), mean selectivity of 0.79 g kg(-1), sensitivity of 5.04×10(-3) g kg(-1), precision of 27.8 g kg(-1), and signal-to-noise ratio of 1.04×10(-3) g kg(-1). These results suggest NIR spectroscopy and multivariate calibration can be effectively used to determine anthocyanin content in intact açaí and palmitero-juçara fruit. Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. Linkage Analysis of a Model Quantitative Trait in Humans: Finger Ridge Count Shows Significant Multivariate Linkage to 5q14.1

    PubMed Central

    Medland, Sarah E; Loesch, Danuta Z; Mdzewski, Bogdan; Zhu, Gu; Montgomery, Grant W; Martin, Nicholas G

    2007-01-01

    The finger ridge count (a measure of pattern size) is one of the most heritable complex traits studied in humans and has been considered a model human polygenic trait in quantitative genetic analysis. Here, we report the results of the first genome-wide linkage scan for finger ridge count in a sample of 2,114 offspring from 922 nuclear families. Both univariate linkage to the absolute ridge count (a sum of all the ridge counts on all ten fingers), and multivariate linkage analyses of the counts on individual fingers, were conducted. The multivariate analyses yielded significant linkage to 5q14.1 (Logarithm of odds [LOD] = 3.34, pointwise-empirical p-value = 0.00025) that was predominantly driven by linkage to the ring, index, and middle fingers. The strongest univariate linkage was to 1q42.2 (LOD = 2.04, point-wise p-value = 0.002, genome-wide p-value = 0.29). In summary, the combination of univariate and multivariate results was more informative than simple univariate analyses alone. Patterns of quantitative trait loci factor loadings consistent with developmental fields were observed, and the simple pleiotropic model underlying the absolute ridge count was not sufficient to characterize the interrelationships between the ridge counts of individual fingers. PMID:17907812

  9. HYTESS 2: A Hypothetical Turbofan Engine Simplified Simulation with multivariable control and sensor analytical redundancy

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.

    1986-01-01

    A hypothetical turbofan engine simplified simulation with a multivariable control and sensor failure detection, isolation, and accommodation logic (HYTESS II) is presented. The digital program, written in FORTRAN, is self-contained, efficient, realistic and easily used. Simulated engine dynamics were developed from linearized operating point models. However, essential nonlinear effects are retained. The simulation is representative of the hypothetical, low bypass ratio turbofan engine with an advanced control and failure detection logic. Included is a description of the engine dynamics, the control algorithm, and the sensor failure detection logic. Details of the simulation including block diagrams, variable descriptions, common block definitions, subroutine descriptions, and input requirements are given. Example simulation results are also presented.

  10. Multivariate regression model for predicting lumber grade volumes of northern red oak sawlogs

    Treesearch

    Daniel A. Yaussy; Robert L. Brisbin

    1983-01-01

    A multivariate regression model was developed to predict green board-foot yields for the seven common factory lumber grades processed from northern red oak (Quercus rubra L.) factory grade logs. The model uses the standard log measurements of grade, scaling diameter, length, and percent defect. It was validated with an independent data set. The model...

  11. A note on a simplified and general approach to simulating from multivariate copula functions

    Treesearch

    Barry K. Goodwin

    2013-01-01

    Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses ‘Probability-...

  12. Multivariate regression model for predicting yields of grade lumber from yellow birch sawlogs

    Treesearch

    Andrew F. Howard; Daniel A. Yaussy

    1986-01-01

    A multivariate regression model was developed to predict green board-foot yields for the common grades of factory lumber processed from yellow birch factory-grade logs. The model incorporates the standard log measurements of scaling diameter, length, proportion of scalable defects, and the assigned USDA Forest Service log grade. Differences in yields between band and...

  13. Advertising of tobacco products at point of sale: who are more exposed in Brazil?

    PubMed

    Ferreira-Gomes, Adriana Bacelar; Moura, Lenildo de; Araújo-Andrade, Silvânia Suely de; Lacerda-Mendes, Felipe; Perez, Cristina A; Abaakouk, Zohra

    2017-01-01

    To describe the adult population perception of cigarette advertising at point of sale, according their tobacco-use status and socio-demographic characteristics such as sex, age, race/color, region, household location and schooling. A multivariable analysis was carried out using data from the Global Adult Tobacco Survey in 2008 and the National Health Survey in 2013. Both surveys showed that among nonsmokers: women, young adults and those who had over 10 years of schooling had more frequently noticed advertising of cigarettes at point of sale. It was also observed that among the population with fewer years of schooling these proportions increased significantly. A measure that completely bans tobacco advertising would be more effective to protect the vulnerable groups from tobacco consumption.

  14. Copula-based prediction of economic movements

    NASA Astrophysics Data System (ADS)

    García, J. E.; González-López, V. A.; Hirsh, I. D.

    2016-06-01

    In this paper we model the discretized returns of two paired time series BM&FBOVESPA Dividend Index and BM&FBOVESPA Public Utilities Index using multivariate Markov models. The discretization corresponds to three categories, high losses, high profits and the complementary periods of the series. In technical terms, the maximal memory that can be considered for a Markov model, can be derived from the size of the alphabet and dataset. The number of parameters needed to specify a discrete multivariate Markov chain grows exponentially with the order and dimension of the chain. In this case the size of the database is not large enough for a consistent estimation of the model. We apply a strategy to estimate a multivariate process with an order greater than the order achieved using standard procedures. The new strategy consist on obtaining a partition of the state space which is constructed from a combination, of the partitions corresponding to the two marginal processes and the partition corresponding to the multivariate Markov chain. In order to estimate the transition probabilities, all the partitions are linked using a copula. In our application this strategy provides a significant improvement in the movement predictions.

  15. The Galileo scan platform pointing control system - A modern control theoretic viewpoint

    NASA Technical Reports Server (NTRS)

    Sevaston, G. E.; Macala, G. A.; Man, G. K.

    1985-01-01

    The current Galileo scan platform pointing control system (SPPCS) is described, and ways in which modern control concepts could serve to enhance it are considered. Of particular interest are: the multi-variable design model and overall control system architecture, command input filtering, feedback compensator and command input design, stability robustness constraint for both continuous time control systems and for sampled data control systems, and digital implementation of the control system. The proposed approach leads to the design of a system that is similar to current Galileo SPPCS configuration, but promises to be more systematic.

  16. Fuel Property Determination of Biodiesel-Diesel Blends By Terahertz Spectrum

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Zhao, Kun; Bao, Rima

    2012-05-01

    The frequency-dependent absorption characteristics of biodiesel and its blends with conventional diesel fuel have been researched in the spectral range of 0.2-1.5 THz by the terahertz time-domain spectroscopy (THz-TDS). The absorption coefficient presented a regular increasing with biodiesel content. A nonlinear multivariate model that correlating cetane number and solidifying point of bio-diesel blends with absorption coefficient has been established, making the quantitative analysis of fuel properties simple. The results made the cetane number and solidifying point prediction possible by THz-TDS technology and indicated a bright future in practical application.

  17. Detection of BCG bacteria using a magnetoresistive biosensor: A step towards a fully electronic platform for tuberculosis point-of-care detection.

    PubMed

    Barroso, Teresa G; Martins, Rui C; Fernandes, Elisabete; Cardoso, Susana; Rivas, José; Freitas, Paulo P

    2018-02-15

    Tuberculosis is one of the major public health concerns. This highly contagious disease affects more than 10.4 million people, being a leading cause of morbidity by infection. Tuberculosis is diagnosed at the point-of-care by the Ziehl-Neelsen sputum smear microscopy test. Ziehl-Neelsen is laborious, prone to human error and infection risk, with a limit of detection of 10 4 cells/mL. In resource-poor nations, a more practical test, with lower detection limit, is paramount. This work uses a magnetoresistive biosensor to detect BCG bacteria for tuberculosis diagnosis. Herein we report: i) nanoparticle assembly method and specificity for tuberculosis detection; ii) demonstration of proportionality between BCG cell concentration and magnetoresistive voltage signal; iii) application of multiplicative signal correction for systematic effects removal; iv) investigation of calibration effectiveness using chemometrics methods; and v) comparison with state-of-the-art point-of-care tuberculosis biosensors. Results present a clear correspondence between voltage signal and cell concentration. Multiplicative signal correction removes baseline shifts within and between biochip sensors, allowing accurate and precise voltage signal between different biochips. The corrected signal was used for multivariate regression models, which significantly decreased the calibration standard error from 0.50 to 0.03log 10 (cells/mL). Results show that Ziehl-Neelsen detection limits and below are achievable with the magnetoresistive biochip, when pre-processing and chemometrics are used. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Reporting and Methodology of Multivariable Analyses in Prognostic Observational Studies Published in 4 Anesthesiology Journals: A Methodological Descriptive Review.

    PubMed

    Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric

    2015-10-01

    Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.

  19. On measures of association among genetic variables

    PubMed Central

    Gianola, Daniel; Manfredi, Eduardo; Simianer, Henner

    2012-01-01

    Summary Systems involving many variables are important in population and quantitative genetics, for example, in multi-trait prediction of breeding values and in exploration of multi-locus associations. We studied departures of the joint distribution of sets of genetic variables from independence. New measures of association based on notions of statistical distance between distributions are presented. These are more general than correlations, which are pairwise measures, and lack a clear interpretation beyond the bivariate normal distribution. Our measures are based on logarithmic (Kullback-Leibler) and on relative ‘distances’ between distributions. Indexes of association are developed and illustrated for quantitative genetics settings in which the joint distribution of the variables is either multivariate normal or multivariate-t, and we show how the indexes can be used to study linkage disequilibrium in a two-locus system with multiple alleles and present applications to systems of correlated beta distributions. Two multivariate beta and multivariate beta-binomial processes are examined, and new distributions are introduced: the GMS-Sarmanov multivariate beta and its beta-binomial counterpart. PMID:22742500

  20. Quantitative Analysis of Cotton Canopy Size in Field Conditions Using a Consumer-Grade RGB-D Camera.

    PubMed

    Jiang, Yu; Li, Changying; Paterson, Andrew H; Sun, Shangpeng; Xu, Rui; Robertson, Jon

    2017-01-01

    Plant canopy structure can strongly affect crop functions such as yield and stress tolerance, and canopy size is an important aspect of canopy structure. Manual assessment of canopy size is laborious and imprecise, and cannot measure multi-dimensional traits such as projected leaf area and canopy volume. Field-based high throughput phenotyping systems with imaging capabilities can rapidly acquire data about plants in field conditions, making it possible to quantify and monitor plant canopy development. The goal of this study was to develop a 3D imaging approach to quantitatively analyze cotton canopy development in field conditions. A cotton field was planted with 128 plots, including four genotypes of 32 plots each. The field was scanned by GPhenoVision (a customized field-based high throughput phenotyping system) to acquire color and depth images with GPS information in 2016 covering two growth stages: canopy development, and flowering and boll development. A data processing pipeline was developed, consisting of three steps: plot point cloud reconstruction, plant canopy segmentation, and trait extraction. Plot point clouds were reconstructed using color and depth images with GPS information. In colorized point clouds, vegetation was segmented from the background using an excess-green (ExG) color filter, and cotton canopies were further separated from weeds based on height, size, and position information. Static morphological traits were extracted on each day, including univariate traits (maximum and mean canopy height and width, projected canopy area, and concave and convex volumes) and a multivariate trait (cumulative height profile). Growth rates were calculated for univariate static traits, quantifying canopy growth and development. Linear regressions were performed between the traits and fiber yield to identify the best traits and measurement time for yield prediction. The results showed that fiber yield was correlated with static traits after the canopy development stage ( R 2 = 0.35-0.71) and growth rates in early canopy development stages ( R 2 = 0.29-0.52). Multi-dimensional traits (e.g., projected canopy area and volume) outperformed one-dimensional traits, and the multivariate trait (cumulative height profile) outperformed univariate traits. The proposed approach would be useful for identification of quantitative trait loci (QTLs) controlling canopy size in genetics/genomics studies or for fiber yield prediction in breeding programs and production environments.

  1. Computer-based self-organized tectonic zoning: a tentative pattern recognition for Iran

    NASA Astrophysics Data System (ADS)

    Zamani, Ahmad; Hashemi, Naser

    2004-08-01

    Conventional methods of tectonic zoning are frequently characterized by two deficiencies. The first one is the large uncertainty involved in tectonic zoning based on non-quantitative and subjective analysis. Failure to interpret accurately a large amount of data "by eye" is the second. In order to alleviate each of these deficiencies, the multivariate statistical method of cluster analysis has been utilized to seek and separate zones with similar tectonic pattern and construct automated self-organized multivariate tectonic zoning maps. This analytical method of tectonic regionalization is particularly useful for showing trends in tectonic evolution of a region that could not be discovered by any other means. To illustrate, this method has been applied for producing a general-purpose numerical tectonic zoning map of Iran. While there are some similarities between the self-organized multivariate numerical maps and the conventional maps, the cluster solution maps reveal some remarkable features that cannot be observed on the current tectonic maps. The following specific examples need to be noted: (1) The much disputed extent and rigidity of the Lut Rigid Block, described as the microplate of east Iran, is clearly revealed on the self-organized numerical maps. (2) The cluster solution maps reveal a striking similarity between this microplate and the northern Central Iran—including the Great Kavir region. (3) Contrary to the conventional map, the cluster solution maps make a clear distinction between the East Iranian Ranges and the Makran Mountains. (4) Moreover, an interesting similarity between the Azarbaijan region in the northwest and the Makran Mountains in the southeast and between the Kopet Dagh Ranges in the northeast and the Zagros Folded Belt in the southwest of Iran are revealed in the clustering process. This new approach to tectonic zoning is a starting point and is expected to be improved and refined by collection of new data. The method is also a useful tool in studying neotectonics, seismotectonics, seismic zoning, and hazard estimation of the seismogenic regions.

  2. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  3. Radiation Therapy Versus No Radiation Therapy to the Neo-breast Following Skin-Sparing Mastectomy and Immediate Autologous Free Flap Reconstruction for Breast Cancer: Patient-Reported and Surgical Outcomes at 1 Year-A Mastectomy Reconstruction Outcomes Consortium (MROC) Substudy.

    PubMed

    Cooke, Andrew L; Diaz-Abele, Julian; Hayakawa, Tom; Buchel, Ed; Dalke, Kimberly; Lambert, Pascal

    2017-09-01

    To determine whether adjuvant radiation therapy (RT) is associated with adverse patient-reported outcomes and surgical complications 1 year after skin-sparing mastectomy and immediate autologous free flap reconstruction for breast cancer. We compared 24 domains of patient-reported outcome measures 1 year after autologous reconstruction between patients who received adjuvant RT and those who did not. A total of 125 patients who underwent surgery between 2012 and 2015 at our institution were included from the Mastectomy Reconstruction Outcomes Consortium study database. Adjusted multivariate models were created incorporating RT technical data, age, cancer stage, estrogen receptor, chemotherapy, breast size, body mass index, and income to determine whether RT was associated with outcomes. At 1 year after surgery, European Organisation for Research and Treatment of Cancer (EORTC) Breast Cancer-Specific Quality of Life Questionnaire breast symptoms were significantly greater in 64 patients who received RT (8-point difference on 100-point ordinal scale, P<.0001) versus 61 who did not receive RT in univariate and multivariate models. EORTC arm symptoms (20-point difference on 100-point ordinal scale, P=.0200) differed on univariate analysis but not on multivariate analysis. All other outcomes-including Numerical Pain Rating Scale, BREAST-Q (Post-operative Reconstruction Module), Patient-Report Outcomes Measurement Information System Profile 29, McGill Pain Questionnaire-Short Form (MPQ-SF) score, Generalized Anxiety Disorder Scale, and Patient Health Questionnaire-were not statistically different between groups. Surgical complications were uncommon and did not differ by treatment. RT to the neo-breast compared with no RT following immediate autologous free flap reconstruction for breast cancer is well tolerated at 1 year following surgery despite patients undergoing RT also having a higher cancer stage and more intensive surgical and systemic treatment. Neo-breast symptoms are more common in patients receiving RT by the EORTC Breast Cancer-Specific Quality of Life Questionnaire but not by the BREAST-Q. Patient-reported results at 1 year after surgery suggest RT following immediate autologous free flap breast reconstruction is well tolerated. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Medication non-adherence and uncertainty: Information-seeking and processing in the Danish LIFESTAT survey.

    PubMed

    Kriegbaum, Margit; Lau, Sofie Rosenlund

    2017-09-23

    Statins are widely prescribed to lower cardiovascular morbidity and mortality. However, statin non-adherence is very high. The aim of this paper was to investigate reasons for stopping statin treatment in the general population and to study how aspects of information-seeking and processing is associated with statin non-adherence. This study used a population survey on 3050 Danish residents aged 45-65 years. Reasons for statin discontinuation was studied among those who were previous statin users. The association between information seeking and processing and statin discontinuation were analysed using multivariate logistical regression models. Experience of side effects and fear of side effects played an important role in the discontinuation of statin treatment. Feelings of uncertainty and confusion regarding information on statins predicted statin discontinuation. This applied to information from both mass media and from general practitioners. There was no clear pattern of information seeking and statin non-adherence. The article point to the impact of information-seeking on the decision to take cholesterol-lowering medication. This included contributions from information disseminated by media outlets. Side effects and fear of side effects should be addressed in clinical practice. Health care professionals should pay attention to emotional aspects of how information is disseminated and perceived by statin users. Copyright © 2017. Published by Elsevier Inc.

  5. Numerical modeling of polymorphic transformation of oleic acid via near-infrared spectroscopy and factor analysis.

    PubMed

    Liu, Ling; Cheng, Yuliang; Sun, Xiulan; Pi, Fuwei

    2018-05-15

    Near-infrared (NIR) spectroscopy as a tool for direct and quantitatively screening the minute polymorphic transitions of bioactive fatty acids was assessed basing on a thermal heating process of oleic acid. Temperature-dependent NIR spectral profiles indicate that dynamical variances of COOH group dominate its γ → α phase transition, while the transition from active α to β phase mainly relates to the conformational transfer of acyl chain. Through operating multivariate curve resolution-alternating least squares with factor analysis, instantaneous contribution of each active polymorph during the transition process was illustrated for displaying the progressive evolutions of functional groups. Calculated contributions reveal that the α phase of oleic acid initially is present at around -18 °C, but sharply grows up around -2.2 °C from the transformation of γ phase and finally disappears at the melting point. On the other hand, the β phase of oleic acid is sole self-generation after melt even it embryonically appears at -2.2 °C. Such mathematical approach based on NIR spectroscopy and factor analysis calculation provides a volatile strategy in quantitatively exploring the transition processes of bioactive fatty acids; meanwhile, it maintains promising possibility for instantaneous quantifying each active polymorph of lipid materials. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Numerical modeling of polymorphic transformation of oleic acid via near-infrared spectroscopy and factor analysis

    NASA Astrophysics Data System (ADS)

    Liu, Ling; Cheng, Yuliang; Sun, Xiulan; Pi, Fuwei

    2018-05-01

    Near-infrared (NIR) spectroscopy as a tool for direct and quantitatively screening the minute polymorphic transitions of bioactive fatty acids was assessed basing on a thermal heating process of oleic acid. Temperature-dependent NIR spectral profiles indicate that dynamical variances of COOH group dominate its γ → α phase transition, while the transition from active α to β phase mainly relates to the conformational transfer of acyl chain. Through operating multivariate curve resolution-alternating least squares with factor analysis, instantaneous contribution of each active polymorph during the transition process was illustrated for displaying the progressive evolutions of functional groups. Calculated contributions reveal that the α phase of oleic acid initially is present at around -18 °C, but sharply grows up around -2.2 °C from the transformation of γ phase and finally disappears at the melting point. On the other hand, the β phase of oleic acid is sole self-generation after melt even it embryonically appears at -2.2 °C. Such mathematical approach based on NIR spectroscopy and factor analysis calculation provides a volatile strategy in quantitatively exploring the transition processes of bioactive fatty acids; meanwhile, it maintains promising possibility for instantaneous quantifying each active polymorph of lipid materials.

  7. Cider fermentation process monitoring by Vis-NIR sensor system and chemometrics.

    PubMed

    Villar, Alberto; Vadillo, Julen; Santos, Jose I; Gorritxategi, Eneko; Mabe, Jon; Arnaiz, Aitor; Fernández, Luis A

    2017-04-15

    Optimization of a multivariate calibration process has been undertaken for a Visible-Near Infrared (400-1100nm) sensor system, applied in the monitoring of the fermentation process of the cider produced in the Basque Country (Spain). The main parameters that were monitored included alcoholic proof, l-lactic acid content, glucose+fructose and acetic acid content. The multivariate calibration was carried out using a combination of different variable selection techniques and the most suitable pre-processing strategies were selected based on the spectra characteristics obtained by the sensor system. The variable selection techniques studied in this work include Martens Uncertainty test, interval Partial Least Square Regression (iPLS) and Genetic Algorithm (GA). This procedure arises from the need to improve the calibration models prediction ability for cider monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Multivariate Analysis To Quantify Species in the Presence of Direct Interferents: Micro-Raman Analysis of HNO 3 in Microfluidic Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Nelson, Gilbert L.; Casella, Amanda J.

    Microfluidic devices are a growing field with significant potential for application to small scale processing of solutions. Much like large scale processing, fast, reliable, and cost effective means of monitoring the streams during processing are needed. Here we apply a novel Micro-Raman probe to the on-line monitoring of streams within a microfluidic device. For either macro or micro scale process monitoring via spectroscopic response, there is the danger of interfering or confounded bands obfuscating results. By utilizing chemometric analysis, a form of multivariate analysis, species can be accurately quantified in solution despite the presence of overlapping or confounded spectroscopic bands.more » This is demonstrated on solutions of HNO 3 and NaNO 3 within micro-flow and microfluidic devices.« less

  9. Multivariate pattern dependence

    PubMed Central

    Saxe, Rebecca

    2017-01-01

    When we perform a cognitive task, multiple brain regions are engaged. Understanding how these regions interact is a fundamental step to uncover the neural bases of behavior. Most research on the interactions between brain regions has focused on the univariate responses in the regions. However, fine grained patterns of response encode important information, as shown by multivariate pattern analysis. In the present article, we introduce and apply multivariate pattern dependence (MVPD): a technique to study the statistical dependence between brain regions in humans in terms of the multivariate relations between their patterns of responses. MVPD characterizes the responses in each brain region as trajectories in region-specific multidimensional spaces, and models the multivariate relationship between these trajectories. We applied MVPD to the posterior superior temporal sulcus (pSTS) and to the fusiform face area (FFA), using a searchlight approach to reveal interactions between these seed regions and the rest of the brain. Across two different experiments, MVPD identified significant statistical dependence not detected by standard functional connectivity. Additionally, MVPD outperformed univariate connectivity in its ability to explain independent variance in the responses of individual voxels. In the end, MVPD uncovered different connectivity profiles associated with different representational subspaces of FFA: the first principal component of FFA shows differential connectivity with occipital and parietal regions implicated in the processing of low-level properties of faces, while the second and third components show differential connectivity with anterior temporal regions implicated in the processing of invariant representations of face identity. PMID:29155809

  10. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  11. Predictors of radiation-induced esophageal toxicity in patients with non-small-cell lung cancer treated with three-dimensional conformal radiotherapy.

    PubMed

    Singh, Anurag K; Lockett, Mary Ann; Bradley, Jeffrey D

    2003-02-01

    To evaluate the incidence and clinical/dosimetric predictors of acute and late Radiation Therapy Oncology Group Grade 3-5 esophageal toxicity in patients with non-small-cell lung cancer (NSCLC) treated with definitive three-dimensional conformal radiotherapy (3D-CRT). We retrospectively reviewed the charts of 207 consecutive patients with NSCLC who were treated with high-dose, definitive 3D-CRT between March 1991 and December 1998. This population consisted of 107 men and 100 women. The median age was 67 years (range 31-90). The following patient and treatment parameters were studied: age, gender, race, performance status, sequential chemotherapy, concurrent chemotherapy, presence of subcarinal nodes, pretreatment weight loss, mean dose to the entire esophagus, maximal point dose to the esophagus, and percentage of volume of esophagus receiving >55 Gy. All doses are reported without heterogeneity corrections. The median prescription dose to the isocenter in this population was 70 Gy (range 60-74) delivered in 2-Gy daily fractions. All patients were treated once daily. Acute and late esophageal toxicities were graded by Radiation Therapy Oncology Group criteria. Patient and clinical/dosimetric factors were coded and correlated with acute and late Grade 3-5 esophageal toxicity using univariate and multivariate regression analyses. Of 207 patients, 16 (8%) developed acute (10 patients) or late (13 patients) Grade 3-5 esophageal toxicity. Seven patients had both acute and late Grade 3-5 esophageal toxicity. One patient died (Grade 5 esophageal toxicity) of late esophageal perforation. Concurrent chemotherapy, maximal point dose to the esophagus >58 Gy, and a mean dose to the entire esophagus >34 Gy were significantly associated with a risk of Grade 3-5 esophageal toxicity on univariate analysis. Concurrent chemotherapy and maximal point dose to the esophagus >58 Gy retained significance on multivariate analysis. Of 207 patients, 53 (26%) received concurrent chemotherapy. Fourteen (88%) of the 16 patients who developed Grade 3-5 esophageal toxicity had received concurrent chemotherapy (p = 0.0001, Pearson's chi-square test). No case of Grade 3-5 esophageal toxicity occurred in patients who received a maximal point dose to the esophagus of <58 Gy (p = 0.0001, Fisher's exact test, two-tail). Only 2 patients developed Grade 3-5 esophageal toxicity in the absence of concurrent chemotherapy; both received a maximal esophageal point dose >69 Gy. All assessable patients who developed Grade 3-5 esophageal toxicity had a mean dose to the entire esophagus >34 Gy (p = 0.0351, Pearson's chi-square test). However, the mean dose was not predictive on multivariate analysis. Concurrent chemotherapy and the maximal esophageal point dose were significantly associated with a risk of Grade 3-5 esophageal toxicity in patients with NSCLC treated with high-dose 3D-CRT. In patients who received concurrent chemotherapy, the threshold maximal esophageal point dose for Grade 3-5 esophageal toxicity was 58 Gy. An insufficient number of patients developed Grade 3-5 esophageal toxicity in the absence of chemotherapy to allow a valid statistical analysis of the relationship between the maximal esophageal point dose and esophagitis.

  12. Multivariate Granger causality: an estimation framework based on factorization of the spectral density matrix

    PubMed Central

    Wen, Xiaotong; Rangarajan, Govindan; Ding, Mingzhou

    2013-01-01

    Granger causality is increasingly being applied to multi-electrode neurophysiological and functional imaging data to characterize directional interactions between neurons and brain regions. For a multivariate dataset, one might be interested in different subsets of the recorded neurons or brain regions. According to the current estimation framework, for each subset, one conducts a separate autoregressive model fitting process, introducing the potential for unwanted variability and uncertainty. In this paper, we propose a multivariate framework for estimating Granger causality. It is based on spectral density matrix factorization and offers the advantage that the estimation of such a matrix needs to be done only once for the entire multivariate dataset. For any subset of recorded data, Granger causality can be calculated through factorizing the appropriate submatrix of the overall spectral density matrix. PMID:23858479

  13. Compound-Specific Isotope Analysis of Diesel Fuels in a Forensic Investigation

    NASA Astrophysics Data System (ADS)

    Muhammad, Syahidah; Frew, Russell; Hayman, Alan

    2015-02-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin i.e. the very subtle differences in isotopic values between the samples.

  14. Compound-specific isotope analysis of diesel fuels in a forensic investigation

    PubMed Central

    Muhammad, Syahidah A.; Frew, Russell D.; Hayman, Alan R.

    2015-01-01

    Compound-specific isotope analysis (CSIA) offers great potential as a tool to provide chemical evidence in a forensic investigation. Many attempts to trace environmental oil spills were successful where isotopic values were particularly distinct. However, difficulties arise when a large data set is analyzed and the isotopic differences between samples are subtle. In the present study, discrimination of diesel oils involved in a diesel theft case was carried out to infer the relatedness of the samples to potential source samples. This discriminatory analysis used a suite of hydrocarbon diagnostic indices, alkanes, to generate carbon and hydrogen isotopic data of the compositions of the compounds which were then processed using multivariate statistical analyses to infer the relatedness of the data set. The results from this analysis were put into context by comparing the data with the δ13C and δ2H of alkanes in commercial diesel samples obtained from various locations in the South Island of New Zealand. Based on the isotopic character of the alkanes, it is suggested that diesel fuels involved in the diesel theft case were distinguishable. This manuscript shows that CSIA when used in tandem with multivariate statistical analysis provide a defensible means to differentiate and source-apportion qualitatively similar oils at the molecular level. This approach was able to overcome confounding challenges posed by the near single-point source of origin, i.e., the very subtle differences in isotopic values between the samples. PMID:25774366

  15. Outcomes of Kidney Transplantation Abroad: A Single-Center Canadian Cohort Study.

    PubMed

    Quach, Kevin; Sultan, Heebah; Li, Yanhong; Famure, Olusegun; Kim, S Joseph

    2016-03-01

    An increasing demand for kidney transplantation has enticed some patients with end-stage renal disease (ESRD) to venture outside their country of residence, but their posttransplant outcomes may be suboptimal. We compared the risks and clinical outcomes among tourists, or patients who pursue a kidney transplant abroad, versus patients who received a transplant at the Toronto General Hospital (TGH). A single-center, 1:3 matched (based on age at transplant, time on dialysis, and year of transplant) cohort study was conducted. Forty-five tourists were matched with 135 domestic transplant recipients between January 1, 2000, and December 31, 2011. Multivariable Cox proportional hazards models were fitted to assess graft and patient outcomes. Among the 45 tourists, the majority (38 of 45) traveled to the Middle East or Far East Asia, and most received living donor kidney transplants (35 of 45). Multivariable Cox proportional hazards models showed that tourists had a higher risk for the composite outcome of acute rejection, death-censored graft failure, or death with graft function (DWGF; hazard ratio [HR] 2.08, 95% confidence interval [CI]: 1.06-4.07). Tourists also showed a higher risk for the individual end points of acute rejection, DWGF, and posttransplant hospitalizations. Patients going abroad for kidney transplantation may have inferior outcomes compared to domestic patients receiving kidney transplants. Patients who are contemplating an overseas transplant need to be aware of the increased risk of adverse posttransplant outcomes and should be appropriately counseled by transplant professionals during the pretransplant evaluation process. © 2016, NATCO.

  16. The impact of health information technology on cancer care across the continuum: a systematic review and meta-analysis.

    PubMed

    Tarver, Will L; Menachemi, Nir

    2016-03-01

    Health information technology (HIT) has the potential to play a significant role in the management of cancer. The purpose of this review is to identify and examine empirical studies that investigate the impact of HIT in cancer care on different levels of the care continuum. Electronic searches were performed in four academic databases. The authors used a three-step search process to identify 122 studies that met specific inclusion criteria. Next, a coding sheet was used to extract information from each included article to use in an analysis. Logistic regression was used to determine study-specific characteristics that were associated with positive findings. Overall, 72.4% of published analyses reported a beneficial effect of HIT. Multivariate analysis found that the impact of HIT differs across the cancer continuum with studies targeting diagnosis and treatment being, respectively, 77 (P = .001) and 39 (P = .039) percentage points less likely to report a beneficial effect when compared to those targeting prevention. In addition, studies targeting HIT to patients were 31 percentage points less likely to find a beneficial effect than those targeting providers (P = .030). Lastly, studies assessing behavior change as an outcome were 41 percentage points less likely to find a beneficial effect (P = .006), while studies targeting decision making were 27 percentage points more likely to find a beneficial effect (P = .034). Based on current evidence, HIT interventions seem to be more successful when targeting physicians, care in the prevention phase of the cancer continuum, and/or decision making. An agenda for future research is discussed. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Combining fibre optic Raman spectroscopy and tactile resonance measurement for tissue characterization

    NASA Astrophysics Data System (ADS)

    Candefjord, Stefan; Nyberg, Morgan; Jalkanen, Ville; Ramser, Kerstin; Lindahl, Olof A.

    2010-12-01

    Tissue characterization is fundamental for identification of pathological conditions. Raman spectroscopy (RS) and tactile resonance measurement (TRM) are two promising techniques that measure biochemical content and stiffness, respectively. They have potential to complement the golden standard--histological analysis. By combining RS and TRM, complementary information about tissue content can be obtained and specific drawbacks can be avoided. The aim of this study was to develop a multivariate approach to compare RS and TRM information. The approach was evaluated on measurements at the same points on porcine abdominal tissue. The measurement points were divided into five groups by multivariate analysis of the RS data. A regression analysis was performed and receiver operating characteristic (ROC) curves were used to compare the RS and TRM data. TRM identified one group efficiently (area under ROC curve 0.99). The RS data showed that the proportion of saturated fat was high in this group. The regression analysis showed that stiffness was mainly determined by the amount of fat and its composition. We concluded that RS provided additional, important information for tissue identification that was not provided by TRM alone. The results are promising for development of a method combining RS and TRM for intraoperative tissue characterization.

  18. Associated Variables of Myositis in Systemic Lupus Erythematosus: A Cross-Sectional Study.

    PubMed

    Liang, Yan; Leng, Rui-Xue; Pan, Hai-Feng; Ye, Dong-Qing

    2017-05-26

    BACKGROUND This study aimed to estimate the point prevalence of myositis and identify associated variables of myositis in systemic lupus erythematosus (SLE). MATERIAL AND METHODS Clinical date of patients hospitalized with lupus at the First Affiliated Hospital of Anhui Medical University and Anhui Provincial Hospital were collected. Patients were defined as having myositis if they reported the presence of persistent invalidating muscular weakness combined with increased levels of creatine phosphokinase (CPK) and abnormal electromyography (EMG). RESULTS The study sample comprised 1701 lupus patients, of which 44 had myositis. Patients with SLE-associated myositis are more likely to have skin rash, alopecia, pericarditis, vasculitis, anti-Sm, anti-RNP, anti-dsDNA, thrombocytopenia, leukopenia, low C3, low C4, high erythrocyte sedimentation rate (ESR), high D-dimer, and active disease. Multivariate logistic regression found positive associations between leukopenia, alopecia, and active disease with myositis. Negative associations between myositis with the use of corticosteroids or immunosuppressive drugs were revealed in univariate and multivariate analysis. CONCLUSIONS The point prevalence of myositis was 2.6% in SLE patients. The significant association of alopecia, leukopenia, and active disease with myositis suggests that organ damage, hematological abnormality, and high disease activity promote the progression of myositis in lupus patients.

  19. Optimization of cloud point extraction and solid phase extraction methods for speciation of arsenic in natural water using multivariate technique.

    PubMed

    Baig, Jameel A; Kazi, Tasneem G; Shah, Abdul Q; Arain, Mohammad B; Afridi, Hassan I; Kandhro, Ghulam A; Khan, Sumaira

    2009-09-28

    The simple and rapid pre-concentration techniques viz. cloud point extraction (CPE) and solid phase extraction (SPE) were applied for the determination of As(3+) and total inorganic arsenic (iAs) in surface and ground water samples. The As(3+) was formed complex with ammonium pyrrolidinedithiocarbamate (APDC) and extracted by surfactant-rich phases in the non-ionic surfactant Triton X-114, after centrifugation the surfactant-rich phase was diluted with 0.1 mol L(-1) HNO(3) in methanol. While total iAs in water samples was adsorbed on titanium dioxide (TiO(2)); after centrifugation, the solid phase was prepared to be slurry for determination. The extracted As species were determined by electrothermal atomic absorption spectrometry. The multivariate strategy was applied to estimate the optimum values of experimental factors for the recovery of As(3+) and total iAs by CPE and SPE. The standard addition method was used to validate the optimized methods. The obtained result showed sufficient recoveries for As(3+) and iAs (>98.0%). The concentration factor in both cases was found to be 40.

  20. Experimental study of adaptive pointing and tracking for large flexible space structures

    NASA Technical Reports Server (NTRS)

    Boussalis, D.; Bayard, D. S.; Ih, C.; Wang, S. J.; Ahmed, A.

    1991-01-01

    This paper describes an experimental study of adaptive pointing and tracking control for flexible spacecraft conducted on a complex ground experiment facility. The algorithm used in this study is based on a multivariable direct model reference adaptive control law. Several experimental validation studies were performed earlier using this algorithm for vibration damping and robust regulation, with excellent results. The current work extends previous studies by addressing the pointing and tracking problem. As is consistent with an adaptive control framework, the plant is assumed to be poorly known to the extent that only system level knowledge of its dynamics is available. Explicit bounds on the steady-state pointing error are derived as functions of the adaptive controller design parameters. It is shown that good tracking performance can be achieved in an experimental setting by adjusting adaptive controller design weightings according to the guidelines indicated by the analytical expressions for the error.

  1. The effect of health and dental insurance on US children's dental care utilization for urgent and non-urgent dental problems - 2008.

    PubMed

    Naavaal, Shillpa; Barker, Laurie K; Griffin, Susan O

    2017-12-01

    We examined the association between utilization of care for a dental problem (utilization-DP) and parent-reported dental problem (DP) urgency among children with DP by type of health care insurance coverage. We used weighted 2008 National Health Interview Survey data from 2,834 children, aged 2-17 years with at least one DP within the 6 months preceding survey. Explanatory variables were selected based on Andersen's model of healthcare utilization. Need was considered urgent if DP included toothache, bleeding gums, broken or missing teeth, broken or missing filling, or decayed teeth and otherwise as non-urgent. The primary enabling variable, insurance, had four categories: none, private health no dental coverage (PHND), private health and dental (PHD), or Medicaid/State Children's Health Insurance Program (SCHIP). Predisposing variables included sociodemographic characteristics. We used bivariate and multivariate analyses to identify explanatory variables' association with utilization-DP. Using logistic regression, we obtained adjusted estimates of utilization-DP by urgency for each insurance category. In bivariate analyses, utilization-DP was associated with both insurance and urgency. In multivariate analyses, the difference in percent utilizing care for an urgent versus non-urgent DP among children covered by Medicaid/SCHIP was 32 percentage points; PHD, 25 percentage points; PHND, 12 percentage points; and no insurance, 14 percentage points. The difference in utilization by DP urgency was higher for children with Medicaid/SCHIP compared with either PHND or uninsured children. Expansion of Medicaid/SCHIP may permit children to receive care for urgent DPs who otherwise may not, due to lack of dental insurance. © 2016 American Association of Public Health Dentistry.

  2. Multivariable PID controller design tuning using bat algorithm for activated sludge process

    NASA Astrophysics Data System (ADS)

    Atikah Nor’Azlan, Nur; Asmiza Selamat, Nur; Mat Yahya, Nafrizuan

    2018-04-01

    The designing of a multivariable PID control for multi input multi output is being concerned with this project by applying four multivariable PID control tuning which is Davison, Penttinen-Koivo, Maciejowski and Proposed Combined method. The determination of this study is to investigate the performance of selected optimization technique to tune the parameter of MPID controller. The selected optimization technique is Bat Algorithm (BA). All the MPID-BA tuning result will be compared and analyzed. Later, the best MPID-BA will be chosen in order to determine which techniques are better based on the system performances in terms of transient response.

  3. Host-based identification is not supported by morphometrics in natural populations of Gyrodactylus salaris and G. thymalli (Platyhelminthes, Monogenea).

    PubMed

    Olstad, K; Shinn, A P; Bachmann, L; Bakke, T A

    2007-12-01

    Gyrodactylus salaris is a serious pest of wild pre-smolt Atlantic salmon (Salmo salar) in Norway. The closely related G. thymalli, originally described from grayling (Thymallus thymallus), is assumed harmless to both grayling and salmon. The 2 species are difficult to distinguish using traditional, morphometric methods or molecular approaches. The aim of this study was to explore whether there is a consistent pattern of morphometrical variation between G. salaris and G. thymalli and to analyse the morphometric variation in the context of 'diagnostic realism' (in natural populations). Specimens from the type-material for the 2 species are also included. In total, 27 point-to-point measurements from the opisthaptoral hard parts were used and analysed by digital image processing and uni- and multivariate morphometry. All populations most closely resembled its respective type material, as expected from host species, with the exception of G. thymalli from the Norwegian river Trysilelva. We, therefore, did not find clear support in the morphometrical variation among G. salaris and G. thymalli for an a priori species delineation based on host. The present study also indicates an urgent need for more detailed knowledge on the influence of environmental factors on the phenotype of gyrodactylid populations.

  4. Time-dependent changes in protein expression in rainbow trout muscle following hypoxia.

    PubMed

    Wulff, Tune; Jokumsen, Alfred; Højrup, Peter; Jessen, Flemming

    2012-04-18

    Adaptation to hypoxia is a complex process, and individual proteins will be up- or down-regulated in order to address the main challenges at any given time. To investigate the dynamics of the adaptation, rainbow trout (Oncorhynchus mykiss) was exposed to 30% of normal oxygen tension for 1, 2, 5 and 24 h respectively, after which muscle samples were taken. The successful investigation of numerous proteins in a single study was achieved by selectively separating the sarcoplasmic proteins using 2-DE. In total 46 protein spots were identified as changing in abundance in response to hypoxia using one-way ANOVA and multivariate data analysis. Proteins of interest were subsequently identified by MS/MS following tryptic digestion. The observed regulation following hypoxia in skeletal muscle was determined to be time specific, as only a limited number of proteins were regulated in response to more than one time point. The cellular response to hypoxia included regulation of proteins involved in maintaining iron homeostasis, energy levels and muscle structure. In conclusion, this proteome-based study presents a comprehensive investigation of the expression profiles of numerous proteins at four different time points. This increases our understanding of timed changes in protein expression in rainbow trout muscle following hypoxia. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. On the interpretation of weight vectors of linear models in multivariate neuroimaging.

    PubMed

    Haufe, Stefan; Meinecke, Frank; Görgen, Kai; Dähne, Sven; Haynes, John-Dylan; Blankertz, Benjamin; Bießmann, Felix

    2014-02-15

    The increase in spatiotemporal resolution of neuroimaging devices is accompanied by a trend towards more powerful multivariate analysis methods. Often it is desired to interpret the outcome of these methods with respect to the cognitive processes under study. Here we discuss which methods allow for such interpretations, and provide guidelines for choosing an appropriate analysis for a given experimental goal: For a surgeon who needs to decide where to remove brain tissue it is most important to determine the origin of cognitive functions and associated neural processes. In contrast, when communicating with paralyzed or comatose patients via brain-computer interfaces, it is most important to accurately extract the neural processes specific to a certain mental state. These equally important but complementary objectives require different analysis methods. Determining the origin of neural processes in time or space from the parameters of a data-driven model requires what we call a forward model of the data; such a model explains how the measured data was generated from the neural sources. Examples are general linear models (GLMs). Methods for the extraction of neural information from data can be considered as backward models, as they attempt to reverse the data generating process. Examples are multivariate classifiers. Here we demonstrate that the parameters of forward models are neurophysiologically interpretable in the sense that significant nonzero weights are only observed at channels the activity of which is related to the brain process under study. In contrast, the interpretation of backward model parameters can lead to wrong conclusions regarding the spatial or temporal origin of the neural signals of interest, since significant nonzero weights may also be observed at channels the activity of which is statistically independent of the brain process under study. As a remedy for the linear case, we propose a procedure for transforming backward models into forward models. This procedure enables the neurophysiological interpretation of the parameters of linear backward models. We hope that this work raises awareness for an often encountered problem and provides a theoretical basis for conducting better interpretable multivariate neuroimaging analyses. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Lifestyle and the risk of diabetes mellitus in a Japanese population.

    PubMed

    Tatsumi, Yukako; Ohno, Yuko; Morimoto, Akiko; Nishigaki, Yoshio; Mizuno, Shoichi; Watanabe, Shaw

    2013-06-01

    The objective was to examine the association between lifestyle and risk for diabetes. For an average of 9.9 years, this study prospectively followed a cohort of 7,211 (2,524 men and 4,687 women) community residents aged 30-69 years without diabetes at a health check-up conducted between April 1990 and March 1992 until diabetes was confirmed or until the end of 2006. The subjects were divided into 6 groups according to their total scores of Breslow's lifestyle index (1-2, 3, 4, 5, 6 and 7 points). The association between lifestyle and diabetes incidence was investigated using Cox proportional hazards regression models. The results showed that the multivariate-adjusted hazard ratios were 0.45 in subjects who scored 5 points, 0.39 in subjects who scored 6 points, and 0.31 in subjects who scored 7 points, compared with subjects who scored 1-2 points. These data indicate that the healthy behaviors prevent the incidence of diabetes.

  7. Atrial Electrogram Fractionation Distribution before and after Pulmonary Vein Isolation in Human Persistent Atrial Fibrillation-A Retrospective Multivariate Statistical Analysis.

    PubMed

    Almeida, Tiago P; Chu, Gavin S; Li, Xin; Dastagir, Nawshin; Tuan, Jiun H; Stafford, Peter J; Schlindwein, Fernando S; Ng, G André

    2017-01-01

    Purpose: Complex fractionated atrial electrograms (CFAE)-guided ablation after pulmonary vein isolation (PVI) has been used for persistent atrial fibrillation (persAF) therapy. This strategy has shown suboptimal outcomes due to, among other factors, undetected changes in the atrial tissue following PVI. In the present work, we investigate CFAE distribution before and after PVI in patients with persAF using a multivariate statistical model. Methods: 207 pairs of atrial electrograms (AEGs) were collected before and after PVI respectively, from corresponding LA regions in 18 persAF patients. Twelve attributes were measured from the AEGs, before and after PVI. Statistical models based on multivariate analysis of variance (MANOVA) and linear discriminant analysis (LDA) have been used to characterize the atrial regions and AEGs. Results: PVI significantly reduced CFAEs in the LA (70 vs. 40%; P < 0.0001). Four types of LA regions were identified, based on the AEGs characteristics: (i) fractionated before PVI that remained fractionated after PVI (31% of the collected points); (ii) fractionated that converted to normal (39%); (iii) normal prior to PVI that became fractionated (9%) and; (iv) normal that remained normal (21%). Individually, the attributes failed to distinguish these LA regions, but multivariate statistical models were effective in their discrimination ( P < 0.0001). Conclusion: Our results have unveiled that there are LA regions resistant to PVI, while others are affected by it. Although, traditional methods were unable to identify these different regions, the proposed multivariate statistical model discriminated LA regions resistant to PVI from those affected by it without prior ablation information.

  8. Estimation of parameters in Shot-Noise-Driven Doubly Stochastic Poisson processes using the EM algorithm--modeling of pre- and postsynaptic spike trains.

    PubMed

    Mino, H

    2007-01-01

    To estimate the parameters, the impulse response (IR) functions of some linear time-invariant systems generating intensity processes, in Shot-Noise-Driven Doubly Stochastic Poisson Process (SND-DSPP) in which multivariate presynaptic spike trains and postsynaptic spike trains can be assumed to be modeled by the SND-DSPPs. An explicit formula for estimating the IR functions from observations of multivariate input processes of the linear systems and the corresponding counting process (output process) is derived utilizing the expectation maximization (EM) algorithm. The validity of the estimation formula was verified through Monte Carlo simulations in which two presynaptic spike trains and one postsynaptic spike train were assumed to be observable. The IR functions estimated on the basis of the proposed identification method were close to the true IR functions. The proposed method will play an important role in identifying the input-output relationship of pre- and postsynaptic neural spike trains in practical situations.

  9. Multivariate assessment of event-related potentials with the t-CWT method.

    PubMed

    Bostanov, Vladimir

    2015-11-05

    Event-related brain potentials (ERPs) are usually assessed with univariate statistical tests although they are essentially multivariate objects. Brain-computer interface applications are a notable exception to this practice, because they are based on multivariate classification of single-trial ERPs. Multivariate ERP assessment can be facilitated by feature extraction methods. One such method is t-CWT, a mathematical-statistical algorithm based on the continuous wavelet transform (CWT) and Student's t-test. This article begins with a geometric primer on some basic concepts of multivariate statistics as applied to ERP assessment in general and to the t-CWT method in particular. Further, it presents for the first time a detailed, step-by-step, formal mathematical description of the t-CWT algorithm. A new multivariate outlier rejection procedure based on principal component analysis in the frequency domain is presented as an important pre-processing step. The MATLAB and GNU Octave implementation of t-CWT is also made publicly available for the first time as free and open source code. The method is demonstrated on some example ERP data obtained in a passive oddball paradigm. Finally, some conceptually novel applications of the multivariate approach in general and of the t-CWT method in particular are suggested and discussed. Hopefully, the publication of both the t-CWT source code and its underlying mathematical algorithm along with a didactic geometric introduction to some basic concepts of multivariate statistics would make t-CWT more accessible to both users and developers in the field of neuroscience research.

  10. Application of dual-cloud point extraction for the trace levels of copper in serum of different viral hepatitis patients by flame atomic absorption spectrometry: a multivariate study.

    PubMed

    Arain, Salma Aslam; Kazi, Tasneem G; Afridi, Hassan Imran; Abbasi, Abdul Rasool; Panhwar, Abdul Haleem; Naeemullah; Shanker, Bhawani; Arain, Mohammad Balal

    2014-12-10

    An efficient, innovative preconcentration method, dual-cloud point extraction (d-CPE) has been developed for the extraction and preconcentration of copper (Cu(2+)) in serum samples of different viral hepatitis patients prior to couple with flame atomic absorption spectrometry (FAAS). The d-CPE procedure was based on forming complexes of elemental ions with complexing reagent 1-(2-pyridylazo)-2-naphthol (PAN), and subsequent entrapping the complexes in nonionic surfactant (Triton X-114). Then the surfactant rich phase containing the metal complexes was treated with aqueous nitric acid solution, and metal ions were back extracted into the aqueous phase, as second cloud point extraction stage, and finally determined by flame atomic absorption spectrometry using conventional nebulization. The multivariate strategy was applied to estimate the optimum values of experimental variables for the recovery of Cu(2+) using d-CPE. In optimum experimental conditions, the limit of detection and the enrichment factor were 0.046μgL(-1) and 78, respectively. The validity and accuracy of proposed method were checked by analysis of Cu(2+) in certified sample of serum (CRM) by d-CPE and conventional CPE procedure on same CRM. The proposed method was successfully applied to the determination of Cu(2+) in serum samples of different viral hepatitis patients and healthy controls. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Scope of Gradient and Genetic Algorithms in Multivariable Function Optimization

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Sen, S. K.

    2007-01-01

    Global optimization of a multivariable function - constrained by bounds specified on each variable and also unconstrained - is an important problem with several real world applications. Deterministic methods such as the gradient algorithms as well as the randomized methods such as the genetic algorithms may be employed to solve these problems. In fact, there are optimization problems where a genetic algorithm/an evolutionary approach is preferable at least from the quality (accuracy) of the results point of view. From cost (complexity) point of view, both gradient and genetic approaches are usually polynomial-time; there are no serious differences in this regard, i.e., the computational complexity point of view. However, for certain types of problems, such as those with unacceptably erroneous numerical partial derivatives and those with physically amplified analytical partial derivatives whose numerical evaluation involves undesirable errors and/or is messy, a genetic (stochastic) approach should be a better choice. We have presented here the pros and cons of both the approaches so that the concerned reader/user can decide which approach is most suited for the problem at hand. Also for the function which is known in a tabular form, instead of an analytical form, as is often the case in an experimental environment, we attempt to provide an insight into the approaches focusing our attention toward accuracy. Such an insight will help one to decide which method, out of several available methods, should be employed to obtain the best (least error) output. *

  12. Development and validation of the San Diego Early Test Score to predict acute and early HIV infection risk in men who have sex with men.

    PubMed

    Hoenigl, Martin; Weibel, Nadir; Mehta, Sanjay R; Anderson, Christy M; Jenks, Jeffrey; Green, Nella; Gianella, Sara; Smith, Davey M; Little, Susan J

    2015-08-01

    Although men who have sex with men (MSM) represent a dominant risk group for human immunodeficiency virus (HIV), the risk of HIV infection within this population is not uniform. The objective of this study was to develop and validate a score to estimate incident HIV infection risk. Adult MSM who were tested for acute and early HIV (AEH) between 2008 and 2014 were retrospectively randomized 2:1 to a derivation and validation dataset, respectively. Using the derivation dataset, each predictor associated with an AEH outcome in the multivariate prediction model was assigned a point value that corresponded to its odds ratio. The score was validated on the validation dataset using C-statistics. Data collected at a single HIV testing encounter from 8326 unique MSM were analyzed, including 200 with AEH (2.4%). Four risk behavior variables were significantly associated with an AEH diagnosis (ie, incident infection) in multivariable analysis and were used to derive the San Diego Early Test (SDET) score: condomless receptive anal intercourse (CRAI) with an HIV-positive MSM (3 points), the combination of CRAI plus ≥5 male partners (3 points), ≥10 male partners (2 points), and diagnosis of bacterial sexually transmitted infection (2 points)-all as reported for the prior 12 months. The C-statistic for this risk score was >0.7 in both data sets. The SDET risk score may help to prioritize resources and target interventions, such as preexposure prophylaxis, to MSM at greatest risk of acquiring HIV infection. The SDET risk score is deployed as a freely available tool at http://sdet.ucsd.edu. © The Author 2015. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. Comparison of Different Risk Perception Measures in Predicting Seasonal Influenza Vaccination among Healthy Chinese Adults in Hong Kong: A Prospective Longitudinal Study

    PubMed Central

    Liao, Qiuyan; Wong, Wing Sze; Fielding, Richard

    2013-01-01

    Background Risk perception is a reported predictor of vaccination uptake, but which measures of risk perception best predict influenza vaccination uptake remain unclear. Methodology During the main influenza seasons (between January and March) of 2009 (Wave 1) and 2010 (Wave 2),505 Chinese students and employees from a Hong Kong university completed an online survey. Multivariate logistic regression models were conducted to assess how well different risk perceptions measures in Wave 1 predicted vaccination uptake against seasonal influenza in Wave 2. Principal Findings The results of the multivariate logistic regression models showed that feeling at risk (β = 0.25, p = 0.021) was the better predictor compared with probability judgment while probability judgment (β = 0.25, p = 0.029 ) was better than beliefs about risk in predicting subsequent influenza vaccination uptake. Beliefs about risk and feeling at risk seemed to predict the same aspect of subsequent vaccination uptake because their associations with vaccination uptake became insignificant when paired into the logistic regression model. Similarly, to compare the four scales for assessing probability judgment in predicting vaccination uptake, the 7-point verbal scale remained a significant and stronger predictor for vaccination uptake when paired with other three scales; the 6-point verbal scale was a significant and stronger predictor when paired with the percentage scale or the 2-point verbal scale; and the percentage scale was a significant and stronger predictor only when paired with the 2-point verbal scale. Conclusions/Significance Beliefs about risk and feeling at risk are not well differentiated by Hong Kong Chinese people. Feeling at risk, an affective-cognitive dimension of risk perception predicts subsequent vaccination uptake better than do probability judgments. Among the four scales for assessing risk probability judgment, the 7-point verbal scale offered the best predictive power for subsequent vaccination uptake. PMID:23894292

  14. Comparison of different risk perception measures in predicting seasonal influenza vaccination among healthy Chinese adults in Hong Kong: a prospective longitudinal study.

    PubMed

    Liao, Qiuyan; Wong, Wing Sze; Fielding, Richard

    2013-01-01

    Risk perception is a reported predictor of vaccination uptake, but which measures of risk perception best predict influenza vaccination uptake remain unclear. During the main influenza seasons (between January and March) of 2009 (Wave 1) and 2010 (Wave 2),505 Chinese students and employees from a Hong Kong university completed an online survey. Multivariate logistic regression models were conducted to assess how well different risk perceptions measures in Wave 1 predicted vaccination uptake against seasonal influenza in Wave 2. The results of the multivariate logistic regression models showed that feeling at risk (β = 0.25, p = 0.021) was the better predictor compared with probability judgment while probability judgment (β = 0.25, p = 0.029 ) was better than beliefs about risk in predicting subsequent influenza vaccination uptake. Beliefs about risk and feeling at risk seemed to predict the same aspect of subsequent vaccination uptake because their associations with vaccination uptake became insignificant when paired into the logistic regression model. Similarly, to compare the four scales for assessing probability judgment in predicting vaccination uptake, the 7-point verbal scale remained a significant and stronger predictor for vaccination uptake when paired with other three scales; the 6-point verbal scale was a significant and stronger predictor when paired with the percentage scale or the 2-point verbal scale; and the percentage scale was a significant and stronger predictor only when paired with the 2-point verbal scale. Beliefs about risk and feeling at risk are not well differentiated by Hong Kong Chinese people. Feeling at risk, an affective-cognitive dimension of risk perception predicts subsequent vaccination uptake better than do probability judgments. Among the four scales for assessing risk probability judgment, the 7-point verbal scale offered the best predictive power for subsequent vaccination uptake.

  15. The MIDAS processor. [Multivariate Interactive Digital Analysis System for multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Gordon, M. F.; Mclaughlin, R. H.; Marshall, R. E.

    1975-01-01

    The MIDAS (Multivariate Interactive Digital Analysis System) processor is a high-speed processor designed to process multispectral scanner data (from Landsat, EOS, aircraft, etc.) quickly and cost-effectively to meet the requirements of users of remote sensor data, especially from very large areas. MIDAS consists of a fast multipipeline preprocessor and classifier, an interactive color display and color printer, and a medium scale computer system for analysis and control. The system is designed to process data having as many as 16 spectral bands per picture element at rates of 200,000 picture elements per second into as many as 17 classes using a maximum likelihood decision rule.

  16. [Analysis of variance of repeated data measured by water maze with SPSS].

    PubMed

    Qiu, Hong; Jin, Guo-qin; Jin, Ru-feng; Zhao, Wei-kang

    2007-01-01

    To introduce the method of analyzing repeated data measured by water maze with SPSS 11.0, and offer a reference statistical method to clinical and basic medicine researchers who take the design of repeated measures. Using repeated measures and multivariate analysis of variance (ANOVA) process of the general linear model in SPSS and giving comparison among different groups and different measure time pairwise. Firstly, Mauchly's test of sphericity should be used to judge whether there were relations among the repeatedly measured data. If any (P

  17. On the Bayesian Treed Multivariate Gaussian Process with Linear Model of Coregionalization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Lin, Guang

    2015-02-01

    The Bayesian treed Gaussian process (BTGP) has gained popularity in recent years because it provides a straightforward mechanism for modeling non-stationary data and can alleviate computational demands by fitting models to less data. The extension of BTGP to the multivariate setting requires us to model the cross-covariance and to propose efficient algorithms that can deal with trans-dimensional MCMC moves. In this paper we extend the cross-covariance of the Bayesian treed multivariate Gaussian process (BTMGP) to that of linear model of Coregionalization (LMC) cross-covariances. Different strategies have been developed to improve the MCMC mixing and invert smaller matrices in the Bayesianmore » inference. Moreover, we compare the proposed BTMGP with existing multiple BTGP and BTMGP in test cases and multiphase flow computer experiment in a full scale regenerator of a carbon capture unit. The use of the BTMGP with LMC cross-covariance helped to predict the computer experiments relatively better than existing competitors. The proposed model has a wide variety of applications, such as computer experiments and environmental data. In the case of computer experiments we also develop an adaptive sampling strategy for the BTMGP with LMC cross-covariance function.« less

  18. Relationship between stressfulness of claiming for injury compensation and long-term recovery: a prospective cohort study.

    PubMed

    Grant, Genevieve M; O'Donnell, Meaghan L; Spittal, Matthew J; Creamer, Mark; Studdert, David M

    2014-04-01

    Each year, millions of persons worldwide seek compensation for transport accident and workplace injuries. Previous research suggests that these claimants have worse long-term health outcomes than persons whose injuries fall outside compensation schemes. However, existing studies have substantial methodological weaknesses and have not identified which aspects of the claiming experience may drive these effects. To determine aspects of claims processes that claimants to transport accident and workers' compensation schemes find stressful and whether such stressful experiences are associated with poorer long-term recovery. Prospective cohort study of a random sample of 1010 patients hospitalized in 3 Australian states for injuries from 2004 through 2006. At 6-year follow-up, we interviewed 332 participants who had claimed compensation from transport accident and workers' compensation schemes ("claimants") to determine which aspects of the claiming experience they found stressful. We used multivariable regression analysis to test for associations between compensation-related stress and health status at 6 years, adjusting for baseline determinants of long-term health status and predisposition to stressful experiences (via propensity scores). Disability, quality of life, anxiety, and depression. Among claimants, 33.9% reported high levels of stress associated with understanding what they needed to do for their claim; 30.4%, with claim delays; 26.9%, with the number of medical assessments; and 26.1%, with the amount of compensation they received. Six years after their injury, claimants who reported high levels of stress had significantly higher levels of disability (+6.94 points, World Health Organization Disability Assessment Schedule sum score), anxiety and depression (+1.89 points and +2.61 points, respectively, Hospital Anxiety and Depression Scale), and lower quality of life (-0.73 points, World Health Organization Quality of Life instrument, overall item), compared with other claimants. Adjusting for claimants' vulnerability to stress attenuated the strength of these associations, but most remained strong and statistically significant. Many claimants experience high levels of stress from engaging with injury compensation schemes, and this experience is positively correlated with poor long-term recovery. Intervening early to boost resilience among those at risk of stressful claims experiences and redesigning compensation processes to reduce their stressfulness may improve recovery and save money.

  19. A new method for defining and managing process alarms and for correcting process operation when an alarm occurs.

    PubMed

    Brooks, Robin; Thorpe, Richard; Wilson, John

    2004-11-11

    A new mathematical treatment of alarms that considers them as multi-variable interactions between process variables has provided the first-ever method to calculate values for alarm limits. This has resulted in substantial reductions in false alarms and hence in alarm annunciation rates in field trials. It has also unified alarm management, process control and product quality control into a single mathematical framework so that operations improvement and hence economic benefits are obtained at the same time as increased process safety. Additionally, an algorithm has been developed that advises what changes should be made to Manipulable process variables to clear an alarm. The multi-variable Best Operating Zone at the heart of the method is derived from existing historical data using equation-free methods. It does not require a first-principles process model or an expensive series of process identification experiments. Integral with the method is a new format Process Operator Display that uses only existing variables to fully describe the multi-variable operating space. This combination of features makes it an affordable and maintainable solution for small plants and single items of equipment as well as for the largest plants. In many cases, it also provides the justification for the investments about to be made or already made in process historian systems. Field Trials have been and are being conducted at IneosChlor and Mallinckrodt Chemicals, both in the UK, of the new geometric process control (GPC) method for improving the quality of both process operations and product by providing Process Alarms and Alerts of much high quality than ever before. The paper describes the methods used, including a simple visual method for Alarm Rationalisation that quickly delivers large sets of Consistent Alarm Limits, and the extension to full Alert Management with highlights from the Field Trials to indicate the overall effectiveness of the method in practice.

  20. New strategy to identify radicals in a time evolving EPR data set by multivariate curve resolution-alternating least squares.

    PubMed

    Fadel, Maya Abou; de Juan, Anna; Vezin, Hervé; Duponchel, Ludovic

    2016-12-01

    Electron paramagnetic resonance (EPR) spectroscopy is a powerful technique that is able to characterize radicals formed in kinetic reactions. However, spectral characterization of individual chemical species is often limited or even unmanageable due to the severe kinetic and spectral overlap among species in kinetic processes. Therefore, we applied, for the first time, multivariate curve resolution-alternating least squares (MCR-ALS) method to EPR time evolving data sets to model and characterize the different constituents in a kinetic reaction. Here we demonstrate the advantage of multivariate analysis in the investigation of radicals formed along the kinetic process of hydroxycoumarin in alkaline medium. Multiset analysis of several EPR-monitored kinetic experiments performed in different conditions revealed the individual paramagnetic centres as well as their kinetic profiles. The results obtained by MCR-ALS method demonstrate its prominent potential in analysis of EPR time evolved spectra. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quality of Life in Long-term Survivors of Muscle-Invasive Bladder Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mak, Kimberley S.; Boston Medical Center, Boston University School of Medicine, Boston, Massachusetts; Smith, Angela B.

    Purpose: Health-related quality of life (QOL) has not been well-studied in survivors of muscle-invasive bladder cancer (MIBC). The present study compared long-term QOL in MIBC patients treated with radical cystectomy (RC) versus bladder-sparing trimodality therapy (TMT). Methods and Materials: This cross-sectional bi-institutional study identified 226 patients with nonmetastatic cT2-cT4 MIBC, diagnosed in 1990 to 2011, who were eligible for RC and were disease free for ≥2 years. Six validated QOL instruments were administered: EuroQOL EQ-5D, European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Core Questionnaire and EORTC MIBC module, Expanded Prostate Cancer Index Composite bowel scale, Cancermore » Treatment and Perception Scale, and Impact of Cancer, version 2. Multivariable analyses of the mean QOL scores were conducted using propensity score matching. Results: The response rate was 77% (n=173). The median follow-up period was 5.6 years. Of the 173 patients, 64 received TMT and 109, RC. The median interval from diagnosis to questionnaire completion was 9 years after TMT and 7 years after RC (P=.009). No significant differences were found in age, gender, comorbidities, tobacco history, performance status, or tumor stage. On multivariable analysis, patients who received TMT had better general QOL by 9.7 points of 100 compared with those who had received RC (P=.001) and higher physical, role, social, emotional, and cognitive functioning by 6.6 to 9.9 points (P≤.04). TMT was associated with better bowel function by 4.5 points (P=.02) and fewer bowel symptoms by 2.7 to 7.1 points (P≤.05). The urinary symptom scores were similar. TMT was associated with better sexual function by 8.7 to 32.1 points (P≤.02) and body image by 14.8 points (P<.001). The patients who underwent TMT reported greater informed decision-making scores by 13.6 points (P=.01) and less concern about the negative effect of cancer by 6.8 points (P=.006). The study limitations included missing baseline QOL data and different follow-up times. Conclusions: Both TMT and RC result in good long-term QOL outcomes in MIBC survivors, supporting TMT as a good alternative to RC for selected patients. Whether TMT leads to superior QOL requires prospective validation.« less

  2. Multivariate moment closure techniques for stochastic kinetic models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lakatos, Eszter, E-mail: e.lakatos13@imperial.ac.uk; Ale, Angelique; Kirk, Paul D. W.

    2015-09-07

    Stochastic effects dominate many chemical and biochemical processes. Their analysis, however, can be computationally prohibitively expensive and a range of approximation schemes have been proposed to lighten the computational burden. These, notably the increasingly popular linear noise approximation and the more general moment expansion methods, perform well for many dynamical regimes, especially linear systems. At higher levels of nonlinearity, it comes to an interplay between the nonlinearities and the stochastic dynamics, which is much harder to capture correctly by such approximations to the true stochastic processes. Moment-closure approaches promise to address this problem by capturing higher-order terms of the temporallymore » evolving probability distribution. Here, we develop a set of multivariate moment-closures that allows us to describe the stochastic dynamics of nonlinear systems. Multivariate closure captures the way that correlations between different molecular species, induced by the reaction dynamics, interact with stochastic effects. We use multivariate Gaussian, gamma, and lognormal closure and illustrate their use in the context of two models that have proved challenging to the previous attempts at approximating stochastic dynamics: oscillations in p53 and Hes1. In addition, we consider a larger system, Erk-mediated mitogen-activated protein kinases signalling, where conventional stochastic simulation approaches incur unacceptably high computational costs.« less

  3. A mathematical theory of learning control for linear discrete multivariable systems

    NASA Technical Reports Server (NTRS)

    Phan, Minh; Longman, Richard W.

    1988-01-01

    When tracking control systems are used in repetitive operations such as robots in various manufacturing processes, the controller will make the same errors repeatedly. Here consideration is given to learning controllers that look at the tracking errors in each repetition of the process and adjust the control to decrease these errors in the next repetition. A general formalism is developed for learning control of discrete-time (time-varying or time-invariant) linear multivariable systems. Methods of specifying a desired trajectory (such that the trajectory can actually be performed by the discrete system) are discussed, and learning controllers are developed. Stability criteria are obtained which are relatively easy to use to insure convergence of the learning process, and proper gain settings are discussed in light of measurement noise and system uncertainties.

  4. Geometric Model of Induction Heating Process of Iron-Based Sintered Materials

    NASA Astrophysics Data System (ADS)

    Semagina, Yu V.; Egorova, M. A.

    2018-03-01

    The article studies the issue of building multivariable dependences based on the experimental data. A constructive method for solving the issue is presented in the form of equations of (n-1) – surface compartments of the extended Euclidean space E+n. The dimension of space is taken to be equal to the sum of the number of parameters and factors of the model of the system being studied. The basis for building multivariable dependencies is the generalized approach to n-space used for the surface compartments of 3D space. The surface is designed on the basis of the kinematic method, moving one geometric object along a certain trajectory. The proposed approach simplifies the process aimed at building the multifactorial empirical dependencies which describe the process being investigated.

  5. A Multivariate Analysis of the Early Dropout Process

    ERIC Educational Resources Information Center

    Fiester, Alan R.; Rudestam, Kjell E.

    1975-01-01

    Principal-component factor analyses were performed on patient input (demographic and pretherapy expectations), therapist input (demographic), and patient perspective therapy process variables that significantly differentiated early dropout from nondropout outpatients at two community mental health centers. (Author)

  6. A Multivariate Methodological Workflow for the Analysis of FTIR Chemical Mapping Applied on Historic Paint Stratigraphies

    PubMed Central

    Sciutto, Giorgia; Oliveri, Paolo; Catelli, Emilio; Bonacini, Irene

    2017-01-01

    In the field of applied researches in heritage science, the use of multivariate approach is still quite limited and often chemometric results obtained are often underinterpreted. Within this scenario, the present paper is aimed at disseminating the use of suitable multivariate methodologies and proposes a procedural workflow applied on a representative group of case studies, of considerable importance for conservation purposes, as a sort of guideline on the processing and on the interpretation of this FTIR data. Initially, principal component analysis (PCA) is performed and the score values are converted into chemical maps. Successively, the brushing approach is applied, demonstrating its usefulness for a deep understanding of the relationships between the multivariate map and PC score space, as well as for the identification of the spectral bands mainly involved in the definition of each area localised within the score maps. PMID:29333162

  7. A multivariate analytical method to characterize sediment attributes from high-frequency acoustic backscatter and ground-truthing data (Jade Bay, German North Sea coast)

    NASA Astrophysics Data System (ADS)

    Biondo, Manuela; Bartholomä, Alexander

    2017-04-01

    One of the burning issues on the topic of acoustic seabed classification is the lack of solid, repeatable, statistical procedures that can support the verification of acoustic variability in relation to seabed properties. Acoustic sediment classification schemes often lead to biased and subjective interpretation, as they ultimately aim at an oversimplified categorization of the seabed based on conventionally defined sediment types. However, grain size variability alone cannot be accounted for acoustic diversity, which will be ultimately affected by multiple physical processes, scale of heterogeneity, instrument settings, data quality, image processing and segmentation performances. Understanding and assessing the weight of all of these factors on backscatter is a difficult task, due to the spatially limited and fragmentary knowledge of the seabed from of direct observations (e.g. grab samples, cores, videos). In particular, large-scale mapping requires an enormous availability of ground-truthing data that is often obtained from heterogeneous and multidisciplinary sources, resulting into a further chance of misclassification. Independently from all of these limitations, acoustic segments still contain signals for seabed changes that, if appropriate procedures are established, can be translated into meaningful knowledge. In this study we design a simple, repeatable method, based on multivariate procedures, with the scope to classify a 100 km2, high-frequency (450 kHz) sidescan sonar mosaic acquired in the year 2012 in the shallow upper-mesotidal inlet of the Jade Bay (German North Sea coast). The tool used for the automated classification of the backscatter mosaic is the QTC SWATHVIEWTMsoftware. The ground-truthing database included grab sample data from multiple sources (2009-2011). The method was designed to extrapolate quantitative descriptors for acoustic backscatter and model their spatial changes in relation to grain size distribution and morphology. The modelled relationships were used to: 1) asses the automated segmentation performance, 2) obtain a ranking of most discriminant seabed attributes responsible for acoustic diversity, 3) select the best-fit ground-truthing information to characterize each acoustic class. Using a supervised Linear Discriminant Analysis (LDA), relationships between seabed parameters and acoustic classes discrimination were modelled, and acoustic classes for each data point were predicted. The model predicted a success rate of 63.5%. An unsupervised LDA was used to model relationships between acoustic variables and clustered seabed categories with the scope of identifying misrepresentative ground-truthing data points. The model prediction scored a success rate of 50.8%. Misclassified data points were disregarded for final classification. Analyses led to clearer, more accurate appreciation of relationship patterns and improved understanding of site-specific processes affecting the acoustic signal. Value to the qualitative classification output was added by comparing the latter with a more recent set of acoustic and ground-truthing information (2014). Classification resulted in the first acoustic sediment map ever produced in the area and offered valuable knowledge for detailed sediment variability. The method proved to be a simple, repeatable strategy that may be applied to similar work and environments.

  8. MAOA, MTHFR, and TNF-β genes polymorphisms and personality traits in the pathogenesis of migraine.

    PubMed

    Ishii, Masakazu; Shimizu, Shunichi; Sakairi, Yuki; Nagamine, Ayumu; Naito, Yuika; Hosaka, Yukiko; Naito, Yuko; Kurihara, Tatsuya; Onaya, Tomomi; Oyamada, Hideto; Imagawa, Atsuko; Shida, Kenji; Takahashi, Johji; Oguchi, Katsuji; Masuda, Yutaka; Hara, Hajime; Usami, Shino; Kiuchi, Yuji

    2012-04-01

    Migraine is a multifactorial disease with various factors, such as genetic polymorphisms and personality traits, but the contribution of those factors is not clear. To clarify the pathogenesis of migraine, the contributions of genetic polymorphisms and personality traits were simultaneously investigated using multivariate analysis. Ninety-one migraine patients and 119 non-headache healthy volunteers were enrolled. The 12 gene polymorphisms analysis and NEO-FFI personality test were performed. At first, the univariate analysis was performed to extract the contributing factors to pathogenesis of migraine. We then extracted the factors that independently contributed to the pathogenesis of migraine using multivariate stepwise logistic regression analysis. Using the multivariate analysis, three gene polymorphisms including monoamine oxidase A (MAOA) T941G, methylenetetrahydrofolate reductase (MTHFR) C677T, and tumor necrosis factor beta (TNF-β) G252Α, and the neuroticism and conscientiousness scores in NEO-FFI were selected as significant factors that independently contributed to the pathogenesis of migraine. Their odds ratios were 1.099 (per point of neuroticism score), 1.080 (per point of conscientiousness score), 2.272 (T and T/T or T/G vs G and G/G genotype of MAOA), 1.939 (C/T or T/T vs C/C genotype of MTHFR), and 2.748 (G/A or A/A vs G/G genotype of TNF-β), respectively. We suggested that multiple factors, such as gene polymorphisms and personality traits, contribute to the pathogenesis of migraine. The contribution of polymorphisms, such as MAOA T941G, MTHFR C677T, and TNF-β G252A, were more important than personality traits in the pathogenesis of migraine, a multifactorial disorder.

  9. Hot spots of multivariate extreme anomalies in Earth observations

    NASA Astrophysics Data System (ADS)

    Flach, M.; Sippel, S.; Bodesheim, P.; Brenning, A.; Denzler, J.; Gans, F.; Guanche, Y.; Reichstein, M.; Rodner, E.; Mahecha, M. D.

    2016-12-01

    Anomalies in Earth observations might indicate data quality issues, extremes or the change of underlying processes within a highly multivariate system. Thus, considering the multivariate constellation of variables for extreme detection yields crucial additional information over conventional univariate approaches. We highlight areas in which multivariate extreme anomalies are more likely to occur, i.e. hot spots of extremes in global atmospheric Earth observations that impact the Biosphere. In addition, we present the year of the most unusual multivariate extreme between 2001 and 2013 and show that these coincide with well known high impact extremes. Technically speaking, we account for multivariate extremes by using three sophisticated algorithms adapted from computer science applications. Namely an ensemble of the k-nearest neighbours mean distance, a kernel density estimation and an approach based on recurrences is used. However, the impact of atmosphere extremes on the Biosphere might largely depend on what is considered to be normal, i.e. the shape of the mean seasonal cycle and its inter-annual variability. We identify regions with similar mean seasonality by means of dimensionality reduction in order to estimate in each region both the `normal' variance and robust thresholds for detecting the extremes. In addition, we account for challenges like heteroscedasticity in Northern latitudes. Apart from hot spot areas, those anomalies in the atmosphere time series are of particular interest, which can only be detected by a multivariate approach but not by a simple univariate approach. Such an anomalous constellation of atmosphere variables is of interest if it impacts the Biosphere. The multivariate constellation of such an anomalous part of a time series is shown in one case study indicating that multivariate anomaly detection can provide novel insights into Earth observations.

  10. Hierarchy of temporal responses of multivariate self-excited epidemic processes

    NASA Astrophysics Data System (ADS)

    Saichev, Alexander; Maillart, Thomas; Sornette, Didier

    2013-04-01

    Many natural and social systems are characterized by bursty dynamics, for which past events trigger future activity. These systems can be modelled by so-called self-excited Hawkes conditional Poisson processes. It is generally assumed that all events have similar triggering abilities. However, some systems exhibit heterogeneity and clusters with possibly different intra- and inter-triggering, which can be accounted for by generalization into the "multivariate" self-excited Hawkes conditional Poisson processes. We develop the general formalism of the multivariate moment generating function for the cumulative number of first-generation and of all generation events triggered by a given mother event (the "shock") as a function of the current time t. This corresponds to studying the response function of the process. A variety of different systems have been analyzed. In particular, for systems in which triggering between events of different types proceeds through a one-dimension directed or symmetric chain of influence in type space, we report a novel hierarchy of intermediate asymptotic power law decays ˜ 1/ t 1-( m+1) θ of the rate of triggered events as a function of the distance m of the events to the initial shock in the type space, where 0 < θ < 1 for the relevant long-memory processes characterizing many natural and social systems. The richness of the generated time dynamics comes from the cascades of intermediate events of possibly different kinds, unfolding via random changes of types genealogy.

  11. Graduate Views on Access to Higher Education: Is It Really a Case of Pulling up the Ladder?

    ERIC Educational Resources Information Center

    Webb, Rob; Watson, Duncan; Cook, Steve; Arico, Fabio

    2017-01-01

    Using as a starting point in the recent work of Mountford-Zimdars et al., the authors analyse attitudes towards expanding higher education (HE) opportunities in the UK. The authors propose that the approach of Mountford-Zimdars et al. is flawed not only in its adoption of a multivariate logistic regression but also in its interpretation of…

  12. Predicting the occurrence of embolic events: an analysis of 1456 episodes of infective endocarditis from the Italian Study on Endocarditis (SEI).

    PubMed

    Rizzi, Marco; Ravasio, Veronica; Carobbio, Alessandra; Mattucci, Irene; Crapis, Massimo; Stellini, Roberto; Pasticci, Maria Bruna; Chinello, Pierangelo; Falcone, Marco; Grossi, Paolo; Barbaro, Francesco; Pan, Angelo; Viale, Pierluigi; Durante-Mangoni, Emanuele

    2014-04-29

    Embolic events are a major cause of morbidity and mortality in patients with infective endocarditis. We analyzed the database of the prospective cohort study SEI in order to identify factors associated with the occurrence of embolic events and to develop a scoring system for the assessment of the risk of embolism. We retrospectively analyzed 1456 episodes of infective endocarditis from the multicenter study SEI. Predictors of embolism were identified. Risk factors identified at multivariate analysis as predictive of embolism in left-sided endocarditis, were used for the development of a risk score: 1 point was assigned to each risk factor (total risk score range: minimum 0 points; maximum 2 points). Three categories were defined by the score: low (0 points), intermediate (1 point), or high risk (2 points); the probability of embolic events per risk category was calculated for each day on treatment (day 0 through day 30). There were 499 episodes of infective endocarditis (34%) that were complicated by ≥ 1 embolic event. Most embolic events occurred early in the clinical course (first week of therapy: 15.5 episodes per 1000 patient days; second week: 3.7 episodes per 1000 patient days). In the total cohort, the factors associated with the occurrence of embolism at multivariate analysis were prosthetic valve localization (odds ratio, 1.84), right-sided endocarditis (odds ratio, 3.93), Staphylococcus aureus etiology (odds ratio, 2.23) and vegetation size ≥ 13 mm (odds ratio, 1.86). In left-sided endocarditis, Staphylococcus aureus etiology (odds ratio, 2.1) and vegetation size ≥ 13 mm (odds ratio, 2.1) were independently associated with embolic events; the 30-day cumulative incidence of embolism varied with risk score category (low risk, 12%; intermediate risk, 25%; high risk, 38%; p < 0.001). Staphylococcus aureus etiology and vegetation size are associated with an increased risk of embolism. In left-sided endocarditis, a simple scoring system, which combines etiology and vegetation size with time on antimicrobials, might contribute to a better assessment of the risk of embolism, and to a more individualized analysis of indications and contraindications for early surgery.

  13. Categorical speech processing in Broca's area: an fMRI study using multivariate pattern-based analysis.

    PubMed

    Lee, Yune-Sang; Turkeltaub, Peter; Granger, Richard; Raizada, Rajeev D S

    2012-03-14

    Although much effort has been directed toward understanding the neural basis of speech processing, the neural processes involved in the categorical perception of speech have been relatively less studied, and many questions remain open. In this functional magnetic resonance imaging (fMRI) study, we probed the cortical regions mediating categorical speech perception using an advanced brain-mapping technique, whole-brain multivariate pattern-based analysis (MVPA). Normal healthy human subjects (native English speakers) were scanned while they listened to 10 consonant-vowel syllables along the /ba/-/da/ continuum. Outside of the scanner, individuals' own category boundaries were measured to divide the fMRI data into /ba/ and /da/ conditions per subject. The whole-brain MVPA revealed that Broca's area and the left pre-supplementary motor area evoked distinct neural activity patterns between the two perceptual categories (/ba/ vs /da/). Broca's area was also found when the same analysis was applied to another dataset (Raizada and Poldrack, 2007), which previously yielded the supramarginal gyrus using a univariate adaptation-fMRI paradigm. The consistent MVPA findings from two independent datasets strongly indicate that Broca's area participates in categorical speech perception, with a possible role of translating speech signals into articulatory codes. The difference in results between univariate and multivariate pattern-based analyses of the same data suggest that processes in different cortical areas along the dorsal speech perception stream are distributed on different spatial scales.

  14. Analysis of the dependence of extreme rainfalls

    NASA Astrophysics Data System (ADS)

    Padoan, Simone; Ancey, Christophe; Parlange, Marc

    2010-05-01

    The aim of spatial analysis is to quantitatively describe the behavior of environmental phenomena such as precipitation levels, wind speed or daily temperatures. A number of generic approaches to spatial modeling have been developed[1], but these are not necessarily ideal for handling extremal aspects given their focus on mean process levels. The areal modelling of the extremes of a natural process observed at points in space is important in environmental statistics; for example, understanding extremal spatial rainfall is crucial in flood protection. In light of recent concerns over climate change, the use of robust mathematical and statistical methods for such analyses has grown in importance. Multivariate extreme value models and the class of maxstable processes [2] have a similar asymptotic motivation to the univariate Generalized Extreme Value (GEV) distribution , but providing a general approach to modeling extreme processes incorporating temporal or spatial dependence. Statistical methods for max-stable processes and data analyses of practical problems are discussed by [3] and [4]. This work illustrates methods to the statistical modelling of spatial extremes and gives examples of their use by means of a real extremal data analysis of Switzerland precipitation levels. [1] Cressie, N. A. C. (1993). Statistics for Spatial Data. Wiley, New York. [2] de Haan, L and Ferreria A. (2006). Extreme Value Theory An Introduction. Springer, USA. [3] Padoan, S. A., Ribatet, M and Sisson, S. A. (2009). Likelihood-Based Inference for Max-Stable Processes. Journal of the American Statistical Association, Theory & Methods. In press. [4] Davison, A. C. and Gholamrezaee, M. (2009), Geostatistics of extremes. Journal of the Royal Statistical Society, Series B. To appear.

  15. A conceptual weather-type classification procedure for the Philadelphia, Pennsylvania, area

    USGS Publications Warehouse

    McCabe, Gregory J.

    1990-01-01

    A simple method of weather-type classification, based on a conceptual model of pressure systems that pass through the Philadelphia, Pennsylvania, area, has been developed. The only inputs required for the procedure are daily mean wind direction and cloud cover, which are used to index the relative position of pressure systems and fronts to Philadelphia.Daily mean wind-direction and cloud-cover data recorded at Philadelphia, Pennsylvania, from January 1954 through August 1988 were used to categorize daily weather conditions. The conceptual weather types reflect changes in daily air and dew-point temperatures, and changes in monthly mean temperature and monthly and annual precipitation. The weather-type classification produced by using the conceptual model was similar to a classification produced by using a multivariate statistical classification procedure. Even though the conceptual weather types are derived from a small amount of data, they appear to account for the variability of daily weather patterns sufficiently to describe distinct weather conditions for use in environmental analyses of weather-sensitive processes.

  16. Predicting Condom Use Using the Information-Motivation-Behavioral Skills (IMB) Model: A Multivariate Latent Growth Curve Analysis

    PubMed Central

    Senn, Theresa E.; Scott-Sheldon, Lori A. J.; Vanable, Peter A.; Carey, Michael P.

    2011-01-01

    Background The Information-Motivation-Behavioral Skills (IMB) model often guides sexual risk reduction programs even though no studies have examined covariation in the theory’s constructs in a dynamic fashion with longitudinal data. Purpose Using new developments in latent growth modeling, we explore how changes in information, motivation, and behavioral skills over 9 months relate to changes in condom use among STD clinic patients. Methods Participants (N = 1281, 50% female, 66% African American) completed measures of IMB constructs at three time points. We used parallel process latent growth modeling to examine associations among intercepts and slopes of IMB constructs. Results Initial levels of motivation, behavioral skills, and condom use were all positively associated, with behavioral skills partially mediating associations between motivation and condom use. Changes over time in behavioral skills positively related to changes in condom use. Conclusions Results support the key role of behavioral skills in sexual risk reduction, suggesting these skills should be targeted in HIV prevention interventions. PMID:21638196

  17. Structural equation modeling and natural systems

    USGS Publications Warehouse

    Grace, James B.

    2006-01-01

    This book, first published in 2006, presents an introduction to the methodology of structural equation modeling, illustrates its use, and goes on to argue that it has revolutionary implications for the study of natural systems. A major theme of this book is that we have, up to this point, attempted to study systems primarily using methods (such as the univariate model) that were designed only for considering individual processes. Understanding systems requires the capacity to examine simultaneous influences and responses. Structural equation modeling (SEM) has such capabilities. It also possesses many other traits that add strength to its utility as a means of making scientific progress. In light of the capabilities of SEM, it can be argued that much of ecological theory is currently locked in an immature state that impairs its relevance. It is further argued that the principles of SEM are capable of leading to the development and evaluation of multivariate theories of the sort vitally needed for the conservation of natural systems.

  18. Point-of-care optical tool to detect early stage of hemorrhage and shock

    NASA Astrophysics Data System (ADS)

    Gurjar, Rajan S.; Riccardi, Suzannah L.; Johnson, Blair D.; Johnson, Christopher P.; Paradis, Norman A.; Joyner, Michael J.; Wolf, David E.

    2014-02-01

    There is a critical unmet clinical need for a device that can monitor and predict the onset of shock: hemorrhagic shock or bleeding to death, septic shock or systemic infection, and cardiogenic shock or blood flow and tissue oxygenation impairment due to heart attack. Together these represent 141 M patients per year. We have developed a monitor for shock based on measuring blood flow in peripheral (skin) capillary beds using diffuse correlation spectroscopy, a form of dynamic light scattering, and have demonstrated proof-of-principle both in pigs and humans. Our results show that skin blood flow measurement, either alone or in conjunction with other hemodynamic properties such as heart rate variability, pulse pressure variability, and tissue oxygenation, can meet this unmet need in a small self-contained patch-like device in conjunction with a hand-held processing unit. In this paper we describe and discuss the experimental work and the multivariate statistical analysis performed to demonstrate proof-of-principle of the concept.

  19. What matters? Assessing and developing inquiry and multivariable reasoning skills in high school chemistry

    NASA Astrophysics Data System (ADS)

    Daftedar Abdelhadi, Raghda Mohamed

    Although the Next Generation Science Standards (NGSS) present a detailed set of Science and Engineering Practices, a finer grained representation of the underlying skills is lacking in the standards document. Therefore, it has been reported that teachers are facing challenges deciphering and effectively implementing the standards, especially with regards to the Practices. This analytical study assessed the development of high school chemistry students' (N = 41) inquiry, multivariable causal reasoning skills, and metacognition as a mediator for their development. Inquiry tasks based on concepts of element properties of the periodic table as well as reaction kinetics required students to conduct controlled thought experiments, make inferences, and declare predictions of the level of the outcome variable by coordinating the effects of multiple variables. An embedded mixed methods design was utilized for depth and breadth of understanding. Various sources of data were collected including students' written artifacts, audio recordings of in-depth observational groups and interviews. Data analysis was informed by a conceptual framework formulated around the concepts of coordinating theory and evidence, metacognition, and mental models of multivariable causal reasoning. Results of the study indicated positive change towards conducting controlled experimentation, making valid inferences and justifications. Additionally, significant positive correlation between metastrategic and metacognitive competencies, and sophistication of experimental strategies, signified the central role metacognition played. Finally, lack of consistency in indicating effective variables during the multivariable prediction task pointed towards the fragile mental models of multivariable causal reasoning the students had. Implications for teacher education, science education policy as well as classroom research methods are discussed. Finally, recommendations for developing reform-based chemistry curricula based on the Practices are presented.

  20. Review of validation and reporting of non-targeted fingerprinting approaches for food authentication.

    PubMed

    Riedl, Janet; Esslinger, Susanne; Fauhl-Hassek, Carsten

    2015-07-23

    Food fingerprinting approaches are expected to become a very potent tool in authentication processes aiming at a comprehensive characterization of complex food matrices. By non-targeted spectrometric or spectroscopic chemical analysis with a subsequent (multivariate) statistical evaluation of acquired data, food matrices can be investigated in terms of their geographical origin, species variety or possible adulterations. Although many successful research projects have already demonstrated the feasibility of non-targeted fingerprinting approaches, their uptake and implementation into routine analysis and food surveillance is still limited. In many proof-of-principle studies, the prediction ability of only one data set was explored, measured within a limited period of time using one instrument within one laboratory. Thorough validation strategies that guarantee reliability of the respective data basis and that allow conclusion on the applicability of the respective approaches for its fit-for-purpose have not yet been proposed. Within this review, critical steps of the fingerprinting workflow were explored to develop a generic scheme for multivariate model validation. As a result, a proposed scheme for "good practice" shall guide users through validation and reporting of non-targeted fingerprinting results. Furthermore, food fingerprinting studies were selected by a systematic search approach and reviewed with regard to (a) transparency of data processing and (b) validity of study results. Subsequently, the studies were inspected for measures of statistical model validation, analytical method validation and quality assurance measures. In this context, issues and recommendations were found that might be considered as an actual starting point for developing validation standards of non-targeted metabolomics approaches for food authentication in the future. Hence, this review intends to contribute to the harmonization and standardization of food fingerprinting, both required as a prior condition for the authentication of food in routine analysis and official control. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitative image processing in fluid mechanics

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  2. Nonlinear Decoupling Control With ANFIS-Based Unmodeled Dynamics Compensation for a Class of Complex Industrial Processes.

    PubMed

    Zhang, Yajun; Chai, Tianyou; Wang, Hong; Wang, Dianhui; Chen, Xinkai

    2018-06-01

    Complex industrial processes are multivariable and generally exhibit strong coupling among their control loops with heavy nonlinear nature. These make it very difficult to obtain an accurate model. As a result, the conventional and data-driven control methods are difficult to apply. Using a twin-tank level control system as an example, a novel multivariable decoupling control algorithm with adaptive neural-fuzzy inference system (ANFIS)-based unmodeled dynamics (UD) compensation is proposed in this paper for a class of complex industrial processes. At first, a nonlinear multivariable decoupling controller with UD compensation is introduced. Different from the existing methods, the decomposition estimation algorithm using ANFIS is employed to estimate the UD, and the desired estimating and decoupling control effects are achieved. Second, the proposed method does not require the complicated switching mechanism which has been commonly used in the literature. This significantly simplifies the obtained decoupling algorithm and its realization. Third, based on some new lemmas and theorems, the conditions on the stability and convergence of the closed-loop system are analyzed to show the uniform boundedness of all the variables. This is then followed by the summary on experimental tests on a heavily coupled nonlinear twin-tank system that demonstrates the effectiveness and the practicability of the proposed method.

  3. Balance between transmitted HLA preadapted and nonassociated polymorphisms is a major determinant of HIV-1 disease progression.

    PubMed

    Mónaco, Daniela C; Dilernia, Dario A; Fiore-Gartland, Andrew; Yu, Tianwei; Prince, Jessica L; Dennis, Kristine K; Qin, Kai; Schaefer, Malinda; Claiborne, Daniel T; Kilembe, William; Tang, Jianming; Price, Matt A; Farmer, Paul; Gilmour, Jill; Bansal, Anju; Allen, Susan; Goepfert, Paul; Hunter, Eric

    2016-09-19

    HIV-1 adapts to a new host through mutations that facilitate immune escape. Here, we evaluate the impact on viral control and disease progression of transmitted polymorphisms that were either preadapted to or nonassociated with the new host's HLA. In a cohort of 169 Zambian heterosexual transmission pairs, we found that almost one-third of possible HLA-linked target sites in the transmitted virus Gag protein are already adapted, and that this transmitted preadaptation significantly reduced early immune recognition of epitopes. Transmitted preadapted and nonassociated polymorphisms showed opposing effects on set-point VL and the balance between the two was significantly associated with higher set-point VLs in a multivariable model including other risk factors. Transmitted preadaptation was also significantly associated with faster CD4 decline (<350 cells/µl) and this association was stronger after accounting for nonassociated polymorphisms, which were linked with slower CD4 decline. Overall, the relative ratio of the two classes of polymorphisms was found to be the major determinant of CD4 decline in a multivariable model including other risk factors. This study reveals that, even before an immune response is mounted in the new host, the balance of these opposing factors can significantly influence the outcome of HIV-1 infection. © 2016 Mónaco et al.

  4. Associated Variables of Myositis in Systemic Lupus Erythematosus: A Cross-Sectional Study

    PubMed Central

    Liang, Yan; Leng, Rui-Xue; Pan, Hai-Feng; Ye, Dong-Qing

    2017-01-01

    Background This study aimed to estimate the point prevalence of myositis and identify associated variables of myositis in systemic lupus erythematosus (SLE). Material/Methods Clinical date of patients hospitalized with lupus at the First Affiliated Hospital of Anhui Medical University and Anhui Provincial Hospital were collected. Patients were defined as having myositis if they reported the presence of persistent invalidating muscular weakness combined with increased levels of creatine phosphokinase (CPK) and abnormal electromyography (EMG). Results The study sample comprised 1701 lupus patients, of which 44 had myositis. Patients with SLE-associated myositis are more likely to have skin rash, alopecia, pericarditis, vasculitis, anti-Sm, anti-RNP, anti-dsDNA, thrombocytopenia, leukopenia, low C3, low C4, high erythrocyte sedimentation rate (ESR), high D-dimer, and active disease. Multivariate logistic regression found positive associations between leukopenia, alopecia, and active disease with myositis. Negative associations between myositis with the use of corticosteroids or immunosuppressive drugs were revealed in univariate and multivariate analysis. Conclusions The point prevalence of myositis was 2.6% in SLE patients. The significant association of alopecia, leukopenia, and active disease with myositis suggests that organ damage, hematological abnormality, and high disease activity promote the progression of myositis in lupus patients. PMID:28548078

  5. Construction of inorganic elemental fingerprint and multivariate statistical analysis of marine traditional Chinese medicine Meretricis concha from Rushan Bay

    NASA Astrophysics Data System (ADS)

    Wu, Xia; Zheng, Kang; Zhao, Fengjia; Zheng, Yongjun; Li, Yantuan

    2014-08-01

    Meretricis concha is a kind of marine traditional Chinese medicine (TCM), and has been commonly used for the treatment of asthma and scald burns. In order to investigate the relationship between the inorganic elemental fingerprint and the geographical origin identification of Meretricis concha, the elemental contents of M. concha from five sampling points in Rushan Bay have been determined by means of inductively coupled plasma optical emission spectrometry (ICP-OES). Based on the contents of 14 inorganic elements (Al, As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Mo, Ni, Pb, Se, and Zn), the inorganic elemental fingerprint which well reflects the elemental characteristics was constructed. All the data from the five sampling points were discriminated with accuracy through hierarchical cluster analysis (HCA) and principle component analysis (PCA), indicating that a four-factor model which could explain approximately 80% of the detection data was established, and the elements Al, As, Cd, Cu, Ni and Pb could be viewed as the characteristic elements. This investigation suggests that the inorganic elemental fingerprint combined with multivariate statistical analysis is a promising method for verifying the geographical origin of M. concha, and this strategy should be valuable for the authenticity discrimination of some marine TCM.

  6. FABP4 and Cardiovascular Events in Peripheral Arterial Disease.

    PubMed

    Höbaus, Clemens; Herz, Carsten Thilo; Pesau, Gerfried; Wrba, Thomas; Koppensteiner, Renate; Schernthaner, Gerit-Holger

    2018-05-01

    Fatty acid-binding protein 4 (FABP4) is a possible biomarker of atherosclerosis. We evaluated FABP4 levels, for the first time, in patients with peripheral artery disease (PAD) and the possible association between baseline FABP4 levels and cardiovascular events over time. Patients (n = 327; mean age 69 ± 10 years) with stable PAD were enrolled in this study. Serum FABP4 was measured by bead-based multiplex assay. Cardiovascular events were analyzed by FABP4 tertiles using Kaplan-Meier and Cox regression analyses after 5 years. Serum FABP4 levels showed a significant association with the classical 3-point major adverse cardiovascular event (MACE) end point (including death, nonlethal myocardial infarction, or nonfatal stroke) in patients with PAD ( P = .038). A standard deviation increase of FABP4 resulted in a hazard ratio (HR) of 1.33 (95% confidence interval [95% CI]: 1.03-1.71) for MACE. This association increased (HR: 1.47, 95% CI: 1.03-1.71) after multivariable adjustment ( P = .020). Additionally, in multivariable linear regression analysis, FABP4 was linked to estimated glomerular filtration rate ( P < .001), gender ( P = .005), fasting triglycerides ( P = .048), and body mass index ( P < .001). Circulating FABP4 may be a useful additional biomarker to evaluate patients with stable PAD at risk of major cardiovascular complications.

  7. Acute Consumption of Flavan-3-ol-Enriched Dark Chocolate Affects Human Endogenous Metabolism.

    PubMed

    Ostertag, Luisa M; Philo, Mark; Colquhoun, Ian J; Tapp, Henri S; Saha, Shikha; Duthie, Garry G; Kemsley, E Kate; de Roos, Baukje; Kroon, Paul A; Le Gall, Gwénaëlle

    2017-07-07

    Flavan-3-ols and methylxanthines have potential beneficial effects on human health including reducing cardiovascular risk. We performed a randomized controlled crossover intervention trial to assess the acute effects of consumption of flavan-3-ol-enriched dark chocolate, compared with standard dark chocolate and white chocolate, on the human metabolome. We assessed the metabolome in urine and blood plasma samples collected before and at 2 and 6 h after consumption of chocolates in 42 healthy volunteers using a nontargeted metabolomics approach. Plasma samples were assessed and showed differentiation between time points with no further separation among the three chocolate treatments. Multivariate statistics applied to urine samples could readily separate the postprandial time points and distinguish between the treatments. Most of the markers responsible for the multivariate discrimination between the chocolates were of dietary origin. Interestingly, small but significant level changes were also observed for a subset of endogenous metabolites. 1 H NMR revealed that flavan-3-ol-enriched dark chocolate and standard dark chocolate reduced urinary levels of creatinine, lactate, some amino acids, and related degradation products and increased the levels of pyruvate and 4-hydroxyphenylacetate, a phenolic compound of bacterial origin. This study demonstrates that an acute chocolate intervention can significantly affect human metabolism.

  8. External validation of the modified Glasgow prognostic score for renal cancer

    PubMed Central

    Tai, Caroline G.; Johnson, Timothy V.; Abbasi, Ammara; Herrell, Lindsey; Harris, Wayne B.; Kucuk, Omer; Canter, Daniel J.; Ogan, Kenneth; Pattaras, John G.; Nieh, Peter T.; Master, Viraj A.

    2014-01-01

    Purpose: The modified Glasgow prognostic Score (mGPS) incorporates C-reactive protein and albumin as a clinically useful marker of tumor behavior. The ability of the mGPS to predict metastasis in localized renal cell carcinoma (RCC) remains unknown in an external validation cohort. Patients and Methods: Patients with clinically localized clear cell RCC were followed for 1 year post-operatively. Metastases were identified radiologically. Patients were categorized by mGPS score as low-risk (mGPS = 0 points), intermediate-risk (mGPS = 1 point) and high-risk (mGPS = 2 points). Univariate, Kaplan-Meier and multivariate Cox regression analyses examined Recurrence -free survival (RFS) across patient and disease characteristics. Results: Of the 129 patients in this study, 23.3% developed metastases. Of low, intermediate and high risk patients, 10.1%, 38.9% and 89.9% recurred during the study. After accounting for various patient and tumor characteristics in multivariate analysis including stage and grade, only mGPS was significantly associated with RFS. Compared with low-risk patients, intermediate- and high-risk patients experienced a 4-fold (hazard ratios [HR]: 4.035, 95% confidence interval [CI]: 1.312-12.415, P = 0.015) and 7-fold (HR: 7.012, 95% CI: 2.126-23.123 P < 0.001) risk of metastasis, respectively. Conclusions: mGPS is a robust predictor of metastasis following potentially curative nephrectomy for localized RCC. Clinicians may consider mGPS as an adjunct to identify high-risk patients for possible enrollment into clinical trials or for patient counseling PMID:24497679

  9. Involvement of Difference in Decrease of Hemoglobin Level in Poor Prognosis of Stage I and II Nasopharyngeal Carcinoma: Implication in Outcome of Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao Jin; State Key Laboratory of Oncology in Southern China, Sun Yat-Sen University, Guangzhou; Department of Radiation Oncology, Anhui provincial hospital, Hefei

    2012-03-15

    Purpose: To investigate the effect of hemoglobin (Hb) concentration and the difference in its decrease during treatment on outcome of radiotherapy (RT) alone for patients with Stage I and II nasopharyngeal carcinoma. Methods and Materials: A total of 572 patients with Stage I-II nasopharyngeal carcinoma with RT alone between January 2001 and December 2004 were retrospectively analyzed. Patient characteristics, tumor variables, and Hb level, including pre-RT Hb, mid-RT Hb, and dynamic change of Hb between pre- and post- RT and its difference in decrease ( White-Up-Pointing-Small-Triangle Hb) were subjected to univariate and multivariable analysis to identify factors that predict disease-specificmore » survival (DSS), local regional recurrence-free survival (LRFS), and metastases-free survival (MFS). Results: The 5-year DSS was poorer in the Hb continuous decrease group than in the Hb noncontinuous decrease group (84% vs. 89%; p = 0.008). There was poorer 5-year DSS in patients with White-Up-Pointing-Small-Triangle Hb of >11.5 g/L than in those with White-Up-Pointing-Small-Triangle Hb of {<=}11.5 g/L (82% vs. 89%; p = 0.001), and poorer LRFS (79% vs. 83%; p = 0.035). Univariate and multivariate analysis showed that Hb decrease difference with greater than 11.5 g/L was an independent prognostic factor for DSS and LRFS. Conclusions: The difference in decrease of Hb level during the course of radiation treatment appeared as a poor prognostic factor in Stage I and II nasopharyngeal carcinoma patients.« less

  10. Predictors of responses to corticosteroids for anorexia in advanced cancer patients: a multicenter prospective observational study.

    PubMed

    Matsuo, Naoki; Morita, Tatsuya; Matsuda, Yoshinobu; Okamoto, Kenichiro; Matsumoto, Yoshihisa; Kaneishi, Keisuke; Odagiri, Takuya; Sakurai, Hiroki; Katayama, Hideki; Mori, Ichiro; Yamada, Hirohide; Watanabe, Hiroaki; Yokoyama, Taro; Yamaguchi, Takashi; Nishi, Tomohiro; Shirado, Akemi; Hiramoto, Shuji; Watanabe, Toshio; Kohara, Hiroyuki; Shimoyama, Satofumi; Aruga, Etsuko; Baba, Mika; Sumita, Koki; Iwase, Satoru

    2017-01-01

    Although corticosteroids are widely used to relieve anorexia, information regarding the factors predicting responses to corticosteroids remains limited. The purpose of the study is to identify potential factors predicting responses to corticosteroids for anorexia in advanced cancer patients. Inclusion criteria for this multicenter prospective observational study were patients who had metastatic or locally advanced cancer and had an anorexia intensity score of 4 or more on a 0-10 Numerical Rating Scale (NRS). Univariate and multivariate analyses were conducted to identify the factors predicting ≥2-point reduction in NRS on day 3. Among 180 patients who received corticosteroids, 99 (55 %; 95 % confidence interval [CI], 47-62 %) had a response with ≥2-point reduction. Factors that significantly predicted responses were Palliative Performance Scale (PPS) > 40 and absence of drowsiness. In addition, factors that tended to be associated with ≥2-point reduction in NRS included PS 0-3, absence of diabetes mellitus, absence of peripheral edema, presence of lung metastasis, absence of peritoneal metastasis, baseline anorexia NRS of >6, presence of pain, and presence of constipation. A multivariate analysis showed that the independent factors predicting responses were PPS of >40 (odds ratio = 2.7 [95 % CI = 1.4-5.2]), absence of drowsiness (2.6 [1.3-5.0]), and baseline NRS of >6 (2.4 [1.1-4.8]). Treatment responses to corticosteroids for anorexia may be predicted by PPS, drowsiness, and baseline symptom intensity. Larger prospective studies are needed to confirm these results.

  11. Predictors of suicide and suicide attempt in subway stations: a population-based ecological study.

    PubMed

    Niederkrotenthaler, Thomas; Sonneck, Gernot; Dervic, Kanita; Nader, Ingo W; Voracek, Martin; Kapusta, Nestor D; Etzersdorfer, Elmar; Mittendorfer-Rutz, Ellenor; Dorner, Thomas

    2012-04-01

    Suicidal behavior on the subway often involves young people and has a considerable impact on public life, but little is known about factors associated with suicides and suicide attempts in specific subway stations. Between 1979 and 2009, 185 suicides and 107 suicide attempts occurred on the subway in Vienna, Austria. Station-specific suicide and suicide attempt rates (defined as the frequency of suicidal incidents per time period) were modeled as the outcome variables in bivariate and multivariate Poisson regression models. Structural station characteristics (presence of a surveillance unit, train types used, and construction on street level versus other construction), contextual station characteristics (neighborhood to historical sites, size of the catchment area, and in operation during time period of extensive media reporting on subway suicides), and passenger-based characteristics (number of passengers getting on the trains per day, use as meeting point by drug users, and socioeconomic status of the population in the catchment area) were used as the explanatory variables. In the multivariate analyses, subway suicides increased when stations were served by the faster train type. Subway suicide attempts increased with the daily number of passengers getting on the trains and with the stations' use as meeting points by drug users. The findings indicate that there are some differences between subway suicides and suicide attempts. Completed suicides seem to vary most with train type used. Suicide attempts seem to depend mostly on passenger-based characteristics, specifically on the station's crowdedness and on its use as meeting point by drug users. Suicide-preventive interventions should concentrate on crowded stations and on stations frequented by risk groups.

  12. Algorithms for Robust Identification and Control of Large Space Structures. Phase 1.

    DTIC Science & Technology

    1988-05-14

    Variate Analysis," Proc. Amer. Control Conf., San Francisco, * pp. 445-451. LECTIQUE, J., Rault, A., Tessier, M., and Testud , J.L. (1978), "Multivariable...Rault, J.L. Testud , and J. Papon (1978), "Model Predictive Heuris- tic Control: Applications to Industrial Processes," Automatica, Vol. 14, pp. 413...Control ’. Conference, Minneapolis, MN, June. TESTUD , J.L. (1979), "Commande Numerique Multivariable du Ballon de Recupera- tion de Vapeur," Adersa/Gerbios

  13. Recurrence networks from multivariate signals for uncovering dynamic transitions of horizontal oil-water stratified flows

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Zhang, Xin-Wang; Jin, Ning-De; Donner, Reik V.; Marwan, Norbert; Kurths, Jürgen

    2013-09-01

    Characterizing the mechanism of drop formation at the interface of horizontal oil-water stratified flows is a fundamental problem eliciting a great deal of attention from different disciplines. We experimentally and theoretically investigate the formation and transition of horizontal oil-water stratified flows. We design a new multi-sector conductance sensor and measure multivariate signals from two different stratified flow patterns. Using the Adaptive Optimal Kernel Time-Frequency Representation (AOK TFR) we first characterize the flow behavior from an energy and frequency point of view. Then, we infer multivariate recurrence networks from the experimental data and investigate the cross-transitivity for each constructed network. We find that the cross-transitivity allows quantitatively uncovering the flow behavior when the stratified flow evolves from a stable state to an unstable one and recovers deeper insights into the mechanism governing the formation of droplets at the interface of stratified flows, a task that existing methods based on AOK TFR fail to work. These findings present a first step towards an improved understanding of the dynamic mechanism leading to the transition of horizontal oil-water stratified flows from a complex-network perspective.

  14. An approach to multivariable control of manipulators

    NASA Technical Reports Server (NTRS)

    Seraji, H.

    1987-01-01

    The paper presents simple schemes for multivariable control of multiple-joint robot manipulators in joint and Cartesian coordinates. The joint control scheme consists of two independent multivariable feedforward and feedback controllers. The feedforward controller is the minimal inverse of the linearized model of robot dynamics and contains only proportional-double-derivative (PD2) terms - implying feedforward from the desired position, velocity and acceleration. This controller ensures that the manipulator joint angles track any reference trajectories. The feedback controller is of proportional-integral-derivative (PID) type and is designed to achieve pole placement. This controller reduces any initial tracking error to zero as desired and also ensures that robust steady-state tracking of step-plus-exponential trajectories is achieved by the joint angles. Simple and explicit expressions of computation of the feedforward and feedback gains are obtained based on the linearized model of robot dynamics. This leads to computationally efficient schemes for either on-line gain computation or off-line gain scheduling to account for variations in the linearized robot model due to changes in the operating point. The joint control scheme is extended to direct control of the end-effector motion in Cartesian space. Simulation results are given for illustration.

  15. The roles of associative and executive processes in creative cognition.

    PubMed

    Beaty, Roger E; Silvia, Paul J; Nusbaum, Emily C; Jauk, Emanuel; Benedek, Mathias

    2014-10-01

    How does the mind produce creative ideas? Past research has pointed to important roles of both executive and associative processes in creative cognition. But such work has largely focused on the influence of one ability or the other-executive or associative-so the extent to which both abilities may jointly affect creative thought remains unclear. Using multivariate structural equation modeling, we conducted two studies to determine the relative influences of executive and associative processes in domain-general creative cognition (i.e., divergent thinking). Participants completed a series of verbal fluency tasks, and their responses were analyzed by means of latent semantic analysis (LSA) and scored for semantic distance as a measure of associative ability. Participants also completed several measures of executive function-including broad retrieval ability (Gr) and fluid intelligence (Gf). Across both studies, we found substantial effects of both associative and executive abilities: As the average semantic distance between verbal fluency responses and cues increased, so did the creative quality of divergent-thinking responses (Study 1 and Study 2). Moreover, the creative quality of divergent-thinking responses was predicted by the executive variables-Gr (Study 1) and Gf (Study 2). Importantly, the effects of semantic distance and the executive function variables remained robust in the same structural equation model predicting divergent thinking, suggesting unique contributions of both constructs. The present research extends recent applications of LSA in creativity research and provides support for the notion that both associative and executive processes underlie the production of novel ideas.

  16. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    PubMed

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  17. Mass Spectrometry Imaging and Identification of Peptides Associated with Cephalic Ganglia Regeneration in Schmidtea mediterranea*

    PubMed Central

    Ong, Ta-Hsuan; Romanova, Elena V.; Roberts-Galbraith, Rachel H.; Yang, Ning; Zimmerman, Tyler A.; Collins, James J.; Lee, Ji Eun; Kelleher, Neil L.; Newmark, Phillip A.; Sweedler, Jonathan V.

    2016-01-01

    Tissue regeneration is a complex process that involves a mosaic of molecules that vary spatially and temporally. Insights into the chemical signaling underlying this process can be achieved with a multiplex and untargeted chemical imaging method such as mass spectrometry imaging (MSI), which can enable de novo studies of nervous system regeneration. A combination of MSI and multivariate statistics was used to differentiate peptide dynamics in the freshwater planarian flatworm Schmidtea mediterranea at different time points during cephalic ganglia regeneration. A protocol was developed to make S. mediterranea tissues amenable for MSI. MS ion images of planarian tissue sections allow changes in peptides and unknown compounds to be followed as a function of cephalic ganglia regeneration. In conjunction with fluorescence imaging, our results suggest that even though the cephalic ganglia structure is visible after 6 days of regeneration, the original chemical composition of these regenerated structures is regained only after 12 days. Differences were observed in many peptides, such as those derived from secreted peptide 4 and EYE53-1. Peptidomic analysis further identified multiple peptides from various known prohormones, histone proteins, and DNA- and RNA-binding proteins as being associated with the regeneration process. Mass spectrometry data also facilitated the identification of a new prohormone, which we have named secreted peptide prohormone 20 (SPP-20), and is up-regulated during regeneration in planarians. PMID:26884331

  18. Feasibility Study on the Use of On-line Multivariate Statistical Process Control for Safeguards Applications in Natural Uranium Conversion Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ladd-Lively, Jennifer L

    2014-01-01

    The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less

  19. Using Gaussian windows to explore a multivariate data set

    NASA Technical Reports Server (NTRS)

    Jaeckel, Louis A.

    1991-01-01

    In an earlier paper, I recounted an exploratory analysis, using Gaussian windows, of a data set derived from the Infrared Astronomical Satellite. Here, my goals are to develop strategies for finding structural features in a data set in a many-dimensional space, and to find ways to describe the shape of such a data set. After a brief review of Gaussian windows, I describe the current implementation of the method. I give some ways of describing features that we might find in the data, such as clusters and saddle points, and also extended structures such as a 'bar', which is an essentially one-dimensional concentration of data points. I then define a distance function, which I use to determine which data points are 'associated' with a feature. Data points not associated with any feature are called 'outliers'. I then explore the data set, giving the strategies that I used and quantitative descriptions of the features that I found, including clusters, bars, and a saddle point. I tried to use strategies and procedures that could, in principle, be used in any number of dimensions.

  20. Multivariate statistical process control (MSPC) using Raman spectroscopy for in-line culture cell monitoring considering time-varying batches synchronized with correlation optimized warping (COW).

    PubMed

    Liu, Ya-Juan; André, Silvère; Saint Cristau, Lydia; Lagresle, Sylvain; Hannas, Zahia; Calvosa, Éric; Devos, Olivier; Duponchel, Ludovic

    2017-02-01

    Multivariate statistical process control (MSPC) is increasingly popular as the challenge provided by large multivariate datasets from analytical instruments such as Raman spectroscopy for the monitoring of complex cell cultures in the biopharmaceutical industry. However, Raman spectroscopy for in-line monitoring often produces unsynchronized data sets, resulting in time-varying batches. Moreover, unsynchronized data sets are common for cell culture monitoring because spectroscopic measurements are generally recorded in an alternate way, with more than one optical probe parallelly connecting to the same spectrometer. Synchronized batches are prerequisite for the application of multivariate analysis such as multi-way principal component analysis (MPCA) for the MSPC monitoring. Correlation optimized warping (COW) is a popular method for data alignment with satisfactory performance; however, it has never been applied to synchronize acquisition time of spectroscopic datasets in MSPC application before. In this paper we propose, for the first time, to use the method of COW to synchronize batches with varying durations analyzed with Raman spectroscopy. In a second step, we developed MPCA models at different time intervals based on the normal operation condition (NOC) batches synchronized by COW. New batches are finally projected considering the corresponding MPCA model. We monitored the evolution of the batches using two multivariate control charts based on Hotelling's T 2 and Q. As illustrated with results, the MSPC model was able to identify abnormal operation condition including contaminated batches which is of prime importance in cell culture monitoring We proved that Raman-based MSPC monitoring can be used to diagnose batches deviating from the normal condition, with higher efficacy than traditional diagnosis, which would save time and money in the biopharmaceutical industry. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Is a multivariate consensus representation of genetic relationships among populations always meaningful?

    PubMed Central

    Moazami-Goudarzi, K; Laloë, D

    2002-01-01

    To determine the relationships among closely related populations or species, two methods are commonly used in the literature: phylogenetic reconstruction or multivariate analysis. The aim of this article is to assess the reliability of multivariate analysis. We describe a method that is based on principal component analysis and Mantel correlations, using a two-step process: The first step consists of a single-marker analysis and the second step tests if each marker reveals the same typology concerning population differentiation. We conclude that if single markers are not congruent, the compromise structure is not meaningful. Our model is not based on any particular mutation process and it can be applied to most of the commonly used genetic markers. This method is also useful to determine the contribution of each marker to the typology of populations. We test whether our method is efficient with two real data sets based on microsatellite markers. Our analysis suggests that for closely related populations, it is not always possible to accept the hypothesis that an increase in the number of markers will increase the reliability of the typology analysis. PMID:12242255

  2. An Improved Method to Control the Critical Parameters of a Multivariable Control System

    NASA Astrophysics Data System (ADS)

    Subha Hency Jims, P.; Dharmalingam, S.; Wessley, G. Jims John

    2017-10-01

    The role of control systems is to cope with the process deficiencies and the undesirable effect of the external disturbances. Most of the multivariable processes are highly iterative and complex in nature. Aircraft systems, Modern Power Plants, Refineries, Robotic systems are few such complex systems that involve numerous critical parameters that need to be monitored and controlled. Control of these important parameters is not only tedious and cumbersome but also is crucial from environmental, safety and quality perspective. In this paper, one such multivariable system, namely, a utility boiler has been considered. A modern power plant is a complex arrangement of pipework and machineries with numerous interacting control loops and support systems. In this paper, the calculation of controller parameters based on classical tuning concepts has been presented. The controller parameters thus obtained and employed has controlled the critical parameters of a boiler during fuel switching disturbances. The proposed method can be applied to control the critical parameters like elevator, aileron, rudder, elevator trim rudder and aileron trim, flap control systems of aircraft systems.

  3. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    PubMed

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. Multivariate analysis of longitudinal rates of change.

    PubMed

    Bryan, Matthew; Heagerty, Patrick J

    2016-12-10

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed in the literature. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, 'accelerated time' methods have been developed which assume that covariates rescale time in longitudinal models for disease progression. In this manuscript, we detail an alternative multivariate model formulation that directly structures longitudinal rates of change and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Voxelwise multivariate analysis of multimodality magnetic resonance imaging

    PubMed Central

    Naylor, Melissa G.; Cardenas, Valerie A.; Tosun, Duygu; Schuff, Norbert; Weiner, Michael; Schwartzman, Armin

    2015-01-01

    Most brain magnetic resonance imaging (MRI) studies concentrate on a single MRI contrast or modality, frequently structural MRI. By performing an integrated analysis of several modalities, such as structural, perfusion-weighted, and diffusion-weighted MRI, new insights may be attained to better understand the underlying processes of brain diseases. We compare two voxelwise approaches: (1) fitting multiple univariate models, one for each outcome and then adjusting for multiple comparisons among the outcomes and (2) fitting a multivariate model. In both cases, adjustment for multiple comparisons is performed over all voxels jointly to account for the search over the brain. The multivariate model is able to account for the multiple comparisons over outcomes without assuming independence because the covariance structure between modalities is estimated. Simulations show that the multivariate approach is more powerful when the outcomes are correlated and, even when the outcomes are independent, the multivariate approach is just as powerful or more powerful when at least two outcomes are dependent on predictors in the model. However, multiple univariate regressions with Bonferroni correction remains a desirable alternative in some circumstances. To illustrate the power of each approach, we analyze a case control study of Alzheimer's disease, in which data from three MRI modalities are available. PMID:23408378

  6. Sensitivity assessment of freshwater macroinvertebrates to pesticides using biological traits.

    PubMed

    Ippolito, A; Todeschini, R; Vighi, M

    2012-03-01

    Assessing the sensitivity of different species to chemicals is one of the key points in predicting the effects of toxic compounds in the environment. Trait-based predicting methods have proved to be extremely efficient for assessing the sensitivity of macroinvertebrates toward compounds with non specific toxicity (narcotics). Nevertheless, predicting the sensitivity of organisms toward compounds with specific toxicity is much more complex, since it depends on the mode of action of the chemical. The aim of this work was to predict the sensitivity of several freshwater macroinvertebrates toward three classes of plant protection products: organophosphates, carbamates and pyrethroids. Two databases were built: one with sensitivity data (retrieved, evaluated and selected from the U.S. Environmental Protection Agency ECOTOX database) and the other with biological traits. Aside from the "traditional" traits usually considered in ecological analysis (i.e. body size, respiration technique, feeding habits, etc.), multivariate analysis was used to relate the sensitivity of organisms to some other characteristics which may be involved in the process of intoxication. Results confirmed that, besides traditional biological traits, related to uptake capability (e.g. body size and body shape) some traits more related to particular metabolic characteristics or patterns have a good predictive capacity on the sensitivity to these kinds of toxic substances. For example, behavioral complexity, assumed as an indicator of nervous system complexity, proved to be an important predictor of sensitivity towards these compounds. These results confirm the need for more complex traits to predict effects of highly specific substances. One key point for achieving a complete mechanistic understanding of the process is the choice of traits, whose role in the discrimination of sensitivity should be clearly interpretable, and not only statistically significant.

  7. Optical monitoring of Disinfection By-product Precursors with Fluorescence Excitation-Emission Mapping (F-EEM): Practical Application Issues for Drinking, Waste and Reuse Water Industry

    NASA Astrophysics Data System (ADS)

    Gilmore, A. M.

    2012-12-01

    Drinking water, wastewater and reuse plants must deal with regulations associated with bacterial contamination and halogen disinfection procedures that can generate harmful disinfection by-products (DBPs) including trihalomethanes (THMs), haloacetic acids (HOAAs) and other compounds. The natural fluorescent chromophoric dissolved organic matter (CDOM) is regulated as the major DBP precursor. This study outlines the advantages and current limitations associated with optical monitoring of water treatment processes using tcontemporary Fluorescence Excitation-Emission Mapping (F-EEM). The F-EEM method coupled with practical peak indexing and multi-variate analyses is potentially superior in terms of cost, speed and sensitivity over conventional total organic carbon (TOC) meters and specific UV-absorbance (SUVA) measurements. Hence there is strong interest in developing revised environmental regulations around the F-EEM technique instruments which can incidentally simultaneously measure the SUVA and DOC parameters. Importantly, the F-EEM technique, compared to the single-point TOC and SUVA signals can resolve CDOM classes distinguishing those that strongly cause DBPs. The F-EEM DBP prediction method can be applied to surface water sources to evaluate DBP potential as a function of the point sources and reservoir depth profiles. It can also be applied in-line to rapidly adjust DOC removal processes including sedimentation-flocculation, microfiltration, reverse-osmosis, and ozonation. Limitations and interferences for F-EEMs are discussed including those common to SUVA and TOC in contrast to the advantages including that F-EEMs are less prone to interferences from inorganic carbon and metal contaminations and require little if any chemical preparation. In conclusion, the F-EEM method is discussed in terms of not only the DBP problem but also as a means of predicting (concurrent to DBP monitoring) organic membrane fouling in water-reuse and desalination plants.

  8. Integrating remote sensing with species distribution models; Mapping tamarisk invasions using the Software for Assisted Habitat Modeling (SAHM)

    USGS Publications Warehouse

    West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-01-01

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  9. Integrating Remote Sensing with Species Distribution Models; Mapping Tamarisk Invasions Using the Software for Assisted Habitat Modeling (SAHM).

    PubMed

    West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan

    2016-10-11

    Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.

  10. In line NIR quantification of film thickness on pharmaceutical pellets during a fluid bed coating process.

    PubMed

    Lee, Min-Jeong; Seo, Da-Young; Lee, Hea-Eun; Wang, In-Chun; Kim, Woo-Sik; Jeong, Myung-Yung; Choi, Guang J

    2011-01-17

    Along with the risk-based approach, process analytical technology (PAT) has emerged as one of the key elements to fully implement QbD (quality-by-design). Near-infrared (NIR) spectroscopy has been extensively applied as an in-line/on-line analytical tool in biomedical and chemical industries. In this study, the film thickness on pharmaceutical pellets was examined for quantification using in-line NIR spectroscopy during a fluid-bed coating process. A precise monitoring of coating thickness and its prediction with a suitable control strategy is crucial to the quality assurance of solid dosage forms including dissolution characteristics. Pellets of a test formulation were manufactured and coated in a fluid-bed by spraying a hydroxypropyl methylcellulose (HPMC) coating solution. NIR spectra were acquired via a fiber-optic probe during the coating process, followed by multivariate analysis utilizing partial least squares (PLS) calibration models. The actual coating thickness of pellets was measured by two separate methods, confocal laser scanning microscopy (CLSM) and laser diffraction particle size analysis (LD-PSA). Both characterization methods gave superb correlation results, and all determination coefficient (R(2)) values exceeded 0.995. In addition, a prediction coating experiment for 70min demonstrated that the end-point can be accurately designated via NIR in-line monitoring with appropriate calibration models. In conclusion, our approach combining in-line NIR monitoring with CLSM and LD-PSA can be applied as an effective PAT tool for fluid-bed pellet coating processes. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Work and retirement among a cohort of older men in the United States, 1966-1983.

    PubMed

    Hayward, M D; Grady, W R

    1990-08-01

    Multivariate increment-decrement working life tables are estimated for a cohort of older men in the United States for the period 1966-1983. The approach taken allows multiple processes to be simultaneously incorporated into a single model, resulting in a more realistic portrayal of a cohort's late-life labor force behavior. In addition, because the life table model is developed from multivariate hazard equations, we identify the effects of sociodemographic characteristics on the potentially complex process by which the labor force career is ended. In contrast to the assumed homogeneity of previous working life table analyses, the present study shows marked differences in labor force mobility and working and nonworking life expectancy according to occupation, class of worker, education, race, and marital status. We briefly discuss the implications of these findings for inequities of access to retirement, private and public pension consumption, and future changes in the retirement process.

  12. Reverse inference of memory retrieval processes underlying metacognitive monitoring of learning using multivariate pattern analysis.

    PubMed

    Stiers, Peter; Falbo, Luciana; Goulas, Alexandros; van Gog, Tamara; de Bruin, Anique

    2016-05-15

    Monitoring of learning is only accurate at some time after learning. It is thought that immediate monitoring is based on working memory, whereas later monitoring requires re-activation of stored items, yielding accurate judgements. Such interpretations are difficult to test because they require reverse inference, which presupposes specificity of brain activity for the hidden cognitive processes. We investigated whether multivariate pattern classification can provide this specificity. We used a word recall task to create single trial examples of immediate and long term retrieval and trained a learning algorithm to discriminate them. Next, participants performed a similar task involving monitoring instead of recall. The recall-trained classifier recognized the retrieval patterns underlying immediate and long term monitoring and classified delayed monitoring examples as long-term retrieval. This result demonstrates the feasibility of decoding cognitive processes, instead of their content. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Robustness of reduced-order multivariable state-space self-tuning controller

    NASA Technical Reports Server (NTRS)

    Yuan, Zhuzhi; Chen, Zengqiang

    1994-01-01

    In this paper, we present a quantitative analysis of the robustness of a reduced-order pole-assignment state-space self-tuning controller for a multivariable adaptive control system whose order of the real process is higher than that of the model used in the controller design. The result of stability analysis shows that, under a specific bounded modelling error, the adaptively controlled closed-loop real system via the reduced-order state-space self-tuner is BIBO stable in the presence of unmodelled dynamics.

  14. Processes of Heat Transfer in Rheologically Unstable Mixtures of Organic Origin

    NASA Astrophysics Data System (ADS)

    Tkachenko, S. I.; Pishenina, N. V.; Rumyantseva, T. Yu.

    2014-05-01

    The dependence of the coefficient of heat transfer from the heat-exchange surface to a rheologically unstable organic mixture on the thermohydrodynamic state of the mixture and its prehistory has been established. A method for multivariant investigation of the process of heat transfer in compound organic mixtures has been proposed; this method makes it possible to evaluate the character and peculiarities of change in the rheological structure of the mixture as functions of the thermohydrodynamic conditions of its treatment. The possibility of evaluating the intensity of heat transfer in a biotechnological system for production of energy carriers at the step of its designing by multivariant investigation of the heat-transfer intensity in rheologically unstable organic mixtures with account of their prehistory has been shown.

  15. Numerical Simulation and Optimization of Directional Solidification Process of Single Crystal Superalloy Casting

    PubMed Central

    Zhang, Hang; Xu, Qingyan; Liu, Baicheng

    2014-01-01

    The rapid development of numerical modeling techniques has led to more accurate results in modeling metal solidification processes. In this study, the cellular automaton-finite difference (CA-FD) method was used to simulate the directional solidification (DS) process of single crystal (SX) superalloy blade samples. Experiments were carried out to validate the simulation results. Meanwhile, an intelligent model based on fuzzy control theory was built to optimize the complicate DS process. Several key parameters, such as mushy zone width and temperature difference at the cast-mold interface, were recognized as the input variables. The input variables were functioned with the multivariable fuzzy rule to get the output adjustment of withdrawal rate (v) (a key technological parameter). The multivariable fuzzy rule was built, based on the structure feature of casting, such as the relationship between section area, and the delay time of the temperature change response by changing v, and the professional experience of the operator as well. Then, the fuzzy controlling model coupled with CA-FD method could be used to optimize v in real-time during the manufacturing process. The optimized process was proven to be more flexible and adaptive for a steady and stray-grain free DS process. PMID:28788535

  16. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    PubMed

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  17. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    PubMed

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Marked point process for modelling seismic activity (case study in Sumatra and Java)

    NASA Astrophysics Data System (ADS)

    Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.

    2018-05-01

    Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.

  19. Applications of modern statistical methods to analysis of data in physical science

    NASA Astrophysics Data System (ADS)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.

  20. Boiling points of halogenated aliphatic compounds: a quantitative structure-property relationship for prediction and validation.

    PubMed

    Oberg, Tomas

    2004-01-01

    Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.

  1. Point-of-Sale Tobacco Marketing to Youth in New York State.

    PubMed

    Waddell, Elizabeth Needham; Sacks, Rachel; Farley, Shannon M; Johns, Michael

    2016-09-01

    To assess youth exposure to menthol versus nonmenthol cigarette advertising, we examined whether menthol cigarette promotions are more likely in neighborhoods with relatively high youth populations. We linked 2011 New York State Retail Advertising Tobacco Survey observational data with U.S. Census and American Community Survey demographic data. Multivariable models assessed the relationship between neighborhood youth population and point-of-sale cigarette promotions for three brands of cigarettes, adjusting for neighborhood demographic characteristics including race/ethnicity and poverty. Menthol cigarette point-of-sale marketing was more likely in neighborhoods with higher proportions of youth, adjusting for presence of nonmenthol brand marketing, neighborhood race/ethnicity, neighborhood poverty, and urban geography. Data from the 2011 Retail Advertising Tobacco Study linked to block level census data clearly indicate that price reduction promotions for menthol cigarettes are disproportionately targeted to youth markets in New York State. Published by Elsevier Inc.

  2. A Search for Point Sources of EeV Photons

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Coutu, S.; Covault, C. E.; Criss, A.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Fuji, T.; Gaior, R.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Islo, K.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; La Rosa, G.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Malacari, M.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, A. J.; Matthews, J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Olinto, A.; Oliveira, M.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Peters, C.; Petrera, S.; Petrolini, A.; Petrov, Y.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez Cabo, I.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schulz, A.; Schulz, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tartare, M.; Thao, N. T.; Theodoro, V. M.; Tiffenberg, J.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Whelan, B. J.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Auger Collaboration102, The Pierre

    2014-07-01

    Measurements of air showers made using the hybrid technique developed with the fluorescence and surface detectors of the Pierre Auger Observatory allow a sensitive search for point sources of EeV photons anywhere in the exposed sky. A multivariate analysis reduces the background of hadronic cosmic rays. The search is sensitive to a declination band from -85° to +20°, in an energy range from 1017.3 eV to 1018.5 eV. No photon point source has been detected. An upper limit on the photon flux has been derived for every direction. The mean value of the energy flux limit that results from this, assuming a photon spectral index of -2, is 0.06 eV cm-2 s-1, and no celestial direction exceeds 0.25 eV cm-2 s-1. These upper limits constrain scenarios in which EeV cosmic ray protons are emitted by non-transient sources in the Galaxy.

  3. The intrinsic dependence structure of peak, volume, duration, and average intensity of hyetographs and hydrographs

    NASA Astrophysics Data System (ADS)

    Serinaldi, Francesco; Kilsby, Chris G.

    2013-06-01

    The information contained in hyetographs and hydrographs is often synthesized by using key properties such as the peak or maximum value Xp, volume V, duration D, and average intensity I. These variables play a fundamental role in hydrologic engineering as they are used, for instance, to define design hyetographs and hydrographs as well as to model and simulate the rainfall and streamflow processes. Given their inherent variability and the empirical evidence of the presence of a significant degree of association, such quantities have been studied as correlated random variables suitable to be modeled by multivariate joint distribution functions. The advent of copulas in geosciences simplified the inference procedures allowing for splitting the analysis of the marginal distributions and the study of the so-called dependence structure or copula. However, the attention paid to the modeling task has overlooked a more thorough study of the true nature and origin of the relationships that link Xp,V,D, and I. In this study, we apply a set of ad hoc bootstrap algorithms to investigate these aspects by analyzing the hyetographs and hydrographs extracted from 282 daily rainfall series from central eastern Europe, three 5 min rainfall series from central Italy, 80 daily streamflow series from the continental United States, and two sets of 200 simulated universal multifractal time series. Our results show that all the pairwise dependence structures between Xp,V,D, and I exhibit some key properties that can be reproduced by simple bootstrap algorithms that rely on a standard univariate resampling without resort to multivariate techniques. Therefore, the strong similarities between the observed dependence structures and the agreement between the observed and bootstrap samples suggest the existence of a numerical generating mechanism based on the superposition of the effects of sampling data at finite time steps and the process of summing realizations of independent random variables over random durations. We also show that the pairwise dependence structures are weakly dependent on the internal patterns of the hyetographs and hydrographs, meaning that the temporal evolution of the rainfall and runoff events marginally influences the mutual relationships of Xp,V,D, and I. Finally, our findings point out that subtle and often overlooked deterministic relationships between the properties of the event hyetographs and hydrographs exist. Confusing these relationships with genuine stochastic relationships can lead to an incorrect application of multivariate distributions and copulas and to misleading results.

  4. Some New Approaches to Multivariate Probability Distributions.

    DTIC Science & Technology

    1986-12-01

    Krishnaiah (1977). The following example may serve as an illustration of this point. EXAMPLE 2. (Fre^*chet’s bivariate continuous distribution...the error in the theorem of "" Prakasa Rao (1974) and to Dr. P.R. Krishnaiah for his valuable comments on the initial draft, his monumental patience and...M. and Proschan, F. (1984). Nonparametric Concepts and Methods in Reliability, Handbook of Statistics, 4, 613-655, (eds. P.R. Krishnaiah and P.K

  5. Behavioral and Psychosocial Correlates of HIV Testing Among Male Clients of Female Sex Workers in Tijuana, Mexico.

    PubMed

    Fleming, Paul J; Patterson, Thomas L; Chavarin, Claudia V; Semple, Shirley J; Magis-Rodriguez, Carlos; Pitpitan, Eileen V

    2017-08-01

    We use data collected from a sample of 400 male clients of female sex workers (FSW) to examine their HIV testing behavior. We present frequencies of HIV testing and used bivariate and multivariable analyses to assess its socio-demographic, behavioral, and psychosocial correlates. We found that the majority (55 %) of male clients of FSW in Tijuana, Mexico had never had an HIV test and the prevalence of HIV testing within the past year was low (9 %). In multivariable analyses, significant correlates of having ever tested for HIV were higher age, higher HIV knowledge score, lower sexual compulsiveness score, lower misogynistic attitudes score, having a condom break during sex with a FSW, and higher frequency of sex with a FSW while she was high. Our findings represent an important starting point for developing effective interventions to address the need to promote HIV testing among this population.

  6. The upper respiratory pyramid: early factors and later treatment utilization in World Trade Center exposed firefighters.

    PubMed

    Niles, Justin K; Webber, Mayris P; Liu, Xiaoxue; Zeig-Owens, Rachel; Hall, Charles B; Cohen, Hillel W; Glaser, Michelle S; Weakley, Jessica; Schwartz, Theresa M; Weiden, Michael D; Nolan, Anna; Aldrich, Thomas K; Glass, Lara; Kelly, Kerry J; Prezant, David J

    2014-08-01

    We investigated early post 9/11 factors that could predict rhinosinusitis healthcare utilization costs up to 11 years later in 8,079 World Trade Center-exposed rescue/recovery workers. We used bivariate and multivariate analytic techniques to investigate utilization outcomes; we also used a pyramid framework to describe rhinosinusitis healthcare groups at early (by 9/11/2005) and late (by 9/11/2012) time points. Multivariate models showed that pre-9/11/2005 chronic rhinosinusitis diagnoses and nasal symptoms predicted final year healthcare utilization outcomes more than a decade after WTC exposure. The relative proportion of workers on each pyramid level changed significantly during the study period. Diagnoses of chronic rhinosinusitis within 4 years of a major inhalation event only partially explain future healthcare utilization. Exposure intensity, early symptoms and other factors must also be considered when anticipating future healthcare needs. © 2014 Wiley Periodicals, Inc.

  7. Prolonged instability prior to a regime shift

    USGS Publications Warehouse

    Spanbauer, Trisha; Allen, Craig R.; Angeler, David G.; Eason, Tarsha; Fritz, Sherilyn C.; Garmestani, Ahjond S.; Nash, Kirsty L.; Stone, Jeffery R.

    2014-01-01

    Regime shifts are generally defined as the point of ‘abrupt’ change in the state of a system. However, a seemingly abrupt transition can be the product of a system reorganization that has been ongoing much longer than is evident in statistical analysis of a single component of the system. Using both univariate and multivariate statistical methods, we tested a long-term high-resolution paleoecological dataset with a known change in species assemblage for a regime shift. Analysis of this dataset with Fisher Information and multivariate time series modeling showed that there was a∼2000 year period of instability prior to the regime shift. This period of instability and the subsequent regime shift coincide with regional climate change, indicating that the system is undergoing extrinsic forcing. Paleoecological records offer a unique opportunity to test tools for the detection of thresholds and stable-states, and thus to examine the long-term stability of ecosystems over periods of multiple millennia.

  8. Stimulation of a turbofan engine for evaluation of multivariable optimal control concepts. [(computerized simulation)

    NASA Technical Reports Server (NTRS)

    Seldner, K.

    1976-01-01

    The development of control systems for jet engines requires a real-time computer simulation. The simulation provides an effective tool for evaluating control concepts and problem areas prior to actual engine testing. The development and use of a real-time simulation of the Pratt and Whitney F100-PW100 turbofan engine is described. The simulation was used in a multi-variable optimal controls research program using linear quadratic regulator theory. The simulation is used to generate linear engine models at selected operating points and evaluate the control algorithm. To reduce the complexity of the design, it is desirable to reduce the order of the linear model. A technique to reduce the order of the model; is discussed. Selected results between high and low order models are compared. The LQR control algorithms can be programmed on digital computer. This computer will control the engine simulation over the desired flight envelope.

  9. Dissolution comparisons using a Multivariate Statistical Distance (MSD) test and a comparison of various approaches for calculating the measurements of dissolution profile comparison.

    PubMed

    Cardot, J-M; Roudier, B; Schütz, H

    2017-07-01

    The f 2 test is generally used for comparing dissolution profiles. In cases of high variability, the f 2 test is not applicable, and the Multivariate Statistical Distance (MSD) test is frequently proposed as an alternative by the FDA and EMA. The guidelines provide only general recommendations. MSD tests can be performed either on raw data with or without time as a variable or on parameters of models. In addition, data can be limited-as in the case of the f 2 test-to dissolutions of up to 85% or to all available data. In the context of the present paper, the recommended calculation included all raw dissolution data up to the first point greater than 85% as a variable-without the various times as parameters. The proposed MSD overcomes several drawbacks found in other methods.

  10. The Upper Respiratory Pyramid: Early Factors and Later Treatment Utilization in World Trade Center Exposed Firefighters

    PubMed Central

    Niles, Justin K.; Webber, Mayris P.; Liu, Xiaoxue; Zeig-Owens, Rachel; Hall, Charles B.; Cohen, Hillel W.; Glaser, Michelle S.; Weakley, Jessica; Schwartz, Theresa M.; Weiden, Michael D.; Nolan, Anna; Aldrich, Thomas K.; Glass, Lara; Kelly, Kerry J.; Prezant, David J.

    2015-01-01

    Background We investigated early post 9/11 factors that could predict rhinosinusitis healthcare utilization costs up to 11 years later in 8,079 World Trade Center-exposed rescue/recovery workers. Methods We used bivariate and multivariate analytic techniques to investigate utilization outcomes; we also used a pyramid framework to describe rhinosinusitis healthcare groups at early (by 9/11/2005) and late (by 9/11/2012) time points. Results Multivariate models showed that pre-9/11/2005 chronic rhinosinusitis diagnoses and nasal symptoms predicted final year healthcare utilization outcomes more than a decade after WTC exposure. The relative proportion of workers on each pyramid level changed significantly during the study period. Conclusions Diagnoses of chronic rhinosinusitis within 4 years of a major inhalation event only partially explain future healthcare utilization. Exposure intensity, early symptoms and other factors must also be considered when anticipating future healthcare needs. PMID:24898816

  11. Pneumonia in multiple injured patients: a prospective controlled trial on early prediction using clinical and immunological parameters.

    PubMed

    Andermahr, J; Greb, A; Hensler, T; Helling, H J; Bouillon, B; Sauerland, S; Rehm, K E; Neugebauer, E

    2002-05-01

    In a prospective trial 266 multiple injured patients were included to evaluate clinical risk factors and immune parameters related to pneumonia. Clinical and humoral parameters were assessed and multivariate analysis performed. The multivariate analysis (odds ratio with 95% confidence interval (CI)) revealed male gender (3.65), traumatic brain injury (TBI) (2.52), thorax trauma (AIS(thorax) > or = 3) (2.05), antibiotic prophylaxis (1.30), injury severity score (ISS) (1.03 per ISS point) and the age (1.02 per year) as risk factors for pneumonia. The main pathogens were Acinetobacter Baumannii (40%) and Staphylococcus aureus (25%). A tendency towards higher Procalcitonin (PCT) and Interleukin (IL)-6 levels two days after trauma was observed for pneumonia patients. The immune parameters (PCT, IL-6, IL-10, soluble tumor necrosis factor p-55 and p-75) could not confirm the diagnosis of pneumonia earlier than the clinical parameters.

  12. Prefrontal gray matter volume mediates genetic risks for obesity.

    PubMed

    Opel, N; Redlich, R; Kaehler, C; Grotegerd, D; Dohm, K; Heindel, W; Kugel, H; Thalamuthu, A; Koutsouleris, N; Arolt, V; Teuber, A; Wersching, H; Baune, B T; Berger, K; Dannlowski, U

    2017-05-01

    Genetic and neuroimaging research has identified neurobiological correlates of obesity. However, evidence for an integrated model of genetic risk and brain structural alterations in the pathophysiology of obesity is still absent. Here we investigated the relationship between polygenic risk for obesity, gray matter structure and body mass index (BMI) by the use of univariate and multivariate analyses in two large, independent cohorts (n=330 and n=347). Higher BMI and higher polygenic risk for obesity were significantly associated with medial prefrontal gray matter decrease, and prefrontal gray matter was further shown to significantly mediate the effect of polygenic risk for obesity on BMI in both samples. Building on this, the successful individualized prediction of BMI by means of multivariate pattern classification algorithms trained on whole-brain imaging data and external validations in the second cohort points to potential clinical applications of this imaging trait marker.

  13. Specific prognostic factors for secondary pancreatic infection in severe acute pancreatitis.

    PubMed

    Armengol-Carrasco, M; Oller, B; Escudero, L E; Roca, J; Gener, J; Rodríguez, N; del Moral, P; Moreno, P

    1999-01-01

    The aim of the present study was to investigate whether there are specific prognostic factors to predict the development of secondary pancreatic infection (SPI) in severe acute pancreatitis in order to perform a computed tomography-fine needle aspiration with bacteriological sampling at the right moment and confirm the diagnosis. Twenty-five clinical and laboratory parameters were determined sequentially in 150 patients with severe acute pancreatitis (SAP) and univariate, and multivariate regression analyses were done looking for correlation with the development of SPI. Only APACHE II score and C-reactive protein levels were related to the development of SPI in the multivariate analysis. A regression equation was designed using these two parameters, and empiric cut-off points defined the subgroup of patients at high risk of developing secondary pancreatic infection. The results showed that it is possible to predict SPI during SAP allowing bacteriological confirmation and early treatment of this severe condition.

  14. Rheumatoid Arthritis Risk Allele PTPRC Is Also Associated With Response to Anti–Tumor Necrosis Factor α Therapy

    PubMed Central

    Cui, Jing; Saevarsdottir, Saedis; Thomson, Brian; Padyukov, Leonid; van der Helm-van Mil, Annette H. M.; Nititham, Joanne; Hughes, Laura B.; de Vries, Niek; Raychaudhuri, Soumya; Alfredsson, Lars; Askling, Johan; Wedrén, Sara; Ding, Bo; Guiducci, Candace; Wolbink, Gert Jan; Crusius, J. Bart A.; van der Horst-Bruinsma, Irene E.; Herenius, Marieke; Weinblatt, Michael E.; Shadick, Nancy A.; Worthington, Jane; Batliwalla, Franak; Kern, Marlena; Morgan, Ann W.; Wilson, Anthony G.; Isaacs, John D.; Hyrich, Kimme; Seldin, Michael F.; Moreland, Larry W.; Behrens, Timothy W.; Allaart, Cornelia F.; Criswell, Lindsey A.; Huizinga, Tom W. J.; Tak, Paul P.; Bridges, S. Louis; Toes, Rene E. M.; Barton, Anne; Klareskog, Lars; Gregersen, Peter K.; Karlson, Elizabeth W.; Plenge, Robert M.

    2013-01-01

    Objective Anti–tumor necrosis factor α (anti-TNF) therapy is a mainstay of treatment in rheumatoid arthritis (RA). The aim of the present study was to test established RA genetic risk factors to determine whether the same alleles also influence the response to anti-TNF therapy. Methods A total of 1,283 RA patients receiving etanercept, infliximab, or adalimumab therapy were studied from among an international collaborative consortium of 9 different RA cohorts. The primary end point compared RA patients with a good treatment response according to the European League Against Rheumatism (EULAR) response criteria (n = 505) with RA patients considered to be nonresponders (n = 316). The secondary end point was the change from baseline in the level of disease activity according to the Disease Activity Score in 28 joints (ΔDAS28). Clinical factors such as age, sex, and concomitant medications were tested as possible correlates of treatment response. Thirty-one single-nucleotide polymorphisms (SNPs) associated with the risk of RA were genotyped and tested for any association with treatment response, using univariate and multivariate logistic regression models. Results Of the 31 RA-associated risk alleles, a SNP at the PTPRC (also known as CD45) gene locus (rs10919563) was associated with the primary end point, a EULAR good response versus no response (odds ratio [OR] 0.55, P = 0.0001 in the multivariate model). Similar results were obtained using the secondary end point, the ΔDAS28 (P = 0.0002). There was suggestive evidence of a stronger association in autoantibody-positive patients with RA (OR 0.55, 95% confidence interval [95% CI] 0.39–0.76) as compared with autoantibody-negative patients (OR 0.90, 95% CI 0.41–1.99). Conclusion Statistically significant associations were observed between the response to anti-TNF therapy and an RA risk allele at the PTPRC gene locus. Additional studies will be required to replicate this finding in additional patient collections. PMID:20309874

  15. Gastrointestinal Dose-Histogram Effects in the Context of Dose-Volume–Constrained Prostate Radiation Therapy: Analysis of Data From the RADAR Prostate Radiation Therapy Trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebert, Martin A., E-mail: Martin.Ebert@health.wa.gov.au; School of Physics, University of Western Australia, Perth, Western Australia; Foo, Kerwyn

    Purpose: To use a high-quality multicenter trial dataset to determine dose-volume effects for gastrointestinal (GI) toxicity following radiation therapy for prostate carcinoma. Influential dose-volume histogram regions were to be determined as functions of dose, anatomical location, toxicity, and clinical endpoint. Methods and Materials: Planning datasets for 754 participants in the TROG 03.04 RADAR trial were available, with Late Effects of Normal Tissues (LENT) Subjective, Objective, Management, and Analytic (SOMA) toxicity assessment to a median of 72 months. A rank sum method was used to define dose-volume cut-points as near-continuous functions of dose to 3 GI anatomical regions, together with amore » comprehensive assessment of significance. Univariate and multivariate ordinal regression was used to assess the importance of cut-points at each dose. Results: Dose ranges providing significant cut-points tended to be consistent with those showing significant univariate regression odds-ratios (representing the probability of a unitary increase in toxicity grade per percent relative volume). Ranges of significant cut-points for rectal bleeding validated previously published results. Separation of the lower GI anatomy into complete anorectum, rectum, and anal canal showed the impact of mid-low doses to the anal canal on urgency and tenesmus, completeness of evacuation and stool frequency, and mid-high doses to the anorectum on bleeding and stool frequency. Derived multivariate models emphasized the importance of the high-dose region of the anorectum and rectum for rectal bleeding and mid- to low-dose regions for diarrhea and urgency and tenesmus, and low-to-mid doses to the anal canal for stool frequency, diarrhea, evacuation, and bleeding. Conclusions: Results confirm anatomical dependence of specific GI toxicities. They provide an atlas summarizing dose-histogram effects and derived constraints as functions of anatomical region, dose, toxicity, and endpoint for informing future radiation therapy planning.« less

  16. Structural Equation Modeling of Multivariate Time Series

    ERIC Educational Resources Information Center

    du Toit, Stephen H. C.; Browne, Michael W.

    2007-01-01

    The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…

  17. Relationships among participant international prostate symptom score, benign prostatic hyperplasia impact index changes and global ratings of change in a trial of phytotherapy in men with lower urinary tract symptoms.

    PubMed

    Barry, Michael J; Cantor, Alan; Roehrborn, Claus G

    2013-03-01

    We related changes in American Urological Association symptom index scores with bother measures and global ratings of change in men with lower urinary tract symptoms who were enrolled in a saw palmetto trial. To be eligible for study men were 45 years old or older, and had a peak uroflow of 4 ml per second or greater and an American Urological Association symptom index score of 8 to 24. Participants self-administered the American Urological Association symptom index, International Prostate Symptom Score quality of life item, Benign Prostatic Hyperplasia Impact Index and 2 global change questions at baseline, and at 24, 48 and 72 weeks. In 357 participants global ratings of a little better were associated with a mean decrease in American Urological Association symptom index scores from 2.8 to 4.1 points across 3 time points. The analogous range for mean decreases in Benign Prostatic Hyperplasia Impact Index scores was 1.0 to 1.7 points and for the International Prostate Symptom Score quality of life item it was 0.5 to 0.8 points. At 72 weeks for the first global change question each change measure discriminated between participants who rated themselves at least a little better vs unchanged or worse 70% to 72% of the time. A multivariate model increased discrimination to 77%. For the second global change question each change measure correctly discriminated ratings of at least a little better vs unchanged or worse 69% to 74% of the time and a multivariate model increased discrimination to 79%. Changes in American Urological Association symptom index scores could discriminate between participants rating themselves at least a little better vs unchanged or worse. Our findings support the practice of powering studies to detect group mean differences in American Urological Association symptom index scores of at least 3 points. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  18. Coating process optimization through in-line monitoring for coating weight gain using Raman spectroscopy and design of experiments.

    PubMed

    Kim, Byungsuk; Woo, Young-Ah

    2018-05-30

    In this study the authors developed a real-time Process Analytical Technology (PAT) of a coating process by applying in-line Raman spectroscopy to evaluate the coating weight gain, which is a quantitative analysis of the film coating layer. The wide area illumination (WAI) Raman probe was connected to the pan coater for real-time monitoring of changes in the weight gain of coating layers. Under the proposed in-line Raman scheme, a non-contact, non-destructive analysis was performed using WAI Raman probes with a spot size of 6 mm. The in-line Raman probe maintained a focal length of 250 mm, and a compressed air line was designed to protect the lens surface from spray droplets. The Design of Experiment (DOE) was applied to identify factors affecting the Raman spectra background of laser irradiation. The factors selected for DOE were the strength of compressed air connected to the probe, and the shielding of light by the transparent door connecting the probe to the pan coater. To develop a quantitative model, partial least squares (PLS) models as multivariate calibration were developed based on the three regions showing the specificity of TiO 2 individually or in combination. For the three single peaks (636 cm -1 , 512 cm -1 , 398 cm -1 ), least squares method (LSM) was applied to develop three univariate quantitative analysis models. One of best multivariate quantitative model having a factor of 1 gave the lowest RMSEP of 0.128, 0.129, and 0.125, respectively for prediction batches. When LSM was applied to the single peak at 636 cm -1 , the univariate quantitative model with an R 2 of 0.9863, slope of 0.5851, and y-intercept of 0.8066 had the lowest RMSEP of 0.138, 0.144, and 0.153, respectively for prediction batches. The in-line Raman spectroscopic method for the analysis of coating weight gain was verified by considering system suitability and parameters such as specificity, range, linearity, accuracy, and precision in accordance with ICH Q2 regarding method validation. The proposed in-line Raman spectroscopy can be utilized as a PAT for product quality assurance as it offers real-time monitoring of quantitative changes in coating weight gain and process end-points during the film coating process. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. In situ X-ray diffraction analysis of (CF x) n batteries: signal extraction by multivariate analysis

    DOE PAGES

    Rodriguez, Mark A.; Keenan, Michael R.; Nagasubramanian, Ganesan

    2007-11-10

    In this study, (CF x) n cathode reaction during discharge has been investigated using in situ X-ray diffraction (XRD). Mathematical treatment of the in situ XRD data set was performed using multivariate curve resolution with alternating least squares (MCR–ALS), a technique of multivariate analysis. MCR–ALS analysis successfully separated the relatively weak XRD signal intensity due to the chemical reaction from the other inert cell component signals. The resulting dynamic reaction component revealed the loss of (CF x) n cathode signal together with the simultaneous appearance of LiF by-product intensity. Careful examination of the XRD data set revealed an additional dynamicmore » component which may be associated with the formation of an intermediate compound during the discharge process.« less

  20. Information extraction from multivariate images

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Kegley, K. A.; Schiess, J. R.

    1986-01-01

    An overview of several multivariate image processing techniques is presented, with emphasis on techniques based upon the principal component transformation (PCT). Multiimages in various formats have a multivariate pixel value, associated with each pixel location, which has been scaled and quantized into a gray level vector, and the bivariate of the extent to which two images are correlated. The PCT of a multiimage decorrelates the multiimage to reduce its dimensionality and reveal its intercomponent dependencies if some off-diagonal elements are not small, and for the purposes of display the principal component images must be postprocessed into multiimage format. The principal component analysis of a multiimage is a statistical analysis based upon the PCT whose primary application is to determine the intrinsic component dimensionality of the multiimage. Computational considerations are also discussed.

  1. Efficiency gain of solid oxide fuel cell systems by using anode offgas recycle - Results for a small scale propane driven unit

    NASA Astrophysics Data System (ADS)

    Dietrich, Ralph-Uwe; Oelze, Jana; Lindermeir, Andreas; Spitta, Christian; Steffen, Michael; Küster, Torben; Chen, Shaofei; Schlitzberger, Christian; Leithner, Reinhard

    The transfer of high electrical efficiencies of solid oxide fuel cells (SOFC) into praxis requires appropriate system concepts. One option is the anode-offgas recycling (AOGR) approach, which is based on the integration of waste heat using the principle of a chemical heat pump. The AOGR concept allows a combined steam- and dry-reforming of hydrocarbon fuel using the fuel cell products steam and carbon dioxide. SOFC fuel gas of higher quantity and quality results. In combination with internal reuse of waste heat the system efficiency increases compared to the usual path of partial oxidation (POX). The demonstration of the AOGR concept with a 300 Wel-SOFC stack running on propane required: a combined reformer/burner-reactor operating in POX (start-up) and AOGR modus; a hotgas-injector for anode-offgas recycling to the reformer; a dynamic process model; a multi-variable process controller; full system operation for experimental proof of the efficiency gain. Experimental results proof an efficiency gain of 18 percentage points (η·POX = 23%, η·AOGR = 41%) under idealized lab conditions. Nevertheless, further improvements of injector performance, stack fuel utilization and additional reduction of reformer reformer O/C ratio and system pressure drop are required to bring this approach into self-sustaining operation.

  2. A nonlinear heartbeat dynamics model approach for personalized emotion recognition.

    PubMed

    Valenza, Gaetano; Citi, Luca; Lanatà, Antonio; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2013-01-01

    Emotion recognition based on autonomic nervous system signs is one of the ambitious goals of affective computing. It is well-accepted that standard signal processing techniques require relative long-time series of multivariate records to ensure reliability and robustness of recognition and classification algorithms. In this work, we present a novel methodology able to assess cardiovascular dynamics during short-time (i.e. < 10 seconds) affective stimuli, thus overcoming some of the limitations of current emotion recognition approaches. We developed a personalized, fully parametric probabilistic framework based on point-process theory where heartbeat events are modelled using a 2(nd)-order nonlinear autoregressive integrative structure in order to achieve effective performances in short-time affective assessment. Experimental results show a comprehensive emotional characterization of 4 subjects undergoing a passive affective elicitation using a sequence of standardized images gathered from the international affective picture system. Each picture was identified by the IAPS arousal and valence scores as well as by a self-reported emotional label associating a subjective positive or negative emotion. Results show a clear classification of two defined levels of arousal, valence and self-emotional state using features coming from the instantaneous spectrum and bispectrum of the considered RR intervals, reaching up to 90% recognition accuracy.

  3. Neural representation of form-contingent color filling-in in the early visual cortex.

    PubMed

    Hong, Sang Wook; Tong, Frank

    2017-11-01

    Perceptual filling-in exemplifies the constructive nature of visual processing. Color, a prominent surface property of visual objects, can appear to spread to neighboring areas that lack any color. We investigated cortical responses to a color filling-in illusion that effectively dissociates perceived color from the retinal input (van Lier, Vergeer, & Anstis, 2009). Observers adapted to a star-shaped stimulus with alternating red- and cyan-colored points to elicit a complementary afterimage. By presenting an achromatic outline that enclosed one of the two afterimage colors, perceptual filling-in of that color was induced in the unadapted central region. Visual cortical activity was monitored with fMRI, and analyzed using multivariate pattern analysis. Activity patterns in early visual areas (V1-V4) reliably distinguished between the two color-induced filled-in conditions, but only higher extrastriate visual areas showed the predicted correspondence with color perception. Activity patterns allowed for reliable generalization between filled-in colors and physical presentations of perceptually matched colors in areas V3 and V4, but not in earlier visual areas. These findings suggest that the perception of filled-in surface color likely requires more extensive processing by extrastriate visual areas, in order for the neural representation of surface color to become aligned with perceptually matched real colors.

  4. Effectiveness of Family Planning Policies: The Abortion Paradox

    PubMed Central

    Bajos, Nathalie; Le Guen, Mireille; Bohet, Aline; Panjo, Henri; Moreau, Caroline

    2014-01-01

    Objective The relation between levels of contraceptive use and the incidence of induced abortion remains a topic of heated debate. Many of the contradictions are likely due to the fact that abortion is the end point of a process that starts with sexual activity, contraceptive use (or non-use), followed by unwanted pregnancy, a decision to terminate, and access to abortion. Trends in abortion rates reflect changes in each step of this process, and opposing trends may cancel each other out. This paper aims to investigate the roles played by the dissemination of contraception and the evolving norms of motherhood on changes in abortion rates. Methods Drawing data from six national probability surveys that explored contraception and pregnancy wantedness in France from 1978 through 2010, we used multivariate linear regression to explore the associations between trends in contraceptive rates and trends in (i) abortion rates, (ii) unwanted pregnancy rates, (iii) and unwanted birth rates, and to determine which of these 3 associations was strongest. Findings The association between contraceptive rates and abortion rates over time was weaker than that between contraception rates and unwanted pregnancy rates (p = 0.003). Similarly, the association between contraceptive rates and unwanted birth rates over time was weaker than that between contraceptive rates and unwanted pregnancy rates (p = 0.000). PMID:24670784

  5. Mass spectrometry for the characterization of brewing process.

    PubMed

    Vivian, Adriana Fu; Aoyagui, Caroline Tiemi; de Oliveira, Diogo Noin; Catharino, Rodrigo Ramos

    2016-11-01

    Beer is a carbonated alcoholic beverage produced by fermenting ingredients containing starch, especially malted cereals, and other compounds such as water, hops and yeast. The process comprises five main steps: malting, mashing, boiling, fermentation and maturation. There has been growing interest in the subject, since there is increasing demand for beer quality aspects and beer is a ubiquitous alcoholic beverage in the world. This study is based on the manufacturing process of a Brazilian craft brewery, which is characterized by withdrawing samples during key production stages and using electrospray ionization (ESI) high-resolution mass spectrometry (HRMS), a selective and reliable technique used in the identification of substances in an expeditious and practical way. Multivariate data analysis, namely partial least squares discriminant analysis (PLS-DA) is used to define its markers. In both positive and negative modes of PLS-DA score plot, it is possible to notice differences between each stage. VIP score analysis pointed out markers coherent with the process, such as barley components ((+)-catechin), small peptide varieties, hop content (humulone), yeast metabolic compounds and, in maturation, flavoring compounds (caproic acid, glutaric acid and 2,3-butanediol). Besides that, it was possible to identify other important substances such as off-flavor precursors and other different trace compounds, according to the focus given. This is an attractive alternative for the control of food and beverage industry, allowing a quick assessment of process status before it is finished, preventing higher production costs, ensuring quality and helping the control of desirable features, as flavor, foam stability and drinkability. Covering different classes of compounds, this approach suggests a novel analytical strategy: "processomics", aiming at understanding processes in detail, promoting control and being able to make improvements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Predicting the occurrence of embolic events: an analysis of 1456 episodes of infective endocarditis from the Italian Study on Endocarditis (SEI)

    PubMed Central

    2014-01-01

    Background Embolic events are a major cause of morbidity and mortality in patients with infective endocarditis. We analyzed the database of the prospective cohort study SEI in order to identify factors associated with the occurrence of embolic events and to develop a scoring system for the assessment of the risk of embolism. Methods We retrospectively analyzed 1456 episodes of infective endocarditis from the multicenter study SEI. Predictors of embolism were identified. Risk factors identified at multivariate analysis as predictive of embolism in left-sided endocarditis, were used for the development of a risk score: 1 point was assigned to each risk factor (total risk score range: minimum 0 points; maximum 2 points). Three categories were defined by the score: low (0 points), intermediate (1 point), or high risk (2 points); the probability of embolic events per risk category was calculated for each day on treatment (day 0 through day 30). Results There were 499 episodes of infective endocarditis (34%) that were complicated by ≥ 1 embolic event. Most embolic events occurred early in the clinical course (first week of therapy: 15.5 episodes per 1000 patient days; second week: 3.7 episodes per 1000 patient days). In the total cohort, the factors associated with the occurrence of embolism at multivariate analysis were prosthetic valve localization (odds ratio, 1.84), right-sided endocarditis (odds ratio, 3.93), Staphylococcus aureus etiology (odds ratio, 2.23) and vegetation size ≥ 13 mm (odds ratio, 1.86). In left-sided endocarditis, Staphylococcus aureus etiology (odds ratio, 2.1) and vegetation size ≥ 13 mm (odds ratio, 2.1) were independently associated with embolic events; the 30-day cumulative incidence of embolism varied with risk score category (low risk, 12%; intermediate risk, 25%; high risk, 38%; p < 0.001). Conclusions Staphylococcus aureus etiology and vegetation size are associated with an increased risk of embolism. In left-sided endocarditis, a simple scoring system, which combines etiology and vegetation size with time on antimicrobials, might contribute to a better assessment of the risk of embolism, and to a more individualized analysis of indications and contraindications for early surgery. PMID:24779617

  7. The environmental zero-point problem in evolutionary reaction norm modeling.

    PubMed

    Ergon, Rolf

    2018-04-01

    There is a potential problem in present quantitative genetics evolutionary modeling based on reaction norms. Such models are state-space models, where the multivariate breeder's equation in some form is used as the state equation that propagates the population state forward in time. These models use the implicit assumption of a constant reference environment, in many cases set to zero. This zero-point is often the environment a population is adapted to, that is, where the expected geometric mean fitness is maximized. Such environmental reference values follow from the state of the population system, and they are thus population properties. The environment the population is adapted to, is, in other words, an internal population property, independent of the external environment. It is only when the external environment coincides with the internal reference environment, or vice versa, that the population is adapted to the current environment. This is formally a result of state-space modeling theory, which is an important theoretical basis for evolutionary modeling. The potential zero-point problem is present in all types of reaction norm models, parametrized as well as function-valued, and the problem does not disappear when the reference environment is set to zero. As the environmental reference values are population characteristics, they ought to be modeled as such. Whether such characteristics are evolvable is an open question, but considering the complexity of evolutionary processes, such evolvability cannot be excluded without good arguments. As a straightforward solution, I propose to model the reference values as evolvable mean traits in their own right, in addition to other reaction norm traits. However, solutions based on an evolvable G matrix are also possible.

  8. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.

  9. Prospective Study of Insufficient Sleep and Neurobehavioral Functioning Among School-Age Children.

    PubMed

    Taveras, Elsie M; Rifas-Shiman, Sheryl L; Bub, Kristen L; Gillman, Matthew W; Oken, Emily

    2017-08-01

    To examine associations between insufficient sleep and neurobehavioral functioning in childhood as reported by mothers and teachers. Participants were 1046 children in a prebirth cohort study. Main exposures were insufficient sleep durations at 3 time points: 6 months to 2 years, defined as sleep <11 h/d, 11 to <12 h/d (vs ≥12); 3 to 4 years, defined as sleep <10 h/d, 10 to <11 h/d (vs ≥11); and 5 to 7 years, sleep <9 h/d, 9 to <10 h/d (vs ≥10). Outcomes at age 7 years were executive function, behavior, and social-emotional functioning, assessed using the Behavioral Rating Inventory of Executive Function (BRIEF) and the Strengths and Difficulties Questionnaire (SDQ). Higher scores indicate poorer functioning. Mothers and teachers completed both instruments independently. At age 7 years, mean (SD) mother and teacher report of the BRIEF global executive composite scale were 48.3 (7.9) and 50.7 (9.4) points, respectively, and of the SDQ total difficulties score was 6.5 (4.7) and 6.2 (5.7). In multivariable models, children who slept <10 h/d at 3 to 4 years had worse maternal-reported scores for the BRIEF (2.11 points; 95% confidence interval, 0.17-4.05) and SDQ (1.91 points; 95% confidence interval, 0.78-3.05) than those with age-appropriate sleep. Children who slept <9 h/d at 5 to 7 years also had worse scores. At both ages, associations with teacher-reported results were consistent with those of mothers. Infants who slept 11 to <12 h/d had higher teacher- but not mother-reported scores. Insufficient sleep in the preschool and early school years is associated with poorer mother- and teacher-reported neurobehavioral processes in midchildhood. Copyright © 2017 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  10. Radiotherapy and Hyperthermia for Treatment of Primary Locally Advanced Cervix Cancer: Results in 378 Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franckena, Martine; Lutgens, Ludy C.; Koper, Peter C.

    2009-01-01

    Purpose: To report response rate, pelvic tumor control, survival, and late toxicity after treatment with combined radiotherapy and hyperthermia (RHT) for patients with locally advanced cervical carcinoma (LACC) and compare the results with other published series. Methods and Materials: From 1996 to 2005, a total of 378 patients with LACC (International Federation of Gynecology and Obstetrics Stage IB2-IVA) were treated with RHT. External beam radiotherapy (RT) was applied to 46-50.4 Gy and combined with brachytherapy. The hyperthermia (HT) was prescribed once weekly. Primary end points were complete response (CR) and local control. Secondary end points were overall survival, disease-specific survival,more » and late toxicity. Patient, tumor, and treatment characteristics predictive for the end points were identified in univariate and multivariate analyses. Results: Overall, a CR was achieved in 77% of patients. At 5 years, local control, disease-specific survival, and incidence of late toxicity Common Terminology Criteria for Adverse Events Grade 3 or higher were 53%, 47%, and 12%, respectively. In multivariate analysis, number of HT treatments emerged as a predictor of outcome in addition to commonly identified prognostic factors. Conclusions: The CR, local control, and survival rates are similar to previously observed results of RHT in the randomized Dutch Deep Hyperthermia Trial. Reported treatment results for currently applied combined treatment modalities (i.e., RT with chemotherapy and/or HT) do not permit definite conclusions about which combination is superior. The present results confirm previously shown beneficial effects from adding HT to RT and justify the application of RHT as first-line treatment in patients with LACC as an alternative to chemoradiation.« less

  11. Mediterranean and Nordic diet scores and long-term changes in body weight and waist circumference: results from a large cohort study.

    PubMed

    Li, Yingjun; Roswall, Nina; Ström, Peter; Sandin, Sven; Adami, Hans-Olov; Weiderpass, Elisabete

    2015-12-28

    Dietary patterns, which represent a broader picture of food and nutrient consumption, have gained increasing interest over the last decades. In a cohort design, we followed 27 544 women aged 29-49 years from baseline in 1991-1992. We collected data from an FFQ at baseline and body weight (BW) and waist circumference (WC) data both at baseline and at follow-up in 2003. We calculated the Mediterranean diet score (MDS, ranging from 0 to 9) and the Nordic diet score (NDS, ranging from 0 to 6). We used linear regression to examine the association between MDS and NDS (exposures) with subsequent BW change (ΔBW) and WC change (ΔWC) (outcomes) both continuously and categorically. Higher adherence to the MDS or NDS was not associated with ΔBW. The multivariable population average increment in BW was 0·03 kg (95 % CI -0·03, 0·09) per 1-point increase in MDS and 0·04 kg (95 % CI -0·02, 0·10) per 1-point increase in NDS. In addition, higher adherence to the MDS was not associated with ΔWC, with the multivariable population average increment per 1-point increase in MDS being 0·05 cm (95 % CI -0·03, 0·13). Higher adherence to the NDS was not significantly associated with gain in WC when adjusted for concurrent ΔBW. In conclusion, a higher adherence to the MDS or NDS was not associated with changes in average BW or WC in the present cohort followed for 12 years.

  12. Can preoperative and postoperative CA19-9 levels predict survival and early recurrence in patients with resectable hilar cholangiocarcinoma?

    PubMed

    Wang, Jun-Ke; Hu, Hai-Jie; Shrestha, Anuj; Ma, Wen-Jie; Yang, Qin; Liu, Fei; Cheng, Nan-Sheng; Li, Fu-Yu

    2017-07-11

    To investigate the predictive values of preoperative and postoperative serum CA19-9 levels on survival and other prognostic factors including early recurrence in patients with resectable hilar cholangiocarcinoma. In univariate analysis, increased preoperative and postoperative CA19-9 levels in the light of different cut-off points (37, 100, 150, 200, 400, 1000 U/ml) were significantly associated with poor survival outcomes, of which the cut-off point of 150 U/ml showed the strongest predictive value (both P < 0.001). Preoperative to postoperative increase in CA19-9 level was also correlated with poor survival outcome (P < 0.001). In multivariate analysis, preoperative CA19-9 level > 150 U/ml was significantly associated with lymph node metastasis (OR = 3.471, 95% CI 1.216-9.905; P = 0.020) and early recurrence (OR = 8.280, 95% CI 2.391-28.674; P = 0.001). Meanwhile, postoperative CA19-9 level > 150 U/ml was also correlated with early recurrence (OR = 4.006, 95% CI 1.107-14.459; P = 0.034). Ninety-eight patients who had undergone curative surgery for hilar cholangiocarcinoma between 1995 and 2014 in our institution were selected for the study. The correlations of preoperative and postoperative serum CA19-9 levels on the basis of different cut-off points with survival and various tumor factors were retrospectively analyzed with univariate and multivariate methods. In patients with resectable hilar cholangiocarcinoma, serum CA19-9 predict survival and early recurrence. Patients with increased preoperative and postoperative CA19-9 levels have poor survival outcomes and higher tendency of early recurrence.

  13. Beta-blockers influence the short-term and long-term prognostic information of natriuretic peptides and catecholamines in chronic heart failure independent from specific agents.

    PubMed

    Frankenstein, Lutz; Nelles, Manfred; Slavutsky, Maxim; Schellberg, Dieter; Doesch, Andreas; Katus, Hugo; Remppis, Andrew; Zugck, Christian

    2007-10-01

    In chronic heart failure (CHF), the physiologic effects of natriuretic peptides and catecholamines are interdependent. Furthermore, reports state an agent-dependent effect of individual beta-blockers on biomarkers. Data on the short-term and long-term predictive power comparing these biomarkers as well as accounting for the influence of beta-blocker treatment both on the marker or the resultant prognostic information are scarce. We included 513 consecutive patients with systolic CHF, measured atrial natriuretic peptide (ANP), N-terminal prohormone brain natriuretic peptide (NTproBNP), noradrenaline, and adrenaline, and monitored them for 90 +/- 25 months. Death or the combination of death and cardiac transplantation at 1 year, 5 years, and overall follow-up were considered end points. Compared with patients not taking beta-blockers, patients taking beta-blockers had significantly lower levels of catecholamines but not natriuretic peptides. Only for adrenaline was the amount of this effect related to the specific beta-blocker chosen. Receiver operating characteristic curves demonstrated superior prognostic accuracy for NTproBNP both at the 1- and 5-year follow-up compared with ANP, noradrenaline, and adrenaline. In multivariate analysis including established risk markers (New York Heart Association functional class, left ventricular ejection fraction, peak oxygen uptake, and 6-minute walk test), of all neurohumoral parameters, only NTproBNP remained an independent predictor for both end points. Long-term beta-blocker therapy is associated with decreased levels of plasma catecholamines but not natriuretic peptides. This effect is independent from the actual beta-blocker chosen for natriuretic peptides and noradrenaline. In multivariate analysis, both for short-term and long-term prediction of mortality or the combined end point of death and cardiac transplantation, only NTproBNP remained independent from established clinical risk markers.

  14. Carotid Plaque Score and Risk of Cardiovascular Mortality in the Oldest Old: Results from the TOOTH Study.

    PubMed

    Hirata, Takumi; Arai, Yasumichi; Takayama, Michiyo; Abe, Yukiko; Ohkuma, Kiyoshi; Takebayashi, Toru

    2018-01-01

    Accumulating evidence suggests that predictability of traditional cardiovascular risk factors declines with advancing age. We investigated whether carotid plaque scores (CPSs) were associated with cardiovascular disease (CVD) death in the oldest old, and whether asymmetrical dimethylarginine (ADMA), a marker of endothelial dysfunction, moderated the association between the CPS and CVD death. We conducted a prospective cohort study of Japanese subjects aged ≥85 years without CVD at baseline. We followed this cohort for 6 years to investigate the association of CPS with CVD death via multivariable Cox proportional hazard analysis. We divided participants into three groups according to CPS (no, 0 points; low, 1.2-4.9 points; high, ≥5.0 points). The predictive value of CPS for estimating CVD death risk over CVD risk factors, including ADMA, was examined using C-statistics. We analyzed 347 participants (151 men, 196 women; mean age, 87.6 years), of which 135 (38.9%) had no carotid plaque at baseline, and 48 (13.8%) had high CPS. Of the total, 29 (8.4%) participants experienced CVD-related death during the study period. Multivariable analysis revealed a significant association of high CPS with CVD-related mortality relative to no CPS (hazard ratio, 3.90; 95% confidence interval: 1.47-10.39). ADMA was not associated with CVD death, but the significant association between CPS and CVD death was observed only in lower ADMA level. The addition of CPS to other risk factors improved the predictability of CVD death (p=0.032). High CPS correlated significantly with a higher CVD death risk in the oldest old with low cardiovascular risk. Ultrasound carotid plaque evaluation might facilitate risk evaluations of CVD death in the very old.

  15. A Simple Score That Predicts Paroxysmal Atrial Fibrillation on Outpatient Cardiac Monitoring after Embolic Stroke of Unknown Source.

    PubMed

    Ricci, Brittany; Chang, Andrew D; Hemendinger, Morgan; Dakay, Katarina; Cutting, Shawna; Burton, Tina; Mac Grory, Brian; Narwal, Priya; Song, Christopher; Chu, Antony; Mehanna, Emile; McTaggart, Ryan; Jayaraman, Mahesh; Furie, Karen; Yaghi, Shadi

    2018-06-01

    Occult paroxysmal atrial fibrillation (AF) is detected in 16%-30% of patients with embolic stroke of unknown source (ESUS). The identification of AF predictors on outpatient cardiac monitoring can help guide clinicians decide on a duration or method of cardiac monitoring after ESUS. We included all patients with ESUS who underwent an inpatient diagnostic evaluation and outpatient cardiac monitoring between January 1, 2013, and December 31, 2016. Patients were divided into 2 groups based on detection of AF or atrial flutter during monitoring. We compared demographic data, clinical risk factors, and cardiac biomarkers between the 2 groups. Multivariable logistic regression was used to determine predictors of AF. We identified 296 consecutive patients during the study period; 38 (12.8%) patients had AF detected on outpatient cardiac monitoring. In a multivariable regression analysis, advanced age (ages 65-74: odds ratio [OR] 2.36, 95% confidence interval [CI] .85-6.52; ages 75 or older: OR 4.08, 95% CI 1.58-10.52) and moderate-to-severe left atrial enlargement (OR 4.66, 95% CI 1.79-12.12) were predictors of AF on outpatient monitoring. We developed the Brown ESUS-AF score: age (65-74 years: 1 point, 75 years or older: 2 points) and left atrial enlargement (moderate or severe: 2 points) with good prediction of AF (area under the curve .725) and was internally validated using bootstrapping. The percentage of patients with AF detected in each score category were as follows: 0: 4.2%; 1: 14.8%; 2: 20.8%; 3: 22.2%; 4: 55.6%. The Brown ESUS-AF score predicts AF on prolonged outpatient monitoring after ESUS. More studies are needed to externally validate our findings. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  16. Mining Input Data for Multivariate Probabilistic Modeling of Rainfall-Induced Landslide Hazard in the Lake ATITLÁN Watershed in Guatemala

    NASA Astrophysics Data System (ADS)

    Cobin, P. F.; Oommen, T.; Gierke, J. S.

    2013-12-01

    The Lake Atitlán watershed is home to approximately 200,000 people and is located in the western highlands of Guatemala. Steep slopes, highly susceptible to landslides during the rainy season, characterize the region. Typically these landslides occur during high-intensity precipitation events. Hurricane Stan hit Guatemala in October 2005; the resulting flooding and landslides devastated the region. Locations of landslide and non-landslide points were obtained from field observations and orthophotos taken following Hurricane Stan. Different datasets of landslide and non-landslide points across the watershed were used to compare model success at a small scale and regional scale. This study used data from multiple attributes: geology, geomorphology, distance to faults and streams, land use, slope, aspect, curvature, plan curvature, profile curvature and topographic wetness index. The open source software Weka was used for the data mining. Several attribute selection methods were applied to the data to predetermine the potential landslide causative influence. Different multivariate algorithms were then evaluated for their ability to predict landslide occurrence. The following statistical parameters were used to evaluate model accuracy: precision, recall, F measure and area under the receiver operating characteristic (ROC) curve. The attribute combinations of the most successful models were compared to the attribute evaluator results. The algorithm BayesNet yielded the most accurate model and was used to build a probability map of landslide initiation points for the regions selected in the watershed. The ultimate aim of this study is to share the methodology and results with municipal contacts from the author's time as a U.S. Peace Corps volunteer, to facilitate more effective future landslide hazard planning and mitigation.

  17. Racial differences in vascular risk factors and outcomes of patients with intracranial atherosclerotic arterial stenosis.

    PubMed

    Waddy, Salina P; Cotsonis, George; Lynn, Michael J; Frankel, Michael R; Chaturvedi, Seemant; Williams, Janice E; Chimowitz, Marc

    2009-03-01

    Atherosclerotic intracranial stenosis is an important cause of stroke in blacks, yet there are limited data on vascular risk factors and outcome. We analyzed the vascular risk factors and outcomes of blacks and whites in the Warfarin versus Aspirin for Symptomatic Intracranial Disease (WASID) trial. Baseline characteristics and outcomes (ischemic stroke, brain hemorrhage, or vascular death combined and ischemic stroke alone) were compared between blacks (n=174) and whites (n=331) using univariate and multivariate analyses. Blacks were significantly (P<0.05) more likely than whites to be/have: female, hypertension history, diabetes history, higher LDL, higher total cholesterol, lower triglycerides, unmarried, unemployed, nonprivate insurance, no insurance, stroke as qualifying event, <70% stenosis, symptomatic anterior circulation vessel, no antithrombotic medication before qualifying event, and no family history of myocardial infarction. Blacks more frequently reached an end point of ischemic stroke, brain hemorrhage or vascular death (28% versus 20%; hazard ratio of 1.49, 95% CI 1.03 to 2.17, P=0.03), had a higher 2-year event rate (0.28 versus 0.19), and reached the end point of ischemic stroke alone (25% versus 16% at 2 years; hazard ratio of 1.62, P=0.017). In multivariate analysis, race was associated with ischemic stroke (P=0.0488) but not with the end point ischemic stroke, brain hemorrhage or vascular death (P=0.188). Blacks with intracranial stenosis are at higher risk of stroke recurrence than whites. This risk warrants additional study of factors contributing to stroke in blacks and highlights the need for aggressive risk factor management in blacks to prevent recurrence.

  18. Research on Scheduling Algorithm for Multi-satellite and Point Target Task on Swinging Mode

    NASA Astrophysics Data System (ADS)

    Wang, M.; Dai, G.; Peng, L.; Song, Z.; Chen, G.

    2012-12-01

    Nowadays, using satellite in space to observe ground is an important and major method to obtain ground information. With the development of the scientific technology in the field of space, many fields such as military and economic and other areas have more and more requirement of space technology because of the benefits of the satellite's widespread, timeliness and unlimited of area and country. And at the same time, because of the wide use of all kinds of satellites, sensors, repeater satellites and ground receiving stations, ground control system are now facing great challenge. Therefore, how to make the best value of satellite resources so as to make full use of them becomes an important problem of ground control system. Satellite scheduling is to distribute the resource to all tasks without conflict to obtain the scheduling result so as to complete as many tasks as possible to meet user's requirement under considering the condition of the requirement of satellites, sensors and ground receiving stations. Considering the size of the task, we can divide tasks into point task and area task. This paper only considers point targets. In this paper, a description of satellite scheduling problem and a chief introduction of the theory of satellite scheduling are firstly made. We also analyze the restriction of resource and task in scheduling satellites. The input and output flow of scheduling process are also chiefly described in the paper. On the basis of these analyses, we put forward a scheduling model named as multi-variable optimization model for multi-satellite and point target task on swinging mode. In the multi-variable optimization model, the scheduling problem is transformed the parametric optimization problem. The parameter we wish to optimize is the swinging angle of every time-window. In the view of the efficiency and accuracy, some important problems relating the satellite scheduling such as the angle relation between satellites and ground targets, positive and negative swinging angle and the computation of time window are analyzed and discussed. And many strategies to improve the efficiency of this model are also put forward. In order to solve the model, we bring forward the conception of activity sequence map. By using the activity sequence map, the activity choice and the start time of the activity can be divided. We also bring forward three neighborhood operators to search the result space. The front movement remaining time and the back movement remaining time are used to analyze the feasibility to generate solution from neighborhood operators. Lastly, the algorithm to solve the problem and model is put forward based genetic algorithm. Population initialization, crossover operator, mutation operator, individual evaluation, collision decrease operator, select operator and collision elimination operator is designed in the paper. Finally, the scheduling result and the simulation for a practical example on 5 satellites and 100 point targets with swinging mode is given, and the scheduling performances are also analyzed while the swinging angle in 0, 5, 10, 15, 25. It can be shown by the result that the model and the algorithm are more effective than those ones without swinging mode.

  19. The effects of HMO ownership on hospital costs and revenues: is there a difference between for-profit and nonprofit plans?

    PubMed

    Shen, Yu-Chu; Melnick, Glenn

    2004-01-01

    We conducted multivariate analyses to examine whether high health maintenance organization (HMO) penetration and large share of for-profit health plans in a market reduced hospital cost and revenue growth rates between 1989 and 1998. We found that hospitals in high HMO areas experienced revenue and cost growth rates that were 21 and 18 percentage points, respectively, below hospitals in low HMO areas. We also found that, conditional on overall HMO penetration level, hospitals in areas with high for-profit HMO penetration experienced revenue and cost growth rates that were 10 percentage points below hospitals in areas with low for-profit penetration areas; the difference was especially evident within high HMO penetration areas.

  20. Multivariate curve resolution of incomplete fused multiset data from chromatographic and spectrophotometric analyses for drug photostability studies.

    PubMed

    De Luca, Michele; Ragno, Gaetano; Ioele, Giuseppina; Tauler, Romà

    2014-07-21

    An advanced and powerful chemometric approach is proposed for the analysis of incomplete multiset data obtained by fusion of hyphenated liquid chromatographic DAD/MS data with UV spectrophotometric data from acid-base titration and kinetic degradation experiments. Column- and row-wise augmented data blocks were combined and simultaneously processed by means of a new version of the multivariate curve resolution-alternating least squares (MCR-ALS) technique, including the simultaneous analysis of incomplete multiset data from different instrumental techniques. The proposed procedure was applied to the detailed study of the kinetic photodegradation process of the amiloride (AML) drug. All chemical species involved in the degradation and equilibrium reactions were resolved and the pH dependent kinetic pathway described. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  2. Characterization of Interfacial Chemistry of Adhesive/Dentin Bond Using FTIR Chemical Imaging With Univariate and Multivariate Data Processing

    PubMed Central

    Wang, Yong; Yao, Xiaomei; Parthasarathy, Ranganathan

    2008-01-01

    Fourier transform infrared (FTIR) chemical imaging can be used to investigate molecular chemical features of the adhesive/dentin interfaces. However, the information is not straightforward, and is not easily extracted. The objective of this study was to use multivariate analysis methods, principal component analysis and fuzzy c-means clustering, to analyze spectral data in comparison with univariate analysis. The spectral imaging data collected from both the adhesive/healthy dentin and adhesive/caries-affected dentin specimens were used and compared. The univariate statistical methods such as mapping of intensities of specific functional group do not always accurately identify functional group locations and concentrations due to more or less band overlapping in adhesive and dentin. Apart from the ease with which information can be extracted, multivariate methods highlight subtle and often important changes in the spectra that are difficult to observe using univariate methods. The results showed that the multivariate methods gave more satisfactory, interpretable results than univariate methods and were conclusive in showing that they can discriminate and classify differences between healthy dentin and caries-affected dentin within the interfacial regions. It is demonstrated that the multivariate FTIR imaging approaches can be used in the rapid characterization of heterogeneous, complex structure. PMID:18980198

  3. Voxelwise multivariate analysis of multimodality magnetic resonance imaging.

    PubMed

    Naylor, Melissa G; Cardenas, Valerie A; Tosun, Duygu; Schuff, Norbert; Weiner, Michael; Schwartzman, Armin

    2014-03-01

    Most brain magnetic resonance imaging (MRI) studies concentrate on a single MRI contrast or modality, frequently structural MRI. By performing an integrated analysis of several modalities, such as structural, perfusion-weighted, and diffusion-weighted MRI, new insights may be attained to better understand the underlying processes of brain diseases. We compare two voxelwise approaches: (1) fitting multiple univariate models, one for each outcome and then adjusting for multiple comparisons among the outcomes and (2) fitting a multivariate model. In both cases, adjustment for multiple comparisons is performed over all voxels jointly to account for the search over the brain. The multivariate model is able to account for the multiple comparisons over outcomes without assuming independence because the covariance structure between modalities is estimated. Simulations show that the multivariate approach is more powerful when the outcomes are correlated and, even when the outcomes are independent, the multivariate approach is just as powerful or more powerful when at least two outcomes are dependent on predictors in the model. However, multiple univariate regressions with Bonferroni correction remain a desirable alternative in some circumstances. To illustrate the power of each approach, we analyze a case control study of Alzheimer's disease, in which data from three MRI modalities are available. Copyright © 2013 Wiley Periodicals, Inc.

  4. Identification of Reliable Components in Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS): a Data-Driven Approach across Metabolic Processes.

    PubMed

    Motegi, Hiromi; Tsuboi, Yuuri; Saga, Ayako; Kagami, Tomoko; Inoue, Maki; Toki, Hideaki; Minowa, Osamu; Noda, Tetsuo; Kikuchi, Jun

    2015-11-04

    There is an increasing need to use multivariate statistical methods for understanding biological functions, identifying the mechanisms of diseases, and exploring biomarkers. In addition to classical analyses such as hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis, various multivariate strategies, including independent component analysis, non-negative matrix factorization, and multivariate curve resolution, have recently been proposed. However, determining the number of components is problematic. Despite the proposal of several different methods, no satisfactory approach has yet been reported. To resolve this problem, we implemented a new idea: classifying a component as "reliable" or "unreliable" based on the reproducibility of its appearance, regardless of the number of components in the calculation. Using the clustering method for classification, we applied this idea to multivariate curve resolution-alternating least squares (MCR-ALS). Comparisons between conventional and modified methods applied to proton nuclear magnetic resonance ((1)H-NMR) spectral datasets derived from known standard mixtures and biological mixtures (urine and feces of mice) revealed that more plausible results are obtained by the modified method. In particular, clusters containing little information were detected with reliability. This strategy, named "cluster-aided MCR-ALS," will facilitate the attainment of more reliable results in the metabolomics datasets.

  5. Multivariate Analysis of Longitudinal Rates of Change

    PubMed Central

    Bryan, Matthew; Heagerty, Patrick J.

    2016-01-01

    Longitudinal data allow direct comparison of the change in patient outcomes associated with treatment or exposure. Frequently, several longitudinal measures are collected that either reflect a common underlying health status, or characterize processes that are influenced in a similar way by covariates such as exposure or demographic characteristics. Statistical methods that can combine multivariate response variables into common measures of covariate effects have been proposed by Roy and Lin [1]; Proust-Lima, Letenneur and Jacqmin-Gadda [2]; and Gray and Brookmeyer [3] among others. Current methods for characterizing the relationship between covariates and the rate of change in multivariate outcomes are limited to select models. For example, Gray and Brookmeyer [3] introduce an “accelerated time” method which assumes that covariates rescale time in longitudinal models for disease progression. In this manuscript we detail an alternative multivariate model formulation that directly structures longitudinal rates of change, and that permits a common covariate effect across multiple outcomes. We detail maximum likelihood estimation for a multivariate longitudinal mixed model. We show via asymptotic calculations the potential gain in power that may be achieved with a common analysis of multiple outcomes. We apply the proposed methods to the analysis of a trivariate outcome for infant growth and compare rates of change for HIV infected and uninfected infants. PMID:27417129

  6. Risk factors for baclofen pump infection in children: a multivariate analysis.

    PubMed

    Spader, Heather S; Bollo, Robert J; Bowers, Christian A; Riva-Cambrin, Jay

    2016-06-01

    OBJECTIVE Intrathecal baclofen infusion systems to manage severe spasticity and dystonia are associated with higher infection rates in children than in adults. Factors unique to this population, such as poor nutrition and physical limitations for pump placement, have been hypothesized as the reasons for this disparity. The authors assessed potential risk factors for infection in a multivariate analysis. METHODS Patients who underwent implantation of a programmable pump and intrathecal catheter for baclofen infusion at a single center between January 1, 2000, and March 1, 2012, were identified in this retrospective cohort study. The primary end point was infection. Potential risk factors investigated included preoperative (i.e., demographics, body mass index [BMI], gastrostomy tube, tracheostomy, previous spinal fusion), intraoperative (i.e., surgeon, antibiotics, pump size, catheter location), and postoperative (i.e., wound dehiscence, CSF leak, and number of revisions) factors. Univariate analysis was performed, and a multivariate logistic regression model was created to identify independent risk factors for infection. RESULTS A total of 254 patients were evaluated. The overall infection rate was 9.8%. Univariate analysis identified young age, shorter height, lower weight, dehiscence, CSF leak, and number of revisions within 6 months of pump placement as significantly associated with infection. Multivariate analysis identified young age, dehiscence, and number of revisions as independent risk factors for infection. CONCLUSIONS Young age, wound dehiscence, and number of revisions were independent risk factors for infection in this pediatric cohort. A low BMI and the presence of either a gastrostomy or tracheostomy were not associated with infection and may not be contraindications for this procedure.

  7. Investigating conflict in ICUs - Is the clinicians’ perspective enough?

    PubMed Central

    Schuster, Rachel A.; Hong, Seo Yeon; Arnold, Robert M.; White, Douglas B.

    2013-01-01

    Objective Most studies have assessed conflict between clinicians and surrogate decision makers in ICUs from only clinicians’ perspectives. It is unknown if surrogates’ perceptions differ from clinicians’. We sought to determine the degree of agreement between physicians and surrogates about conflict, and to identify predictors of physician-surrogate conflict. Design Prospective cohort study. Setting Four ICUs of two hospitals in San Francisco, California. Patients 230 surrogate decision makers and 100 physicians of 175 critically ill patients. Measurements Questionnaires addressing participants’ perceptions of whether there was physician-surrogate conflict, as well as attitudes and preferences about clinician-surrogate communication; kappa scores to quantify physician-surrogate concordance about the presence of conflict; and hierarchical multivariate modeling to determine predictors of conflict. Main Results Either the physician or surrogate identified conflict in 63% of cases. Physicians were less likely to perceive conflict than surrogates (27.8% vs 42.3%; p=0.007). Agreement between physicians and surrogates about conflict was poor (kappa = 0.14). Multivariable analysis with surrogate-assessed conflict as the outcome revealed that higher levels of surrogates’ satisfaction with physicians’ bedside manner were associated with lower odds of conflict (OR: 0.75 per 1 point increase in satisfaction, 95% CI 0.59–0.96). Multivariable analysis with physician-assessed conflict as the outcome revealed that the surrogate having felt discriminated against in the healthcare setting was associated with higher odds of conflict (OR 17.5, 95% CI 1.6–190.1) while surrogates’ satisfaction with physicians’ bedside manner was associated with lower odds of conflict (0–10 scale, OR 0.76 per 1 point increase, 95% CI 0.58–0.99). Conclusions Conflict between physicians and surrogates is common in ICUs. There is little agreement between physicians and surrogates about whether physician-surrogate conflict has occurred. Further work is needed to develop reliable and valid methods to assess conflict. In the interim, future studies should assess conflict from the perspective of both clinicians and surrogates. PMID:24434440

  8. Mannitol and Outcome in Intracerebral Hemorrhage: Propensity Score and Multivariable Intensive Blood Pressure Reduction in Acute Cerebral Hemorrhage Trial 2 Results.

    PubMed

    Wang, Xia; Arima, Hisatomi; Yang, Jie; Zhang, Shihong; Wu, Guojun; Woodward, Mark; Muñoz-Venturelli, Paula; Lavados, Pablo M; Stapf, Christian; Robinson, Thompson; Heeley, Emma; Delcourt, Candice; Lindley, Richard I; Parsons, Mark; Chalmers, John; Anderson, Craig S

    2015-10-01

    Mannitol is often used to reduce cerebral edema in acute intracerebral hemorrhage but without strong supporting evidence of benefit. We aimed to determine the impact of mannitol on outcome among participants of the Intensive Blood Pressure Reduction in Acute Cerebral Hemorrhage Trial (INTERACT2). INTERACT2 was an international, open, blinded end point, randomized controlled trial of 2839 patients with spontaneous intracerebral hemorrhage (<6 hours) and elevated systolic blood pressure allocated to intensive (target systolic blood pressure, <140 mm Hg within 1 hour) or guideline-recommended (target systolic blood pressure, <180 mm Hg) blood pressure-lowering treatment. Propensity score and multivariable analyses were performed to investigate the relationship between mannitol treatment (within 7 days) and poor outcome, defined by death or major disability on the modified Rankin Scale score (3-6) at 90 days. There was no significant difference in poor outcome between mannitol (n=1533) and nonmannitol (n=993) groups: propensity score-matched odds ratio of 0.90 (95% confidence interval, 0.75-1.09; P=0.30) and multivariable odds ratio of 0.87 (95% confidence interval, 0.71-1.07; P=0.18). Although a better outcome was suggested in patients with larger (≥15 mL) than those with smaller (<15 mL) baseline hematomas who received mannitol (odds ratio, 0.52 [95% confidence interval, 0.35-0.78] versus odds ratio, 0.91 [95% confidence interval, 0.72-1.15]; P homogeneity<0.03 in propensity score analyses), the association was not consistent in analyses across other cutoff points (≥10 and ≥20 mL) and for differing grades of neurological severity. Mannitol was not associated with excess serious adverse events. Mannitol seems safe but might not improve outcome in patients with acute intracerebral hemorrhage. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00716079. © 2015 American Heart Association, Inc.

  9. GC × GC-TOFMS and supervised multivariate approaches to study human cadaveric decomposition olfactive signatures.

    PubMed

    Stefanuto, Pierre-Hugues; Perrault, Katelynn A; Stadler, Sonja; Pesesse, Romain; LeBlanc, Helene N; Forbes, Shari L; Focant, Jean-François

    2015-06-01

    In forensic thanato-chemistry, the understanding of the process of soft tissue decomposition is still limited. A better understanding of the decomposition process and the characterization of the associated volatile organic compounds (VOC) can help to improve the training of victim recovery (VR) canines, which are used to search for trapped victims in natural disasters or to locate corpses during criminal investigations. The complexity of matrices and the dynamic nature of this process require the use of comprehensive analytical methods for investigation. Moreover, the variability of the environment and between individuals creates additional difficulties in terms of normalization. The resolution of the complex mixture of VOCs emitted by a decaying corpse can be improved using comprehensive two-dimensional gas chromatography (GC × GC), compared to classical single-dimensional gas chromatography (1DGC). This study combines the analytical advantages of GC × GC coupled to time-of-flight mass spectrometry (TOFMS) with the data handling robustness of supervised multivariate statistics to investigate the VOC profile of human remains during early stages of decomposition. Various supervised multivariate approaches are compared to interpret the large data set. Moreover, early decomposition stages of pig carcasses (typically used as human surrogates in field studies) are also monitored to obtain a direct comparison of the two VOC profiles and estimate the robustness of this human decomposition analog model. In this research, we demonstrate that pig and human decomposition processes can be described by the same trends for the major compounds produced during the early stages of soft tissue decomposition.

  10. Obtaining appropriate interval estimates for age when multiple indicators are used: evaluation of an ad-hoc procedure.

    PubMed

    Fieuws, Steffen; Willems, Guy; Larsen-Tangmose, Sara; Lynnerup, Niels; Boldsen, Jesper; Thevissen, Patrick

    2016-03-01

    When an estimate of age is needed, typically multiple indicators are present as found in skeletal or dental information. There exists a vast literature on approaches to estimate age from such multivariate data. Application of Bayes' rule has been proposed to overcome drawbacks of classical regression models but becomes less trivial as soon as the number of indicators increases. Each of the age indicators can lead to a different point estimate ("the most plausible value for age") and a prediction interval ("the range of possible values"). The major challenge in the combination of multiple indicators is not the calculation of a combined point estimate for age but the construction of an appropriate prediction interval. Ignoring the correlation between the age indicators results in intervals being too small. Boldsen et al. (2002) presented an ad-hoc procedure to construct an approximate confidence interval without the need to model the multivariate correlation structure between the indicators. The aim of the present paper is to bring under attention this pragmatic approach and to evaluate its performance in a practical setting. This is all the more needed since recent publications ignore the need for interval estimation. To illustrate and evaluate the method, Köhler et al. (1995) third molar scores are used to estimate the age in a dataset of 3200 male subjects in the juvenile age range.

  11. Clinical value of circulating endothelial cell levels in metastatic colorectal cancer patients treated with first-line chemotherapy and bevacizumab.

    PubMed

    Malka, D; Boige, V; Jacques, N; Vimond, N; Adenis, A; Boucher, E; Pierga, J Y; Conroy, T; Chauffert, B; François, E; Guichard, P; Galais, M P; Cvitkovic, F; Ducreux, M; Farace, F

    2012-04-01

    We investigated whether circulating endothelial cells (CECs) predict clinical outcome of first-line chemotherapy and bevacizumab in metastatic colorectal cancer (mCRC) patients. In a substudy of the randomized phase II FNCLCC ACCORD 13/0503 trial, CECs (CD45- CD31+ CD146+ 7-amino-actinomycin- cells) were enumerated in 99 patients by four-color flow cytometry at baseline and after one cycle of treatment. We correlated CEC levels with objective response rate (ORR), 6-month progression-free survival (PFS) rate (primary end point of the trial), PFS, and overall survival (OS). Multivariate analyses of potential prognostic factors, including CEC counts and Köhne score, were carried out. By multivariate analysis, high baseline CEC levels were the only independent prognostic factor for 6-month PFS rate (P < 0.01) and were independently associated with worse PFS (P = 0.02). High CEC levels after one cycle were the only independent prognostic factor for ORR (P = 0.03). High CEC levels at both time points independently predicted worse ORR (P = 0.025), 6-month PFS rate (P = 0.007), and PFS (P = 0.02). Köhne score was the only variable associated with OS. CEC levels at baseline and after one treatment cycle may independently predict ORR and PFS in mCRC patients starting first-line bevacizumab and chemotherapy.

  12. Analysis of algae growth mechanism and water bloom prediction under the effect of multi-affecting factor.

    PubMed

    Wang, Li; Wang, Xiaoyi; Jin, Xuebo; Xu, Jiping; Zhang, Huiyan; Yu, Jiabin; Sun, Qian; Gao, Chong; Wang, Lingbin

    2017-03-01

    The formation process of algae is described inaccurately and water blooms are predicted with a low precision by current methods. In this paper, chemical mechanism of algae growth is analyzed, and a correlation analysis of chlorophyll-a and algal density is conducted by chemical measurement. Taking into account the influence of multi-factors on algae growth and water blooms, the comprehensive prediction method combined with multivariate time series and intelligent model is put forward in this paper. Firstly, through the process of photosynthesis, the main factors that affect the reproduction of the algae are analyzed. A compensation prediction method of multivariate time series analysis based on neural network and Support Vector Machine has been put forward which is combined with Kernel Principal Component Analysis to deal with dimension reduction of the influence factors of blooms. Then, Genetic Algorithm is applied to improve the generalization ability of the BP network and Least Squares Support Vector Machine. Experimental results show that this method could better compensate the prediction model of multivariate time series analysis which is an effective way to improve the description accuracy of algae growth and prediction precision of water blooms.

  13. The influence of television and video game use on attention and school problems: a multivariate analysis with other risk factors controlled.

    PubMed

    Ferguson, Christopher J

    2011-06-01

    Research on youth mental health has increasingly indicated the importance of multivariate analyses of multiple risk factors for negative outcomes. Television and video game use have often been posited as potential contributors to attention problems, but previous studies have not always been well-controlled or used well-validated outcome measures. The current study examines the multivariate nature of risk factors for attention problems symptomatic of attention deficit hyperactivity disorder and poor school performance. A predominantly Hispanic population of 603 children (ages 10-14) and their parents/guardians responded to multiple behavioral measures. Outcome measures included parent and child reported attention problem behaviors on the Child Behavior Checklist (CBCL) as well as poor school performance as measured by grade point average (GPA). Results found that internal factors such as male gender, antisocial traits, family environment and anxiety best predicted attention problems. School performance was best predicted by family income. Television and video game use, whether total time spent using, or exposure to violent content specifically, did not predict attention problems or GPA. Television and video game use do not appear to be significant predictors of childhood attention problems. Intervention and prevention efforts may be better spent on other risk factors. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Rapid and Simultaneous Prediction of Eight Diesel Quality Parameters through ATR-FTIR Analysis.

    PubMed

    Nespeca, Maurilio Gustavo; Hatanaka, Rafael Rodrigues; Flumignan, Danilo Luiz; de Oliveira, José Eduardo

    2018-01-01

    Quality assessment of diesel fuel is highly necessary for society, but the costs and time spent are very high while using standard methods. Therefore, this study aimed to develop an analytical method capable of simultaneously determining eight diesel quality parameters (density; flash point; total sulfur content; distillation temperatures at 10% (T10), 50% (T50), and 85% (T85) recovery; cetane index; and biodiesel content) through attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy and the multivariate regression method, partial least square (PLS). For this purpose, the quality parameters of 409 samples were determined using standard methods, and their spectra were acquired in ranges of 4000-650 cm -1 . The use of the multivariate filters, generalized least squares weighting (GLSW) and orthogonal signal correction (OSC), was evaluated to improve the signal-to-noise ratio of the models. Likewise, four variable selection approaches were tested: manual exclusion, forward interval PLS (FiPLS), backward interval PLS (BiPLS), and genetic algorithm (GA). The multivariate filters and variables selection algorithms generated more fitted and accurate PLS models. According to the validation, the FTIR/PLS models presented accuracy comparable to the reference methods and, therefore, the proposed method can be applied in the diesel routine monitoring to significantly reduce costs and analysis time.

  15. Rapid and Simultaneous Prediction of Eight Diesel Quality Parameters through ATR-FTIR Analysis

    PubMed Central

    Hatanaka, Rafael Rodrigues; Flumignan, Danilo Luiz; de Oliveira, José Eduardo

    2018-01-01

    Quality assessment of diesel fuel is highly necessary for society, but the costs and time spent are very high while using standard methods. Therefore, this study aimed to develop an analytical method capable of simultaneously determining eight diesel quality parameters (density; flash point; total sulfur content; distillation temperatures at 10% (T10), 50% (T50), and 85% (T85) recovery; cetane index; and biodiesel content) through attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy and the multivariate regression method, partial least square (PLS). For this purpose, the quality parameters of 409 samples were determined using standard methods, and their spectra were acquired in ranges of 4000–650 cm−1. The use of the multivariate filters, generalized least squares weighting (GLSW) and orthogonal signal correction (OSC), was evaluated to improve the signal-to-noise ratio of the models. Likewise, four variable selection approaches were tested: manual exclusion, forward interval PLS (FiPLS), backward interval PLS (BiPLS), and genetic algorithm (GA). The multivariate filters and variables selection algorithms generated more fitted and accurate PLS models. According to the validation, the FTIR/PLS models presented accuracy comparable to the reference methods and, therefore, the proposed method can be applied in the diesel routine monitoring to significantly reduce costs and analysis time. PMID:29629209

  16. Non-parametric identification of multivariable systems: A local rational modeling approach with application to a vibration isolation benchmark

    NASA Astrophysics Data System (ADS)

    Voorhoeve, Robbert; van der Maas, Annemiek; Oomen, Tom

    2018-05-01

    Frequency response function (FRF) identification is often used as a basis for control systems design and as a starting point for subsequent parametric system identification. The aim of this paper is to develop a multiple-input multiple-output (MIMO) local parametric modeling approach for FRF identification of lightly damped mechanical systems with improved speed and accuracy. The proposed method is based on local rational models, which can efficiently handle the lightly-damped resonant dynamics. A key aspect herein is the freedom in the multivariable rational model parametrizations. Several choices for such multivariable rational model parametrizations are proposed and investigated. For systems with many inputs and outputs the required number of model parameters can rapidly increase, adversely affecting the performance of the local modeling approach. Therefore, low-order model structures are investigated. The structure of these low-order parametrizations leads to an undesired directionality in the identification problem. To address this, an iterative local rational modeling algorithm is proposed. As a special case recently developed SISO algorithms are recovered. The proposed approach is successfully demonstrated on simulations and on an active vibration isolation system benchmark, confirming good performance of the method using significantly less parameters compared with alternative approaches.

  17. Spatial Elucidation of Spinal Cord Lipid- and Metabolite- Regulations in Amyotrophic Lateral Sclerosis

    NASA Astrophysics Data System (ADS)

    Hanrieder, Jörg; Ewing, Andrew G.

    2014-06-01

    Amyotrophic lateral sclerosis (ALS) is a devastating, rapidly progressing disease of the central nervous system that is characterized by motor neuron degeneration in the brain stem and the spinal cord. We employed time of flight secondary ion mass spectrometry (ToF-SIMS) to profile spatial lipid- and metabolite- regulations in post mortem human spinal cord tissue from ALS patients to investigate chemical markers of ALS pathogenesis. ToF-SIMS scans and multivariate analysis of image and spectral data were performed on thoracic human spinal cord sections. Multivariate statistics of the image data allowed delineation of anatomical regions of interest based on their chemical identity. Spectral data extracted from these regions were compared using two different approaches for multivariate statistics, for investigating ALS related lipid and metabolite changes. The results show a significant decrease for cholesterol, triglycerides, and vitamin E in the ventral horn of ALS samples, which is presumably a consequence of motor neuron degeneration. Conversely, the biogenic mediator lipid lysophosphatidylcholine and its fragments were increased in ALS ventral spinal cord, pointing towards neuroinflammatory mechanisms associated with neuronal cell death. ToF-SIMS imaging is a promising approach for chemical histology and pathology for investigating the subcellular mechanisms underlying motor neuron degeneration in amyotrophic lateral sclerosis.

  18. Precise control of flexible manipulators

    NASA Technical Reports Server (NTRS)

    Cannon, R. H., Jr.; Bindford, T. O.; Schmitz, E.

    1984-01-01

    The design and experimental testing of end point position controllers for a very flexible one link lightweight manipulator are summarized. The latest upgraded version of the experimental set up, and the basic differences between conventional joint angle feedback and end point position feedback are described. A general procedure for application of modern control methods to the problem is outlined. The relationship between weighting parameters and the bandwidth and control stiffness of the resulting end point position closed loop system is shown. It is found that joint rate angle feedback in addition to the primary end point position sensor is essential for adequate disturbance rejection capability of the closed loop system. The use of a low order multivariable compensator design computer code; called Sandy is documented. A solution to the problem of control mode switching between position sensor sets is outlined. The proof of concept for endpoint position feedback for a one link flexible manipulator was demonstrated. The bandwidth obtained with the experimental end point position controller is about twice as fast as the beam's first natural cantilevered frequency, and comes within a factor of four of the absolute physical speed limit imposed by the wave propagation time of the beam.

  19. A GPU-Based Implementation of the Firefly Algorithm for Variable Selection in Multivariate Calibration Problems

    PubMed Central

    de Paula, Lauro C. M.; Soares, Anderson S.; de Lima, Telma W.; Delbem, Alexandre C. B.; Coelho, Clarimar J.; Filho, Arlindo R. G.

    2014-01-01

    Several variable selection algorithms in multivariate calibration can be accelerated using Graphics Processing Units (GPU). Among these algorithms, the Firefly Algorithm (FA) is a recent proposed metaheuristic that may be used for variable selection. This paper presents a GPU-based FA (FA-MLR) with multiobjective formulation for variable selection in multivariate calibration problems and compares it with some traditional sequential algorithms in the literature. The advantage of the proposed implementation is demonstrated in an example involving a relatively large number of variables. The results showed that the FA-MLR, in comparison with the traditional algorithms is a more suitable choice and a relevant contribution for the variable selection problem. Additionally, the results also demonstrated that the FA-MLR performed in a GPU can be five times faster than its sequential implementation. PMID:25493625

  20. A GPU-Based Implementation of the Firefly Algorithm for Variable Selection in Multivariate Calibration Problems.

    PubMed

    de Paula, Lauro C M; Soares, Anderson S; de Lima, Telma W; Delbem, Alexandre C B; Coelho, Clarimar J; Filho, Arlindo R G

    2014-01-01

    Several variable selection algorithms in multivariate calibration can be accelerated using Graphics Processing Units (GPU). Among these algorithms, the Firefly Algorithm (FA) is a recent proposed metaheuristic that may be used for variable selection. This paper presents a GPU-based FA (FA-MLR) with multiobjective formulation for variable selection in multivariate calibration problems and compares it with some traditional sequential algorithms in the literature. The advantage of the proposed implementation is demonstrated in an example involving a relatively large number of variables. The results showed that the FA-MLR, in comparison with the traditional algorithms is a more suitable choice and a relevant contribution for the variable selection problem. Additionally, the results also demonstrated that the FA-MLR performed in a GPU can be five times faster than its sequential implementation.

  1. A system to build distributed multivariate models and manage disparate data sharing policies: implementation in the scalable national network for effectiveness research.

    PubMed

    Meeker, Daniella; Jiang, Xiaoqian; Matheny, Michael E; Farcas, Claudiu; D'Arcy, Michel; Pearlman, Laura; Nookala, Lavanya; Day, Michele E; Kim, Katherine K; Kim, Hyeoneui; Boxwala, Aziz; El-Kareh, Robert; Kuo, Grace M; Resnic, Frederic S; Kesselman, Carl; Ohno-Machado, Lucila

    2015-11-01

    Centralized and federated models for sharing data in research networks currently exist. To build multivariate data analysis for centralized networks, transfer of patient-level data to a central computation resource is necessary. The authors implemented distributed multivariate models for federated networks in which patient-level data is kept at each site and data exchange policies are managed in a study-centric manner. The objective was to implement infrastructure that supports the functionality of some existing research networks (e.g., cohort discovery, workflow management, and estimation of multivariate analytic models on centralized data) while adding additional important new features, such as algorithms for distributed iterative multivariate models, a graphical interface for multivariate model specification, synchronous and asynchronous response to network queries, investigator-initiated studies, and study-based control of staff, protocols, and data sharing policies. Based on the requirements gathered from statisticians, administrators, and investigators from multiple institutions, the authors developed infrastructure and tools to support multisite comparative effectiveness studies using web services for multivariate statistical estimation in the SCANNER federated network. The authors implemented massively parallel (map-reduce) computation methods and a new policy management system to enable each study initiated by network participants to define the ways in which data may be processed, managed, queried, and shared. The authors illustrated the use of these systems among institutions with highly different policies and operating under different state laws. Federated research networks need not limit distributed query functionality to count queries, cohort discovery, or independently estimated analytic models. Multivariate analyses can be efficiently and securely conducted without patient-level data transport, allowing institutions with strict local data storage requirements to participate in sophisticated analyses based on federated research networks. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Non-invasive evaluation of stable renal allograft function using point shear-wave elastography.

    PubMed

    Kim, Bom Jun; Kim, Chan Kyo; Park, Jung Jae

    2018-01-01

    To investigate the feasibility of point shear-wave elastography (SWE) in evaluating patients with stable renal allograft function who underwent protocol biopsies. 95 patients with stable renal allograft function that underwent ultrasound-guided biopsies at predefined time points (10 days or 1 year after transplantation) were enrolled. Ultrasound and point SWE examinations were performed immediately before protocol biopsies. Patients were categorized into two groups: subclinical rejection (SCR) and non-SCR. Tissue elasticity (kPa) on SWE was measured in the cortex of all renal allografts. SCR was pathologically confirmed in 34 patients. Tissue elasticity of the SCR group (31.0 kPa) was significantly greater than that of the non-SCR group (24.5 kPa) (=0.016), while resistive index value did not show a significant difference between the two groups (p = 0.112). Tissue elasticity in renal allografts demonstrated significantly moderate negative correlation with estimated glomerular filtration rate (correlation coefficient = -0.604, p < 0.001). Tissue elasticity was not independent factor for SCR prediction on multivariate analysis. As a non-invasive tool, point SWE appears feasible in distinguishing between patients with SCR and without SCR in stable functioning renal allografts. Moreover, it may demonstrate the functional state of renal allografts. Advances in knowledge: On point SWE, SCR has greater tissue elasticity than non-SCR.

  3. Network meta-analysis of multiple outcome measures accounting for borrowing of information across outcomes.

    PubMed

    Achana, Felix A; Cooper, Nicola J; Bujkiewicz, Sylwia; Hubbard, Stephanie J; Kendrick, Denise; Jones, David R; Sutton, Alex J

    2014-07-21

    Network meta-analysis (NMA) enables simultaneous comparison of multiple treatments while preserving randomisation. When summarising evidence to inform an economic evaluation, it is important that the analysis accurately reflects the dependency structure within the data, as correlations between outcomes may have implication for estimating the net benefit associated with treatment. A multivariate NMA offers a framework for evaluating multiple treatments across multiple outcome measures while accounting for the correlation structure between outcomes. The standard NMA model is extended to multiple outcome settings in two stages. In the first stage, information is borrowed across outcomes as well across studies through modelling the within-study and between-study correlation structure. In the second stage, we make use of the additional assumption that intervention effects are exchangeable between outcomes to predict effect estimates for all outcomes, including effect estimates on outcomes where evidence is either sparse or the treatment had not been considered by any one of the studies included in the analysis. We apply the methods to binary outcome data from a systematic review evaluating the effectiveness of nine home safety interventions on uptake of three poisoning prevention practices (safe storage of medicines, safe storage of other household products, and possession of poison centre control telephone number) in households with children. Analyses are conducted in WinBUGS using Markov Chain Monte Carlo (MCMC) simulations. Univariate and the first stage multivariate models produced broadly similar point estimates of intervention effects but the uncertainty around the multivariate estimates varied depending on the prior distribution specified for the between-study covariance structure. The second stage multivariate analyses produced more precise effect estimates while enabling intervention effects to be predicted for all outcomes, including intervention effects on outcomes not directly considered by the studies included in the analysis. Accounting for the dependency between outcomes in a multivariate meta-analysis may or may not improve the precision of effect estimates from a network meta-analysis compared to analysing each outcome separately.

  4. High-order interactions observed in multi-task intrinsic networks are dominant indicators of aberrant brain function in schizophrenia

    PubMed Central

    Plis, Sergey M; Sui, Jing; Lane, Terran; Roy, Sushmita; Clark, Vincent P; Potluru, Vamsi K; Huster, Rene J; Michael, Andrew; Sponheim, Scott R; Weisend, Michael P; Calhoun, Vince D

    2013-01-01

    Identifying the complex activity relationships present in rich, modern neuroimaging data sets remains a key challenge for neuroscience. The problem is hard because (a) the underlying spatial and temporal networks may be nonlinear and multivariate and (b) the observed data may be driven by numerous latent factors. Further, modern experiments often produce data sets containing multiple stimulus contexts or tasks processed by the same subjects. Fusing such multi-session data sets may reveal additional structure, but raises further statistical challenges. We present a novel analysis method for extracting complex activity networks from such multifaceted imaging data sets. Compared to previous methods, we choose a new point in the trade-off space, sacrificing detailed generative probability models and explicit latent variable inference in order to achieve robust estimation of multivariate, nonlinear group factors (“network clusters”). We apply our method to identify relationships of task-specific intrinsic networks in schizophrenia patients and control subjects from a large fMRI study. After identifying network-clusters characterized by within- and between-task interactions, we find significant differences between patient and control groups in interaction strength among networks. Our results are consistent with known findings of brain regions exhibiting deviations in schizophrenic patients. However, we also find high-order, nonlinear interactions that discriminate groups but that are not detected by linear, pair-wise methods. We additionally identify high-order relationships that provide new insights into schizophrenia but that have not been found by traditional univariate or second-order methods. Overall, our approach can identify key relationships that are missed by existing analysis methods, without losing the ability to find relationships that are known to be important. PMID:23876245

  5. Executive function, but not memory, associates with incident coronary heart disease and stroke.

    PubMed

    Rostamian, Somayeh; van Buchem, Mark A; Westendorp, Rudi G J; Jukema, J Wouter; Mooijaart, Simon P; Sabayan, Behnam; de Craen, Anton J M

    2015-09-01

    To evaluate the association of performance in cognitive domains executive function and memory with incident coronary heart disease and stroke in older participants without dementia. We included 3,926 participants (mean age 75 years, 44% male) at risk for cardiovascular diseases from the Prospective Study of Pravastatin in the Elderly at Risk (PROSPER) with Mini-Mental State Examination score ≥24 points. Scores on the Stroop Color-Word Test (selective attention) and the Letter Digit Substitution Test (processing speed) were converted to Z scores and averaged into a composite executive function score. Likewise, scores of the Picture Learning Test (immediate and delayed memory) were transformed into a composite memory score. Associations of executive function and memory were longitudinally assessed with risk of coronary heart disease and stroke using multivariable Cox regression models. During 3.2 years of follow-up, incidence rates of coronary heart disease and stroke were 30.5 and 12.4 per 1,000 person-years, respectively. In multivariable models, participants in the lowest third of executive function, as compared to participants in the highest third, had 1.85-fold (95% confidence interval [CI] 1.39-2.45) higher risk of coronary heart disease and 1.51-fold (95% CI 0.99-2.30) higher risk of stroke. Participants in the lowest third of memory had no increased risk of coronary heart disease (hazard ratio 0.99, 95% CI 0.74-1.32) or stroke (hazard ratio 0.87, 95% CI 0.57-1.32). Lower executive function, but not memory, is associated with higher risk of coronary heart disease and stroke. Lower executive function, as an independent risk indicator, might better reflect brain vascular pathologies. © 2015 American Academy of Neurology.

  6. Visualization of 3-D tensor fields

    NASA Technical Reports Server (NTRS)

    Hesselink, L.

    1996-01-01

    Second-order tensor fields have applications in many different areas of physics, such as general relativity and fluid mechanics. The wealth of multivariate information in tensor fields makes them more complex and abstract than scalar and vector fields. Visualization is a good technique for scientists to gain new insights from them. Visualizing a 3-D continuous tensor field is equivalent to simultaneously visualizing its three eigenvector fields. In the past, research has been conducted in the area of two-dimensional tensor fields. It was shown that degenerate points, defined as points where eigenvalues are equal to each other, are the basic singularities underlying the topology of tensor fields. Moreover, it was shown that eigenvectors never cross each other except at degenerate points. Since we live in a three-dimensional world, it is important for us to understand the underlying physics of this world. In this report, we describe a new method for locating degenerate points along with the conditions for classifying them in three-dimensional space. Finally, we discuss some topological features of three-dimensional tensor fields, and interpret topological patterns in terms of physical properties.

  7. Application of Monte Carlo Method for Evaluation of Uncertainties of ITS-90 by Standard Platinum Resistance Thermometer

    NASA Astrophysics Data System (ADS)

    Palenčár, Rudolf; Sopkuliak, Peter; Palenčár, Jakub; Ďuriš, Stanislav; Suroviak, Emil; Halaj, Martin

    2017-06-01

    Evaluation of uncertainties of the temperature measurement by standard platinum resistance thermometer calibrated at the defining fixed points according to ITS-90 is a problem that can be solved in different ways. The paper presents a procedure based on the propagation of distributions using the Monte Carlo method. The procedure employs generation of pseudo-random numbers for the input variables of resistances at the defining fixed points, supposing the multivariate Gaussian distribution for input quantities. This allows taking into account the correlations among resistances at the defining fixed points. Assumption of Gaussian probability density function is acceptable, with respect to the several sources of uncertainties of resistances. In the case of uncorrelated resistances at the defining fixed points, the method is applicable to any probability density function. Validation of the law of propagation of uncertainty using the Monte Carlo method is presented on the example of specific data for 25 Ω standard platinum resistance thermometer in the temperature range from 0 to 660 °C. Using this example, we demonstrate suitability of the method by validation of its results.

  8. Hierarchical Multinomial Processing Tree Models: A Latent-Trait Approach

    ERIC Educational Resources Information Center

    Klauer, Karl Christoph

    2010-01-01

    Multinomial processing tree models are widely used in many areas of psychology. A hierarchical extension of the model class is proposed, using a multivariate normal distribution of person-level parameters with the mean and covariance matrix to be estimated from the data. The hierarchical model allows one to take variability between persons into…

  9. Recoding Numerics to Geometrics for Complex Discrimination Tasks; A Feasibility Study of Coding Strategy.

    ERIC Educational Resources Information Center

    Simpkins, John D.

    Processing complex multivariate information effectively when relational properties of information sub-groups are ambiguous is difficult for man and man-machine systems. However, the information processing task is made easier through code study, cybernetic planning, and accurate display mechanisms. An exploratory laboratory study designed for the…

  10. Texture as a basis for acoustic classification of substrate in the nearshore region

    NASA Astrophysics Data System (ADS)

    Dennison, A.; Wattrus, N. J.

    2016-12-01

    Segmentation and classification of substrate type from two locations in Lake Superior, are predicted using multivariate statistical processing of textural measures derived from shallow-water, high-resolution multibeam bathymetric data. During a multibeam sonar survey, both bathymetric and backscatter data are collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on substrate type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. Preliminary results from an analysis of bathymetric data and ground-truth samples collected from the Amnicon River, Superior, Wisconsin, and the Lester River, Duluth, Minnesota, demonstrate the ability to process and develop a novel classification scheme of the bottom type in two geomorphologically distinct areas.

  11. Multivariate data analysis on historical IPV production data for better process understanding and future improvements.

    PubMed

    Thomassen, Yvonne E; van Sprang, Eric N M; van der Pol, Leo A; Bakker, Wilfried A M

    2010-09-01

    Historical manufacturing data can potentially harbor a wealth of information for process optimization and enhancement of efficiency and robustness. To extract useful data multivariate data analysis (MVDA) using projection methods is often applied. In this contribution, the results obtained from applying MVDA on data from inactivated polio vaccine (IPV) production runs are described. Data from over 50 batches at two different production scales (700-L and 1,500-L) were available. The explorative analysis performed on single unit operations indicated consistent manufacturing. Known outliers (e.g., rejected batches) were identified using principal component analysis (PCA). The source of operational variation was pinpointed to variation of input such as media. Other relevant process parameters were in control and, using this manufacturing data, could not be correlated to product quality attributes. The gained knowledge of the IPV production process, not only from the MVDA, but also from digitalizing the available historical data, has proven to be useful for troubleshooting, understanding limitations of available data and seeing the opportunity for improvements. 2010 Wiley Periodicals, Inc.

  12. Process design and control of a twin screw hot melt extrusion for continuous pharmaceutical tamper-resistant tablet production.

    PubMed

    Baronsky-Probst, J; Möltgen, C-V; Kessler, W; Kessler, R W

    2016-05-25

    Hot melt extrusion (HME) is a well-known process within the plastic and food industries that has been utilized for the past several decades and is increasingly accepted by the pharmaceutical industry for continuous manufacturing. For tamper-resistant formulations of e.g. opioids, HME is the most efficient production technique. The focus of this study is thus to evaluate the manufacturability of the HME process for tamper-resistant formulations. Parameters such as the specific mechanical energy (SME), as well as the melt pressure and its standard deviation, are important and will be discussed in this study. In the first step, the existing process data are analyzed by means of multivariate data analysis. Key critical process parameters such as feed rate, screw speed, and the concentration of the API in the polymers are identified, and critical quality parameters of the tablet are defined. In the second step, a relationship between the critical material, product and process quality attributes are established by means of Design of Experiments (DoEs). The resulting SME and the temperature at the die are essential data points needed to indirectly qualify the degradation of the API, which should be minimal. NIR-spectroscopy is used to monitor the material during the extrusion process. In contrast to most applications in which the probe is directly integrated into the die, the optical sensor is integrated into the cooling line of the strands. This saves costs in the probe design and maintenance and increases the robustness of the chemometric models. Finally, a process measurement system is installed to monitor and control all of the critical attributes in real-time by means of first principles, DoE models, soft sensor models, and spectroscopic information. Overall, the process is very robust as long as the screw speed is kept low. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Unified theory for stochastic modelling of hydroclimatic processes: Preserving marginal distributions, correlation structures, and intermittency

    NASA Astrophysics Data System (ADS)

    Papalexiou, Simon Michael

    2018-05-01

    Hydroclimatic processes come in all "shapes and sizes". They are characterized by different spatiotemporal correlation structures and probability distributions that can be continuous, mixed-type, discrete or even binary. Simulating such processes by reproducing precisely their marginal distribution and linear correlation structure, including features like intermittency, can greatly improve hydrological analysis and design. Traditionally, modelling schemes are case specific and typically attempt to preserve few statistical moments providing inadequate and potentially risky distribution approximations. Here, a single framework is proposed that unifies, extends, and improves a general-purpose modelling strategy, based on the assumption that any process can emerge by transforming a specific "parent" Gaussian process. A novel mathematical representation of this scheme, introducing parametric correlation transformation functions, enables straightforward estimation of the parent-Gaussian process yielding the target process after the marginal back transformation, while it provides a general description that supersedes previous specific parameterizations, offering a simple, fast and efficient simulation procedure for every stationary process at any spatiotemporal scale. This framework, also applicable for cyclostationary and multivariate modelling, is augmented with flexible parametric correlation structures that parsimoniously describe observed correlations. Real-world simulations of various hydroclimatic processes with different correlation structures and marginals, such as precipitation, river discharge, wind speed, humidity, extreme events per year, etc., as well as a multivariate example, highlight the flexibility, advantages, and complete generality of the method.

  14. Relating dynamic brain states to dynamic machine states: Human and machine solutions to the speech recognition problem

    PubMed Central

    Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth

    2017-01-01

    There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744

  15. Bayesian soft X-ray tomography using non-stationary Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Li, Dong; Svensson, J.; Thomsen, H.; Medina, F.; Werner, A.; Wolf, R.

    2013-08-01

    In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.

  16. Bayesian soft X-ray tomography using non-stationary Gaussian Processes.

    PubMed

    Li, Dong; Svensson, J; Thomsen, H; Medina, F; Werner, A; Wolf, R

    2013-08-01

    In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allows the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.

  17. Arabidopsis phenotyping through Geometric Morphometrics.

    PubMed

    Manacorda, Carlos A; Asurmendi, Sebastian

    2018-06-18

    Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.

  18. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Algebraic reasoning for the enhancement of data-driven building reconstructions

    NASA Astrophysics Data System (ADS)

    Meidow, Jochen; Hammer, Horst

    2016-04-01

    Data-driven approaches for the reconstruction of buildings feature the flexibility needed to capture objects of arbitrary shape. To recognize man-made structures, geometric relations such as orthogonality or parallelism have to be detected. These constraints are typically formulated as sets of multivariate polynomials. For the enforcement of the constraints within an adjustment process, a set of independent and consistent geometric constraints has to be determined. Gröbner bases are an ideal tool to identify such sets exactly. A complete workflow for geometric reasoning is presented to obtain boundary representations of solids based on given point clouds. The constraints are formulated in homogeneous coordinates, which results in simple polynomials suitable for the successful derivation of Gröbner bases for algebraic reasoning. Strategies for the reduction of the algebraical complexity are presented. To enforce the constraints, an adjustment model is introduced, which is able to cope with homogeneous coordinates along with their singular covariance matrices. The feasibility and the potential of the approach are demonstrated by the analysis of a real data set.

  20. Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.

    PubMed

    Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J

    2015-01-01

    The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.

  1. Different spatio-temporal electroencephalography features drive the successful decoding of binaural and monaural cues for sound localization.

    PubMed

    Bednar, Adam; Boland, Francis M; Lalor, Edmund C

    2017-03-01

    The human ability to localize sound is essential for monitoring our environment and helps us to analyse complex auditory scenes. Although the acoustic cues mediating sound localization have been established, it remains unknown how these cues are represented in human cortex. In particular, it is still a point of contention whether binaural and monaural cues are processed by the same or distinct cortical networks. In this study, participants listened to a sequence of auditory stimuli from different spatial locations while we recorded their neural activity using electroencephalography (EEG). The stimuli were presented over a loudspeaker array, which allowed us to deliver realistic, free-field stimuli in both the horizontal and vertical planes. Using a multivariate classification approach, we showed that it is possible to decode sound source location from scalp-recorded EEG. Robust and consistent decoding was shown for stimuli that provide binaural cues (i.e. Left vs. Right stimuli). Decoding location when only monaural cues were available (i.e. Front vs. Rear and elevational stimuli) was successful for a subset of subjects and showed less consistency. Notably, the spatio-temporal pattern of EEG features that facilitated decoding differed based on the availability of binaural and monaural cues. In particular, we identified neural processing of binaural cues at around 120 ms post-stimulus and found that monaural cues are processed later between 150 and 200 ms. Furthermore, different spatial activation patterns emerged for binaural and monaural cue processing. These spatio-temporal dissimilarities suggest the involvement of separate cortical mechanisms in monaural and binaural acoustic cue processing. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  2. 1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.

    PubMed

    Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr

    2015-12-01

    Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.

  3. Solfatara volcano subsurface imaging: two different approaches to process and interpret multi-variate data sets

    NASA Astrophysics Data System (ADS)

    Bernardinetti, Stefano; Bruno, Pier Paolo; Lavoué, François; Gresse, Marceau; Vandemeulebrouck, Jean; Revil, André

    2017-04-01

    The need to reduce model uncertainty and produce a more reliable geophysical imaging and interpretations is nowadays a fundamental task required to geophysics techniques applied in complex environments such as Solfatara Volcano. The use of independent geophysical methods allows to obtain many information on the subsurface due to the different sensitivities of the data towards parameters such as compressional and shearing wave velocities, bulk electrical conductivity, or density. The joint processing of these multiple physical properties can lead to a very detailed characterization of the subsurface and therefore enhance our imaging and our interpretation. In this work, we develop two different processing approaches based on reflection seismology and seismic P-wave tomography on one hand, and electrical data acquired over the same line, on the other hand. From these data, we obtain an image-guided electrical resistivity tomography and a post processing integration of tomographic results. The image-guided electrical resistivity tomography is obtained by regularizing the inversion of the electrical data with structural constraints extracted from a migrated seismic section using image processing tools. This approach enables to focus the reconstruction of electrical resistivity anomalies along the features visible in the seismic section, and acts as a guide for interpretation in terms of subsurface structures and processes. To integrate co-registrated P-wave velocity and electrical resistivity values, we apply a data mining tool, the k-means algorithm, to individuate relationships between the two set of variables. This algorithm permits to individuate different clusters with the objective to minimize the sum of squared Euclidean distances within each cluster and maximize it between clusters for the multivariate data set. We obtain a partitioning of the multivariate data set in a finite number of well-correlated clusters, representative of the optimum clustering of our geophysical variables (P-wave velocities and electrical resistivities). The result is an integrated tomography that shows a finite number of homogeneous geophysical facies, and therefore permits to highlight the main geological features of the subsurface.

  4. Emergency department blood alcohol level associates with injury factors and six-month outcome after uncomplicated mild traumatic brain injury.

    PubMed

    Yue, John K; Ngwenya, Laura B; Upadhyayula, Pavan S; Deng, Hansen; Winkler, Ethan A; Burke, John F; Lee, Young M; Robinson, Caitlin K; Ferguson, Adam R; Lingsma, Hester F; Cnossen, Maryse C; Pirracchio, Romain; Korley, Frederick K; Vassar, Mary J; Yuh, Esther L; Mukherjee, Pratik; Gordon, Wayne A; Valadka, Alex B; Okonkwo, David O; Manley, Geoffrey T

    2017-11-01

    The relationship between blood alcohol level (BAL) and mild traumatic brain injury (mTBI) remains in need of improved characterization. Adult patients suffering mTBI without intracranial pathology on computed tomography (CT) from the prospective Transforming Research and Clinical Knowledge in Traumatic Brain Injury Pilot study with emergency department (ED) Glasgow Coma Scale (GCS) 13-15 and recorded blood alcohol level (BAL) were extracted. BAL≥80-mg/dl was set as proxy for excessive use. Multivariable regression was performed for patients with six-month Glasgow Outcome Scale-Extended (GOSE; functional recovery) and Wechsler Adult Intelligence Scale Processing Speed Index Composite Score (WAIS-PSI; nonverbal processing speed), using BAL≥80-mg/dl and <80-mg/dl cohorts, adjusting for demographic/injury factors. Overall, 107 patients were aged 42.7±16.8-years, 67.3%-male, and 80.4%-Caucasian; 65.4% had BAL=0-mg/dl, 4.6% BAL<80-mg/dl, and 30.0% BAL≥80-mg/dl (range 100-440-mg/dl). BAL differed across loss of consciousness (LOC; none: median 0-mg/dl [interquartile range (IQR) 0-0], <30-min: 0-mg/dl [0-43], ≥30-min: 224-mg/dl [50-269], unknown: 108-mg/dl [0-232]; p=0.002). GCS<15 associated with higher BAL (19-mg/dl [0-204] vs. 0-mg/dl [0-20]; p=0.013). On univariate analysis, BAL≥80-mg/dl associated with less-than-full functional recovery (GOSE≤7; 38.1% vs. 11.5%; p=0.025) and lower WAIS-PSI (92.4±12.7, 30th-percentile vs. 105.1±11.7, 63rd-percentile; p<0.001). On multivariable regression BAL≥80-mg/dl demonstrated an odds ratio of 8.05 (95% CI [1.35-47.92]; p=0.022) for GOSE≤7 and an adjusted mean decrease of 8.88-points (95% CI [0.67-17.09]; p=0.035) on WAIS-PSI. Day-of-injury BAL>80-mg/dl after uncomplicated mTBI was associated with decreased GCS score and prolongation of reported LOC. BAL may be a biomarker for impaired return to baseline function and decreased nonverbal processing speed at six-months postinjury. Future confirmatory studies are needed. Published by Elsevier Ltd.

  5. Development and evaluation of spatial point process models for epidermal nerve fibers.

    PubMed

    Olsbo, Viktor; Myllymäki, Mari; Waller, Lance A; Särkkä, Aila

    2013-06-01

    We propose two spatial point process models for the spatial structure of epidermal nerve fibers (ENFs) across human skin. The models derive from two point processes, Φb and Φe, describing the locations of the base and end points of the fibers. Each point of Φe (the end point process) is connected to a unique point in Φb (the base point process). In the first model, both Φe and Φb are Poisson processes, yielding a null model of uniform coverage of the skin by end points and general baseline results and reference values for moments of key physiologic indicators. The second model provides a mechanistic model to generate end points for each base, and we model the branching structure more directly by defining Φe as a cluster process conditioned on the realization of Φb as its parent points. In both cases, we derive distributional properties for observable quantities of direct interest to neurologists such as the number of fibers per base, and the direction and range of fibers on the skin. We contrast both models by fitting them to data from skin blister biopsy images of ENFs and provide inference regarding physiological properties of ENFs. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Using the Mean Shift Algorithm to Make Post Hoc Improvements to the Accuracy of Eye Tracking Data Based on Probable Fixation Locations

    DTIC Science & Technology

    2010-08-01

    astigmatism and other sources, and stay constant from time to time (LC Technologies, 2000). Systematic errors can sometimes reach many degrees of visual angle...Taking the average of all disparities would mean treating each as equally important regardless of whether they are from correct or incorrect mappings. In...likely stop somewhere near the centroid because the large hM basically treats every point equally (or nearly equally if using the multivariate

  7. The use of experimental design to find the operating maximum power point of PEM fuel cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crăciunescu, Aurelian; Pătularu, Laurenţiu; Ciumbulea, Gloria

    2015-03-10

    Proton Exchange Membrane (PEM) Fuel Cells are difficult to model due to their complex nonlinear nature. In this paper, the development of a PEM Fuel Cells mathematical model based on the Design of Experiment methodology is described. The Design of Experiment provides a very efficient methodology to obtain a mathematical model for the studied multivariable system with only a few experiments. The obtained results can be used for optimization and control of the PEM Fuel Cells systems.

  8. Exploring the spatio-temporal neural basis of face learning

    PubMed Central

    Yang, Ying; Xu, Yang; Jew, Carol A.; Pyles, John A.; Kass, Robert E.; Tarr, Michael J.

    2017-01-01

    Humans are experts at face individuation. Although previous work has identified a network of face-sensitive regions and some of the temporal signatures of face processing, as yet, we do not have a clear understanding of how such face-sensitive regions support learning at different time points. To study the joint spatio-temporal neural basis of face learning, we trained subjects to categorize two groups of novel faces and recorded their neural responses using magnetoencephalography (MEG) throughout learning. A regression analysis of neural responses in face-sensitive regions against behavioral learning curves revealed significant correlations with learning in the majority of the face-sensitive regions in the face network, mostly between 150–250 ms, but also after 300 ms. However, the effect was smaller in nonventral regions (within the superior temporal areas and prefrontal cortex) than that in the ventral regions (within the inferior occipital gyri (IOG), midfusiform gyri (mFUS) and anterior temporal lobes). A multivariate discriminant analysis also revealed that IOG and mFUS, which showed strong correlation effects with learning, exhibited significant discriminability between the two face categories at different time points both between 150–250 ms and after 300 ms. In contrast, the nonventral face-sensitive regions, where correlation effects with learning were smaller, did exhibit some significant discriminability, but mainly after 300 ms. In sum, our findings indicate that early and recurring temporal components arising from ventral face-sensitive regions are critically involved in learning new faces. PMID:28570739

  9. Exploring the spatio-temporal neural basis of face learning.

    PubMed

    Yang, Ying; Xu, Yang; Jew, Carol A; Pyles, John A; Kass, Robert E; Tarr, Michael J

    2017-06-01

    Humans are experts at face individuation. Although previous work has identified a network of face-sensitive regions and some of the temporal signatures of face processing, as yet, we do not have a clear understanding of how such face-sensitive regions support learning at different time points. To study the joint spatio-temporal neural basis of face learning, we trained subjects to categorize two groups of novel faces and recorded their neural responses using magnetoencephalography (MEG) throughout learning. A regression analysis of neural responses in face-sensitive regions against behavioral learning curves revealed significant correlations with learning in the majority of the face-sensitive regions in the face network, mostly between 150-250 ms, but also after 300 ms. However, the effect was smaller in nonventral regions (within the superior temporal areas and prefrontal cortex) than that in the ventral regions (within the inferior occipital gyri (IOG), midfusiform gyri (mFUS) and anterior temporal lobes). A multivariate discriminant analysis also revealed that IOG and mFUS, which showed strong correlation effects with learning, exhibited significant discriminability between the two face categories at different time points both between 150-250 ms and after 300 ms. In contrast, the nonventral face-sensitive regions, where correlation effects with learning were smaller, did exhibit some significant discriminability, but mainly after 300 ms. In sum, our findings indicate that early and recurring temporal components arising from ventral face-sensitive regions are critically involved in learning new faces.

  10. Clinical risk stratification model for advanced colorectal neoplasia in persons with negative fecal immunochemical test results.

    PubMed

    Jung, Yoon Suk; Park, Chan Hyuk; Kim, Nam Hee; Park, Jung Ho; Park, Dong Il; Sohn, Chong Il

    2018-01-01

    The fecal immunochemical test (FIT) has low sensitivity for detecting advanced colorectal neoplasia (ACRN); thus, a considerable portion of FIT-negative persons may have ACRN. We aimed to develop a risk-scoring model for predicting ACRN in FIT-negative persons. We reviewed the records of participants aged ≥40 years who underwent a colonoscopy and FIT during a health check-up. We developed a risk-scoring model for predicting ACRN in FIT-negative persons. Of 11,873 FIT-negative participants, 255 (2.1%) had ACRN. On the basis of the multivariable logistic regression model, point scores were assigned as follows among FIT-negative persons: age (per year from 40 years old), 1 point; current smoker, 10 points; overweight, 5 points; obese, 7 points; hypertension, 6 points; old cerebrovascular attack (CVA), 15 points. Although the proportion of ACRN in FIT-negative persons increased as risk scores increased (from 0.6% in the group with 0-4 points to 8.1% in the group with 35-39 points), it was significantly lower than that in FIT-positive persons (14.9%). However, there was no statistical difference between the proportion of ACRN in FIT-negative persons with ≥40 points and in FIT-positive persons (10.5% vs. 14.9%, P = 0.321). FIT-negative persons may need to undergo screening colonoscopy if they clinically have a high risk of ACRN. The scoring model based on age, smoking habits, overweight or obesity, hypertension, and old CVA may be useful in selecting and prioritizing FIT-negative persons for screening colonoscopy.

  11. Analyzing developmental processes on an individual level using nonstationary time series modeling.

    PubMed

    Molenaar, Peter C M; Sinclair, Katerina O; Rovine, Michael J; Ram, Nilam; Corneal, Sherry E

    2009-01-01

    Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in developmental processes over time using a multivariate nonstationary time series model. They apply this model to describe the changing relationships between a biological son and father and a stepson and stepfather at the individual level. The authors also explain how to use an extended Kalman filter with iteration and smoothing estimator to capture how dynamics change over time. Finally, they suggest further applications of the multivariate nonstationary time series model and detail the next steps in the development of statistical models used to analyze individual-level data.

  12. On the reliability of Shewhart-type control charts for multivariate process variability

    NASA Astrophysics Data System (ADS)

    Djauhari, Maman A.; Salleh, Rohayu Mohd; Zolkeply, Zunnaaim; Li, Lee Siaw

    2017-05-01

    We show that in the current practice of multivariate process variability monitoring, the reliability of Shewhart-type control charts cannot be measured except when the sub-group size n tends to infinity. However, the requirement of large n is meaningless not only in manufacturing industry where n is small but also in service industry where n is moderate. In this paper, we introduce a new definition of control limits in the two most appreciated control charts in the literature, i.e., the improved generalized variance chart (IGV-chart) and vector variance chart (VV-chart). With the new definition of control limits, the reliability of the control charts can be determined. Some important properties of new control limits will be derived and the computational technique of probability of false alarm will be delivered.

  13. Unsupervised pattern recognition methods in ciders profiling based on GCE voltammetric signals.

    PubMed

    Jakubowska, Małgorzata; Sordoń, Wanda; Ciepiela, Filip

    2016-07-15

    This work presents a complete methodology of distinguishing between different brands of cider and ageing degrees, based on voltammetric signals, utilizing dedicated data preprocessing procedures and unsupervised multivariate analysis. It was demonstrated that voltammograms recorded on glassy carbon electrode in Britton-Robinson buffer at pH 2 are reproducible for each brand. By application of clustering algorithms and principal component analysis visible homogenous clusters were obtained. Advanced signal processing strategy which included automatic baseline correction, interval scaling and continuous wavelet transform with dedicated mother wavelet, was a key step in the correct recognition of the objects. The results show that voltammetry combined with optimized univariate and multivariate data processing is a sufficient tool to distinguish between ciders from various brands and to evaluate their freshness. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Application of quality by design concepts in the development of fluidized bed granulation and tableting processes.

    PubMed

    Djuris, Jelena; Medarevic, Djordje; Krstic, Marko; Djuric, Zorica; Ibric, Svetlana

    2013-06-01

    This study illustrates the application of experimental design and multivariate data analysis in defining design space for granulation and tableting processes. According to the quality by design concepts, critical quality attributes (CQAs) of granules and tablets, as well as critical parameters of granulation and tableting processes, were identified and evaluated. Acetaminophen was used as the model drug, and one of the study aims was to investigate the possibility of the development of immediate- or extended-release acetaminophen tablets. Granulation experiments were performed in the fluid bed processor using polyethylene oxide polymer as a binder in the direct granulation method. Tablets were compressed in the laboratory excenter tablet press. The first set of experiments was organized according to Plackett-Burman design, followed by the full factorial experimental design. Principal component analysis and partial least squares regression were applied as the multivariate analysis techniques. By using these different methods, CQAs and process parameters were identified and quantified. Furthermore, an in-line method was developed to monitor the temperature during the fluidized bed granulation process, to foresee possible defects in granules CQAs. Various control strategies that are based on the process understanding and assure desired quality attributes of the product are proposed. Copyright © 2013 Wiley Periodicals, Inc.

  15. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  16. Examining the Role of the Human Hippocampus in Approach-Avoidance Decision Making Using a Novel Conflict Paradigm and Multivariate Functional Magnetic Resonance Imaging.

    PubMed

    O'Neil, Edward B; Newsome, Rachel N; Li, Iris H N; Thavabalasingam, Sathesan; Ito, Rutsuko; Lee, Andy C H

    2015-11-11

    Rodent models of anxiety have implicated the ventral hippocampus in approach-avoidance conflict processing. Few studies have, however, examined whether the human hippocampus plays a similar role. We developed a novel decision-making paradigm to examine neural activity when participants made approach/avoidance decisions under conditions of high or absent approach-avoidance conflict. Critically, our task required participants to learn the associated reward/punishment values of previously neutral stimuli and controlled for mnemonic and spatial processing demands, both important issues given approach-avoidance behavior in humans is less tied to predation and foraging compared to rodents. Participants played a points-based game where they first attempted to maximize their score by determining which of a series of previously neutral image pairs should be approached or avoided. During functional magnetic resonance imaging, participants were then presented with novel pairings of these images. These pairings consisted of images of congruent or opposing learned valences, the latter creating conditions of high approach-avoidance conflict. A data-driven partial least squares multivariate analysis revealed two reliable patterns of activity, each revealing differential activity in the anterior hippocampus, the homolog of the rodent ventral hippocampus. The first was associated with greater hippocampal involvement during trials with high as opposed to no approach-avoidance conflict, regardless of approach or avoidance behavior. The second pattern encompassed greater hippocampal activity in a more anterior aspect during approach compared to avoid responses, for conflict and no-conflict conditions. Multivoxel pattern classification analyses yielded converging findings, underlining a role of the anterior hippocampus in approach-avoidance conflict decision making. Approach-avoidance conflict has been linked to anxiety and occurs when a stimulus or situation is associated with reward and punishment. Although rodent work has implicated the hippocampus in approach-avoidance conflict processing, there is limited data on whether this role applies to learned, as opposed to innate, incentive values, and whether the human hippocampus plays a similar role. Using functional neuroimaging with a novel decision-making task that controlled for perceptual and mnemonic processing, we found that the human hippocampus was significantly active when approach-avoidance conflict was present for stimuli with learned incentive values. These findings demonstrate a role for the human hippocampus in approach-avoidance decision making that cannot be explained easily by hippocampal-dependent long-term memory or spatial cognition. Copyright © 2015 the authors 0270-6474/15/3515040-11$15.00/0.

  17. Computer program documentation: ISOCLS iterative self-organizing clustering program, program C094

    NASA Technical Reports Server (NTRS)

    Minter, R. T. (Principal Investigator)

    1972-01-01

    The author has identified the following significant results. This program implements an algorithm which, ideally, sorts a given set of multivariate data points into similar groups or clusters. The program is intended for use in the evaluation of multispectral scanner data; however, the algorithm could be used for other data types as well. The user may specify a set of initial estimated cluster means to begin the procedure, or he may begin with the assumption that all the data belongs to one cluster. The procedure is initiatized by assigning each data point to the nearest (in absolute distance) cluster mean. If no initial cluster means were input, all of the data is assigned to cluster 1. The means and standard deviations are calculated for each cluster.

  18. A Quality by Design approach to investigate tablet dissolution shift upon accelerated stability by multivariate methods.

    PubMed

    Huang, Jun; Goolcharran, Chimanlall; Ghosh, Krishnendu

    2011-05-01

    This paper presents the use of experimental design, optimization and multivariate techniques to investigate root-cause of tablet dissolution shift (slow-down) upon stability and develop control strategies for a drug product during formulation and process development. The effectiveness and usefulness of these methodologies were demonstrated through two application examples. In both applications, dissolution slow-down was observed during a 4-week accelerated stability test under 51°C/75%RH storage condition. In Application I, an experimental design was carried out to evaluate the interactions and effects of the design factors on critical quality attribute (CQA) of dissolution upon stability. The design space was studied by design of experiment (DOE) and multivariate analysis to ensure desired dissolution profile and minimal dissolution shift upon stability. Multivariate techniques, such as multi-way principal component analysis (MPCA) of the entire dissolution profiles upon stability, were performed to reveal batch relationships and to evaluate the impact of design factors on dissolution. In Application II, an experiment was conducted to study the impact of varying tablet breaking force on dissolution upon stability utilizing MPCA. It was demonstrated that the use of multivariate methods, defined as Quality by Design (QbD) principles and tools in ICH-Q8 guidance, provides an effective means to achieve a greater understanding of tablet dissolution upon stability. Copyright © 2010 Elsevier B.V. All rights reserved.

  19. Spatio-temporal interpolation of precipitation during monsoon periods in Pakistan

    NASA Astrophysics Data System (ADS)

    Hussain, Ijaz; Spöck, Gunter; Pilz, Jürgen; Yu, Hwa-Lung

    2010-08-01

    Spatio-temporal estimation of precipitation over a region is essential to the modeling of hydrologic processes for water resources management. The changes of magnitude and space-time heterogeneity of rainfall observations make space-time estimation of precipitation a challenging task. In this paper we propose a Box-Cox transformed hierarchical Bayesian multivariate spatio-temporal interpolation method for the skewed response variable. The proposed method is applied to estimate space-time monthly precipitation in the monsoon periods during 1974-2000, and 27-year monthly average precipitation data are obtained from 51 stations in Pakistan. The results of transformed hierarchical Bayesian multivariate spatio-temporal interpolation are compared to those of non-transformed hierarchical Bayesian interpolation by using cross-validation. The software developed by [11] is used for Bayesian non-stationary multivariate space-time interpolation. It is observed that the transformed hierarchical Bayesian method provides more accuracy than the non-transformed hierarchical Bayesian method.

  20. A hybrid clustering approach for multivariate time series - A case study applied to failure analysis in a gas turbine.

    PubMed

    Fontes, Cristiano Hora; Budman, Hector

    2017-11-01

    A clustering problem involving multivariate time series (MTS) requires the selection of similarity metrics. This paper shows the limitations of the PCA similarity factor (SPCA) as a single metric in nonlinear problems where there are differences in magnitude of the same process variables due to expected changes in operation conditions. A novel method for clustering MTS based on a combination between SPCA and the average-based Euclidean distance (AED) within a fuzzy clustering approach is proposed. Case studies involving either simulated or real industrial data collected from a large scale gas turbine are used to illustrate that the hybrid approach enhances the ability to recognize normal and fault operating patterns. This paper also proposes an oversampling procedure to create synthetic multivariate time series that can be useful in commonly occurring situations involving unbalanced data sets. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Embedding of multidimensional time-dependent observations.

    PubMed

    Barnard, J P; Aldrich, C; Gerber, M

    2001-10-01

    A method is proposed to reconstruct dynamic attractors by embedding of multivariate observations of dynamic nonlinear processes. The Takens embedding theory is combined with independent component analysis to transform the embedding into a vector space of linearly independent vectors (phase variables). The method is successfully tested against prediction of the unembedded state vector in two case studies of simulated chaotic processes.

  2. Embedding of multidimensional time-dependent observations

    NASA Astrophysics Data System (ADS)

    Barnard, Jakobus P.; Aldrich, Chris; Gerber, Marius

    2001-10-01

    A method is proposed to reconstruct dynamic attractors by embedding of multivariate observations of dynamic nonlinear processes. The Takens embedding theory is combined with independent component analysis to transform the embedding into a vector space of linearly independent vectors (phase variables). The method is successfully tested against prediction of the unembedded state vector in two case studies of simulated chaotic processes.

  3. "Photographing money" task pricing

    NASA Astrophysics Data System (ADS)

    Jia, Zhongxiang

    2018-05-01

    "Photographing money" [1]is a self-service model under the mobile Internet. The task pricing is reasonable, related to the success of the commodity inspection. First of all, we analyzed the position of the mission and the membership, and introduced the factor of membership density, considering the influence of the number of members around the mission on the pricing. Multivariate regression of task location and membership density using MATLAB to establish the mathematical model of task pricing. At the same time, we can see from the life experience that membership reputation and the intensity of the task will also affect the pricing, and the data of the task success point is more reliable. Therefore, the successful point of the task is selected, and its reputation, task density, membership density and Multiple regression of task positions, according to which a nhew task pricing program. Finally, an objective evaluation is given of the advantages and disadvantages of the established model and solution method, and the improved method is pointed out.

  4. Detecting determinism from point processes.

    PubMed

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  5. Response Surface Modeling Using Multivariate Orthogonal Functions

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; DeLoach, Richard

    2001-01-01

    A nonlinear modeling technique was used to characterize response surfaces for non-dimensional longitudinal aerodynamic force and moment coefficients, based on wind tunnel data from a commercial jet transport model. Data were collected using two experimental procedures - one based on modem design of experiments (MDOE), and one using a classical one factor at a time (OFAT) approach. The nonlinear modeling technique used multivariate orthogonal functions generated from the independent variable data as modeling functions in a least squares context to characterize the response surfaces. Model terms were selected automatically using a prediction error metric. Prediction error bounds computed from the modeling data alone were found to be- a good measure of actual prediction error for prediction points within the inference space. Root-mean-square model fit error and prediction error were less than 4 percent of the mean response value in all cases. Efficacy and prediction performance of the response surface models identified from both MDOE and OFAT experiments were investigated.

  6. Automatic and objective oral cancer diagnosis by Raman spectroscopic detection of keratin with multivariate curve resolution analysis

    NASA Astrophysics Data System (ADS)

    Chen, Po-Hsiung; Shimada, Rintaro; Yabumoto, Sohshi; Okajima, Hajime; Ando, Masahiro; Chang, Chiou-Tzu; Lee, Li-Tzu; Wong, Yong-Kie; Chiou, Arthur; Hamaguchi, Hiro-O.

    2016-01-01

    We have developed an automatic and objective method for detecting human oral squamous cell carcinoma (OSCC) tissues with Raman microspectroscopy. We measure 196 independent Raman spectra from 196 different points of one oral tissue sample and globally analyze these spectra using a Multivariate Curve Resolution (MCR) analysis. Discrimination of OSCC tissues is automatically and objectively made by spectral matching comparison of the MCR decomposed Raman spectra and the standard Raman spectrum of keratin, a well-established molecular marker of OSCC. We use a total of 24 tissue samples, 10 OSCC and 10 normal tissues from the same 10 patients, 3 OSCC and 1 normal tissues from different patients. Following the newly developed protocol presented here, we have been able to detect OSCC tissues with 77 to 92% sensitivity (depending on how to define positivity) and 100% specificity. The present approach lends itself to a reliable clinical diagnosis of OSCC substantiated by the “molecular fingerprint” of keratin.

  7. Detection of Leukemia with Blood Samples Using Raman Spectroscopy and Multivariate Analysis

    NASA Astrophysics Data System (ADS)

    Martínez-Espinosa, J. C.; González-Solís, J. L.; Frausto-Reyes, C.; Miranda-Beltrán, M. L.; Soria-Fregoso, C.; Medina-Valtierra, J.

    2009-06-01

    The use of Raman spectroscopy to analyze blood biochemistry and hence distinguish between normal and abnormal blood was investigated. Blood samples were obtained from 6 patients who were clinically diagnosed with leukemia and 6 healthy volunteers. The imprint was put under the microscope and several points were chosen for Raman measurement. All the spectra were collected by a confocal Raman micro-spectroscopy (Renishaw) with a NIR 830 nm laser. It is shown that the serum samples from patients with leukemia and from the control group can be discriminated when the multivariate statistical methods of principal component analysis (PCA) and linear discriminated analysis (LDA) are applied to their Raman spectra. The ratios of some band intensities were analyzed and some band ratios were significant and corresponded to proteins, phospholipids, and polysaccharides. The preliminary results suggest that Raman Spectroscopy could be a new technique to study the degree of damage to the bone marrow using just blood samples instead of biopsies, treatment very painful for patients.

  8. Climate and Leishmaniasis in French Guiana

    PubMed Central

    Roger, Amaury; Nacher, Mathieu; Hanf, Matthieu; Drogoul, Anne Sophie; Adenis, Antoine; Basurko, Celia; Dufour, Julie; Sainte Marie, Dominique; Blanchet, Denis; Simon, Stephane; Carme, Bernard; Couppié, Pierre

    2013-01-01

    To study the link between climatic variables and the incidence of leishmaniasis a study was conducted in Cayenne, French Guiana. Patients infected between January 1994 and December 2010. Meteorological data were studied in relation to the incidence of leishmaniasis using an ARIMA model. In the final model, the infections were negatively correlated with rainfall (with a 2-month lag) and with the number of days with rainfall > 50 mm (lags of 4 and 7 months). The variables that were positively correlated were temperature and the Multivariate El Niño Southern Oscillation Index with lags of 8 and 4 months, respectively. Significantly greater correlations were observed in March for rainfall and in November for the Multivariate El Niño/Southern Oscillation Index. Climate thus seems to be a non-negligible explanatory variable for the fluctuations of leishmaniasis. A decrease in rainfall is linked to increased cases 2 months later. This easily perceptible point could lead to an interesting prevention message. PMID:23939706

  9. Meal Detection in Patients With Type 1 Diabetes: A New Module for the Multivariable Adaptive Artificial Pancreas Control System.

    PubMed

    Turksoy, Kamuran; Samadi, Sediqeh; Feng, Jianyuan; Littlejohn, Elizabeth; Quinn, Laurie; Cinar, Ali

    2016-01-01

    A novel meal-detection algorithm is developed based on continuous glucose measurements. Bergman's minimal model is modified and used in an unscented Kalman filter for state estimations. The estimated rate of appearance of glucose is used for meal detection. Data from nine subjects are used to assess the performance of the algorithm. The results indicate that the proposed algorithm works successfully with high accuracy. The average change in glucose levels between the meals and the detection points is 16(±9.42) [mg/dl] for 61 successfully detected meals and snacks. The algorithm is developed as a new module of an integrated multivariable adaptive artificial pancreas control system. Meal detection with the proposed method is used to administer insulin boluses and prevent most of postprandial hyperglycemia without any manual meal announcements. A novel meal bolus calculation method is proposed and tested with the UVA/Padova simulator. The results indicate significant reduction in hyperglycemia.

  10. Extraction, isolation, and purification of analytes from samples of marine origin--a multivariate task.

    PubMed

    Liguori, Lucia; Bjørsvik, Hans-René

    2012-12-01

    The development of a multivariate study for a quantitative analysis of six different polybrominated diphenyl ethers (PBDEs) in tissue of Atlantic Salmo salar L. is reported. An extraction, isolation, and purification process based on an accelerated solvent extraction system was designed, investigated, and optimized by means of statistical experimental design and multivariate data analysis and regression. An accompanying gas chromatography-mass spectrometry analytical method was developed for the identification and quantification of the analytes, BDE 28, BDE 47, BDE 99, BDE 100, BDE 153, and BDE 154. These PBDEs have been used in commercial blends that were used as flame-retardants for a variety of materials, including electronic devices, synthetic polymers and textiles. The present study revealed that an extracting solvent mixture composed of hexane and CH₂Cl₂ (10:90) provided excellent recoveries of all of the six PBDEs studied herein. A somewhat lower polarity in the extracting solvent, hexane and CH₂Cl₂ (40:60) decreased the analyte %-recoveries, which still remain acceptable and satisfactory. The study demonstrates the necessity to perform an intimately investigation of the extraction and purification process in order to achieve quantitative isolation of the analytes from the specific matrix. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Performance of the Prognocean Plus system during the El Niño 2015/2016: predictions of sea level anomalies as tools for forecasting El Niño

    NASA Astrophysics Data System (ADS)

    Świerczyńska-Chlaściak, Małgorzata; Niedzielski, Tomasz; Miziński, Bartłomiej

    2017-04-01

    The aim of this paper is to present the performance of the Prognocean Plus system, which produces long-term predictions of sea level anomalies, during the El Niño 2015/2016. The main objective of work is to identify such ocean areas in which long-term forecasts of sea level anomalies during El Niño 2015/2016 reveal a considerable accuracy. At present, the system produces prognoses using four data-based models and their combinations: polynomial-harmonic model, autoregressive model, threshold autoregressive model and multivariate autoregressive model. The system offers weekly forecasts, with lead time up to 12 weeks. Several statistics that describe the efficiency of the available prediction models in four seasons used for estimating Oceanic Niño index (ONI) are calculated. The accuracies/skills of the predicting models were computed in the specific locations in the equatorial Pacific, namely the geometrically-determined central points of all Niño regions. For the said locations, we focused on the forecasts which targeted at the local maximum of sea level, driven by the El Niño 2015/2016. As a result, a series of the "spaghetti" graphs (for each point, season and model) as well as plots presenting the prognostic performance of every model - for all lead times, seasons and locations - were created. It is found that the Prognocean Plus system has a potential to become a new solution which may enhance the diagnostic discussions on the El Niño development. The forecasts produced by the threshold autoregressive model, for lead times of 5-6 weeks and 9 weeks, within the Niño1+2 region for the November-to-January (NDJ) season anticipated the culmination of the El Niño 2015/2016. The longest forecasts (8-12 weeks) were found to be the most accurate in the phase of transition from El Niño to normal conditions (the multivariate autoregressive model, central point of Niño1+2 region, the December-to-February season). The study was conducted to verify the ability and usefulness of sea level anomaly forecasts in predicting phenomena that are controlled by the ocean-atmosphere processes, such as El Niño Southern Oscillation or North Atlantic Oscillation. The results may support further investigations into long-term forecasting of the quantitative indices of these oscillations, solely based on prognoses of sea level change. In particular, comparing the accuracies of prognoses of the North Atlantic Oscillation index remains one of the tasks of the research project no. 2016/21/N/ST10/03231, financed by the National Science Center of Poland.

  12. The PROPKD Score: A New Algorithm to Predict Renal Survival in Autosomal Dominant Polycystic Kidney Disease.

    PubMed

    Cornec-Le Gall, Emilie; Audrézet, Marie-Pierre; Rousseau, Annick; Hourmant, Maryvonne; Renaudineau, Eric; Charasse, Christophe; Morin, Marie-Pascale; Moal, Marie-Christine; Dantal, Jacques; Wehbe, Bassem; Perrichot, Régine; Frouget, Thierry; Vigneau, Cécile; Potier, Jérôme; Jousset, Philippe; Guillodo, Marie-Paule; Siohan, Pascale; Terki, Nazim; Sawadogo, Théophile; Legrand, Didier; Menoyo-Calonge, Victorio; Benarbia, Seddik; Besnier, Dominique; Longuet, Hélène; Férec, Claude; Le Meur, Yannick

    2016-03-01

    The course of autosomal dominant polycystic kidney disease (ADPKD) varies among individuals, with some reaching ESRD before 40 years of age and others never requiring RRT. In this study, we developed a prognostic model to predict renal outcomes in patients with ADPKD on the basis of genetic and clinical data. We conducted a cross-sectional study of 1341 patients from the Genkyst cohort and evaluated the influence of clinical and genetic factors on renal survival. Multivariate survival analysis identified four variables that were significantly associated with age at ESRD onset, and a scoring system from 0 to 9 was developed as follows: being male: 1 point; hypertension before 35 years of age: 2 points; first urologic event before 35 years of age: 2 points; PKD2 mutation: 0 points; nontruncating PKD1 mutation: 2 points; and truncating PKD1 mutation: 4 points. Three risk categories were subsequently defined as low risk (0-3 points), intermediate risk (4-6 points), and high risk (7-9 points) of progression to ESRD, with corresponding median ages for ESRD onset of 70.6, 56.9, and 49 years, respectively. Whereas a score ≤3 eliminates evolution to ESRD before 60 years of age with a negative predictive value of 81.4%, a score >6 forecasts ESRD onset before 60 years of age with a positive predictive value of 90.9%. This new prognostic score accurately predicts renal outcomes in patients with ADPKD and may enable the personalization of therapeutic management of ADPKD. Copyright © 2016 by the American Society of Nephrology.

  13. Recent applications of multivariate data analysis methods in the authentication of rice and the most analyzed parameters: A review.

    PubMed

    Maione, Camila; Barbosa, Rommel Melgaço

    2018-01-24

    Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.

  14. Southeast Atlantic Cloud Properties in a Multivariate Statistical Model - How Relevant is Air Mass History for Local Cloud Properties?

    NASA Astrophysics Data System (ADS)

    Fuchs, Julia; Cermak, Jan; Andersen, Hendrik

    2017-04-01

    This study aims at untangling the impacts of external dynamics and local conditions on cloud properties in the Southeast Atlantic (SEA) by combining satellite and reanalysis data using multivariate statistics. The understanding of clouds and their determinants at different scales is important for constraining the Earth's radiative budget, and thus prominent in climate-system research. In this study, SEA stratocumulus cloud properties are observed not only as the result of local environmental conditions but also as affected by external dynamics and spatial origins of air masses entering the study area. In order to assess to what extent cloud properties are impacted by aerosol concentration, air mass history, and meteorology, a multivariate approach is conducted using satellite observations of aerosol and cloud properties (MODIS, SEVIRI), information on aerosol species composition (MACC) and meteorological context (ERA-Interim reanalysis). To account for the often-neglected but important role of air mass origin, information on air mass history based on HYSPLIT modeling is included in the statistical model. This multivariate approach is intended to lead to a better understanding of the physical processes behind observed stratocumulus cloud properties in the SEA.

  15. Field applications of stand-off sensing using visible/NIR multivariate optical computing

    NASA Astrophysics Data System (ADS)

    Eastwood, DeLyle; Soyemi, Olusola O.; Karunamuni, Jeevanandra; Zhang, Lixia; Li, Hongli; Myrick, Michael L.

    2001-02-01

    12 A novel multivariate visible/NIR optical computing approach applicable to standoff sensing will be demonstrated with porphyrin mixtures as examples. The ultimate goal is to develop environmental or counter-terrorism sensors for chemicals such as organophosphorus (OP) pesticides or chemical warfare simulants in the near infrared spectral region. The mathematical operation that characterizes prediction of properties via regression from optical spectra is a calculation of inner products between the spectrum and the pre-determined regression vector. The result is scaled appropriately and offset to correspond to the basis from which the regression vector is derived. The process involves collecting spectroscopic data and synthesizing a multivariate vector using a pattern recognition method. Then, an interference coating is designed that reproduces the pattern of the multivariate vector in its transmission or reflection spectrum, and appropriate interference filters are fabricated. High and low refractive index materials such as Nb2O5 and SiO2 are excellent choices for the visible and near infrared regions. The proof of concept has now been established for this system in the visible and will later be extended to chemicals such as OP compounds in the near and mid-infrared.

  16. Higher-order Multivariable Polynomial Regression to Estimate Human Affective States

    NASA Astrophysics Data System (ADS)

    Wei, Jie; Chen, Tong; Liu, Guangyuan; Yang, Jiemin

    2016-03-01

    From direct observations, facial, vocal, gestural, physiological, and central nervous signals, estimating human affective states through computational models such as multivariate linear-regression analysis, support vector regression, and artificial neural network, have been proposed in the past decade. In these models, linear models are generally lack of precision because of ignoring intrinsic nonlinearities of complex psychophysiological processes; and nonlinear models commonly adopt complicated algorithms. To improve accuracy and simplify model, we introduce a new computational modeling method named as higher-order multivariable polynomial regression to estimate human affective states. The study employs standardized pictures in the International Affective Picture System to induce thirty subjects’ affective states, and obtains pure affective patterns of skin conductance as input variables to the higher-order multivariable polynomial model for predicting affective valence and arousal. Experimental results show that our method is able to obtain efficient correlation coefficients of 0.98 and 0.96 for estimation of affective valence and arousal, respectively. Moreover, the method may provide certain indirect evidences that valence and arousal have their brain’s motivational circuit origins. Thus, the proposed method can serve as a novel one for efficiently estimating human affective states.

  17. Risk factors for incidental durotomy during lumbar surgery: a retrospective study by multivariate analysis.

    PubMed

    Chen, Zhixiang; Shao, Peng; Sun, Qizhao; Zhao, Dong

    2015-03-01

    The purpose of the present study was to use a prospectively collected data to evaluate the rate of incidental durotomy (ID) during lumbar surgery and determine the associated risk factors by using univariate and multivariate analysis. We retrospectively reviewed 2184 patients who underwent lumbar surgery from January 1, 2009 to December 31, 2011 at a single hospital. Patients with ID (n=97) were compared with the patients without ID (n=2019). The influences of several potential risk factors that might affect the occurrence of ID were assessed using univariate and multivariate analyses. The overall incidence of ID was 4.62%. Univariate analysis demonstrated that older age, diabetes, lumbar central stenosis, posterior approach, revision surgery, prior lumber surgery and minimal invasive surgery are risk factors for ID during lumbar surgery. However, multivariate analysis identified older age, prior lumber surgery, revision surgery, and minimally invasive surgery as independent risk factors. Older age, prior lumber surgery, revision surgery, and minimal invasive surgery were independent risk factors for ID during lumbar surgery. These findings may guide clinicians making future surgical decisions regarding ID and aid in the patient counseling process to alleviate risks and complications. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Higher-order Multivariable Polynomial Regression to Estimate Human Affective States

    PubMed Central

    Wei, Jie; Chen, Tong; Liu, Guangyuan; Yang, Jiemin

    2016-01-01

    From direct observations, facial, vocal, gestural, physiological, and central nervous signals, estimating human affective states through computational models such as multivariate linear-regression analysis, support vector regression, and artificial neural network, have been proposed in the past decade. In these models, linear models are generally lack of precision because of ignoring intrinsic nonlinearities of complex psychophysiological processes; and nonlinear models commonly adopt complicated algorithms. To improve accuracy and simplify model, we introduce a new computational modeling method named as higher-order multivariable polynomial regression to estimate human affective states. The study employs standardized pictures in the International Affective Picture System to induce thirty subjects’ affective states, and obtains pure affective patterns of skin conductance as input variables to the higher-order multivariable polynomial model for predicting affective valence and arousal. Experimental results show that our method is able to obtain efficient correlation coefficients of 0.98 and 0.96 for estimation of affective valence and arousal, respectively. Moreover, the method may provide certain indirect evidences that valence and arousal have their brain’s motivational circuit origins. Thus, the proposed method can serve as a novel one for efficiently estimating human affective states. PMID:26996254

  19. Southern Ocean Mixed-Layer Seasonal and Interannual Variations From Combined Satellite and In Situ Data

    NASA Astrophysics Data System (ADS)

    Buongiorno Nardelli, B.; Guinehut, S.; Verbrugge, N.; Cotroneo, Y.; Zambianchi, E.; Iudicone, D.

    2017-12-01

    The depth of the upper ocean mixed layer provides fundamental information on the amount of seawater that directly interacts with the atmosphere. Its space-time variability modulates water mass formation and carbon sequestration processes related to both the physical and biological pumps. These processes are particularly relevant in the Southern Ocean, where surface mixed-layer depth estimates are generally obtained either as climatological fields derived from in situ observations or through numerical simulations. Here we demonstrate that weekly observation-based reconstructions can be used to describe the variations of the mixed-layer depth in the upper ocean over a range of space and time scales. We compare and validate four different products obtained by combining satellite measurements of the sea surface temperature, salinity, and dynamic topography and in situ Argo profiles. We also compute an ensemble mean and use the corresponding spread to estimate mixed-layer depth uncertainties and to identify the more reliable products. The analysis points out the advantage of synergistic approaches that include in input the sea surface salinity observations obtained through a multivariate optimal interpolation. Corresponding data allow to assess mixed-layer depth seasonal and interannual variability. Specifically, the maximum correlations between mixed-layer anomalies and the Southern Annular Mode are found at different time lags, related to distinct summer/winter responses in the Antarctic Intermediate Water and Sub-Antarctic Mode Waters main formation areas.

  20. A multivariate assessment of changes in wetland habitat for waterbirds at Moosehorn National Wildlife Refuge, Maine, USA

    USGS Publications Warehouse

    Hierl, L.A.; Loftin, C.S.; Longcore, J.R.; McAuley, D.G.; Urban, D.L.

    2007-01-01

    We assessed changes in vegetative structure of 49 impoundments at Moosehorn National Wildlife Refuge (MNWR), Maine, USA, between the periods 1984-1985 to 2002 with a multivariate, adaptive approach that may be useful in a variety of wetland and other habitat management situations. We used Mahalanobis Distance (MD) analysis to classify the refuge?s wetlands as poor or good waterbird habitat based on five variables: percent emergent vegetation, percent shrub, percent open water, relative richness of vegetative types, and an interspersion juxtaposition index that measures adjacency of vegetation patches. Mahalanobis Distance is a multivariate statistic that examines whether a particular data point is an outlier or a member of a data cluster while accounting for correlations among inputs. For each wetland, we used MD analysis to quantify a distance from a reference condition defined a priori by habitat conditions measured in MNWR wetlands used by waterbirds. Twenty-five wetlands declined in quality between the two periods, whereas 23 wetlands improved. We identified specific wetland characteristics that may be modified to improve habitat conditions for waterbirds. The MD analysis seems ideal for instituting an adaptive wetland management approach because metrics can be easily added or removed, ranges of target habitat conditions can be defined by field-collected data, and the analysis can identify priorities for single or multiple management objectives.

  1. Estimating multivariate response surface model with data outliers, case study in enhancing surface layer properties of an aircraft aluminium alloy

    NASA Astrophysics Data System (ADS)

    Widodo, Edy; Kariyam

    2017-03-01

    To determine the input variable settings that create the optimal compromise in response variable used Response Surface Methodology (RSM). There are three primary steps in the RSM problem, namely data collection, modelling, and optimization. In this study focused on the establishment of response surface models, using the assumption that the data produced is correct. Usually the response surface model parameters are estimated by OLS. However, this method is highly sensitive to outliers. Outliers can generate substantial residual and often affect the estimator models. Estimator models produced can be biased and could lead to errors in the determination of the optimal point of fact, that the main purpose of RSM is not reached. Meanwhile, in real life, the collected data often contain some response variable and a set of independent variables. Treat each response separately and apply a single response procedures can result in the wrong interpretation. So we need a development model for the multi-response case. Therefore, it takes a multivariate model of the response surface that is resistant to outliers. As an alternative, in this study discussed on M-estimation as a parameter estimator in multivariate response surface models containing outliers. As an illustration presented a case study on the experimental results to the enhancement of the surface layer of aluminium alloy air by shot peening.

  2. Evaluating the predictive power of multivariate tensor-based morphometry in Alzheimer's disease progression via convex fused sparse group Lasso

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Gajawelli, Niharika; Zhou, Jiayu; Shi, Jie; Ye, Jieping; Wang, Yalin; Lepore, Natasha

    2014-03-01

    Prediction of Alzheimers disease (AD) progression based on baseline measures allows us to understand disease progression and has implications in decisions concerning treatment strategy. To this end we combine a predictive multi-task machine learning method1 with novel MR-based multivariate morphometric surface map of the hippocampus2 to predict future cognitive scores of patients. Previous work by Zhou et al.1 has shown that a multi-task learning framework that performs prediction of all future time points (or tasks) simultaneously can be used to encode both sparsity as well as temporal smoothness. They showed that this can be used in predicting cognitive outcomes of Alzheimers Disease Neuroimaging Initiative (ADNI) subjects based on FreeSurfer-based baseline MRI features, MMSE score demographic information and ApoE status. Whilst volumetric information may hold generalized information on brain status, we hypothesized that hippocampus specific information may be more useful in predictive modeling of AD. To this end, we applied Shi et al.2s recently developed multivariate tensor-based (mTBM) parametric surface analysis method to extract features from the hippocampal surface. We show that by combining the power of the multi-task framework with the sensitivity of mTBM features of the hippocampus surface, we are able to improve significantly improve predictive performance of ADAS cognitive scores 6, 12, 24, 36 and 48 months from baseline.

  3. Prognostic value of indeterminable anaerobic threshold in heart failure.

    PubMed

    Agostoni, Piergiuseppe; Corrà, Ugo; Cattadori, Gaia; Veglia, Fabrizio; Battaia, Elisa; La Gioia, Rocco; Scardovi, Angela B; Emdin, Michele; Metra, Marco; Sinagra, Gianfranco; Limongelli, Giuseppe; Raimondo, Rosa; Re, Federica; Guazzi, Marco; Belardinelli, Romualdo; Parati, Gianfranco; Magrì, Damiano; Fiorentini, Cesare; Cicoira, Mariantonietta; Salvioni, Elisabetta; Giovannardi, Marta; Mezzani, Alessandro; Scrutinio, Domenico; Di Lenarda, Andrea; Mantegazza, Valentina; Ricci, Roberto; Apostolo, Anna; Iorio, Annamaria; Paolillo, Stefania; Palermo, Pietro; Contini, Mauro; Vassanelli, Corrado; Passino, Claudio; Piepoli, Massimo F

    2013-09-01

    In patients with heart failure (HF), during maximal cardiopulmonary exercise test, anaerobic threshold (AT) is not always identified. We evaluated whether this finding has a prognostic meaning. We recruited and prospectively followed up, in 14 dedicated HF units, 3058 patients with systolic (left ventricular ejection fraction <40%) HF in stable clinical conditions, New York Heart Association class I to III, who underwent clinical, laboratory, echocardiographic, and cardiopulmonary exercise test investigations at study enrollment. We excluded 921 patients who did not perform a maximal exercise, based on lack of achievement of anaerobic metabolism (peak respiratory quotient ≤1.05). Primary study end point was a composite of cardiovascular death and urgent cardiac transplant, and secondary end point was all-cause death. Median follow-up was 3.01 (1.39-4.98) years. AT was identified in 1935 out of 2137 patients (90.54%). At multivariable logistic analysis, failure in detecting AT resulted significantly in reduced peak oxygen uptake and higher metabolic exercise and cardiac and kidney index score value, a powerful prognostic composite HF index (P<0.001). At multivariable analysis, the following variables were significantly associated with primary study end point: peak oxygen uptake (% pred; P<0.001; hazard ratio [HR]=0.977; confidence interval [CI]=0.97-0.98), ventilatory efficiency slope (P=0.01; HR=1.02; CI=1.01-1.03), hemoglobin (P<0.05; HR=0.931; CI=0.87-1.00), left ventricular ejection fraction (P<0.001; HR=0.948; CI=0.94-0.96), renal function (modification of diet in renal disease; P<0.001; HR=0.990; CI=0.98-0.99), sodium (P<0.05; HR=0.967; CI=0.94-0.99), and AT nonidentification (P<0.05; HR=1.41; CI=1.06-1.89). Nonidentification of AT remained associated to prognosis also when compared with metabolic exercise and cardiac and kidney index score (P<0.01; HR=1.459; CI=1.09-1.10). Similar results were obtained for the secondary study end point. The inability to identify AT most often occurs in patients with severe HF, and it has an independent prognostic role in HF.

  4. Developing points-based risk-scoring systems in the presence of competing risks.

    PubMed

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  5. Optimal Parameter Exploration for Online Change-Point Detection in Activity Monitoring Using Genetic Algorithms

    PubMed Central

    Khan, Naveed; McClean, Sally; Zhang, Shuai; Nugent, Chris

    2016-01-01

    In recent years, smart phones with inbuilt sensors have become popular devices to facilitate activity recognition. The sensors capture a large amount of data, containing meaningful events, in a short period of time. The change points in this data are used to specify transitions to distinct events and can be used in various scenarios such as identifying change in a patient’s vital signs in the medical domain or requesting activity labels for generating real-world labeled activity datasets. Our work focuses on change-point detection to identify a transition from one activity to another. Within this paper, we extend our previous work on multivariate exponentially weighted moving average (MEWMA) algorithm by using a genetic algorithm (GA) to identify the optimal set of parameters for online change-point detection. The proposed technique finds the maximum accuracy and F_measure by optimizing the different parameters of the MEWMA, which subsequently identifies the exact location of the change point from an existing activity to a new one. Optimal parameter selection facilitates an algorithm to detect accurate change points and minimize false alarms. Results have been evaluated based on two real datasets of accelerometer data collected from a set of different activities from two users, with a high degree of accuracy from 99.4% to 99.8% and F_measure of up to 66.7%. PMID:27792177

  6. Predictors of health-related quality of life in 500 severely obese patients.

    PubMed

    Warkentin, Lindsey M; Majumdar, Sumit R; Johnson, Jeffrey A; Agborsangaya, Calypse B; Rueda-Clausen, Christian F; Sharma, Arya M; Klarenbach, Scott W; Birch, Daniel W; Karmali, Shahzeer; McCargar, Linda; Fassbender, Konrad; Padwal, Raj S

    2014-05-01

    To characterize health-related quality of life (HRQL) impairment in severely obese subjects, using several validated instruments. A cross-sectional analysis of 500 severely obese subjects was completed. Short-Form (SF)-12 [Physical (PCS) and Mental (MCS) component summary scores], EuroQol (EQ)-5D [Index and Visual Analog Scale (VAS)], and Impact of Weight on Quality of Life (IWQOL)-Lite were administered. Multivariable linear regression models were performed to identify independent predictors of HRQL. Increasing BMI was associated with lower PCS (-1.33 points per 5 kg/m(2) heavier; P < 0.001), EQ-index (-0.02; P < 0.001), EQ-VAS (-1.71; P = 0.003), and IWQOL-Lite (-3.72; P = 0.002), but not MCS (P = 0.69). The strongest predictors (all P < 0.005) for impairment in each instrument were: fibromyalgia for PCS (-5.84 points), depression for MCS (-7.49 points), stroke for EQ-index (-0.17 points), less than full-time employment for EQ-VAS (-7.06 points), and coronary disease for IWQOL-Lite (-10.86 points). Chronic pain, depression, and sleep apnea were associated with reduced HRQL using all instruments. The clinical impact of BMI on physical and general HRQL was small, and mental health scores were not associated with BMI. Chronic pain, depression, and sleep apnea were consistently associated with lower HRQL. Copyright © 2014 The Obesity Society.

  7. Multivariate Analysis of the Visual Information Processing of Numbers

    ERIC Educational Resources Information Center

    Levine, David M.

    1977-01-01

    Nonmetric multidimensional scaling and hierarchical clustering procedures are applied to a confusion matrix of numerals. Two dimensions were interpreted: straight versus curved, and locus of curvature. Four major clusters of numerals were developed. (Author/JKS)

  8. Derivation and Validation of a Serum Biomarker Panel to Identify Infants With Acute Intracranial Hemorrhage.

    PubMed

    Berger, Rachel Pardes; Pak, Brian J; Kolesnikova, Mariya D; Fromkin, Janet; Saladino, Richard; Herman, Bruce E; Pierce, Mary Clyde; Englert, David; Smith, Paul T; Kochanek, Patrick M

    2017-06-05

    Abusive head trauma is the leading cause of death from physical abuse. Missing the diagnosis of abusive head trauma, particularly in its mild form, is common and contributes to increased morbidity and mortality. Serum biomarkers may have potential as quantitative point-of-care screening tools to alert physicians to the possibility of intracranial hemorrhage. To identify and validate a set of biomarkers that could be the basis of a multivariable model to identify intracranial hemorrhage in well-appearing infants using the Ziplex System. Binary logistic regression was used to develop a multivariable model incorporating 3 serum biomarkers (matrix metallopeptidase-9, neuron-specific enolase, and vascular cellular adhesion molecule-1) and 1 clinical variable (total hemoglobin). The model was then prospectively validated. Multiplex biomarker measurements were performed using Flow-Thru microarray technology on the Ziplex System, which has potential as a point-of-care system. The model was tested at 3 pediatric emergency departments in level I pediatric trauma centers (Children's Hospital of Pittsburgh of University of Pittsburgh Medical Center, Pittsburgh, Pennsylvania; Primary Children's Hospital, Salt Lake City, Utah; and Lurie Children's Hospital, Chicago, Illinois) among well-appearing infants who presented for care owing to symptoms that placed them at increased risk of abusive head trauma. The study took place from November 2006 to April 2014 at Children's Hospital of Pittsburgh, June 2010 to August 2013 at Primary Children's Hospital, and January 2011 to August 2013 at Lurie Children's Hospital. A mathematical model that can predict acute intracranial hemorrhage in infants at increased risk of abusive head trauma. The multivariable model, Biomarkers for Infant Brain Injury Score, was applied prospectively to 599 patients. The mean (SD) age was 4.7 (3.1) months. Fifty-two percent were boys, 78% were white, and 8% were Hispanic. At a cutoff of 0.182, the model was 89.3% sensitive (95% CI, 87.7-90.4) and 48.0% specific (95% CI, 47.3-48.9) for acute intracranial hemorrhage. Positive and negative predictive values were 21.3% and 95.6%, respectively. The model was neither sensitive nor specific for atraumatic brain abnormalities, isolated skull fractures, or chronic intracranial hemorrhage. The Biomarkers for Infant Brain Injury Score, a multivariable model using 3 serum biomarker concentrations and serum hemoglobin, can identify infants with acute intracranial hemorrhage. Accurate and timely identification of intracranial hemorrhage in infants without a history of trauma in whom trauma may not be part of the differential diagnosis has the potential to decrease morbidity and mortality from abusive head trauma.

  9. Hidden Process Models

    DTIC Science & Technology

    2009-12-18

    cannot be detected with univariate techniques, but require multivariate analysis instead (Kamitani and Tong [2005]). Two other time series analysis ...learning for time series analysis . The historical record of DBNs can be traced back to Dean and Kanazawa [1988] and Dean and Wellman [1991], with...Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: Hidden Process Models, probabilistic time series modeling, functional Magnetic Resonance Imaging

  10. A Multivariate Quality Loss Function Approach for Optimization of Spinning Processes

    NASA Astrophysics Data System (ADS)

    Chakraborty, Shankar; Mitra, Ankan

    2018-05-01

    Recent advancements in textile industry have given rise to several spinning techniques, such as ring spinning, rotor spinning etc., which can be used to produce a wide variety of textile apparels so as to fulfil the end requirements of the customers. To achieve the best out of these processes, they should be utilized at their optimal parametric settings. However, in presence of multiple yarn characteristics which are often conflicting in nature, it becomes a challenging task for the spinning industry personnel to identify the best parametric mix which would simultaneously optimize all the responses. Hence, in this paper, the applicability of a new systematic approach in the form of multivariate quality loss function technique is explored for optimizing multiple quality characteristics of yarns while identifying the ideal settings of two spinning processes. It is observed that this approach performs well against the other multi-objective optimization techniques, such as desirability function, distance function and mean squared error methods. With slight modifications in the upper and lower specification limits of the considered quality characteristics, and constraints of the non-linear optimization problem, it can be successfully applied to other processes in textile industry to determine their optimal parametric settings.

  11. UV-visible-DAD and 1H-NMR spectroscopy data fusion for studying the photodegradation process of azo-dyes using MCR-ALS.

    PubMed

    Fernández, Cristina; Pilar Callao, M; Larrechi, M Soledad

    2013-12-15

    The photodegradation process of three azo-dyes - Acid Orange 61, Acid Red 97 and Acid Brown 425 - was monitored simultaneously by ultraviolet-visible spectroscopy with diode array detector (UV-vis-DAD) and (1)H-nuclear magnetic resonance ((1)H-NMR). Multivariate curve resolution-alternating least squares (MCR-ALS) was applied to obtain the concentration and spectral profile of the chemical compounds involved in the process. The analysis of the H-NMR data suggests there are more intermediate compounds than those obtained with the UV-vis-DAD data. The fusion of UV-vis-DAD and the (1)H-NMR signal before the multivariate analysis provides better results than when only one of the two detector signals was used. It was concluded that three degradation products were present in the medium when the three azo-dyes had practically degraded. This study is the first application of UV-vis-DAD and (1)H-NMR spectroscopy data fusion in this field and illustrates its potential as a quick method for evaluating the evolution of the azo-dye photodegradation process. © 2013 Elsevier B.V. All rights reserved.

  12. Evaluation of multivariate calibration models with different pre-processing and processing algorithms for a novel resolution and quantitation of spectrally overlapped quaternary mixture in syrup

    NASA Astrophysics Data System (ADS)

    Moustafa, Azza A.; Hegazy, Maha A.; Mohamed, Dalia; Ali, Omnia

    2016-02-01

    A novel approach for the resolution and quantitation of severely overlapped quaternary mixture of carbinoxamine maleate (CAR), pholcodine (PHL), ephedrine hydrochloride (EPH) and sunset yellow (SUN) in syrup was demonstrated utilizing different spectrophotometric assisted multivariate calibration methods. The applied methods have used different processing and pre-processing algorithms. The proposed methods were partial least squares (PLS), concentration residuals augmented classical least squares (CRACLS), and a novel method; continuous wavelet transforms coupled with partial least squares (CWT-PLS). These methods were applied to a training set in the concentration ranges of 40-100 μg/mL, 40-160 μg/mL, 100-500 μg/mL and 8-24 μg/mL for the four components, respectively. The utilized methods have not required any preliminary separation step or chemical pretreatment. The validity of the methods was evaluated by an external validation set. The selectivity of the developed methods was demonstrated by analyzing the drugs in their combined pharmaceutical formulation without any interference from additives. The obtained results were statistically compared with the official and reported methods where no significant difference was observed regarding both accuracy and precision.

  13. Impact of “Sick” and “Recovery” Roles on Brain Injury Rehabilitation Outcomes

    PubMed Central

    Barclay, David A.

    2012-01-01

    This study utilizes a multivariate, correlational, expost facto research design to examine Parsons' “sick role” as a dynamic, time-sensitive process of “sick role” and “recovery role” and the impact of this process on goal attainment (H1) and psychosocial distress (H2) of adult survivors of acquired brain injury. Measures used include the Brief Symptom Inventory-18, a Goal Attainment Scale, and an original instrument to measure sick role process. 60 survivors of ABI enrolled in community reentry rehabilitation participated. Stepwise regression analyses did not fully support the multivariate hypotheses. Two models emerged from the stepwise analyses. Goal attainment, gender, and postrehab responsibilities accounted for 40% of the shared variance of psychosocial distress. Anxiety and depression accounted for 22% of the shared variance of goal attainment with anxiety contributing to the majority of the explained variance. Bivariate analysis found sick role variables, anxiety, somatization, depression, gender, and goal attainment as significant. The study has implications for ABI rehabilitation in placing greater emphasis on sick role processes, anxiety, gender, and goal attainment in guiding program planning and future research with survivors of ABI. PMID:23119164

  14. Diagnostic tools for mixing models of stream water chemistry

    USGS Publications Warehouse

    Hooper, Richard P.

    2003-01-01

    Mixing models provide a useful null hypothesis against which to evaluate processes controlling stream water chemical data. Because conservative mixing of end‐members with constant concentration is a linear process, a number of simple mathematical and multivariate statistical methods can be applied to this problem. Although mixing models have been most typically used in the context of mixing soil and groundwater end‐members, an extension of the mathematics of mixing models is presented that assesses the “fit” of a multivariate data set to a lower dimensional mixing subspace without the need for explicitly identified end‐members. Diagnostic tools are developed to determine the approximate rank of the data set and to assess lack of fit of the data. This permits identification of processes that violate the assumptions of the mixing model and can suggest the dominant processes controlling stream water chemical variation. These same diagnostic tools can be used to assess the fit of the chemistry of one site into the mixing subspace of a different site, thereby permitting an assessment of the consistency of controlling end‐members across sites. This technique is applied to a number of sites at the Panola Mountain Research Watershed located near Atlanta, Georgia.

  15. Up-scaling of multi-variable flood loss models from objects to land use units at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Schröter, Kai; Merz, Bruno

    2016-05-01

    Flood risk management increasingly relies on risk analyses, including loss modelling. Most of the flood loss models usually applied in standard practice have in common that complex damaging processes are described by simple approaches like stage-damage functions. Novel multi-variable models significantly improve loss estimation on the micro-scale and may also be advantageous for large-scale applications. However, more input parameters also reveal additional uncertainty, even more in upscaling procedures for meso-scale applications, where the parameters need to be estimated on a regional area-wide basis. To gain more knowledge about challenges associated with the up-scaling of multi-variable flood loss models the following approach is applied: Single- and multi-variable micro-scale flood loss models are up-scaled and applied on the meso-scale, namely on basis of ATKIS land-use units. Application and validation is undertaken in 19 municipalities, which were affected during the 2002 flood by the River Mulde in Saxony, Germany by comparison to official loss data provided by the Saxon Relief Bank (SAB).In the meso-scale case study based model validation, most multi-variable models show smaller errors than the uni-variable stage-damage functions. The results show the suitability of the up-scaling approach, and, in accordance with micro-scale validation studies, that multi-variable models are an improvement in flood loss modelling also on the meso-scale. However, uncertainties remain high, stressing the importance of uncertainty quantification. Thus, the development of probabilistic loss models, like BT-FLEMO used in this study, which inherently provide uncertainty information are the way forward.

  16. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  17. The importance of topographically corrected null models for analyzing ecological point processes.

    PubMed

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  18. SMART-COP: a tool for predicting the need for intensive respiratory or vasopressor support in community-acquired pneumonia.

    PubMed

    Charles, Patrick G P; Wolfe, Rory; Whitby, Michael; Fine, Michael J; Fuller, Andrew J; Stirling, Robert; Wright, Alistair A; Ramirez, Julio A; Christiansen, Keryn J; Waterer, Grant W; Pierce, Robert J; Armstrong, John G; Korman, Tony M; Holmes, Peter; Obrosky, D Scott; Peyrani, Paula; Johnson, Barbara; Hooy, Michelle; Grayson, M Lindsay

    2008-08-01

    Existing severity assessment tools, such as the pneumonia severity index (PSI) and CURB-65 (tool based on confusion, urea level, respiratory rate, blood pressure, and age >or=65 years), predict 30-day mortality in community-acquired pneumonia (CAP) and have limited ability to predict which patients will require intensive respiratory or vasopressor support (IRVS). The Australian CAP Study (ACAPS) was a prospective study of 882 episodes in which each patient had a detailed assessment of severity features, etiology, and treatment outcomes. Multivariate logistic regression was performed to identify features at initial assessment that were associated with receipt of IRVS. These results were converted into a simple points-based severity tool that was validated in 5 external databases, totaling 7464 patients. In ACAPS, 10.3% of patients received IRVS, and the 30-day mortality rate was 5.7%. The features statistically significantly associated with receipt of IRVS were low systolic blood pressure (2 points), multilobar chest radiography involvement (1 point), low albumin level (1 point), high respiratory rate (1 point), tachycardia (1 point), confusion (1 point), poor oxygenation (2 points), and low arterial pH (2 points): SMART-COP. A SMART-COP score of >or=3 points identified 92% of patients who received IRVS, including 84% of patients who did not need immediate admission to the intensive care unit. Accuracy was also high in the 5 validation databases. Sensitivities of PSI and CURB-65 for identifying the need for IRVS were 74% and 39%, respectively. SMART-COP is a simple, practical clinical tool for accurately predicting the need for IRVS that is likely to assist clinicians in determining CAP severity.

  19. Transfers between libration-point orbits in the elliptic restricted problem

    NASA Astrophysics Data System (ADS)

    Hiday-Johnston, L. A.; Howell, K. C.

    1994-04-01

    A strategy is formulated to design optimal time-fixed impulsive transfers between three-dimensional libration-point orbits in the vicinity of the interior L1 libration point of the Sun-Earth/Moon barycenter system. The adjoint equation in terms of rotating coordinates in the elliptic restricted three-body problem is shown to be of a distinctly different form from that obtained in the analysis of trajectories in the two-body problem. Also, the necessary conditions for a time-fixed two-impulse transfer to be optimal are stated in terms of the primer vector. Primer vector theory is then extended to nonoptimal impulsive trajectories in order to establish a criterion whereby the addition of an interior impulse reduces total fuel expenditure. The necessary conditions for the local optimality of a transfer containing additional impulses are satisfied by requiring continuity of the Hamiltonian and the derivative of the primer vector at all interior impulses. Determination of location, orientation, and magnitude of each additional impulse is accomplished by the unconstrained minimization of the cost function using a multivariable search method. Results indicate that substantial savings in fuel can be achieved by the addition of interior impulsive maneuvers on transfers between libration-point orbits.

  20. The Supplemental Nutrition Assistance Program, Food Insecurity, Dietary Quality, and Obesity Among U.S. Adults.

    PubMed

    Nguyen, Binh T; Shuval, Kerem; Bertmann, Farryl; Yaroch, Amy L

    2015-07-01

    We examined whether Supplemental Nutrition Assistance Program (SNAP) participation changes associations between food insecurity, dietary quality, and weight among US adults. We analyzed adult dietary intake data (n = 8333) from the 2003 to 2010 National Health and Nutrition Examination Survey. Bivariate and multivariable methods assessed associations of SNAP participation and 4 levels of food security with diet and weight. Measures of dietary quality were the Healthy Eating Index 2010, total caloric intake, empty calories, and solid fat; weight measures were body mass index (BMI), overweight, and obesity. SNAP participants with marginal food security had lower BMI (1.83 kg/m2; P < .01) and lower probability of obesity (9 percentage points; P < .05). SNAP participants with marginal (3.46 points; P < .01), low (1.98 points; P < .05), and very low (3.84 points; P < .01) food security had better diets, as illustrated by the Healthy Eating Index. Associations between SNAP participation and improved diet and weight were stronger among Whites than Blacks and Hispanics. Our research highlights the role of SNAP in helping individuals who are at risk for food insecurity to obtain a healthier diet and better weight status.

Top