Science.gov

Sample records for aic akaike information

  1. Improving data analysis in herpetology: Using Akaike's information criterion (AIC) to assess the strength of biological hypotheses

    USGS Publications Warehouse

    Mazerolle, M.J.

    2006-01-01

    In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.

  2. Model Selection and Psychological Theory: A Discussion of the Differences between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)

    ERIC Educational Resources Information Center

    Vrieze, Scott I.

    2012-01-01

    This article reviews the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models, given their growing use in theory testing and construction. Theoretical statistical results in regression are discussed, and more important…

  3. Model selection and psychological theory: A discussion of the differences between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)

    PubMed Central

    Vrieze, Scott I.

    2012-01-01

    This article reviews the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models given their growing use in theory testing and construction. We discuss theoretical statistical results in regression and illustrate more important issues with novel simulations involving latent variable models including factor analysis, latent profile analysis, and factor mixture models. Asymptotically, the BIC is consistent, in that it will select the true model if, among other assumptions, the true model is among the candidate models considered. The AIC is not consistent under these circumstances. When the true model is not in the candidate model set the AIC is effcient, in that it will asymptotically choose whichever model minimizes the mean squared error of prediction/estimation. The BIC is not effcient under these circumstances. Unlike the BIC, the AIC also has a minimax property, in that it can minimize the maximum possible risk in finite sample sizes. In sum, the AIC and BIC have quite different properties that require different assumptions, and applied researchers and methodologists alike will benefit from improved understanding of the asymptotic and finite-sample behavior of these criteria. The ultimate decision to use AIC or BIC depends on many factors, including: the loss function employed, the study's methodological design, the substantive research question, and the notion of a true model and its applicability to the study at hand. PMID:22309957

  4. Akaike information criterion to select well-fit resist models

    NASA Astrophysics Data System (ADS)

    Burbine, Andrew; Fryer, David; Sturtevant, John

    2015-03-01

    In the field of model design and selection, there is always a risk that a model is over-fit to the data used to train the model. A model is well suited when it describes the physical system and not the stochastic behavior of the particular data collected. K-fold cross validation is a method to check this potential over-fitting to the data by calibrating with k-number of folds in the data, typically between 4 and 10. Model training is a computationally expensive operation, however, and given a wide choice of candidate models, calibrating each one repeatedly becomes prohibitively time consuming. Akaike information criterion (AIC) is an information-theoretic approach to model selection based on the maximized log-likelihood for a given model that only needs a single calibration per model. It is used in this study to demonstrate model ranking and selection among compact resist modelforms that have various numbers and types of terms to describe photoresist behavior. It is shown that there is a good correspondence of AIC to K-fold cross validation in selecting the best modelform, and it is further shown that over-fitting is, in most cases, not indicated. In modelforms with more than 40 fitting parameters, the size of the calibration data set benefits from additional parameters, statistically validating the model complexity.

  5. Assessing Fit and Dimensionality in Least Squares Metric Multidimensional Scaling Using Akaike's Information Criterion

    ERIC Educational Resources Information Center

    Ding, Cody S.; Davison, Mark L.

    2010-01-01

    Akaike's information criterion is suggested as a tool for evaluating fit and dimensionality in metric multidimensional scaling that uses least squares methods of estimation. This criterion combines the least squares loss function with the number of estimated parameters. Numerical examples are presented. The results from analyses of both simulation…

  6. Linear and curvilinear correlations of brain gray matter volume and density with age using voxel-based morphometry with the Akaike information criterion in 291 healthy children.

    PubMed

    Taki, Yasuyuki; Hashizume, Hiroshi; Thyreau, Benjamin; Sassa, Yuko; Takeuchi, Hikaru; Wu, Kai; Kotozaki, Yuka; Nouchi, Rui; Asano, Michiko; Asano, Kohei; Fukuda, Hiroshi; Kawashima, Ryuta

    2013-08-01

    We examined linear and curvilinear correlations of gray matter volume and density in cortical and subcortical gray matter with age using magnetic resonance images (MRI) in a large number of healthy children. We applied voxel-based morphometry (VBM) and region-of-interest (ROI) analyses with the Akaike information criterion (AIC), which was used to determine the best-fit model by selecting which predictor terms should be included. We collected data on brain structural MRI in 291 healthy children aged 5-18 years. Structural MRI data were segmented and normalized using a custom template by applying the diffeomorphic anatomical registration using exponentiated lie algebra (DARTEL) procedure. Next, we analyzed the correlations of gray matter volume and density with age in VBM with AIC by estimating linear, quadratic, and cubic polynomial functions. Several regions such as the prefrontal cortex, the precentral gyrus, and cerebellum showed significant linear or curvilinear correlations between gray matter volume and age on an increasing trajectory, and between gray matter density and age on a decreasing trajectory in VBM and ROI analyses with AIC. Because the trajectory of gray matter volume and density with age suggests the progress of brain maturation, our results may contribute to clarifying brain maturation in healthy children from the viewpoint of brain structure. PMID:22505237

  7. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.

    PubMed

    Zanini, Andrea; Woodbury, Allan D

    2016-01-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995). PMID:26836200

  8. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Zanini, Andrea; Woodbury, Allan D.

    2016-02-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  9. Model Selection Information Criteria for Non-Nested Latent Class Models.

    ERIC Educational Resources Information Center

    Lin, Ting Hsiang; Dayton, C. Mitchell

    1997-01-01

    The use of these three model selection information criteria for latent class models was studied for nonnested models: (1) Akaike's information criterion (H. Akaike, 1973) (AIC); (2) the Schwarz information (G. Schwarz, 1978) (SIC) criterion; and (3) the Bozdogan version of the AIC (CAIC) (H. Bozdogan, 1987). Situations in which each is preferable…

  10. Multidimensional Rasch Model Information-Based Fit Index Accuracy

    ERIC Educational Resources Information Center

    Harrell-Williams, Leigh M.; Wolfe, Edward W.

    2013-01-01

    Most research on confirmatory factor analysis using information-based fit indices (Akaike information criterion [AIC], Bayesian information criteria [BIC], bias-corrected AIC [AICc], and consistent AIC [CAIC]) has used a structural equation modeling framework. Minimal research has been done concerning application of these indices to item response…

  11. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  12. AIC, BIC, Bayesian evidence against the interacting dark energy model

    NASA Astrophysics Data System (ADS)

    Szydłowski, Marek; Krawiec, Adam; Kurek, Aleksandra; Kamionka, Michał

    2015-01-01

    Recent astronomical observations have indicated that the Universe is in a phase of accelerated expansion. While there are many cosmological models which try to explain this phenomenon, we focus on the interacting CDM model where an interaction between the dark energy and dark matter sectors takes place. This model is compared to its simpler alternative—the CDM model. To choose between these models the likelihood ratio test was applied as well as the model comparison methods (employing Occam's principle): the Akaike information criterion (AIC), the Bayesian information criterion (BIC) and the Bayesian evidence. Using the current astronomical data: type Ia supernova (Union2.1), , baryon acoustic oscillation, the Alcock-Paczynski test, and the cosmic microwave background data, we evaluated both models. The analyses based on the AIC indicated that there is less support for the interacting CDM model when compared to the CDM model, while those based on the BIC indicated that there is strong evidence against it in favor of the CDM model. Given the weak or almost non-existing support for the interacting CDM model and bearing in mind Occam's razor we are inclined to reject this model.

  13. Derivation of 3-D surface deformation from an integration of InSAR and GNSS measurements based on Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Luo, Haipeng; Liu, Yang; Chen, Ting; Xu, Caijun; Wen, Yangmao

    2016-01-01

    We present a new method to derive 3-D surface deformation from an integration of interferometric synthetic aperture radar (InSAR) images and Global Navigation Satellite System (GNSS) observations based on Akaike's Bayesian Information Criterion (ABIC), considering relationship between deformations of neighbouring locations. This method avoids interpolated errors by excluding the interpolation of GNSS into the same spatial resolution as InSAR images and harnesses the data sets and the prior smooth constraints of surface deformation objectively and simultaneously by using ABIC, which were inherently unresolved in previous studies. In particular, we define surface roughness measuring smoothing degree to evaluate the performance of the prior constraints and deduce the formula of the covariance for the estimation errors to estimate the uncertainty of modelled solution. We validate this method using synthetic tests and the 2008 Mw 7.9 Wenchuan earthquake. We find that the optimal weights associated with ABIC minimum are generally at trade-off locations that balance contributions from InSAR, GNSS data sets and the prior constraints. We use this method to evaluate the influence of the interpolated errors from the Ordinary Kriging algorithm on the derivation of surface deformation. Tests show that the interpolated errors may contribute to biasing very large weights imposed on Kriged GNSS data, suggesting that fixing the relative weights is required in this case. We also make a comparison with SISTEM method, indicating that our method allows obtaining better estimations even with sparse GNSS observations. In addition, this method can be generalized to provide a solution for situations where some types of data sets are lacking and can be exploited further to account for data sets such as the integration of displacements along radar lines and offsets along satellite tracks.

  14. Autonomic Intelligent Cyber Sensor (AICS) Version 1.0.1

    SciTech Connect

    2015-03-01

    The Autonomic Intelligent Cyber Sensor (AICS) provides cyber security and industrial network state awareness for Ethernet based control network implementations. The AICS utilizes collaborative mechanisms based on Autonomic Research and a Service Oriented Architecture (SOA) to: 1) identify anomalous network traffic; 2) discover network entity information; 3) deploy deceptive virtual hosts; and 4) implement self-configuring modules. AICS achieves these goals by dynamically reacting to the industrial human-digital ecosystem in which it resides. Information is transported internally and externally on a standards based, flexible two-level communication structure.

  15. Autonomic Intelligent Cyber Sensor (AICS) Version 1.0.1

    2015-03-01

    The Autonomic Intelligent Cyber Sensor (AICS) provides cyber security and industrial network state awareness for Ethernet based control network implementations. The AICS utilizes collaborative mechanisms based on Autonomic Research and a Service Oriented Architecture (SOA) to: 1) identify anomalous network traffic; 2) discover network entity information; 3) deploy deceptive virtual hosts; and 4) implement self-configuring modules. AICS achieves these goals by dynamically reacting to the industrial human-digital ecosystem in which it resides. Information is transportedmore » internally and externally on a standards based, flexible two-level communication structure.« less

  16. Dynamic microphones M-87/AIC and M-101/AIC and earphone H-143/AIC. [for space shuttle

    NASA Technical Reports Server (NTRS)

    Reiff, F. H.

    1975-01-01

    The electrical characteristics of the M-87/AIC and M-101/AIC dynamic microphone and H-143 earphones were tested for the purpose of establishing the relative performance levels of units supplied by four vendors. The microphones and earphones were tested for frequency response, sensitivity, linearity, impedance and noise cancellation. Test results are presented and discussed.

  17. Information criteria and selection of vibration models.

    PubMed

    Ruzek, Michal; Guyader, Jean-Louis; Pézerat, Charles

    2014-12-01

    This paper presents a method of determining an appropriate equation of motion of two-dimensional plane structures like membranes and plates from vibration response measurements. The local steady-state vibration field is used as input for the inverse problem that approximately determines the dispersion curve of the structure. This dispersion curve is then statistically treated with Akaike information criterion (AIC), which compares the experimentally measured curve to several candidate models (equations of motion). The model with the lowest AIC value is then chosen, and the utility of other models can also be assessed. This method is applied to three experimental case studies: A red cedar wood plate for musical instruments, a thick paper subjected to unknown membrane tension, and a thick composite sandwich panel. These three cases give three different situations of a model selection. PMID:25480053

  18. Regularization Parameter Selections via Generalized Information Criterion

    PubMed Central

    Zhang, Yiyun; Li, Runze; Tsai, Chih-Ling

    2009-01-01

    We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrinkage estimators. This approach relies heavily on the choice of regularization parameter, which controls the model complexity. In this paper, we propose employing the generalized information criterion (GIC), encompassing the commonly used Akaike information criterion (AIC) and Bayesian information criterion (BIC), for selecting the regularization parameter. Our proposal makes a connection between the classical variable selection criteria and the regularization parameter selections for the nonconcave penalized likelihood approaches. We show that the BIC-type selector enables identification of the true model consistently, and the resulting estimator possesses the oracle property in the terminology of Fan and Li (2001). In contrast, however, the AIC-type selector tends to overfit with positive probability. We further show that the AIC-type selector is asymptotically loss efficient, while the BIC-type selector is not. Our simulation results confirm these theoretical findings, and an empirical example is presented. Some technical proofs are given in the online supplementary material. PMID:20676354

  19. Automatic picking based on an AR-AIC-costfunction appraoach applied on tele-, regional- and induced seismic datasets

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Meier, Thomas; Cristiano, Luigia

    2015-04-01

    A quick picking procedure is an important tool to process large datasets in seismology. Identifying phases and determining the precise onset times at seismological stations is essential not just for localization procedures but also for seismic body-wave tomography. The automated picking procedure should be fast, robust, precise and consistent. In manual processing the speed and consistency are not guaranteed and therefore unreproducible errors may be introduced, especially for large amounts of data. In this work an offline P- and S-phase picker based on an autoregressive-prediction approach is optimized and applied to different data sets. The onset time can be described as the sum of the event source time, the theoretic travel time according to a reference velocity model and a deviation from the theoretic travel time due to lateral heterogeneity or errors in the source location. With this approach the onset time at each station can be found around the theoretical travel time within a time window smaller than the maximum lateral heterogeneity. Around the theoretic travel time an autoregressive prediction error is calculated from one or several components as characteristic function of the waveform. The minimum of the Akaike-Information-Criteria of the characteristic function identifies the phase. As was shown by Küperkoch et al. (2012), the Akaike-Information-Criteria has the tendency to be too late. Therefore, an additional processing step for precise picking is needed. In the vicinity of the minimum of the Akaike-Information-Criteria a cost function is defined and used to find the optimal estimate of the arrival time. The cost function is composed of the CF and three side conditions. The idea behind the use of a cost function is to find the phase pick in the last minimum before the CF rises due to the phase onset. The final onset time is picked in the minimum of the cost function. The automatic picking procedure is applied on datasets recorded at stations of the

  20. Depth-map and albedo estimation with superior information-theoretic performance

    NASA Astrophysics Data System (ADS)

    Harrison, Adam P.; Joseph, Dileepan

    2015-02-01

    Lambertian photometric stereo (PS) is a seminal computer vision method. However, using depth maps in the image formation model, instead of surface normals as in PS, reduces model parameters by a third, making it preferred from an information-theoretic perspective. The Akaike information criterion (AIC) quantifies this trade-off between goodness of fit and overfitting. Obtaining superior AIC values requires an effective maximum likelihood (ML) depth-map & albedo estimation method. Recently, the authors published an ML estimation method that uses a two-step approach based on PS. While effective, approximations of noise distributions and decoupling of depth-map & albedo estimation have limited its accuracy. Overcoming these limitations, this paper presents an ML method operating directly on images. The previous two-step ML method provides a robust initial solution, which kick starts a new nonlinear estimation process. An innovative formulation of the estimation task, including a separable nonlinear least-squares approach, reduces the computational burden of the optimization process. Experiments demonstrate visual improvements under noisy conditions by avoiding overfitting. As well, a comprehensive analysis shows that refined depth maps & albedos produce superior AIC metrics and enjoy better predictive accuracy than with literature methods. The results indicate that the new method is a promising means for depth-map & albedo estimation with superior information-theoretic performance.

  1. Advances on BYY harmony learning: information theoretic perspective, generalized projection geometry, and independent factor autodetermination.

    PubMed

    Xu, Lei

    2004-07-01

    The nature of Bayesian Ying-Yang harmony learning is reexamined from an information theoretic perspective. Not only its ability for model selection and regularization is explained with new insights, but also discussions are made on its relations and differences from the studies of minimum description length (MDL), Bayesian approach, the bit-back based MDL, Akaike information criterion (AIC), maximum likelihood, information geometry, Helmholtz machines, and variational approximation. Moreover, a generalized projection geometry is introduced for further understanding such a new mechanism. Furthermore, new algorithms are also developed for implementing Gaussian factor analysis (FA) and non-Gaussian factor analysis (NFA) such that selecting appropriate factors is automatically made during parameter learning. PMID:15461081

  2. Mission science value-cost savings from the Advanced Imaging Communication System (AICS)

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1984-01-01

    An Advanced Imaging Communication System (AICS) was proposed in the mid-1970s as an alternative to the Voyager data/communication system architecture. The AICS achieved virtually error free communication with little loss in the downlink data rate by concatenating a powerful Reed-Solomon block code with the Voyager convolutionally coded, Viterbi decoded downlink channel. The clean channel allowed AICS sophisticated adaptive data compression techniques. Both Voyager and the Galileo mission have implemented AICS components, and the concatenated channel itself is heading for international standardization. An analysis that assigns a dollar value/cost savings to AICS mission performance gains is presented. A conservative value or savings of $3 million for Voyager, $4.5 million for Galileo, and as much as $7 to 9.5 million per mission for future projects such as the proposed Mariner Mar 2 series is shown.

  3. A Comparative Study of Information-Based Source Number Estimation Methods and Experimental Validations on Mechanical Systems

    PubMed Central

    Cheng, Wei; Zhang, Zhousuo; Cao, Hongrui; He, Zhengjia; Zhu, Guanwen

    2014-01-01

    This paper investigates one eigenvalue decomposition-based source number estimation method, and three information-based source number estimation methods, namely the Akaike Information Criterion (AIC), Minimum Description Length (MDL) and Bayesian Information Criterion (BIC), and improves BIC as Improved BIC (IBIC) to make it more efficient and easier for calculation. The performances of the abovementioned source number estimation methods are studied comparatively with numerical case studies, which contain a linear superposition case and a both linear superposition and nonlinear modulation mixing case. A test bed with three sound sources is constructed to test the performances of these methods on mechanical systems, and source separation is carried out to validate the effectiveness of the experimental studies. This work can benefit model order selection, complexity analysis of a system, and applications of source separation to mechanical systems for condition monitoring and fault diagnosis purposes. PMID:24776935

  4. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    NASA Astrophysics Data System (ADS)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  5. A novel hybrid dimension reduction technique for undersized high dimensional gene expression data sets using information complexity criterion for cancer classification.

    PubMed

    Pamukçu, Esra; Bozdogan, Hamparsum; Çalık, Sinan

    2015-01-01

    Gene expression data typically are large, complex, and highly noisy. Their dimension is high with several thousand genes (i.e., features) but with only a limited number of observations (i.e., samples). Although the classical principal component analysis (PCA) method is widely used as a first standard step in dimension reduction and in supervised and unsupervised classification, it suffers from several shortcomings in the case of data sets involving undersized samples, since the sample covariance matrix degenerates and becomes singular. In this paper we address these limitations within the context of probabilistic PCA (PPCA) by introducing and developing a new and novel approach using maximum entropy covariance matrix and its hybridized smoothed covariance estimators. To reduce the dimensionality of the data and to choose the number of probabilistic PCs (PPCs) to be retained, we further introduce and develop celebrated Akaike's information criterion (AIC), consistent Akaike's information criterion (CAIC), and the information theoretic measure of complexity (ICOMP) criterion of Bozdogan. Six publicly available undersized benchmark data sets were analyzed to show the utility, flexibility, and versatility of our approach with hybridized smoothed covariance matrix estimators, which do not degenerate to perform the PPCA to reduce the dimension and to carry out supervised classification of cancer groups in high dimensions. PMID:25838836

  6. The T cell-selective IL-2 mutant AIC284 mediates protection in a rat model of Multiple Sclerosis.

    PubMed

    Weishaupt, Andreas; Paulsen, Daniela; Werner, Sandra; Wolf, Nelli; Köllner, Gabriele; Rübsamen-Schaeff, Helga; Hünig, Thomas; Kerkau, Thomas; Beyersdorf, Niklas

    2015-05-15

    Targeting regulatory T cells (Treg cells) with interleukin-2 (IL-2) constitutes a novel therapeutic approach for autoimmunity. As anti-cancer therapy with IL-2 has revealed substantial toxicities a mutated human IL-2 molecule, termed AIC284 (formerly BAY 50-4798), has been developed to reduce these side effects. To assess whether AIC284 is efficacious in autoimmunity, we studied its therapeutic potential in an animal model for Multiple Sclerosis. Treatment of Lewis rats with AIC284 increased Treg cell numbers and protected the rats from Experimental Autoimmune Encephalomyelitis (EAE). AIC284 might, thus, also efficiently prevent progression of autoimmune diseases in humans. PMID:25903730

  7. The Development of the Extended Adolescent Injury Checklist (E-AIC): A Measure for Injury Prevention Program Evaluation

    ERIC Educational Resources Information Center

    Chapman, Rebekah; Buckley, Lisa; Sheehan, Mary

    2011-01-01

    The Extended Adolescent Injury Checklist (E-AIC), a self-report measure of injury based on the model of the Adolescent Injury Checklist (AIC), was developed for use in the evaluation of school-based interventions. The three stages of this development involved focus groups with adolescents and consultations with medical staff, pilot testing of the…

  8. Test procedures, AN/AIC-27 system and component units. [for space shuttle

    NASA Technical Reports Server (NTRS)

    Reiff, F. H.

    1975-01-01

    The AN/AIC-27 (v) intercommunication system is a 30-channel audio distribution which consists of: air crew station units, maintenance station units, and a central control unit. A test procedure for each of the above units and also a test procedure for the system are presented. The intent of the test is to provide data for use in shuttle audio subsystem design.

  9. AIC649 Induces a Bi-Phasic Treatment Response in the Woodchuck Model of Chronic Hepatitis B

    PubMed Central

    Paulsen, Daniela; Weber, Olaf; Ruebsamen-Schaeff, Helga; Tennant, Bud C.; Menne, Stephan

    2015-01-01

    AIC649 has been shown to directly address the antigen presenting cell arm of the host immune defense leading to a regulated cytokine release and activation of T cell responses. In the present study we analyzed the antiviral efficacy of AIC649 as well as its potential to induce functional cure in animal models for chronic hepatitis B. Hepatitis B virus transgenic mice and chronically woodchuck hepatitis virus (WHV) infected woodchucks were treated with AIC649, respectively. In the mouse system AIC649 decreased the hepatitis B virus titer as effective as the “gold standard”, Tenofovir. Interestingly, AIC649-treated chronically WHV infected woodchucks displayed a bi-phasic pattern of response: The marker for functional cure—hepatitis surface antigen—first increased but subsequently decreased even after cessation of treatment to significantly reduced levels. We hypothesize that the observed bi-phasic response pattern to AIC649 treatment reflects a physiologically “concerted”, reconstituted immune response against WHV and therefore may indicate a potential for inducing functional cure in HBV-infected patients. PMID:26656974

  10. AIC649 Induces a Bi-Phasic Treatment Response in the Woodchuck Model of Chronic Hepatitis B.

    PubMed

    Paulsen, Daniela; Weber, Olaf; Ruebsamen-Schaeff, Helga; Tennant, Bud C; Menne, Stephan

    2015-01-01

    AIC649 has been shown to directly address the antigen presenting cell arm of the host immune defense leading to a regulated cytokine release and activation of T cell responses. In the present study we analyzed the antiviral efficacy of AIC649 as well as its potential to induce functional cure in animal models for chronic hepatitis B. Hepatitis B virus transgenic mice and chronically woodchuck hepatitis virus (WHV) infected woodchucks were treated with AIC649, respectively. In the mouse system AIC649 decreased the hepatitis B virus titer as effective as the "gold standard", Tenofovir. Interestingly, AIC649-treated chronically WHV infected woodchucks displayed a bi-phasic pattern of response: The marker for functional cure--hepatitis surface antigen--first increased but subsequently decreased even after cessation of treatment to significantly reduced levels. We hypothesize that the observed bi-phasic response pattern to AIC649 treatment reflects a physiologically "concerted", reconstituted immune response against WHV and therefore may indicate a potential for inducing functional cure in HBV-infected patients. PMID:26656974

  11. Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique

    SciTech Connect

    Glosup, J.G.; Axelrod M.C.

    1994-11-15

    The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of our interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method.

  12. Perturbation of energy metabolism by fatty-acid derivative AIC-47 and imatinib in BCR-ABL-harboring leukemic cells.

    PubMed

    Shinohara, Haruka; Kumazaki, Minami; Minami, Yosuke; Ito, Yuko; Sugito, Nobuhiko; Kuranaga, Yuki; Taniguchi, Kohei; Yamada, Nami; Otsuki, Yoshinori; Naoe, Tomoki; Akao, Yukihiro

    2016-02-01

    In Ph-positive leukemia, imatinib brought marked clinical improvement; however, further improvement is needed to prevent relapse. Cancer cells efficiently use limited energy sources, and drugs targeting cellular metabolism improve the efficacy of therapy. In this study, we characterized the effects of novel anti-cancer fatty-acid derivative AIC-47 and imatinib, focusing on cancer-specific energy metabolism in chronic myeloid leukemia cells. AIC-47 and imatinib in combination exhibited a significant synergic cytotoxicity. Imatinib inhibited only the phosphorylation of BCR-ABL; whereas AIC-47 suppressed the expression of the protein itself. Both AIC-47 and imatinib modulated the expression of pyruvate kinase M (PKM) isoforms from PKM2 to PKM1 through the down-regulation of polypyrimidine tract-binding protein 1 (PTBP1). PTBP1 functions as alternative splicing repressor of PKM1, resulting in expression of PKM2, which is an inactive form of pyruvate kinase for the last step of glycolysis. Although inactivation of BCR-ABL by imatinib strongly suppressed glycolysis, compensatory fatty-acid oxidation (FAO) activation supported glucose-independent cell survival by up-regulating CPT1C, the rate-limiting FAO enzyme. In contrast, AIC-47 inhibited the expression of CPT1C and directly fatty-acid metabolism. These findings were also observed in the CD34(+) fraction of Ph-positive acute lymphoblastic leukemia cells. These results suggest that AIC-47 in combination with imatinib strengthened the attack on cancer energy metabolism, in terms of both glycolysis and compensatory activation of FAO. PMID:26607903

  13. Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique

    SciTech Connect

    Glosup, J.G.; Axelrod, M.C.

    1994-08-12

    The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method. The problem involves a probability model for underwater noise due to distant shipping.

  14. Tightening the Noose on LMXB Formation of MSPs: Need for AIC ?

    NASA Astrophysics Data System (ADS)

    Grindlay, J. E.; Yi, I.

    1997-12-01

    The origin of millisecond pulsars (MSPs) remains an outstanding problem despite the early and considerable evidence that they are the descendents of neutron stars spun up by accretion in low mass x-ray binaries (LMXBs). The route to MSPs from LMXBs may pass through the high luminosity Z-source LMXBs but is (severely) limited by the very limited population (and apparent birth rate) of Z-sources available. The more numerous x-ray bursters, the Atoll sources, are likely to (still) be short in numbers or birth rate but are now also found to be likely inefficient in the spin-up torques they can provide: the accretion in these relatively low accretion rate systems is likely dominated by an advection dominated flow in which matter accretes onto the NS via sub-Keplerian flows which then transfer correspondingly less angular momentum to the NS. We investigate the implications of the possible ADAF flows in low luminosity NS-LMXBs and find it is unlikely they can produce MSPs. The standard model can still be allowed if most NS-LMXBs are quiescent and undergo transient-like outbursts similar to the soft x-ray transients (which mostly contain black holes). However, apart from Cen X-4 and Aql X-1, few such systems have been found and the SXTs appear instead to be significantly deficient in NS systems. Direct production of MSPs by the accretion induced collapse (AIC) of white dwarfs has been previously suggested to solve the MSP vs. LMXB birth rate problem. We re-examine AIC models in light of the new constraints on direct LMXB production and the additional difficulty imposed by ADAF flows and constraints on SXT populations and derive constraints on the progenitor WD spin and magnetic fields.

  15. Predicting the potential distribution of invasive exotic species using GIS and information-theoretic approaches: A case of ragweed (Ambrosia artemisiifolia L.) distribution in China

    USGS Publications Warehouse

    Chen, H.; Chen, L.; Albright, T.P.

    2007-01-01

    Invasive exotic species pose a growing threat to the economy, public health, and ecological integrity of nations worldwide. Explaining and predicting the spatial distribution of invasive exotic species is of great importance to prevention and early warning efforts. We are investigating the potential distribution of invasive exotic species, the environmental factors that influence these distributions, and the ability to predict them using statistical and information-theoretic approaches. For some species, detailed presence/absence occurrence data are available, allowing the use of a variety of standard statistical techniques. However, for most species, absence data are not available. Presented with the challenge of developing a model based on presence-only information, we developed an improved logistic regression approach using Information Theory and Frequency Statistics to produce a relative suitability map. This paper generated a variety of distributions of ragweed (Ambrosia artemisiifolia L.) from logistic regression models applied to herbarium specimen location data and a suite of GIS layers including climatic, topographic, and land cover information. Our logistic regression model was based on Akaike's Information Criterion (AIC) from a suite of ecologically reasonable predictor variables. Based on the results we provided a new Frequency Statistical method to compartmentalize habitat-suitability in the native range. Finally, we used the model and the compartmentalized criterion developed in native ranges to "project" a potential distribution onto the exotic ranges to build habitat-suitability maps. ?? Science in China Press 2007.

  16. AN/AIC-22(V) Intercommunications Set (ICS) fiber optic link engineering analysis report

    NASA Astrophysics Data System (ADS)

    Minter, Richard; Blocksom, Roland; Ling, Christopher

    1990-08-01

    Electromagnetic interference (EMI) problems constitute a serious threat to operational Navy aircraft systems. The application of fiber optic technology is a potential solution to these problems. EMI reported problems in the P-3 patrol aircraft AN/AIC-22(V) Intercommunications System (ICS) were selected from an EMI problem database for investigation and possible application of fiber optic technology. A proof-of-concept experiment was performed to demonstrate the level of EMI immunity of fiber optics when used in an ICS. A full duplex single channel fiber optic audio link was designed and assembled from modified government furnished equipment (GFE) previously used in another Navy fiber optic application. The link was taken to the Naval Air Test Center (NATC) Patuxent River, Maryland and temporarily installed in a Naval Research Laboratory (NRL) P-3A aircraft for a side-by-side comparison test with the installed ICS. With regards to noise reduction, the fiber optic link provided a qualitative improvement over the conventional ICS. In an effort to obtain a quantitative measure of comparison, audio frequency range both with and without operation of the aircraft VHF and UHF radio transmitters.

  17. The characteristic of correspondence analysis estimator to estimate latent variable model method using high-dimensional AIC

    NASA Astrophysics Data System (ADS)

    Bambang Avip Priatna, M.; Lukman, Sumiaty, Encum

    2016-02-01

    This paper aims to determine the properties of Correspondence Analysis (CA) estimator to estimate latent variable models. The method used is the High-Dimensional AIC (HAIC) method with simulation of Bernoulli distribution data. Stages are: (1) determine the matrix CA; (2) create a model of the CA estimator to estimate the latent variables by using HAIC; (3) simulated the Bernoulli distribution data with repetition 1,000,748 times. The simulation results show the CA estimator models work well.

  18. Perceived challenges and attitudes to regimen and product selection from Italian haemophilia treaters: the 2013 AICE survey.

    PubMed

    Franchini, M; Coppola, A; Rocino, A; Zanon, E; Morfini, M; Accorsi, Arianna; Aru, Anna Brigida; Biasoli, Chiara; Cantori, Isabella; Castaman, Giancarlo; Cesaro, Simone; Ciabatta, Carlo; De Cristofaro, Raimondo; Delios, Grazia; Di Minno, Giovanni; D'Incà, Marco; Dragani, Alfredo; Ettorre, Cosimo Pietro; Gagliano, Fabio; Gamba, Gabriella; Gandini, Giorgio; Giordano, Paola; Giuffrida, Gaetano; Gresele, Paolo; Latella, Caterina; Luciani, Matteo; Margaglione, Maurizio; Marietta, Marco; Mazzucconi, Maria Gabriella; Messina, Maria; Molinari, Angelo Claudio; Notarangelo, Lucia Dora; Oliovecchio, Emily; Peyvandi, Flora; Piseddu, Gavino; Rossetti, Gina; Rossi, Vincenza; Santagostino, Elena; Schiavoni, Mario; Schinco, Piercarla; Serino, Maria Luisa; Tagliaferri, Annarita; Testa, Sophie

    2014-03-01

    Despite great advances in haemophilia care in the last 20 years, a number of questions on haemophilia therapy remain unanswered. These debated issues primarily involve the choice of the product type (plasma-derived vs. recombinant) for patients with different characteristics: specifically, if they were infected by blood-borne virus infections, and if they bear high or low risk of inhibitor development. In addition, the most appropriate treatment regimen in non-inhibitor and inhibitor patients compel physicians operating at the haemophilia treatment centres (HTCs) to take important therapeutic decisions, which are often based on their personal clinical experience rather than on evidence-based recommendations from published literature data. To know the opinion on the most controversial aspects in haemophilia care of Italian expert physicians, who are responsible for common clinical practice and therapeutic decisions, we have conducted a survey among the Directors of HTCs affiliated to the Italian Association of Haemophilia Centres (AICE). A questionnaire, consisting of 19 questions covering the most important topics related to haemophilia treatment, was sent to the Directors of all 52 Italian HTCs. Forty Directors out of 52 (76.9%) responded, accounting for the large majority of HTCs affiliated to the AICE throughout Italy. The results of this survey provide for the first time a picture of the attitudes towards clotting factor concentrate use and product selection of clinicians working at Italian HTCs. PMID:24533954

  19. Teacher's Corner: Conducting Specification Searches with Amos

    ERIC Educational Resources Information Center

    Schumacker, Randall E.

    2006-01-01

    Amos 5.0 (Arbuckle, 2003) permits exploratory specification searches for the best theoretical model given an initial model using the following fit function criteria: chi-square (C), chi-square--df (C--df), Akaike Information Criteria (AIC), Browne-Cudeck criterion (BCC), Bayes Information Criterion (BIC) , chi-square divided by the degrees of…

  20. Egg distributions and the information a solitary parasitoid has and uses for its oviposition decisions.

    PubMed

    Hemerik, Lia; van der Hoeven, Nelly; van Alphen, Jacques J M

    2002-01-01

    Approximately three decades ago the question was first answered "whether parasitoids are able to assess the number or origin of eggs in a host" for a solitary parasitoid, Leptopilina heterotoma, by fitting theoretically derived distributions to empirical ones. We extend the set of different theoretically postulated distributions of eggs among hosts by combining searching modes and abilities in assessing host quality. In the models, parasitoids search either randomly (Poisson) (1) or by vibrotaxis (Negative Binomial) (2). Parasitoids are: (a) assumed to treat all hosts equally, (b) able to distinguish them in unparasitised and parasitised hosts only, (c) able to distinguish them by the number of eggs they contained, or (d) able to recognise their own eggs. Mathematically tractable combinations of searching mode (1 and 2) and abilities (a,b,c,d) result in seven different models (M1a, M1b, M1c, M1d, M2a, M2b and M2c). These models have been simulated for a varying number of searching parasitoids and various mean numbers of eggs per host. Each resulting distribution is fitted to all theoretical models. The model with the minimum Akaike's information criterion (AIC) is chosen as the best fitting for each simulated distribution. We thus investigate the power of the AIC and for each distribution with a specified mean number of eggs per host we derive a frequency distribution for classification. Firstly, we discuss the simulations of models including random search (M1a, M1b, M1c and M1d). For M1a, M1c and M1d the simulated distributions are correctly classified in at least 70% of all cases. However, in a few cases model M1b is only properly classified for intermediate mean values of eggs per host. The models including vibrotaxis as searching behaviour (M2a, M2b and M2c) cannot be distinguished from those with random search if the mean number of eggs per host is low. Among the models incorporating vibrotaxis the three abilities are detected analogously as in models with

  1. The role of multicollinearity in landslide susceptibility assessment by means of Binary Logistic Regression: comparison between VIF and AIC stepwise selection

    NASA Astrophysics Data System (ADS)

    Cama, Mariaelena; Cristi Nicu, Ionut; Conoscenti, Christian; Quénéhervé, Geraldine; Maerker, Michael

    2016-04-01

    Landslide susceptibility can be defined as the likelihood of a landslide occurring in a given area on the basis of local terrain conditions. In the last decades many research focused on its evaluation by means of stochastic approaches under the assumption that 'the past is the key to the future' which means that if a model is able to reproduce a known landslide spatial distribution, it will be able to predict the future locations of new (i.e. unknown) slope failures. Among the various stochastic approaches, Binary Logistic Regression (BLR) is one of the most used because it calculates the susceptibility in probabilistic terms and its results are easily interpretable from a geomorphological point of view. However, very often not much importance is given to multicollinearity assessment whose effect is that the coefficient estimates are unstable, with opposite sign and therefore difficult to interpret. Therefore, it should be evaluated every time in order to make a model whose results are geomorphologically correct. In this study the effects of multicollinearity in the predictive performance and robustness of landslide susceptibility models are analyzed. In particular, the multicollinearity is estimated by means of Variation Inflation Index (VIF) which is also used as selection criterion for the independent variables (VIF Stepwise Selection) and compared to the more commonly used AIC Stepwise Selection. The robustness of the results is evaluated through 100 replicates of the dataset. The study area selected to perform this analysis is the Moldavian Plateau where landslides are among the most frequent geomorphological processes. This area has an increasing trend of urbanization and a very high potential regarding the cultural heritage, being the place of discovery of the largest settlement belonging to the Cucuteni Culture from Eastern Europe (that led to the development of the great complex Cucuteni-Tripyllia). Therefore, identifying the areas susceptible to

  2. Model weights and the foundations of multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2006-01-01

    Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.

  3. Information-theoretic model selection and model averaging for closed-population capture-recapture studies

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1998-01-01

    Specification of an appropriate model is critical to valid stalistical inference. Given the "true model" for the data is unknown, the goal of model selection is to select a plausible approximating model that balances model bias and sampling variance. Model selection based on information criteria such as AIC or its variant AICc, or criteria like CAIC, has proven useful in a variety of contexts including the analysis of open-population capture-recapture data. These criteria have not been intensively evaluated for closed-population capture-recapture models, which are integer parameter models used to estimate population size (N), and there is concern that they will not perform well. To address this concern, we evaluated AIC, AICc, and CAIC model selection for closed-population capture-recapture models by empirically assessing the quality of inference for the population size parameter N. We found that AIC-, AICc-, and CAIC-selected models had smaller relative mean squared errors than randomly selected models, but that confidence interval coverage on N was poor unless unconditional variance estimates (which incorporate model uncertainty) were used to compute confidence intervals. Overall, AIC and AICc outperformed CAIC, and are preferred to CAIC for selection among the closed-population capture-recapture models we investigated. A model averaging approach to estimation, using AIC. AICc, or CAIC to estimate weights, was also investigated and proved superior to estimation using AIC-, AICc-, or CAIC-selected models. Our results suggested that, for model averaging, AIC or AICc. should be favored over CAIC for estimating weights.

  4. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  5. Time series ARIMA models for daily price of palm oil

    NASA Astrophysics Data System (ADS)

    Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu

    2015-02-01

    Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.

  6. Model selection for multi-component frailty models.

    PubMed

    Ha, Il Do; Lee, Youngjo; MacKenzie, Gilbert

    2007-11-20

    Various frailty models have been developed and are now widely used for analysing multivariate survival data. It is therefore important to develop an information criterion for model selection. However, in frailty models there are several alternative ways of forming a criterion and the particular criterion chosen may not be uniformly best. In this paper, we study an Akaike information criterion (AIC) on selecting a frailty structure from a set of (possibly) non-nested frailty models. We propose two new AIC criteria, based on a conditional likelihood and an extended restricted likelihood (ERL) given by Lee and Nelder (J. R. Statist. Soc. B 1996; 58:619-678). We compare their performance using well-known practical examples and demonstrate that the two criteria may yield rather different results. A simulation study shows that the AIC based on the ERL is recommended, when attention is focussed on selecting the frailty structure rather than the fixed effects. PMID:17476647

  7. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  8. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    Many statistical tools have been developed for evaluating, understanding, and comparing models, from both frequentist and Bayesian perspectives. In particular, the problem of model selection can be addressed according to whether the primary goal is explanation or, alternatively, prediction. In the former case, the criteria for model selection are defined over the parameter space whose physical interpretation can be difficult; in the latter case, they are defined over the space of the observations, which has a more direct physical meaning. In the frequentist approaches, model selection is generally based on an asymptotic approximation which may be poor for small data sets (e.g. the F-test, the Kolmogorov-Smirnov test, etc.); moreover, these methods often apply under specific assumptions on models (e.g. models have to be nested in the likelihood ratio test). In the Bayesian context, among the criteria for explanation, the ratio of the observed marginal densities for two competing models, named Bayes Factor (BF), is commonly used for both model choice and model averaging (Kass and Raftery, J. Am. Stat. Ass., 1995). But BF does not apply to improper priors and, even when the prior is proper, it is not robust to the specification of the prior. These limitations can be extended to two famous penalized likelihood methods as the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), since they are proved to be approximations of ‑2log BF . In the perspective that a model is as good as its predictions, the predictive information criteria aim at evaluating the predictive accuracy of Bayesian models or, in other words, at estimating expected out-of-sample prediction error using a bias-correction adjustment of within-sample error (Gelman et al., Stat. Comput., 2014). In particular, the Watanabe criterion is fully Bayesian because it averages the predictive distribution over the posterior distribution of parameters rather than conditioning on a point

  9. AICE Survey of USSR Air Pollution Literature, Volume 13: Technical Papers from the Leningrad International Symposium on the Meteorological Aspects of Atmospheric Pollution, Part 2.

    ERIC Educational Resources Information Center

    Nuttonson, M. Y., Ed.

    Twelve papers were translated from Russian: Automation of Information Processing Involved in Experimental Studies of Atmospheric Diffusion, Micrometeorological Characteristics of Atmospheric Pollution Conditions, Study of theInfluence of Irregularities of the Earth's Surface on the Air Flow Characteristics in a Wind Tunnel, Use of Parameters of…

  10. Variable selection with stepwise and best subset approaches

    PubMed Central

    2016-01-01

    While purposeful selection is performed partly by software and partly by hand, the stepwise and best subset approaches are automatically performed by software. Two R functions stepAIC() and bestglm() are well designed for stepwise and best subset regression, respectively. The stepAIC() function begins with a full or null model, and methods for stepwise regression can be specified in the direction argument with character values “forward”, “backward” and “both”. The bestglm() function begins with a data frame containing explanatory variables and response variables. The response variable should be in the last column. Varieties of goodness-of-fit criteria can be specified in the IC argument. The Bayesian information criterion (BIC) usually results in more parsimonious model than the Akaike information criterion. PMID:27162786

  11. Chemical shift prediction for protein structure calculation and quality assessment using an optimally parameterized force field

    PubMed Central

    Nielsen, Jakob T.; Eghbalnia, Hamid R.; Nielsen, Niels Chr.

    2011-01-01

    The exquisite sensitivity of chemical shifts as reporters of structural information, and the ability to measure them routinely and accurately, gives great import to formulations that elucidate the structure-chemical-shift relationship. Here we present a new and highly accurate, precise, and robust formulation for the prediction of NMR chemical shifts from protein structures. Our approach, shAIC (shift prediction guided by Akaikes Information Criterion), capitalizes on mathematical ideas and an information-theoretic principle, to represent the functional form of the relationship between structure and chemical shift as a parsimonious sum of smooth analytical potentials which optimally takes into account short-, medium-, and long-range parameters in a nuclei-specific manner to capture potential chemical shift perturbations caused by distant nuclei. shAIC outperforms the state-of-the-art methods that use analytical formulations. Moreover, for structures derived by NMR or structures with novel folds, shAIC delivers better overall results; even when it is compared to sophisticated machine learning approaches. shAIC provides for a computationally lightweight implementation that is unimpeded by molecular size, making it an ideal for use as a force field. PMID:22293396

  12. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  13. A hybrid model to simulate the annual runoff of the Kaidu River in northwest China

    NASA Astrophysics Data System (ADS)

    Xu, Jianhua; Chen, Yaning; Bai, Ling; Xu, Yiwen

    2016-04-01

    Fluctuant and complicated hydrological processes can result in the uncertainty of runoff forecasting. Thus, it is necessary to apply the multi-method integrated modeling approaches to simulate runoff. Integrating the ensemble empirical mode decomposition (EEMD), the back-propagation artificial neural network (BPANN) and the nonlinear regression equation, we put forward a hybrid model to simulate the annual runoff (AR) of the Kaidu River in northwest China. We also validate the simulated effects by using the coefficient of determination (R2) and the Akaike information criterion (AIC) based on the observed data from 1960 to 2012 at the Dashankou hydrological station. The average absolute and relative errors show the high simulation accuracy of the hybrid model. R2 and AIC both illustrate that the hybrid model has a much better performance than the single BPANN. The hybrid model and integrated approach elicited by this study can be applied to simulate the annual runoff of similar rivers in northwest China.

  14. Clinical-dosimetric relationship between lacrimal gland dose and ocular toxicity after intensity-modulated radiotherapy for sinonasal tumours

    PubMed Central

    Batth, S S; Sreeraman, R; Dienes, E; Beckett, L A; Daly, M E; Cui, J; Mathai, M; Purdy, J A

    2013-01-01

    Objective: To characterise the relationship between lacrimal gland dose and ocular toxicity among patients treated by intensity-modulated radiotherapy (IMRT) for sinonasal tumours. Methods: 40 patients with cancers involving the nasal cavity and paranasal sinuses were treated with IMRT to a median dose of 66.0 Gy. Toxicity was scored using the Radiation Therapy Oncology Group morbidity criteria based on conjunctivitis, corneal ulceration and keratitis. The paired lacrimal glands were contoured as organs at risk, and the mean dose, maximum dose, V10, V20 and V30 were determined. Statistical analysis was performed using logistic regression and the Akaike information criterion (AIC). Results: The maximum and mean dose to the ipsilateral lacrimal gland were 19.2 Gy (range, 1.4–75.4 Gy) and 14.5 Gy (range, 11.1–67.8 Gy), respectively. The mean V10, V20 and V30 values were 50%, 25% and 17%, respectively. The incidence of acute and late Grade 3+ toxicities was 23% and 19%, respectively. Based on logistic regression and AIC, the maximum dose to the ipsilateral lacrimal gland was identified as a more significant predictor of acute toxicity (AIC, 53.89) and late toxicity (AIC, 32.94) than the mean dose (AIC, 56.13 and 33.83, respectively). The V20 was identified as the most significant predictor of late toxicity (AIC, 26.81). Conclusion: A dose–response relationship between maximum dose to the lacrimal gland and ocular toxicity was established. Our data suggesting a threshold relationship may be useful in establishing dosimetric guidelines for IMRT planning that may decrease the risk of acute and late lacrimal toxicities in the future. Advances in knowledge: A threshold relationship between radiation dose to the lacrimal gland and ocular toxicity was demonstrated, which may aid in treatment planning and reducing the morbidity of radiotherapy for sinonasal tumours. PMID:24167183

  15. Rubber yield prediction by meteorological conditions using mixed models and multi-model inference techniques.

    PubMed

    Golbon, Reza; Ogutu, Joseph Ochieng; Cotter, Marc; Sauerborn, Joachim

    2015-12-01

    Linear mixed models were developed and used to predict rubber (Hevea brasiliensis) yield based on meteorological conditions to which rubber trees had been exposed for periods ranging from 1 day to 2 months prior to tapping events. Predictors included a range of moving averages of meteorological covariates spanning different windows of time before the date of the tapping events. Serial autocorrelation in the latex yield measurements was accounted for using random effects and a spatial generalization of the autoregressive error covariance structure suited to data sampled at irregular time intervals. Information theoretics, specifically the Akaike information criterion (AIC), AIC corrected for small sample size (AICc), and Akaike weights, was used to select models with the greatest strength of support in the data from a set of competing candidate models. The predictive performance of the selected best model was evaluated using both leave-one-out cross-validation (LOOCV) and an independent test set. Moving averages of precipitation, minimum and maximum temperature, and maximum relative humidity with a 30-day lead period were identified as the best yield predictors. Prediction accuracy expressed in terms of the percentage of predictions within a measurement error of 5 g for cross-validation and also for the test dataset was above 99 %. PMID:25824122

  16. Rubber yield prediction by meteorological conditions using mixed models and multi-model inference techniques

    NASA Astrophysics Data System (ADS)

    Golbon, Reza; Ogutu, Joseph Ochieng; Cotter, Marc; Sauerborn, Joachim

    2015-12-01

    Linear mixed models were developed and used to predict rubber ( Hevea brasiliensis) yield based on meteorological conditions to which rubber trees had been exposed for periods ranging from 1 day to 2 months prior to tapping events. Predictors included a range of moving averages of meteorological covariates spanning different windows of time before the date of the tapping events. Serial autocorrelation in the latex yield measurements was accounted for using random effects and a spatial generalization of the autoregressive error covariance structure suited to data sampled at irregular time intervals. Information theoretics, specifically the Akaike information criterion (AIC), AIC corrected for small sample size (AICc), and Akaike weights, was used to select models with the greatest strength of support in the data from a set of competing candidate models. The predictive performance of the selected best model was evaluated using both leave-one-out cross-validation (LOOCV) and an independent test set. Moving averages of precipitation, minimum and maximum temperature, and maximum relative humidity with a 30-day lead period were identified as the best yield predictors. Prediction accuracy expressed in terms of the percentage of predictions within a measurement error of 5 g for cross-validation and also for the test dataset was above 99 %.

  17. Does weather confound or modify the association of particulate air pollution with mortality? An analysis of the Philadelphia data, 1973--1980

    SciTech Connect

    Samet, J.; Zeger, S.; Kelsall, J.; Xu, J.; Kalkstein, L.

    1998-04-01

    This report considers the consequences of using alternative approaches to controlling for weather and explores modification of air pollution effects by weather, as weather patterns could plausibly alter air pollution`s effect on health. The authors analyzed 1973--1980 total mortality data for Philadelphia using four weather models and compared estimates of the effects of TSP and SO{sub 2} on mortality using a Poisson regression model. Two synoptic categories developed by Kalkstein were selected--The Temporal Synoptic Index (TSI) and the Spatial Synoptic Classification (SSC)--and compared with (1) descriptive models developed by Schwartz and Dockery (S-D); and (2) LOESS, a nonparametric function of the previous day`s temperature and dew point. The authors considered model fit using Akaike`s Information Criterion (AIC) and changes in the estimated effects of TSP and SO{sub 2}. In the full-year analysis, S-D is better than LOESS at predicting mortality, and S-D and LOESS are better than TSI, as measured by AIC. When TSP or SO{sub 2} was fit alone, the results were qualitatively similar, regardless of how weather was controlled; when TSP and SO{sub 2} were fit simultaneously, the S-D and LOESS models give qualitatively different results than TSI, which attributes more of the pollution effect to SO{sub 2} than to TSP. Model fit is substantially poorer with TSI.

  18. Algorithm for systematic peak extraction from atomic pair distribution functions.

    PubMed

    Granlund, L; Billinge, S J L; Duxbury, P M

    2015-07-01

    The study presents an algorithm, ParSCAPE, for model-independent extraction of peak positions and intensities from atomic pair distribution functions (PDFs). It provides a statistically motivated method for determining parsimony of extracted peak models using the information-theoretic Akaike information criterion (AIC) applied to plausible models generated within an iterative framework of clustering and chi-square fitting. All parameters the algorithm uses are in principle known or estimable from experiment, though careful judgment must be applied when estimating the PDF baseline of nanostructured materials. ParSCAPE has been implemented in the Python program SrMise. Algorithm performance is examined on synchrotron X-ray PDFs of 16 bulk crystals and two nanoparticles using AIC-based multimodeling techniques, and particularly the impact of experimental uncertainties on extracted models. It is quite resistant to misidentification of spurious peaks coming from noise and termination effects, even in the absence of a constraining structural model. Structure solution from automatically extracted peaks using the Liga algorithm is demonstrated for 14 crystals and for C60. Special attention is given to the information content of the PDF, theory and practice of the AIC, as well as the algorithm's limitations. PMID:26131896

  19. [Species-abundance distribution patterns along succession series of Phyllostachys glauca forest in a limestone mountain].

    PubMed

    Shi, Jian-min; Fan, Cheng-fang; Liu, Yang; Yang, Qing-pei; Fang, Kai; Fan, Fang-li; Yang, Guang-yao

    2015-12-01

    To detect the ecological process of the succession series of Phyllostachys glauca forest in a limestone mountain, five niche models, i.e., broken stick model (BSM), niche preemption model (NPM), dominance preemption model (DPM), random assortment model (RAM) and overlap- ping niche model (ONM) were employed to describe the species-abundance distribution patterns (SDPs) of 15 samples. χ² test and Akaike information criterion (AIC) were used to test the fitting effects of the five models. The results showed that the optimal SDP models for P. glauca forest, bamboo-broadleaved mixed forest and broadleaved forest were DPM (χ² = 35.86, AIC = -69.77), NPM (χ² = 1.60, AIC = -94.68) and NPM (χ² = 0.35, AIC = -364.61), respectively. BSM also well fitted the SDP of bamboo-broadleaved mixed forest and broad-leaved forest, while it was unsuitable to describe the SDP of P. glauca forest. The fittings of RAM and ONM in the three forest types were all rejected by the χ² test and AIC. With the development of community succession from P. glauca forest to broadleaved forest, the species richness and evenness increased, and the optimal SDP model changed from DPM to NPM. It was inferred that the change of ecological process from habitat filtration to interspecific competition was the main driving force of the forest succession. The results also indicated that the application of multiple SDP models and test methods would be beneficial to select the best model and deeply understand the ecological process of community succession. PMID:27111994

  20. Relating body condition to inorganic contaminant concentrations of diving ducks wintering in coastal California

    USGS Publications Warehouse

    Takekawa, J.Y.; Wainwright-De La Cruz, S.E.; Hothem, R.L.; Yee, J.

    2002-01-01

    In wild waterfowl, poor winter body condition may negatively affect migration, survival, and reproduction. Environmental contaminants have been shown to adversely affect the body condition of captive birds, but few field studies have examined body condition and contaminants in wild birds during the winter. We assessed the body condition of carcasses from a collection of canvasbacks (Aythya valisineria) and lesser (A. affinis) and greater scaup (A. marila) wintering in coastal California. We used Akaike information criterion (AIC) to select the model with the best balance of parsimony and goodness of fit that related indices of body condition with concentrations of Cd, Cu, Hg, Se, and Zn. Total ash-free protein in canvasbacks decreased with increasing Se concentrations, and pancreas mass decreased with increasing Hg. We combined the closely related lesser and greater scaup in analyses and found that total carcass fat, pancreas mass, and carcass mass decreased with increasing Zn concentrations, and pancreas mass decreased with increasing Hg. Our AIC analysis indicated that some indices of body condition in diving ducks were inversely related to some environmental contaminants in this collection, but additional AIC analyses should be conducted across a wider range of contaminant concentrations to corroborate our findings.

  1. Relating body condition to inorganic contaminant concentrations of diving ducks wintering in coastal California.

    PubMed

    Takekawa, J Y; Wainwright-De La Cruz, S E; Hothem, R L; Yee, J

    2002-01-01

    In wild waterfowl, poor winter body condition may negatively affect migration, survival, and reproduction. Environmental contaminants have been shown to adversely affect the body condition of captive birds, but few field studies have examined body condition and contaminants in wild birds during the winter. We assessed the body condition of carcasses from a collection of canvasbacks (Aythya valisineria) and lesser (A. affinis) and greater scaup (A. marila) wintering in coastal California. We used Akaike information criterion (AIC) to select the model with the best balance of parsimony and goodness of fit that related indices of body condition with concentrations of Cd, Cu, Hg, Se, and Zn. Total ash-free protein in canvasbacks decreased with increasing Se concentrations, and pancreas mass decreased with increasing Hg. We combined the closely related lesser and greater scaup in analyses and found that total carcass fat, pancreas mass, and carcass mass decreased with increasing Zn concentrations, and pancreas mass decreased with increasing Hg. Our AIC analysis indicated that some indices of body condition in diving ducks were inversely related to some environmental contaminants in this collection, but additional AIC analyses should be conducted across a wider range of contaminant concentrations to corroborate our findings. PMID:11706369

  2. Robust automatic P-phase picking: an on-line implementation in the analysis of broadband seismogram recordings

    NASA Astrophysics Data System (ADS)

    Sleeman, Reinoud; van Eck, Torild

    1999-06-01

    The onset of a seismic signal is determined through joint AR modeling of the noise and the seismic signal, and the application of the Akaike Information Criterion (AIC) using the onset time as parameter. This so-called AR-AIC phase picker has been tested successfully and implemented on the Z-component of the broadband station HGN to provide automatic P-phase picks for a rapid warning system. The AR-AIC picker is shown to provide accurate and robust automatic picks on a large experimental database. Out of 1109 P-phase onsets with signal-to-noise ratio (SNR) above 1 from local, regional and teleseismic earthquakes, our implementation detects 71% and gives a mean difference with manual picks of 0.1 s. An optimal version of the well-established picker of Baer and Kradolfer [Baer, M., Kradolfer, U., An automatic phase picker for local and teleseismic events, Bull. Seism. Soc. Am. 77 (1987) 1437-1445] detects less than 41% and gives a mean difference with manual picks of 0.3 s using the same dataset.

  3. Comparative Study of Four Growth Models Applied to Weight and Height Growth Data in a Cohort of US Children from Birth to 9 Years

    PubMed Central

    Regnault, N.; Gillman, M. W.; Kleinman, K.; Rifas-Shiman, S.; Botton, J.

    2016-01-01

    Background/Aims The objective of our study was to compare the fit of four growth models for weight and height in contemporary US children between birth and 9 years. Methods In Project Viva, we collected weight and height growth data between birth and 9 years. We compared the Jenss model, the adapted Jenss model that adds a quadratic term, and the Reed 1st and 2nd order models. We used the log likelihood ratio test to compare nested models and the Akaike (AIC)/Bayesian information criterion (BIC) to compare nonnested models. Results For weight and height, the adapted Jenss model had a better fit than the Jenss model (for weight: p < 0.0001), and the Reed 2nd order model had a better fit than the Reed 1st order model (for weight: p < 0.0001). Compared with the Reed 2nd order model, the adapted Jenss model had a better fit for both weight (adapted Jenss vs. Reed 2nd order, AIC: 66,974 vs. 82,791, BIC: 67,066 vs. 82,883) and height (adapted Jenss vs. Reed 2nd order, AIC: 87,108 vs. 87,612, BIC: 87,196 vs. 87,700). Conclusions In this pre-birth study of children aged 0–9 years, for both weight and height the adapted Jenss model presented the best fit of all four tested models. PMID:25413655

  4. Comparing Smoothing Techniques for Fitting the Nonlinear Effect of Covariate in Cox Models

    PubMed Central

    Roshani, Daem; Ghaderi, Ebrahim

    2016-01-01

    Background and Objective: Cox model is a popular model in survival analysis, which assumes linearity of the covariate on the log hazard function, While continuous covariates can affect the hazard through more complicated nonlinear functional forms and therefore, Cox models with continuous covariates are prone to misspecification due to not fitting the correct functional form for continuous covariates. In this study, a smooth nonlinear covariate effect would be approximated by different spline functions. Material and Methods: We applied three flexible nonparametric smoothing techniques for nonlinear covariate effect in the Cox models: penalized splines, restricted cubic splines and natural splines. Akaike information criterion (AIC) and degrees of freedom were used to smoothing parameter selection in penalized splines model. The ability of nonparametric methods was evaluated to recover the true functional form of linear, quadratic and nonlinear functions, using different simulated sample sizes. Data analysis was carried out using R 2.11.0 software and significant levels were considered 0.05. Results: Based on AIC, the penalized spline method had consistently lower mean square error compared to others to selection of smoothed parameter. The same result was obtained with real data. Conclusion: Penalized spline smoothing method, with AIC to smoothing parameter selection, was more accurate in evaluate of relation between covariate and log hazard function than other methods. PMID:27041809

  5. Measure the Semantic Similarity of GO Terms Using Aggregate Information Content.

    PubMed

    Song, Xuebo; Li, Lin; Srimani, Pradip K; Yu, Philip S; Wang, James Z

    2014-01-01

    The rapid development of gene ontology (GO) and huge amount of biomedical data annotated by GO terms necessitate computation of semantic similarity of GO terms and, in turn, measurement of functional similarity of genes based on their annotations. In this paper we propose a novel and efficient method to measure the semantic similarity of GO terms. The proposed method addresses the limitations in existing GO term similarity measurement techniques; it computes the semantic content of a GO term by considering the information content of all of its ancestor terms in the graph. The aggregate information content (AIC) of all ancestor terms of a GO term implicitly reflects the GO term's location in the GO graph and also represents how human beings use this GO term and all its ancestor terms to annotate genes. We show that semantic similarity of GO terms obtained by our method closely matches the human perception. Extensive experimental studies show that this novel method also outperforms all existing methods in terms of the correlation with gene expression data. We have developed web services for measuring semantic similarity of GO terms and functional similarity of genes using the proposed AIC method and other popular methods. These web services are available at http://bioinformatics.clemson.edu/G-SESAME. PMID:26356015

  6. Ground surface paleotemperature reconstruction using information measures and empirical Bayes

    NASA Astrophysics Data System (ADS)

    Woodbury, Allan D.; Ferguson, Grant

    2006-03-01

    We outline an empirical Bayesian approach to ground-surface temperature (GST) reconstruction that utilizes Akaike's Bayesian information criterion (ABIC). Typical unknown statistical quantities, such as the noise variance and so on, are automatically determined through the analysis. We compare the ABIC inversion to the singular value decomposition on a synthetic downhole temperature data set. In comparing the root mean square errors between the synthetic climatic signal and each of the reconstructions (singular value and ABIC) from 1900 to 2002, we see that the ABIC solution produced the `best' reconstruction in a mean square sense. We also carry out an analysis of the Canadian borehole data set in which we use 221 temperature profiles. The reconstructed GST record shows warming between 1800 and 1949 of approximately 1.0 K, with the maximum rate of warming occurring between 1900 and 1949.

  7. The optimum order of a Markov chain model for daily rainfall in Nigeria

    NASA Astrophysics Data System (ADS)

    Jimoh, O. D.; Webster, P.

    1996-11-01

    Markov type models are often used to describe the occurrence of daily rainfall. Although models of Order 1 have been successfully employed, there remains uncertainty concerning the optimum order for such models. This paper is concerned with estimation of the optimum order of Markov chains and, in particular, the use of objective criteria of the Akaike and Bayesian Information Criteria (AIC and BIC, respectively). Using daily rainfall series for five stations in Nigeria, it has been found that the AIC and BIC estimates vary with month as well as the value of the rainfall threshold used to define a wet day. There is no apparent system to this variation, although AIC estimates are consistently greater than or equal to BIC estimates, with values of the latter limited to zero or unity. The optimum order is also investigated through generation of synthetic sequences of wet and dry days using the transition matrices of zero-, first- and second-order Markov chains. It was found that the first-order model is superior to the zero-order model in representing the characteristics of the historical sequence as judged using frequency duration curves. There was no discernible difference between the model performance for first- and second-order models. There was no seasonal varation in the model performance, which contrasts with the optimum models identified using AIC and BIC estimates. It is concluded that caution is needed with the use of objective criteria for determining the optimum order of the Markov model and that the use of frequency duration curves can provide a robust alternative method of model identification. Comments are also made on the importance of record length and non-stationarity for model identification

  8. Acceleration of the universe: a reconstruction of the effective equation of state

    NASA Astrophysics Data System (ADS)

    Mukherjee, Ankan

    2016-04-01

    The present work is based upon a parametric reconstruction of the effective or total equation of state in a model for the universe with accelerated expansion. The constraints on the model parameters are obtained by maximum likelihood analysis using the supernova distance modulus data, observational Hubble data, baryon acoustic oscillation data and cosmic microwave background shift parameter data. For statistical comparison, the same analysis has also been carried out for the wCDM dark energy model. Different model selection criteria (Akaike information criterion (AIC)) and (Bayesian Information Criterion (BIC)) give the clear indication that the reconstructed model is well consistent with the wCDM model. Then both the models (weff(z) model and wCDM model) have also been presented through (q0,j0) parameter space. Tighter constraint on the present values of dark energy equation of state parameter (wDE(z = 0)) and cosmological jerk (j0) have been achieved for the reconstructed model.

  9. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  10. Spot counting on fluorescence in situ hybridization in suspension images using Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Liu, Sijia; Sa, Ruhan; Maguire, Orla; Minderman, Hans; Chaudhary, Vipin

    2015-03-01

    Cytogenetic abnormalities are important diagnostic and prognostic criteria for acute myeloid leukemia (AML). A flow cytometry-based imaging approach for FISH in suspension (FISH-IS) was established that enables the automated analysis of several log-magnitude higher number of cells compared to the microscopy-based approaches. The rotational positioning can occur leading to discordance between spot count. As a solution of counting error from overlapping spots, in this study, a Gaussian Mixture Model based classification method is proposed. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) of GMM are used as global image features of this classification method. Via Random Forest classifier, the result shows that the proposed method is able to detect closely overlapping spots which cannot be separated by existing image segmentation based spot detection methods. The experiment results show that by the proposed method we can obtain a significant improvement in spot counting accuracy.

  11. Acceleration of the universe: a reconstruction of the effective equation of state

    NASA Astrophysics Data System (ADS)

    Mukherjee, Ankan

    2016-07-01

    The present work is based upon a parametric reconstruction of the effective or total equation of state in a model for the universe with accelerated expansion. The constraints on the model parameters are obtained by maximum likelihood analysis using the supernova distance modulus data, observational Hubble data, baryon acoustic oscillation data and cosmic microwave background shift parameter data. For statistical comparison, the same analysis has also been carried out for the wCDM dark energy model. Different model selection criteria (Akaike information criterion (AIC)) and (Bayesian Information Criterion (BIC)) give the clear indication that the reconstructed model is well consistent with the wCDM model. Then both the models (w_{eff}(z) model and wCDM model) have also been presented through (q_0 ,j_0 ) parameter space. Tighter constraint on the present values of dark energy equation of state parameter (w_{DE}(z = 0)) and cosmological jerk (j_0) have been achieved for the reconstructed model.

  12. Power-law ansatz in complex systems: Excessive loss of information.

    PubMed

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring. PMID:26764792

  13. Power-law ansatz in complex systems: Excessive loss of information

    NASA Astrophysics Data System (ADS)

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.

  14. EFFECT OF DIET QUALITY ON NUTRIENT ALLOCATION TO THE TEST AND ARISTOTLE’S LANTERN IN THE SEA URCHIN LYTECHINUS VARIEGATUS (LAMARCK, 1816)

    PubMed Central

    Heflin, Laura Elizabeth; Gibbs, Victoria K; Powell, Mickie L; Makowsky, Robert; Lawrence, Addison L; Lawrence, John M

    2014-01-01

    Small adult (19.50 ± 2.01g wet weight) Lytechinus variegatus were fed eight formulated diets with different protein (12 to 36% dry weight as fed) and carbohydrate (21 to 39 % dry weight) levels. Each sea urchin (n = 8 per treatment) was fed a daily ration of 1.5% of the average body weight of all individuals for 9 weeks. Akaike information criterion scores were used to compare six different dietary composition hypotheses for eight growth measurements. For each physical growth response, different mathematical models representing a priori hypotheses were compared using the Akaike Information Criterion (AIC) score. The AIC is one of many information-theoretic approaches that allows for direct comparison of non-nested models with varying number of parameters. Dietary protein level and protein: energy ratio were the best models for prediction of test diameter increase. Dietary protein level was the best model of test with spines wet weight gain and test with spines dry matter production. When the Aristotle’s lantern was corrected for size of the test, there was an inverse relationship with dietary protein level. Log transformed lantern to test with spines index was also best associated with the dietary protein model. Dietary carbohydrate level was a poor predictor for growth parameters. However, the protein × carbohydrate interaction model was the best model of organic content (% dry weight) of the test without spines. These data suggest that there is a differential allocation of resources when dietary protein is limiting and the test with spines, but not the Aristotle’s lantern, is affected by availability of dietary nutrients. PMID:25431520

  15. Cluster Analysis and Gaussian Mixture Estimation of Correlated Time-Series by Means of Multi-dimensional Scaling

    NASA Astrophysics Data System (ADS)

    Ibuki, Takero; Suzuki, Sei; Inoue, Jun-ichi

    We investigate cross-correlations between typical Japanese stocks collected through Yahoo!Japan website ( http://finance.yahoo.co.jp/ ). By making use of multi-dimensional scaling (MDS) for the cross-correlation matrices, we draw two-dimensional scattered plots in which each point corresponds to each stock. To make a clustering for these data plots, we utilize the mixture of Gaussians to fit the data set to several Gaussian densities. By minimizing the so-called Akaike Information Criterion (AIC) with respect to parameters in the mixture, we attempt to specify the best possible mixture of Gaussians. It might be naturally assumed that all the two-dimensional data points of stocks shrink into a single small region when some economic crisis takes place. The justification of this assumption is numerically checked for the empirical Japanese stock data, for instance, those around 11 March 2011.

  16. Particle-size distribution models for the conversion of Chinese data to FAO/USDA system.

    PubMed

    Shangguan, Wei; Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r (2), Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best. PMID:25121108

  17. A model of multisecond timing behaviour under peak-interval procedures.

    PubMed

    Hasegawa, Takayuki; Sakata, Shogo

    2015-04-01

    In this study, the authors developed a fundamental theory of interval timing behaviour, inspired by the learning-to-time (LeT) model and the scalar expectancy theory (SET) model, and based on quantitative analyses of such timing behaviour. Our experiments used the peak-interval procedure with rats. The proposed model of timing behaviour comprises clocks, a regulator, a mixer, a response, and memory. Using our model, we calculated the basic clock speeds indicated by the subjects' behaviour under such peak procedures. In this model, the scalar property can be defined as a kind of transposition, which can then be measured quantitatively. The Akaike information criterion (AIC) values indicated that the current model fit the data slightly better than did the SET model. Our model may therefore provide a useful addition to SET for the analysis of timing behaviour. PMID:25539685

  18. Particle-Size Distribution Models for the Conversion of Chinese Data to FAO/USDA System

    PubMed Central

    Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r 2, Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best. PMID:25121108

  19. Bivariate copula in fitting rainfall data

    NASA Astrophysics Data System (ADS)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  20. Correlation between the phase of the moon and the occurrences of microearthquakes in the Tamba region through point-process modeling

    NASA Astrophysics Data System (ADS)

    Iwata, Takaki; Katao, Hiroshi

    2006-04-01

    We study the correlation between the phase of the moon and the occurrence of microearthquakes in the Tamba region, close to the fault of the 1995 Kobe earthquake. The existence of the correlation during the two-year period following the Kobe earthquake was suggested in a previous study. First, in this study, we investigate the statistical significance of such correlation. Using point-process modeling and AIC (Akaike Information Criterion), we confirm that the existence of the correlation is statistically significant. Second, we investigate the temporal variation of the correlation during the four-year period following the Kobe earthquake. The result of the second analysis indicates that the correlation is strongest just after the Kobe earthquake and that it then becomes weaker year by year.

  1. A K-BKZ Formulation for Soft-Tissue Viscoelasticity

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.; Diethelm, Kai

    2005-01-01

    A viscoelastic model of the K-BKZ (Kaye 1962; Bernstein et al. 1963) type is developed for isotropic biological tissues, and applied to the fat pad of the human heel. To facilitate this pursuit, a class of elastic solids is introduced through a novel strain-energy function whose elements possess strong ellipticity, and therefore lead to stable material models. The standard fractional-order viscoelastic (FOV) solid is used to arrive at the overall elastic/viscoelastic structure of the model, while the elastic potential via the K-BKZ hypothesis is used to arrive at the tensorial structure of the model. Candidate sets of functions are proposed for the elastic and viscoelastic material functions present in the model, including a regularized fractional derivative that was determined to be the best. The Akaike information criterion (AIC) is advocated for performing multi-model inference, enabling an objective selection of the best material function from within a candidate set.

  2. Thermal Signature Identification System (TheSIS)

    NASA Technical Reports Server (NTRS)

    Merritt, Scott; Bean, Brian

    2015-01-01

    We characterize both nonlinear and high order linear responses of fiber-optic and optoelectronic components using spread spectrum temperature cycling methods. This Thermal Signature Identification System (TheSIS) provides much more detail than conventional narrowband or quasi-static temperature profiling methods. This detail allows us to match components more thoroughly, detect subtle reversible shifts in performance, and investigate the cause of instabilities or irreversible changes. In particular, we create parameterized models of athermal fiber Bragg gratings (FBGs), delay line interferometers (DLIs), and distributed feedback (DFB) lasers, then subject the alternative models to selection via the Akaike Information Criterion (AIC). Detailed pairing of components, e.g. FBGs, is accomplished by means of weighted distance metrics or norms, rather than on the basis of a single parameter, such as center wavelength.

  3. Seasonal fractional integrated time series models for rainfall data in Nigeria

    NASA Astrophysics Data System (ADS)

    Yaya, Olaoluwa S.; Fashae, Olutoyin A.

    2015-04-01

    Rainfall variability, seasonality and extremity have a lot of consequences in planning and decision making of every sphere of human endeavour especially in Nigeria where majority of agricultural practices and planning is dependent on rainfed agriculture. For this reason, an extensive understanding of rainfall regime is an important prerequisite in such planning. We approach this work using time series approach. Seasonality and possibility of long-term dependence in rainfall data are considered, and these have significant effects in explaining the distribution of rainfall in each state of the six geopolitical zones of Nigeria. The estimated seasonal autoregressive fractionally integrated moving average (SARFIMA) model for each of the six rainfall zones was found to perform better in predicting rainfall distribution than the corresponding seasonal autoregressive moving average (SARMA) model in terms of minimum Akaike information criterion (AIC) and other model diagnostic measures.

  4. Method for identifying electromagnetically induced transparency in a tunable circuit quantum electrodynamics system

    NASA Astrophysics Data System (ADS)

    Liu, Qi-Chun; Li, Tie-Fu; Luo, Xiao-Qing; Zhao, Hu; Xiong, Wei; Zhang, Ying-Shan; Chen, Zhen; Liu, J. S.; Chen, Wei; Nori, Franco; Tsai, J. S.; You, J. Q.

    2016-05-01

    Electromagnetically induced transparency (EIT) has been realized in atomic systems, but fulfilling the EIT conditions for artificial atoms made from superconducting circuits is a more difficult task. Here we report an experimental observation of the EIT in a tunable three-dimensional transmon by probing the cavity transmission. To fulfill the EIT conditions, we tune the transmon to adjust its damping rates by utilizing the effect of the cavity on the transmon states. From the experimental observations, we clearly identify the EIT and Autler-Townes splitting (ATS) regimes as well as the transition regime in between. Also, the experimental data demonstrate that the threshold ΩAIC determined by the Akaike information criterion can describe the EIT-ATS transition better than the threshold ΩEIT given by the EIT theory.

  5. Bayesian decision tree for the classification of the mode of motion in single-molecule trajectories.

    PubMed

    Türkcan, Silvan; Masson, Jean-Baptiste

    2013-01-01

    Membrane proteins move in heterogeneous environments with spatially (sometimes temporally) varying friction and with biochemical interactions with various partners. It is important to reliably distinguish different modes of motion to improve our knowledge of the membrane architecture and to understand the nature of interactions between membrane proteins and their environments. Here, we present an analysis technique for single molecule tracking (SMT) trajectories that can determine the preferred model of motion that best matches observed trajectories. The method is based on Bayesian inference to calculate the posteriori probability of an observed trajectory according to a certain model. Information theory criteria, such as the Bayesian information criterion (BIC), the Akaike information criterion (AIC), and modified AIC (AICc), are used to select the preferred model. The considered group of models includes free Brownian motion, and confined motion in 2nd or 4th order potentials. We determine the best information criteria for classifying trajectories. We tested its limits through simulations matching large sets of experimental conditions and we built a decision tree. This decision tree first uses the BIC to distinguish between free Brownian motion and confined motion. In a second step, it classifies the confining potential further using the AIC. We apply the method to experimental Clostridium Perfingens [Formula: see text]-toxin (CP[Formula: see text]T) receptor trajectories to show that these receptors are confined by a spring-like potential. An adaptation of this technique was applied on a sliding window in the temporal dimension along the trajectory. We applied this adaptation to experimental CP[Formula: see text]T trajectories that lose confinement due to disaggregation of confining domains. This new technique adds another dimension to the discussion of SMT data. The mode of motion of a receptor might hold more biologically relevant information than the diffusion

  6. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  7. Empirical extensions of the lasso penalty to reduce the false discovery rate in high-dimensional Cox regression models.

    PubMed

    Ternès, Nils; Rotolo, Federico; Michiels, Stefan

    2016-07-10

    Correct selection of prognostic biomarkers among multiple candidates is becoming increasingly challenging as the dimensionality of biological data becomes higher. Therefore, minimizing the false discovery rate (FDR) is of primary importance, while a low false negative rate (FNR) is a complementary measure. The lasso is a popular selection method in Cox regression, but its results depend heavily on the penalty parameter λ. Usually, λ is chosen using maximum cross-validated log-likelihood (max-cvl). However, this method has often a very high FDR. We review methods for a more conservative choice of λ. We propose an empirical extension of the cvl by adding a penalization term, which trades off between the goodness-of-fit and the parsimony of the model, leading to the selection of fewer biomarkers and, as we show, to the reduction of the FDR without large increase in FNR. We conducted a simulation study considering null and moderately sparse alternative scenarios and compared our approach with the standard lasso and 10 other competitors: Akaike information criterion (AIC), corrected AIC, Bayesian information criterion (BIC), extended BIC, Hannan and Quinn information criterion (HQIC), risk information criterion (RIC), one-standard-error rule, adaptive lasso, stability selection, and percentile lasso. Our extension achieved the best compromise across all the scenarios between a reduction of the FDR and a limited raise of the FNR, followed by the AIC, the RIC, and the adaptive lasso, which performed well in some settings. We illustrate the methods using gene expression data of 523 breast cancer patients. In conclusion, we propose to apply our extension to the lasso whenever a stringent FDR with a limited FNR is targeted. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26970107

  8. The evaluation of different forest structural indices to predict the stand aboveground biomass of even-aged Scotch pine (Pinus sylvestris L.) forests in Kunduz, Northern Turkey.

    PubMed

    Ercanli, İlker; Kahriman, Aydın

    2015-03-01

    We assessed the effect of stand structural diversity, including the Shannon, improved Shannon, Simpson, McIntosh, Margelef, and Berger-Parker indices, on stand aboveground biomass (AGB) and developed statistical prediction models for the stand AGB values, including stand structural diversity indices and some stand attributes. The AGB prediction model, including only stand attributes, accounted for 85 % of the total variance in AGB (R (2)) with an Akaike's information criterion (AIC) of 807.2407, Bayesian information criterion (BIC) of 809.5397, Schwarz Bayesian criterion (SBC) of 818.0426, and root mean square error (RMSE) of 38.529 Mg. After inclusion of the stand structural diversity into the model structure, considerable improvement was observed in statistical accuracy, including 97.5 % of the total variance in AGB, with an AIC of 614.1819, BIC of 617.1242, SBC of 633.0853, and RMSE of 15.8153 Mg. The predictive fitting results indicate that some indices describing the stand structural diversity can be employed as significant independent variables to predict the AGB production of the Scotch pine stand. Further, including the stand diversity indices in the AGB prediction model with the stand attributes provided important predictive contributions in estimating the total variance in AGB. PMID:25663395

  9. An improved automatic time-of-flight picker for medical ultrasound tomography

    PubMed Central

    Li, Cuiping; Huang, Lianjie; Duric, Nebojsa; Zhang, Haijiang; Rowe, Charlotte

    2014-01-01

    Objective and motivation Time-of-flight (TOF) tomography used by a clinical ultrasound tomography device can efficiently and reliably produce sound–speed images of the breast for cancer diagnosis. Accurate picking of TOFs of transmitted ultrasound signals is extremely important to ensure high-resolution and high-quality ultrasound sound–speed tomograms. Since manually picking is time-consuming for large datasets, we developed an improved automatic TOF picker based on the Akaike information criterion (AIC), as described in this paper. Methods We make use of an approach termed multi-model inference (model averaging), based on the calculated AIC values, to improve the accuracy of TOF picks. By using multi-model inference, our picking method incorporates all the information near the TOF of ultrasound signals. Median filtering and reciprocal pair comparison are also incorporated in our AIC picker to effectively remove outliers. Results We validate our AIC picker using synthetic ultrasound waveforms, and demonstrate that our automatic TOF picker can accurately pick TOFs in the presence of random noise with absolute amplitudes up to 80% of the maximum absolute signal amplitude. We apply the new method to 1160 in vivo breast ultrasound waveforms, and compare the picked TOFs with manual picks and amplitude threshold picks. The mean value and standard deviation between our TOF picker and manual picking are 0.4 μs and 0.29 μs, while for amplitude threshold picker the values are 1.02 μs and 0.9 μs, respectively. Tomograms for in vivo breast data with high signal-to-noise ratio (SNR) (~25 dB) and low SNR (~18 dB) clearly demonstrate that our AIC picker is much less sensitive to the SNRs of the data, compared to the amplitude threshold picker. Discussion and conclusions The picking routine developed here is aimed at determining reliable quantitative values, necessary for adding diagnostic information to our clinical ultrasound tomography device – CURE. It has been

  10. The growth response of ostrich (Struthio camelus var. domesticus) chicks fed on diets with three different dietary protein and amino acid concentrations.

    PubMed

    Carstens, P D; Sharifi, A R; Brand, T S; Hoffman, L C

    2014-01-01

    1. Feeding costs are the largest expense in an ostrich production system, and protein is one of the more expensive components of the diet. This study evaluated the growth response of ostrich chicks on diets containing different concentrations of protein (amino acids). The diets were formulated to contain three concentrations of protein (one diet with 20% less protein than the conventional concentration, L; one diet with the conventional concentration of protein, M and one diet with 20% more protein than the conventional concentration, H) for each of the phase diets. The phase diets were pre-starter, starter, grower and finisher. 2. This study includes the analysis of ostrich body weight (BW) by modelling growth with linear polynomial and non-linear functions for all the data not separated for treatments. In total, 3378 BW recordings of 90 animals were collected weekly from hatch (d 0) to 287 d (41 weeks) of age. 3. Seven non-linear growth models and three linear polynomial models were fitted to the data. The growth functions were compared by using Akaike's information criterion (AIC). For the non-linear models, the Bridges and Janoschek models had the lowest AIC values for the H treatment, while the Richards curve had the lowest value for M and the von Bertalanffy for the L treatment. 4. For the linear polynomial models, the linear polynomial of the third degree had the lowest AIC values for all three treatments, thus making it the most suitable model for the data; therefore, the predictions of this model were used to interpret the growth data. Significant differences were found between treatments for growth data. 5. The results from this study can aid in describing the growth of ostriches subjected to optimum feeding conditions. This information can also be used in research when modelling the nutrient requirements of growing birds. PMID:25132424

  11. Influence of plaque characteristics on fractional flow reserve for coronary lesions with intermediate to obstructive stenosis: insights from integrated-backscatter intravascular ultrasound analysis.

    PubMed

    Sakurai, Shinichiro; Takashima, Hiroaki; Waseda, Katsuhisa; Gosho, Masahiko; Kurita, Akiyoshi; Ando, Hirohiko; Maeda, Kazuyuki; Suzuki, Akihiro; Fujimoto, Masanobu; Amano, Tetsuya

    2015-10-01

    The aim of this study was to determine the correlation between the fractional flow reserve (FFR) values and volumetric intravascular ultrasound (IVUS) parameters derived from classic gray-scale IVUS and integrated backscatter (IB)-IVUS, taking into account known confounding factors. Patients with unstable angina pectoris with the frequent development of vulnerable plaques often showed the discrepancy between the FFR value and the quantitative coronary angiography findings. Our target population was 107 consecutive subjects with 114 isolated lesions who were scheduled for elective coronary angiography. The FFR was calculated as the mean distal coronary pressure divided by the mean aortic pressure during maximal hyperemia. Various volumetric parameters such as lipid plaque volume (LPV) and percentage of LPV (%LPV) were measured using IB-IVUS. Simple and multivariate linear regression analysis was employed to evaluate the correlation between FFR values and various classic gray-scale IVUS and IB-IVUS parameters. The Akaike information criterion (AIC) was used to compare the goodness of fit in an each model. Both the %LPV (r = -0.24; p = 0.01) and LPV (r = -0.40; p < 0.01) were significantly correlated with the FFR value. Only the LPV (AIC = -147.0; p = 0.006) and %LPV (AIC = -152.9; p = 0.005) proved to be independent predictors for the FFR value even after the adjustment of known confounding factors. The volumetric assessment by IB-IVUS could provide better information in terms of the relationship between plaque morphology and the FFR values as compared to the classic IVUS 2-dimensional gray-scale analysis. PMID:26129657

  12. Assessing bimodality to detect the presence of a dual cognitive process.

    PubMed

    Freeman, Jonathan B; Dale, Rick

    2013-03-01

    Researchers have long sought to distinguish between single-process and dual-process cognitive phenomena, using responses such as reaction times and, more recently, hand movements. Analysis of a response distribution's modality has been crucial in detecting the presence of dual processes, because they tend to introduce bimodal features. Rarely, however, have bimodality measures been systematically evaluated. We carried out tests of readily available bimodality measures that any researcher may easily employ: the bimodality coefficient (BC), Hartigan's dip statistic (HDS), and the difference in Akaike's information criterion between one-component and two-component distribution models (AIC(diff)). We simulated distributions containing two response populations and examined the influences of (1) the distances between populations, (2) proportions of responses, (3) the amount of positive skew present, and (4) sample size. Distance always had a stronger effect than did proportion, and the effects of proportion greatly differed across the measures. Skew biased the measures by increasing bimodality detection, in some cases leading to anomalous interactive effects. BC and HDS were generally convergent, but a number of important discrepancies were found. AIC(diff) was extremely sensitive to bimodality and identified nearly all distributions as bimodal. However, all measures served to detect the presence of bimodality in comparison to unimodal simulations. We provide a validation with experimental data, discuss methodological and theoretical implications, and make recommendations regarding the choice of analysis. PMID:22806703

  13. Age and Growth of the Round Stingray Urotrygon rogersi, a Particularly Fast-Growing and Short-Lived Elasmobranch

    PubMed Central

    Mejía-Falla, Paola A.; Cortés, Enric; Navia, Andrés F.; Zapata, Fernando A.

    2014-01-01

    We examined the age and growth of Urotrygon rogersi on the Colombian coast of the Eastern Tropical Pacific Ocean by directly estimating age using vertebral centra. We verified annual deposition of growth increments with marginal increment analysis. Eight growth curves were fitted to four data sets defined on the basis of the reproductive cycle (unadjusted or adjusted for age at first band) and size variables (disc width or total length). Model performance was evaluated using Akaike's Information Criterion (AIC), AIC weights and multi-model inference criteria. A two-phase growth function with adjusted age provided the best description of growth for females (based on five parameters, DW∞  =  20.1 cm, k  =  0.22 yr–1) and males (based on four and five parameters, DW∞  =  15.5 cm, k  =  0.65 yr–1). Median maturity of female and male U. rogersi is reached very fast (mean ± SE  =  1.0 ± 0.1 year). This is the first age and growth study for a species of the genus Urotrygon and results indicate that U. rogersi attains a smaller maximum size and has a shorter lifespan and lower median age at maturity than species of closely related genera. These life history traits are in contrast with those typically reported for other elasmobranchs. PMID:24776963

  14. Simple and Efficient Algorithm for Improving the MDL Estimator of the Number of Sources

    PubMed Central

    Guimarães, Dayan A.; de Souza, Rausley A. A.

    2014-01-01

    We propose a simple algorithm for improving the MDL (minimum description length) estimator of the number of sources of signals impinging on multiple sensors. The algorithm is based on the norms of vectors whose elements are the normalized and nonlinearly scaled eigenvalues of the received signal covariance matrix and the corresponding normalized indexes. Such norms are used to discriminate the largest eigenvalues from the remaining ones, thus allowing for the estimation of the number of sources. The MDL estimate is used as the input data of the algorithm. Numerical results unveil that the so-called norm-based improved MDL (iMDL) algorithm can achieve performances that are better than those achieved by the MDL estimator alone. Comparisons are also made with the well-known AIC (Akaike information criterion) estimator and with a recently-proposed estimator based on the random matrix theory (RMT). It is shown that our algorithm can also outperform the AIC and the RMT-based estimator in some situations. PMID:25330050

  15. Pharmacokinetic Analysis of 64Cu-ATSM Dynamic PET in Human Xenograft Tumors in Mice

    PubMed Central

    Li, Fan; Jørgensen, Jesper Tranekjær; Madsen, Jacob; Kjaer, Andreas

    2015-01-01

    The aim of this study was to evaluate the feasibility to perform voxel-wise kinetic modeling on datasets obtained from tumor-bearing mice that underwent dynamic PET scans with 64Cu-ATSM and extract useful physiological parameters. Methods: Tumor-bearing mice underwent 90-min dynamic PET scans with 64Cu-ATSM and CT scans with contrast. Irreversible and reversible two-tissue compartment models were fitted to time activity curves (TACs) obtained from whole tumor volumes and compared using the Akaike information criterion (AIC). Based on voxel-wise pharmacokinetic analysis, parametric maps of model rate constants k1, k3 and Ki were generated and compared to 64Cu-ATSM uptake. Results: Based on the AIC, an irreversible two-tissue compartment model was selected for voxel-wise pharmacokinetic analysis. Of the extracted parameters, k1 (~perfusion) showed a strong correlation with early tracer uptake (mean spearman R = 0.88) 5 min post injection (pi). Moreover, positive relationships were found between late tracer uptake (90 min pi) and both k3 and the net influx rate constant, Ki (mean spearman R = 0.56 and R = 0.86; respectively). Conclusion: This study shows the feasibility to extract relevant parameters from voxel-wise pharmacokinetic analysis to be used for preclinical validation of 64Cu-ATSM as a hypoxia-specific PET tracer. PMID:26854145

  16. Double point source W-phase inversion: Real-time implementation and automated model selection

    NASA Astrophysics Data System (ADS)

    Nealy, Jennifer L.; Hayes, Gavin P.

    2015-12-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  17. Delirium and other clinical factors with Clostridium difficile infection that predict mortality in hospitalized patients

    PubMed Central

    Archbald-Pannone, Laurie R.; McMurry, Timothy L.; Guerrant, Richard L.; Warren, Cirle A.

    2015-01-01

    Background Clostridium difficile infection (CDI) severity has increased, especially among hospitalized elderly. We evaluated clinical factors to predict mortality following CDI. Methods We collected data from inpatients diagnosed with CDI at US academic medical center (HSR-IRB# 13630). We evaluated age, Charlson comorbidity index (CCI), admission from a long-term care facility (LTCF), intensive care unit (ICU) at time of diagnosis, white blood cell count (WBC), blood urea nitrogen (BUN), low body mass index (BMI), and delirium as possible predictors. A parsimonious predictive model was chosen using Akaike information criterion (AIC) and a best subsets model selection algorithm. Area under the ROC curve was used to assess the model’s comparative; with AIC as selection criterion for all subsets to measure fit and control for over-fitting. Results From 362 subjects, the selected model included CCI, WBC, BUN, ICU, and delirium. The logistic regression coefficients were converted to a points scale and calibrated so that each unit on the CCI contributed 2 points, ICU contributed 5, unit of WBC (natural log scale) contributed 3, unit of BUN contributed 5, and delirium contributed 11. Discussion Our model shows substantial ability to predict short term mortality in patients hospitalized with CDI. Conclusion Patients who were diagnosed in the ICU and developed delirium are at highest risk for dying within 30 days of CDI diagnosis. PMID:25920706

  18. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk (Tamarix spp.) Habitat in the Western USA

    USGS Publications Warehouse

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-01-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  19. Modelling the growth of tambaqui, Colossoma macropomum (Cuvier, 1816) in floodplain lakes: model selection and multimodel inference.

    PubMed

    Costa, L R F; Barthem, R B; Albernaz, A L; Bittencourt, M M; Villacorta-Corrêa, M A

    2013-05-01

    The tambaqui, Colossoma macropomum, is one of the most commercially valuable Amazonian fish species, and in the floodplains of the region, they are caught in both rivers and lakes. Most growth studies on this species to date have adjusted only one growth model, the von Bertalanffy, without considering its possible uncertainties. In this study, four different models (von Bertalanffy, Logistic, Gompertz and the general model of Schnüte-Richards) were adjusted to a data set of fish caught within lakes from the middle Solimões River. These models were adjusted by non-linear equations, using the sample size of each age class as its weight. The adjustment evaluation of each model was based on the Akaike Information Criterion (AIC), the variation of AIC between the models (Δi) and the evidence weights (wi). Both the Logistic (Δi = 0.0) and Gompertz (Δi = 1.12) models were supported by the data, but neither of them was clearly superior (wi, respectively 52.44 and 29.95%). Thus, we propose the use of an averaged-model to estimate the asymptotic length (L∞). The averaged-model, based on Logistic and Gompertz models, resulted in an estimate of L∞=90.36, indicating that the tambaqui would take approximately 25 years to reach average size. PMID:23917568

  20. Marginal Likelihood Estimate Comparisons to Obtain Optimal Species Delimitations in Silene sect. Cryptoneurae (Caryophyllaceae)

    PubMed Central

    Aydin, Zeynep; Marcussen, Thomas; Ertekin, Alaattin Selcuk; Oxelman, Bengt

    2014-01-01

    Coalescent-based inference of phylogenetic relationships among species takes into account gene tree incongruence due to incomplete lineage sorting, but for such methods to make sense species have to be correctly delimited. Because alternative assignments of individuals to species result in different parametric models, model selection methods can be applied to optimise model of species classification. In a Bayesian framework, Bayes factors (BF), based on marginal likelihood estimates, can be used to test a range of possible classifications for the group under study. Here, we explore BF and the Akaike Information Criterion (AIC) to discriminate between different species classifications in the flowering plant lineage Silene sect. Cryptoneurae (Caryophyllaceae). We estimated marginal likelihoods for different species classification models via the Path Sampling (PS), Stepping Stone sampling (SS), and Harmonic Mean Estimator (HME) methods implemented in BEAST. To select among alternative species classification models a posterior simulation-based analog of the AIC through Markov chain Monte Carlo analysis (AICM) was also performed. The results are compared to outcomes from the software BP&P. Our results agree with another recent study that marginal likelihood estimates from PS and SS methods are useful for comparing different species classifications, and strongly support the recognition of the newly described species S. ertekinii. PMID:25216034

  1. An Investigation of State-Space Model Fidelity for SSME Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2008-01-01

    In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.

  2. Double point source W-phase inversion: Real-time implementation and automated model selection

    USGS Publications Warehouse

    Nealy, Jennifer; Hayes, Gavin

    2015-01-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  3. Prediction of Vigilant Attention and Cognitive Performance Using Self-Reported Alertness, Circadian Phase, Hours since Awakening, and Accumulated Sleep Loss

    PubMed Central

    Bermudez, Eduardo B.; Klerman, Elizabeth B.; Czeisler, Charles A.; Cohen, Daniel A.; Wyatt, James K.; Phillips, Andrew J. K.

    2016-01-01

    Sleep restriction causes impaired cognitive performance that can result in adverse consequences in many occupational settings. Individuals may rely on self-perceived alertness to decide if they are able to adequately perform a task. It is therefore important to determine the relationship between an individual’s self-assessed alertness and their objective performance, and how this relationship depends on circadian phase, hours since awakening, and cumulative lost hours of sleep. Healthy young adults (aged 18–34) completed an inpatient schedule that included forced desynchrony of sleep/wake and circadian rhythms with twelve 42.85-hour “days” and either a 1:2 (n = 8) or 1:3.3 (n = 9) ratio of sleep-opportunity:enforced-wakefulness. We investigated whether subjective alertness (visual analog scale), circadian phase (melatonin), hours since awakening, and cumulative sleep loss could predict objective performance on the Psychomotor Vigilance Task (PVT), an Addition/Calculation Test (ADD) and the Digit Symbol Substitution Test (DSST). Mathematical models that allowed nonlinear interactions between explanatory variables were evaluated using the Akaike Information Criterion (AIC). Subjective alertness was the single best predictor of PVT, ADD, and DSST performance. Subjective alertness alone, however, was not an accurate predictor of PVT performance. The best AIC scores for PVT and DSST were achieved when all explanatory variables were included in the model. The best AIC score for ADD was achieved with circadian phase and subjective alertness variables. We conclude that subjective alertness alone is a weak predictor of objective vigilant or cognitive performance. Predictions can, however, be improved by knowing an individual’s circadian phase, current wake duration, and cumulative sleep loss. PMID:27019198

  4. Comparison of Regression Methods to Compute Atmospheric Pressure and Earth Tidal Coefficients in Water Level Associated with Wenchuan Earthquake of 12 May 2008

    NASA Astrophysics Data System (ADS)

    He, Anhua; Singh, Ramesh P.; Sun, Zhaohua; Ye, Qing; Zhao, Gang

    2016-05-01

    The earth tide, atmospheric pressure, precipitation and earthquake fluctuations, especially earthquake greatly impacts water well levels, thus anomalous co-seismic changes in ground water levels have been observed. In this paper, we have used four different models, simple linear regression (SLR), multiple linear regression (MLR), principal component analysis (PCA) and partial least squares (PLS) to compute the atmospheric pressure and earth tidal effects on water level. Furthermore, we have used the Akaike information criterion (AIC) to study the performance of various models. Based on the lowest AIC and sum of squares for error values, the best estimate of the effects of atmospheric pressure and earth tide on water level is found using the MLR model. However, MLR model does not provide multicollinearity between inputs, as a result the atmospheric pressure and earth tidal response coefficients fail to reflect the mechanisms associated with the groundwater level fluctuations. On the premise of solving serious multicollinearity of inputs, PLS model shows the minimum AIC value. The atmospheric pressure and earth tidal response coefficients show close response with the observation using PLS model. The atmospheric pressure and the earth tidal response coefficients are found to be sensitive to the stress-strain state using the observed data for the period 1 April-8 June 2008 of Chuan 03# well. The transient enhancement of porosity of rock mass around Chuan 03# well associated with the Wenchuan earthquake (Mw = 7.9 of 12 May 2008) that has taken its original pre-seismic level after 13 days indicates that the co-seismic sharp rise of water well could be induced by static stress change, rather than development of new fractures.

  5. Comparison of Regression Methods to Compute Atmospheric Pressure and Earth Tidal Coefficients in Water Level Associated with Wenchuan Earthquake of 12 May 2008

    NASA Astrophysics Data System (ADS)

    He, Anhua; Singh, Ramesh P.; Sun, Zhaohua; Ye, Qing; Zhao, Gang

    2016-07-01

    The earth tide, atmospheric pressure, precipitation and earthquake fluctuations, especially earthquake greatly impacts water well levels, thus anomalous co-seismic changes in ground water levels have been observed. In this paper, we have used four different models, simple linear regression (SLR), multiple linear regression (MLR), principal component analysis (PCA) and partial least squares (PLS) to compute the atmospheric pressure and earth tidal effects on water level. Furthermore, we have used the Akaike information criterion (AIC) to study the performance of various models. Based on the lowest AIC and sum of squares for error values, the best estimate of the effects of atmospheric pressure and earth tide on water level is found using the MLR model. However, MLR model does not provide multicollinearity between inputs, as a result the atmospheric pressure and earth tidal response coefficients fail to reflect the mechanisms associated with the groundwater level fluctuations. On the premise of solving serious multicollinearity of inputs, PLS model shows the minimum AIC value. The atmospheric pressure and earth tidal response coefficients show close response with the observation using PLS model. The atmospheric pressure and the earth tidal response coefficients are found to be sensitive to the stress-strain state using the observed data for the period 1 April-8 June 2008 of Chuan 03# well. The transient enhancement of porosity of rock mass around Chuan 03# well associated with the Wenchuan earthquake (Mw = 7.9 of 12 May 2008) that has taken its original pre-seismic level after 13 days indicates that the co-seismic sharp rise of water well could be induced by static stress change, rather than development of new fractures.

  6. A Test of the DSM-5 Severity Scale for Alcohol Use Disorder

    PubMed Central

    Fazzino, Tera L.; Rose, Gail L.; Burt, Keith B.; Helzer, John E.

    2014-01-01

    BACKGROUND For the DSM-5-defined alcohol use disorder (AUD) diagnosis, a tricategorized scale that designates mild, moderate, and severe AUD was selected over a fully dimensional scale to represent AUD severity. The purpose of this study was to test whether the DSM-5-defined AUD severity measure was as proficient a predictor of alcohol use following a brief intervention, compared to a fully dimensional scale. METHODS Heavy drinking primary care patients (N=246) received a physician-delivered brief intervention (BI), and then reported daily alcohol consumption for six months using an Interactive Voice Response (IVR) system. The dimensional AUD measure we constructed was a summation of all AUD criteria met at baseline (mean = 6.5; SD = 2.5). A multi-model inference technique was used to determine whether the DSM-5 tri-categorized severity measure or a dimensional approach would provide a more precise prediction of change in weekly alcohol consumption following a BI. RESULTS The Akaike information criterion (AIC) for the dimensional AUD model (AIC=7623.88) was four points lower than the tri-categorized model (AIC=7627.88) and weight of evidence calculations indicated there was 88% likelihood the dimensional model was the better approximating model. The dimensional model significantly predicted change in alcohol consumption (p =.04) whereas the DSM-5 tri-categorized model did not. CONCLUSION A dimensional AUD measure was superior, detecting treatment effects that were not apparent with tri-categorized severity model as defined by the DSM-5. We recommend using a dimensional measure for determining AUD severity. PMID:24893979

  7. Prediction of Vigilant Attention and Cognitive Performance Using Self-Reported Alertness, Circadian Phase, Hours since Awakening, and Accumulated Sleep Loss.

    PubMed

    Bermudez, Eduardo B; Klerman, Elizabeth B; Czeisler, Charles A; Cohen, Daniel A; Wyatt, James K; Phillips, Andrew J K

    2016-01-01

    Sleep restriction causes impaired cognitive performance that can result in adverse consequences in many occupational settings. Individuals may rely on self-perceived alertness to decide if they are able to adequately perform a task. It is therefore important to determine the relationship between an individual's self-assessed alertness and their objective performance, and how this relationship depends on circadian phase, hours since awakening, and cumulative lost hours of sleep. Healthy young adults (aged 18-34) completed an inpatient schedule that included forced desynchrony of sleep/wake and circadian rhythms with twelve 42.85-hour "days" and either a 1:2 (n = 8) or 1:3.3 (n = 9) ratio of sleep-opportunity:enforced-wakefulness. We investigated whether subjective alertness (visual analog scale), circadian phase (melatonin), hours since awakening, and cumulative sleep loss could predict objective performance on the Psychomotor Vigilance Task (PVT), an Addition/Calculation Test (ADD) and the Digit Symbol Substitution Test (DSST). Mathematical models that allowed nonlinear interactions between explanatory variables were evaluated using the Akaike Information Criterion (AIC). Subjective alertness was the single best predictor of PVT, ADD, and DSST performance. Subjective alertness alone, however, was not an accurate predictor of PVT performance. The best AIC scores for PVT and DSST were achieved when all explanatory variables were included in the model. The best AIC score for ADD was achieved with circadian phase and subjective alertness variables. We conclude that subjective alertness alone is a weak predictor of objective vigilant or cognitive performance. Predictions can, however, be improved by knowing an individual's circadian phase, current wake duration, and cumulative sleep loss. PMID:27019198

  8. A Potential Role for Allostatic Load in Preeclampsia

    PubMed Central

    Hux, Vanessa J.; Roberts, James M.

    2014-01-01

    Objective Preeclampsia is a multisystemic disorder of pregnancy associated with maternal and fetal complications as well as later-life cardiovascular disease. Its exact cause is not known. We developed a pregnancy-specific multisystem index score of physiologic risk and chronic stress, allostatic load (AL), early in pregnancy. Our objective was to determine whether AL measured early in pregnancy was associated with increased odds of developing preeclampsia. Methods Data were from a single-center, prospectively collected database in a 1:2 individual-matched case control of women enrolled at <15 weeks gestation. We matched 38 preeclamptic cases to 75 uncomplicated, term deliveries on age, parity, and lifetime smoking status. AL was determined using 9 measures of cardiovascular, metabolic, and inflammatory function. Cases and matched controls were compared using conditional logistic regression. We compared the model's association with preeclampsia to that of obesity, a well-known risk factor for preeclampsia, by assessing goodness-of-fit by Akaike information criterion (AIC), where a difference >1-2 suggests better fit. Results Early pregnancy AL was higher in women with preeclampsia (1.25 +/- 0.68 vs. 0.83 +/- 0.62, p=0.002); women with higher AL had increasing odds of developing preeclampsia (OR 2.91, 95% CI 1.50-5.65). The difference between AIC for AL and obesity was >2 (AIC 74.4 vs. 84.4), indicating AL had a stronger association with preeclampsia. Conclusion Higher allostatic load in early pregnancy is associated with increasing odds of preeclampsia. This work supports a possible role of multiple maternal systems and chronic stress early in pregnancy in the development of preeclampsia. PMID:24939173

  9. Spatial Distribution of Black Bear Incident Reports in Michigan.

    PubMed

    McFadden-Hiller, Jamie E; Beyer, Dean E; Belant, Jerrold L

    2016-01-01

    Interactions between humans and carnivores have existed for centuries due to competition for food and space. American black bears are increasing in abundance and populations are expanding geographically in many portions of its range, including areas that are also increasing in human density, often resulting in associated increases in human-bear conflict (hereafter, bear incidents). We used public reports of bear incidents in Michigan, USA, from 2003-2011 to assess the relative contributions of ecological and anthropogenic variables in explaining the spatial distribution of bear incidents and estimated the potential risk of bear incidents. We used weighted Normalized Difference Vegetation Index mean as an index of primary productivity, region (i.e., Upper Peninsula or Lower Peninsula), primary and secondary road densities, and percentage land cover type within 6.5-km2 circular buffers around bear incidents and random points. We developed 22 a priori models and used generalized linear models and Akaike's Information Criterion (AIC) to rank models. The global model was the best compromise between model complexity and model fit (w = 0.99), with a ΔAIC 8.99 units from the second best performing model. We found that as deciduous forest cover increased, the probability of bear incident occurrence increased. Among the measured anthropogenic variables, cultivated crops and primary roads were the most important in our AIC-best model and were both positively related to the probability of bear incident occurrence. The spatial distribution of relative bear incident risk varied markedly throughout Michigan. Forest cover fragmented with agriculture and other anthropogenic activities presents an environment that likely facilitates bear incidents. Our map can help wildlife managers identify areas of bear incident occurrence, which in turn can be used to help develop strategies aimed at reducing incidents. Researchers and wildlife managers can use similar mapping techniques to

  10. Comparison of manual and automatic onset Time picking for local earthquake in North Eastern Italy.

    NASA Astrophysics Data System (ADS)

    Spallarossa, D.; Tiberi, L.; Costa, G.

    2012-04-01

    Automatic estimates of earthquake parameters continues to be of considerable interest to the seismological community. The automatic processing of seismic data, whether for real-time seismic warning system or to reprocessing large amount of seismic recordings, is increasingly being demanded by seismologists. In this study is presented a new method used for automatic phase picking (P and S) which include envelope function calculation, STA/LTA detectors and AR picking algorithms based on the Akaike information criterion (AIC) The main characteristics of the proposed picking algorithm are: a) Pre-filtering and envelope calculation to prearrange the onset; b) Preliminary detection of P onset using both the AIC based picker and the STA/LTA picker; c) S/N analysis, P validation, filtering and re-picking; d) Preliminary earthquake location; e) Detection of S onset adopting the AIC based picker; f) S/N analysis, S validation; g) Earthquake location. The algorithm is applied to a reference data composed by 200 events set with very heterogeneous qualities of P and S onsets acquired by South Eastern Alps Transfontier network from 01/01/2008 to 03/31/2008 in North Eastern Italy and surrounding regions. These data are collected through the use of the software Antelope, an integrated collection of programs for data management and seismic data analysis. The reliability and robustness of the proposed algorithm is tested by comparing manually derived P and S readings (determined by an experienced seismic analyst), serving as reference picks, with the corresponding automatically estimated P and S arrival times. An additional analysis is comparing these automatic picks with the ones produced by Antelope, which used only STA/LTA detectors and finally studying the effect of these different set of arrival times in the resultant localizations for each database event. Preliminary results indicate that seismic detectors which integrate different techniques could improve the stability of the

  11. A procedure for seiche analysis with Bayesian information criterion

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  12. Anterior Insular Cortex and Emotional Awareness

    PubMed Central

    Gu, Xiaosi; Hof, Patrick R.; Friston, Karl J.; Fan, Jin

    2014-01-01

    This paper reviews the foundation for a role of the human anterior insular cortex (AIC) in emotional awareness, defined as the conscious experience of emotions. We first introduce the neuroanatomical features of AIC and existing findings on emotional awareness. Using empathy, the awareness and understanding of other people’s emotional states, as a test case, we then present evidence to demonstrate: 1) AIC and anterior cingulate cortex (ACC) are commonly coactivated as revealed by a meta-analysis, 2) AIC is functionally dissociable from ACC, 3) AIC integrates stimulus-driven and top-down information, and 4) AIC is necessary for emotional awareness. We propose a model in which AIC serves two major functions: integrating bottom-up interoceptive signals with top-down predictions to generate a current awareness state and providing descending predictions to visceral systems that provide a point of reference for autonomic reflexes. We argue that AIC is critical and necessary for emotional awareness. PMID:23749500

  13. Anterior insular cortex and emotional awareness.

    PubMed

    Gu, Xiaosi; Hof, Patrick R; Friston, Karl J; Fan, Jin

    2013-10-15

    This paper reviews the foundation for a role of the human anterior insular cortex (AIC) in emotional awareness, defined as the conscious experience of emotions. We first introduce the neuroanatomical features of AIC and existing findings on emotional awareness. Using empathy, the awareness and understanding of other people's emotional states, as a test case, we then present evidence to demonstrate: 1) AIC and anterior cingulate cortex (ACC) are commonly coactivated as revealed by a meta-analysis, 2) AIC is functionally dissociable from ACC, 3) AIC integrates stimulus-driven and top-down information, and 4) AIC is necessary for emotional awareness. We propose a model in which AIC serves two major functions: integrating bottom-up interoceptive signals with top-down predictions to generate a current awareness state and providing descending predictions to visceral systems that provide a point of reference for autonomic reflexes. We argue that AIC is critical and necessary for emotional awareness. PMID:23749500

  14. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  15. Information Economics: Valuing Information.

    ERIC Educational Resources Information Center

    Brinberg, Herbert R.

    1989-01-01

    Addresses the question of why previous articles and studies on the value of information have failed to provide meaningful techniques for measuring that value. The discussion covers four principle causes for confusion surrounding the valuation of information and draws conclusions about the value added model of information. (seven references) (CLB)

  16. Theorizing Information for Information Science.

    ERIC Educational Resources Information Center

    Cornelius, Ian

    2002-01-01

    Considers whether information science has a theory of information. Highlights include guides to information and its theory; constructivism; information outside information science; process theories; cognitive views of information; measuring information; meaning; and misinformation. (Contains 89 references.) (LRW)

  17. Average Information Content Maximization—A New Approach for Fingerprint Hybridization and Reduction

    PubMed Central

    Śmieja, Marek; Warszycki, Dawid

    2016-01-01

    Fingerprints, bit representations of compound chemical structure, have been widely used in cheminformatics for many years. Although fingerprints with the highest resolution display satisfactory performance in virtual screening campaigns, the presence of a relatively high number of irrelevant bits introduces noise into data and makes their application more time-consuming. In this study, we present a new method of hybrid reduced fingerprint construction, the Average Information Content Maximization algorithm (AIC-Max algorithm), which selects the most informative bits from a collection of fingerprints. This methodology, applied to the ligands of five cognate serotonin receptors (5-HT2A, 5-HT2B, 5-HT2C, 5-HT5A, 5-HT6), proved that 100 bits selected from four non-hashed fingerprints reflect almost all structural information required for a successful in silico discrimination test. A classification experiment indicated that a reduced representation is able to achieve even slightly better performance than the state-of-the-art 10-times-longer fingerprints and in a significantly shorter time. PMID:26784447

  18. Some novel growth functions and their application with reference to growth in ostrich.

    PubMed

    Faridi, A; López, S; Ammar, H; Salwa, K S; Golian, A; Thornley, J H M; France, J

    2015-06-01

    Four novel growth functions, namely, Pareto, extreme value distribution (EVD), Lomolino, and cumulative β-P distribution (CBP), are derived, and their ability to describe ostrich growth curves is evaluated. The functions were compared with standard growth equations, namely, the monomolecular, Michaelis-Menten (MM), Gompertz, Richards, and generalized MM (gMM). For this purpose, 2 separate comparisons were conducted. In the first, all the functions were fitted to 40 individual growth curves (5 males and 35 females) of ostriches using nonlinear regression. In the second, performance of the functions was assessed when data from 71 individuals were composited (570 data points). This comparison was undertaken using nonlinear mixed models and considering 3 approaches: 1) models with no random effect, 2) random effect incorporated as the intercept, and 3) random effect incorporated into the asymptotic weight parameter (Wf). The results from the first comparison showed that the functions generally gave acceptable values of R2 and residual variance. On the basis of the Akaike information criterion (AIC), CBP gave the best fit, whereas the Gompertz and Lomolino equations were the preferred functions on the basis of corrected AIC (AICc). Bias, accuracy factor, the Durbin-Watson statistic, and the number of runs of sign were used to analyze the residuals. CBP gave the best distribution of residuals but also produced more residual autocorrelation (significant Durbin-Watson statistic). The functions were applied to sample data for a more conventional farm species (2 breeds of cattle) to verify the results of the comparison of fit among functions and their applicability across species. In the second comparison, analysis of mixed models showed that incorporation of a random effect into Wf gave the best fit, resulting in smaller AIC and AIC values compared with those in the other 2 approaches. On the basis of AICc, best fit was achieved with CBP, followed by gMM, Lomolino, and

  19. Effects of error covariance structure on estimation of model averaging weights and predictive performance

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Meyer, Philip D.; Curtis, Gary P.; Shi, Xiaoqing; Niu, Xu-Feng; Yabusaki, Steve B.

    2013-01-01

    When conducting model averaging for assessing groundwater conceptual model uncertainty, the averaging weights are often evaluated using model selection criteria such as AIC, AICc, BIC, and KIC (Akaike Information Criterion, Corrected Akaike Information Criterion, Bayesian Information Criterion, and Kashyap Information Criterion, respectively). However, this method often leads to an unrealistic situation in which the best model receives overwhelmingly large averaging weight (close to 100%), which cannot be justified by available data and knowledge. It was found in this study that this problem was caused by using the covariance matrix, CE, of measurement errors for estimating the negative log likelihood function common to all the model selection criteria. This problem can be resolved by using the covariance matrix, Cek, of total errors (including model errors and measurement errors) to account for the correlation between the total errors. An iterative two-stage method was developed in the context of maximum likelihood inverse modeling to iteratively infer the unknown Cek from the residuals during model calibration. The inferred Cek was then used in the evaluation of model selection criteria and model averaging weights. While this method was limited to serial data using time series techniques in this study, it can be extended to spatial data using geostatistical techniques. The method was first evaluated in a synthetic study and then applied to an experimental study, in which alternative surface complexation models were developed to simulate column experiments of uranium reactive transport. It was found that the total errors of the alternative models were temporally correlated due to the model errors. The iterative two-stage method using Cekresolved the problem that the best model receives 100% model averaging weight, and the resulting model averaging weights were supported by the calibration results and physical understanding of the alternative models. Using Cek

  20. Identification of sorption processes and parameters for radionuclide transport in fractured rock

    NASA Astrophysics Data System (ADS)

    Dai, Zhenxue; Wolfsberg, Andrew; Reimus, Paul; Deng, Hailin; Kwicklis, Edward; Ding, Mei; Ware, Doug; Ye, Ming

    2012-01-01

    SummaryIdentification of chemical reaction processes in subsurface environments is a key issue for reactive transport modeling because simulating different processes requires developing different chemical-mathematical models. In this paper, two sorption processes (equilibrium and kinetics) are considered for modeling neptunium and uranium sorption in fractured rock. Based on different conceptualizations of the two processes occurring in fracture and/or matrix media, seven dual-porosity, multi-component reactive transport models are developed. The process models are identified with a stepwise strategy by using multi-tracer concentration data obtained from a series of transport experiments. In the first step, breakthrough data of a conservative tracer (tritium) obtained from four experiments are used to estimate the flow and non-reactive transport parameters (i.e., mean fluid residence time in fracture, fracture aperture, and matrix tortuosity) common to all the reactive transport models. In the second and third steps, by fixing the common non-reactive flow and transport parameters, the sorption parameters (retardation factor, sorption coefficient, and kinetic rate constant) of each model are estimated using the breakthrough data of reactive tracers, neptunium and uranium, respectively. Based on the inverse modeling results, the seven sorption-process models are discriminated using four model discrimination (or selection) criteria, Akaike information criterion ( AIC), modified Akaike information criterion ( AICc), Bayesian information criterion ( BIC) and Kashyap information criterion ( KIC). These criteria suggest the kinetic sorption process for modeling reactive transport of neptunium and uranium transport in both fracture and matrix. This conclusion is confirmed by two chemical criteria, the half reaction time and Damköhler number criterion.

  1. Incidence and description of autoimmune cytopenias during treatment with ibrutinib for chronic lymphocytic leukemia.

    PubMed

    Rogers, K A; Ruppert, A S; Bingman, A; Andritsos, L A; Awan, F T; Blum, K A; Flynn, J M; Jaglowski, S M; Lozanski, G; Maddocks, K J; Byrd, J C; Woyach, J A; Jones, J A

    2016-02-01

    Chronic lymphocytic leukemia (CLL) is frequently complicated by secondary autoimmune cytopenias (AICs). Ibrutinib is an irreversible inhibitor of Bruton's tyrosine kinase approved for the treatment of relapsed CLL and CLL with del(17p). The effect of ibrutinib treatment on the incidence of AIC is currently unknown. We reviewed medical records of 301 patients treated with ibrutinib, as participants in therapeutic clinical trials at The Ohio State University Comprehensive Cancer Center between July 2010 and July 2014. Subjects were reviewed with respect to past history of AIC, and treatment-emergent AIC cases were identified. Before starting ibrutinib treatment, 26% of patients had experienced AIC. Information was available for a total of 468 patient-years of ibrutinib exposure, during which there were six cases of treatment-emergent AIC. This corresponds to an estimated incidence rate of 13 episodes for every 1000 patient-years of ibrutinib treatment. We further identified 22 patients receiving therapy for AIC at the time ibrutinib was started. Of these 22 patients, 19 were able to discontinue AIC therapy. We found that ibrutinib treatment is associated with a low rate of treatment-emergent AIC. Patients with an existing AIC have been successfully treated with ibrutinib and subsequently discontinued AIC therapy. PMID:26442611

  2. Incidence and Description of Autoimmune Cytopenias During Treatment with Ibrutinib for Chronic Lymphocytic Leukemia Autoimmune Cytopenias During Ibrutinib Treatment

    PubMed Central

    Rogers, Kerry A.; Ruppert, Amy S.; Bingman, Anissa; Andritsos, Leslie A.; Awan, Farrukh T.; Blum, Kristie A.; Flynn, Joseph M.; Jaglowski, Samantha M.; Lozanski, Gerard; Maddocks, Kami J.; Byrd, John C.; Woyach, Jennifer A.; Jones, Jeffrey A.

    2016-01-01

    Chronic lymphocytic leukemia (CLL) is frequently complicated by secondary autoimmune cytopenias (AIC). Ibrutinib is an irreversible inhibitor of Bruton’s Tyrosine Kinase approved for treatment of relapsed CLL and CLL with del(17p). The effect of ibrutinib treatment on the incidence of AIC is currently unknown. We reviewed medical records of 301 patients treated with ibrutinib as participants in therapeutic clinical trials at the Ohio State University Comprehensive Cancer Center between July 2010 and July 2014. Subjects were reviewed with respect to past history of AIC, and treatment emergent AIC cases were identified. Prior to starting ibrutinib treatment, 26% of patients had experienced AIC. Information was available for a total of 468 patient-years of ibrutinib exposure, during which there were six cases of treatment emergent AIC. This corresponds to an estimated incidence rate of 13 episodes for every 1 000 patient-years of ibrutinib treatment. We further identified 22 patients receiving therapy for AIC at the time ibrutinib was started. Of these 22 patients, 19 were able to discontinue AIC therapy. We found that ibrutinib treatment is associated with a low rate of treatment emergent AIC. Patients with an existing AIC have been successfully treated with ibrutinib and subsequently discontinued AIC therapy. PMID:26442611

  3. Information-based ranking of 10 compartment models of diffusion-weighted signal attenuation in fixed prostate tissue.

    PubMed

    Liang, Sisi; Panagiotaki, Eleftheria; Bongers, Andre; Shi, Peng; Sved, Paul; Watson, Geoffrey; Bourne, Roger

    2016-05-01

    This study compares the theoretical information content of single- and multi-compartment models of diffusion-weighted signal attenuation in prostate tissue. Diffusion-weighted imaging (DWI) was performed at 9.4 T with multiple diffusion times and an extended range of b values in four whole formalin-fixed prostates. Ten models, including different combinations of isotropic, anisotropic and restricted components, were tested. Models were ranked using the Akaike information criterion. In all four prostates, two-component models, comprising an anisotropic Gaussian component and an isotropic restricted component, ranked highest in the majority of voxels. Single-component models, whether isotropic (apparent diffusion coefficient, ADC) or anisotropic (diffusion tensor imaging, DTI), consistently ranked lower than multi-component models. Model ranking trends were independent of voxel size and maximum b value in the range tested (1.6-16 mm(3) and 3000-10 000 s/mm(2) ). This study characterizes the two major water components previously identified by biexponential models and shows that models incorporating both anisotropic and restricted components provide more information-rich descriptions of DWI signals in prostate tissue than single- or multi-component anisotropic models and models that do not account for restricted diffusion. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26999065

  4. VLP Source Inversion and Evaluation of Error Analysis Techniques at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Brill, K. A.; Waite, G. P.

    2015-12-01

    In January of 2012, our team occupied 10 sites around Fuego volcano with broadband seismometers, two of which were collocated with infrasound microphone arrays and tilt-meters (see Figure 1 for full deployment details). Our radial coverage around Fuego during the 2012 campaign satisfies conditions outlined by Dawson et al. [2011] for good network coverage. Very-long-period (VLP) events that accompany small-scale explosions were classified by waveform and eruption style. We located these VLP event families which have been persistent at Fuego since at least 2008 through inversion in the same manner employed by Lyons and Waite [2011] with improved radial coverage in our network. We compare results for source inversions performed with independent tilt data against inversions incorporating tilt data extracted from the broadband. The current best-practice method for choosing an optimum solution for inversion results is based on each solution's residual error, the relevance of free parameters used in the model, and the physical significance of the source mechanism. Error analysis was performed through a boot strapping in order to explore the source location uncertainty and significance of components of the moment tensor. The significance of the number of free parameters has mostly been evaluated by calculating Akaike's Information Criterion (AIC), but little has been done to evaluate the sensitivity of AIC or other criteria (i.e. Bayesian Information Criterion) to the number of model parameters. We compare solutions as chosen by these alternate methods with more standard techniques for our real data set as well through the use of synthetic data and make recommendations as to best practices. Figure 1: a) Map of 2012 station network: stations highlighted in red were collocated with infrasound arrays. b) Location of Fuego within Guatemala and view of the complex from the west with different eruptive centers labeled. c) Operational times for each of the stations and cameras.

  5. Using Post-Traumatic Amnesia To Predict Outcome after Traumatic Brain Injury.

    PubMed

    Ponsford, Jennie L; Spitz, Gershon; McKenzie, Dean

    2016-06-01

    Duration of post-traumatic amnesia (PTA) has emerged as a strong measure of injury severity after traumatic brain injury (TBI). Despite the growing international adoption of this measure, there remains a lack of consistency in the way in which PTA duration is used to classify severity of injury. This study aimed to establish the classification of PTA that would best predict functional or productivity outcomes. We conducted a cohort study of 1041 persons recruited from inpatient admissions to a TBI rehabilitation center between 1985 and 2013. Participants had a primary diagnosis of TBI, emerged from PTA before discharge from inpatient hospital, and engaged in productive activities before injury. Eight models that classify duration of PTA were evaluated-six that were based on the literature and two that were statistically driven. Models were assessed using area under the receiver operating characteristic curve (AUC) as well as model-based Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) statistics. All categorization models showed longer PTA to be associated with a greater likelihood of being nonproductive at 1 year after TBI. Classification systems with a greater number of categories performed better than two-category systems. The dimensional (continuous) form of PTA resulted in the greatest AUC, and lowest AIC as well as BIC, of the classification systems examined. This finding indicates that the greatest accuracy in prognosis is likely to be achieved using PTA as a continuous variable. This enables the probability of productive outcomes to be estimated with far greater precision than that possible using a classification system. Categorizing PTA to classify severity of injury may be reducing the precision with which clinicians can plan the treatment of patients after TBI. PMID:26234939

  6. Distinguishing between invasions and habitat changes as drivers of diversity loss among California's freshwater fishes.

    PubMed

    Light, Theo; Marchetti, Michael P

    2007-04-01

    Many of California's native populations of freshwater fish are in serious decline, as are freshwater faunas worldwide. Habitat loss and alteration, hydrologic modification, water pollution, and invasions have been identified as major drivers of these losses. Because these potential causes of decline are frequently correlated, it is difficult to separate direct from indirect effects of each factor and to appropriately rank their importance for conservation action. Recently a few authors have questioned the conservation significance of invasions, suggesting that they are "passengers" rather than "drivers" of ecological change. We compiled an extensive, watershed-level data set of fish presence and conservation status, land uses, and hydrologic modifications in California and used an information theoretic approach (Akaike's information criterion, AIC) and path analysis to evaluate competing models of native fish declines. Hydrologic modification (impoundments and diversions), invasions, and proportion of developed land were all predictive of the number of extinct and at-risk native fishes in California watersheds in the AIC analysis. Although nonindigenous fish richness was the best single predictor (after native richness) of fishes of conservation concern, the combined ranking of models containing hydrologic modification variables was slightly higher than that of models containing nonindigenous richness. Nevertheless, the path analysis indicated that the effects of both hydrologic modification and development on fishes of conservation concern were largely indirect, through their positive effects on nonindigenous fish richness. The best-fitting path model was the driver model, which included no direct effects of abiotic disturbance on native fish declines. Our results suggest that, for California freshwater fishes, invasions are the primary direct driver of extinctions and population declines, whereas the most damaging effect of habitat alteration is the tendency of

  7. Estimating Dbh of Trees Employing Multiple Linear Regression of the best Lidar-Derived Parameter Combination Automated in Python in a Natural Broadleaf Forest in the Philippines

    NASA Astrophysics Data System (ADS)

    Ibanez, C. A. G.; Carcellar, B. G., III; Paringit, E. C.; Argamosa, R. J. L.; Faelga, R. A. G.; Posilero, M. A. V.; Zaragosa, G. P.; Dimayacyac, N. A.

    2016-06-01

    Diameter-at-Breast-Height Estimation is a prerequisite in various allometric equations estimating important forestry indices like stem volume, basal area, biomass and carbon stock. LiDAR Technology has a means of directly obtaining different forest parameters, except DBH, from the behavior and characteristics of point cloud unique in different forest classes. Extensive tree inventory was done on a two-hectare established sample plot in Mt. Makiling, Laguna for a natural growth forest. Coordinates, height, and canopy cover were measured and types of species were identified to compare to LiDAR derivatives. Multiple linear regression was used to get LiDAR-derived DBH by integrating field-derived DBH and 27 LiDAR-derived parameters at 20m, 10m, and 5m grid resolutions. To know the best combination of parameters in DBH Estimation, all possible combinations of parameters were generated and automated using python scripts and additional regression related libraries such as Numpy, Scipy, and Scikit learn were used. The combination that yields the highest r-squared or coefficient of determination and lowest AIC (Akaike's Information Criterion) and BIC (Bayesian Information Criterion) was determined to be the best equation. The equation is at its best using 11 parameters at 10mgrid size and at of 0.604 r-squared, 154.04 AIC and 175.08 BIC. Combination of parameters may differ among forest classes for further studies. Additional statistical tests can be supplemented to help determine the correlation among parameters such as Kaiser- Meyer-Olkin (KMO) Coefficient and the Barlett's Test for Spherecity (BTS).

  8. Validation of the Chinese Version of the Quality of Nursing Work Life Scale

    PubMed Central

    Fu, Xia; Xu, Jiajia; Song, Li; Li, Hua; Wang, Jing; Wu, Xiaohua; Hu, Yani; Wei, Lijun; Gao, Lingling; Wang, Qiyi; Lin, Zhanyi; Huang, Huigen

    2015-01-01

    Quality of Nursing Work Life (QNWL) serves as a predictor of a nurse’s intent to leave and hospital nurse turnover. However, QNWL measurement tools that have been validated for use in China are lacking. The present study evaluated the construct validity of the QNWL scale in China. A cross-sectional study was conducted conveniently from June 2012 to January 2013 at five hospitals in Guangzhou, which employ 1938 nurses. The participants were asked to complete the QNWL scale and the World Health Organization Quality of Life abbreviated version (WHOQOL-BREF). A total of 1922 nurses provided the final data used for analyses. Sixty-five nurses from the first investigated division were re-measured two weeks later to assess the test-retest reliability of the scale. The internal consistency reliability of the QNWL scale was assessed using Cronbach’s α. Test-retest reliability was assessed using the intra-class correlation coefficient (ICC). Criterion-relation validity was assessed using the correlation of the total scores of the QNWL and the WHOQOL-BREF. Construct validity was assessed with the following indices: χ2 statistics and degrees of freedom; relative mean square error of approximation (RMSEA); the Akaike information criterion (AIC); the consistent Akaike information criterion (CAIC); the goodness-of-fit index (GFI); the adjusted goodness of fit index; and the comparative fit index (CFI). The findings demonstrated high internal consistency (Cronbach’s α = 0.912) and test-retest reliability (interclass correlation coefficient = 0.74) for the QNWL scale. The chi-square test (χ2 = 13879.60, df [degree of freedom] = 813 P = 0.0001) was significant. The RMSEA value was 0.091, and AIC = 1806.00, CAIC = 7730.69, CFI = 0.93, and GFI = 0.74. The correlation coefficient between the QNWL total scores and the WHOQOL-BREF total scores was 0.605 (p<0.01). The QNWL scale was reliable and valid in Chinese-speaking nurses and could be used as a clinical and research

  9. Information management

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell; Corker, Kevin

    1990-01-01

    Primary Flight Display (PFD) information management and cockpit display of information management research is presented in viewgraph form. The information management problem in the cockpit, information management burdens, the key characteristics of an information manager, the interface management system handling the flow of information and the dialogs between the system and the pilot, and overall system architecture are covered.

  10. A universal approximate cross-validation criterion for regular risk functions.

    PubMed

    Commenges, Daniel; Proust-Lima, Cécile; Samieri, Cécilia; Liquet, Benoit

    2015-05-01

    Selection of estimators is an essential task in modeling. A general framework is that the estimators of a distribution are obtained by minimizing a function (the estimating function) and assessed using another function (the assessment function). A classical case is that both functions estimate an information risk (specifically cross-entropy); this corresponds to using maximum likelihood estimators and assessing them by Akaike information criterion (AIC). In more general cases, the assessment risk can be estimated by leave-one-out cross-validation. Since leave-one-out cross-validation is computationally very demanding, we propose in this paper a universal approximate cross-validation criterion under regularity conditions (UACVR). This criterion can be adapted to different types of estimators, including penalized likelihood and maximum a posteriori estimators, and also to different assessment risk functions, including information risk functions and continuous rank probability score (CRPS). UACVR reduces to Takeuchi information criterion (TIC) when cross-entropy is the risk for both estimation and assessment. We provide the asymptotic distributions of UACVR and of a difference of UACVR values for two estimators. We validate UACVR using simulations and provide an illustration on real data both in the psychometric context where estimators of the distributions of ordered categorical data derived from threshold models and models based on continuous approximations are compared. PMID:25849800

  11. Evaluating a coupled discrete wavelet transform and support vector regression for daily and monthly streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyong; Zhou, Ping; Chen, Gang; Guo, Ledong

    2014-11-01

    This study investigated the performance and potential of a hybrid model that combined the discrete wavelet transform and support vector regression (the DWT-SVR model) for daily and monthly streamflow forecasting. Three key factors of the wavelet decomposition phase (mother wavelet, decomposition level, and edge effect) were proposed to consider for improving the accuracy of the DWT-SVR model. The performance of DWT-SVR models with different combinations of these three factors was compared with the regular SVR model. The effectiveness of these models was evaluated using the root-mean-squared error (RMSE) and Nash-Sutcliffe model efficiency coefficient (NSE). Daily and monthly streamflow data observed at two stations in Indiana, United States, were used to test the forecasting skill of these models. The results demonstrated that the different hybrid models did not always outperform the SVR model for 1-day and 1-month lead time streamflow forecasting. This suggests that it is crucial to consider and compare the three key factors when using the DWT-SVR model (or other machine learning methods coupled with the wavelet transform), rather than choosing them based on personal preferences. We then combined forecasts from multiple candidate DWT-SVR models using a model averaging technique based upon Akaike's information criterion (AIC). This ensemble prediction was superior to the single best DWT-SVR model and regular SVR model for both 1-day and 1-month ahead predictions. With respect to longer lead times (i.e., 2- and 3-day and 2-month), the ensemble predictions using the AIC averaging technique were consistently better than the best DWT-SVR model and SVR model. Therefore, integrating model averaging techniques with the hybrid DWT-SVR model would be a promising approach for daily and monthly streamflow forecasting. Additionally, we strongly recommend considering these three key factors when using wavelet-based SVR models (or other wavelet-based forecasting models).

  12. Water availability determines the richness and density of fig trees within Brazilian semideciduous forest landscapes

    NASA Astrophysics Data System (ADS)

    Coelho, Luís Francisco Mello; Ribeiro, Milton Cezar; Pereira, Rodrigo Augusto Santinelo

    2014-05-01

    The success of fig trees in tropical ecosystems is evidenced by the great diversity (+750 species) and wide geographic distribution of the genus. We assessed the contribution of environmental variables on the species richness and density of fig trees in fragments of seasonal semideciduous forest (SSF) in Brazil. We assessed 20 forest fragments in three regions in Sao Paulo State, Brazil. Fig tree richness and density was estimated in rectangular plots, comprising 31.4 ha sampled. Both richness and fig tree density were linearly modeled as function of variables representing (1) fragment metrics, (2) forest structure, and (3) landscape metrics expressing water drainage in the fragments. Model selection was performed by comparing the AIC values (Akaike Information Criterion) and the relative weight of each model (wAIC). Both species richness and fig tree density were better explained by the water availability in the fragment (meter of streams/ha): wAICrichness = 0.45, wAICdensity = 0.96. The remaining variables related to anthropic perturbation and forest structure were of little weight in the models. The rainfall seasonality in SSF seems to select for both establishment strategies and morphological adaptations in the hemiepiphytic fig tree species. In the studied SSF, hemiepiphytes established at lower heights in their host trees than reported for fig trees in evergreen rainforests. Some hemiepiphytic fig species evolved superficial roots extending up to 100 m from their trunks, resulting in hectare-scale root zones that allow them to efficiently forage water and soil nutrients. The community of fig trees was robust to variation in forest structure and conservation level of SSF fragments, making this group of plants an important element for the functioning of seasonal tropical forests.

  13. Comparison of non-linear models to describe the lactation curves for milk yield and composition in buffaloes (Bubalus bubalis).

    PubMed

    Ghavi Hossein-Zadeh, N

    2016-02-01

    In order to describe the lactation curves of milk yield (MY) and composition in buffaloes, seven non-linear mathematical equations (Wood, Dhanoa, Sikka, Nelder, Brody, Dijkstra and Rook) were used. Data were 116,117 test-day records for MY, fat (FP) and protein (PP) percentages of milk from the first three lactations of buffaloes which were collected from 893 herds in the period from 1992 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly production records of dairy buffaloes using the NLIN and MODEL procedures in SAS and the parameters were estimated. The models were tested for goodness of fit using adjusted coefficient of determination (Radj(2)), root means square error (RMSE), Durbin-Watson statistic and Akaike's information criterion (AIC). The Dijkstra model provided the best fit of MY and PP of milk for the first three parities of buffaloes due to the lower values of RMSE and AIC than other models. For the first-parity buffaloes, Sikka and Brody models provided the best fit of FP, but for the second- and third-parity buffaloes, Sikka model and Brody equation provided the best fit of lactation curve for FP, respectively. The results of this study showed that the Wood and Dhanoa equations were able to estimate the time to the peak MY more accurately than the other equations. In addition, Nelder and Dijkstra equations were able to estimate the peak time at second and third parities more accurately than other equations, respectively. Brody function provided more accurate predictions of peak MY over the first three parities of buffaloes. There was generally a positive relationship between 305-day MY and persistency measures and also between peak yield and 305-day MY, calculated by different models, within each lactation in the current study. Overall, evaluation of the different equations used in the current study indicated the potential of the non-linear models for fitting monthly productive records of buffaloes. PMID:26354679

  14. Impact of Schedule Duration on Head and Neck Radiotherapy: Accelerated Tumor Repopulation Versus Compensatory Mucosal Proliferation

    SciTech Connect

    Fenwick, John D.; Pardo-Montero, Juan; Nahum, Alan E.; Malik, Zafar I.

    2012-02-01

    Purpose: To determine how modelled maximum tumor control rates, achievable without exceeding mucositis tolerance (tcp{sub max-early}) vary with schedule duration for head and neck squamous cell carcinoma (HNSCC). Methods and materials: Using maximum-likelihood techniques, we have fitted a range of tcp models to two HNSCC datasets (Withers' and British Institute of Radiology [BIR]), characterizing the dependence of tcp on duration and equivalent dose in 2 Gy fractions (EQD{sub 2}). Models likely to best describe future data have been selected using the Akaike information criterion (AIC) and its quasi-AIC extension to overdispersed data. Setting EQD{sub 2}s in the selected tcp models to levels just tolerable for mucositis, we have plotted tcp{sub max-early} against schedule duration. Results: While BIR dataset tcp fits describe dose levels isoeffective for tumor control as rising significantly with schedule protraction, indicative of accelerated tumor repopulation, repopulation terms in fits to Withers' dataset do not reach significance after accounting for overdispersion of the data. The tcp{sub max-early} curves calculated from tcp fits to the overall Withers' and BIR datasets rise by 8% and 0-4%, respectively, between 20 and 50 days duration; likewise, tcp{sub max-early} curves calculated for stage-specific cohorts also generally rise slowly with increasing duration. However none of the increases in tcp{sub max-early} calculated from the overall or stage-specific fits reach significance. Conclusions: Local control rates modeled for treatments which lie just within mucosal tolerance rise slowly but insignificantly with increasing schedule length. This finding suggests that whereas useful gains may be made by accelerating unnecessarily slow schedules until they approach early reaction tolerance, little is achieved by shortening schedules further while reducing doses to remain within mucosal tolerance, an approach that may slightly worsen outcomes.

  15. Does transport time help explain the high trauma mortality rates in rural areas? New and traditional predictors assessed by new and traditional statistical methods

    PubMed Central

    Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas

    2015-01-01

    Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600

  16. Effects of human recreation on the incubation behavior of American Oystercatchers

    USGS Publications Warehouse

    McGowan, C.P.; Simons, T.R.

    2006-01-01

    Human recreational disturbance and its effects on wildlife demographics and behavior is an increasingly important area of research. We monitored the nesting success of American Oystercatchers (Haematopus palliatus) in coastal North Carolina in 2002 and 2003. We also used video monitoring at nests to measure the response of incubating birds to human recreation. We counted the number of trips per hour made by adult birds to and from the nest, and we calculated the percent time that adults spent incubating. We asked whether human recreational activities (truck, all-terrain vehicle [ATV], and pedestrian traffic) were correlated with parental behavioral patterns. Eleven a priori models of nest survival and behavioral covariates were evaluated using Akaike's Information Criterion (AIC) to see whether incubation behavior influenced nest survival. Factors associated with birds leaving their nests (n = 548) included ATV traffic (25%), truck traffic (17%), pedestrian traffic (4%), aggression with neighboring oystercatchers or paired birds exchanging incubation duties (26%), airplane traffic (1%) and unknown factors (29%). ATV traffic was positively associated with the rate of trips to and away from the nest (??1 = 0.749, P < 0.001) and negatively correlated with percent time spent incubating (??1 = -0.037, P = 0.025). Other forms of human recreation apparently had little effect on incubation behaviors. Nest survival models incorporating the frequency of trips by adults to and from the nest, and the percentage of time adults spent incubating, were somewhat supported in the AIC analyses. A low frequency of trips to and from the nest and, counter to expectations, low percent time spent incubating were associated with higher daily nest survival rates. These data suggest that changes in incubation behavior might be one mechanism by which human recreation affects the reproductive success of American Oystercatchers.

  17. Canada lynx Lynx canadensis habitat and forest succession in northern Maine, USA

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Jakubas, W.J.; McCollough, M.A.

    2004-01-01

    The contiguous United States population of Canada lynx Lynx canadensis was listed as threatened in 2000. The long-term viability of lynx populations at the southern edge of their geographic range has been hypothesized to be dependent on old growth forests; however, lynx are a specialist predator on snowshoe hare Lepus americanus, a species associated with early-successional forests. To quantify the effects of succession and forest management on landscape-scale (100 km2) patterns of habitat occupancy by lynx, we compared landscape attributes in northern Maine, USA, where lynx had been detected on snow track surveys to landscape attributes where surveys had been conducted, but lynx tracks had not been detected. Models were constructed a priori and compared using logistic regression and Akaike's Information Criterion (AIC), which quantitatively balances data fit and parsimony. In the models with the lowest (i.e. best) AIC, lynx were more likely to occur in landscapes with much regenerating forest, and less likely to occur in landscapes with much recent clearcut, partial harvest and forested wetland. Lynx were not associated positively or negatively with mature coniferous forest. A probabilistic map of the model indicated a patchy distribution of lynx habitat in northern Maine. According to an additional survey of the study area for lynx tracks during the winter of 2003, the model correctly classified 63.5% of the lynx occurrences and absences. Lynx were more closely associated with young forests than mature forests; however, old-growth forests were functionally absent from the landscape. Lynx habitat could be reduced in northern Maine, given recent trends in forest management practices. Harvest strategies have shifted from clearcutting to partial harvesting. If this trend continues, future landscapes will shift away from extensive regenerating forests and toward landscapes dominated by pole-sized and larger stands. Because Maine presently supports the only verified

  18. Monthly streamflow prediction in the Volta Basin of West Africa: A SISO NARMAX polynomial modelling

    NASA Astrophysics Data System (ADS)

    Amisigo, B. A.; van de Giesen, N.; Rogers, C.; Andah, W. E. I.; Friesen, J.

    Single-input-single-output (SISO) non-linear system identification techniques were employed to model monthly catchment runoff at selected gauging sites in the Volta Basin of West Africa. NARMAX (Non-linear Autoregressive Moving Average with eXogenous Input) polynomial models were fitted to basin monthly rainfall and gauging station runoff data for each of the selected sites and used to predict monthly runoff at the sites. An error reduction ratio (ERR) algorithm was used to order regressors for various combinations of input, output and noise lags (various model structures) and the significant regressors for each model selected by applying an Akaike Information Criterion (AIC) to independent rainfall-runoff validation series. Model parameters were estimated from the Matlab REGRESS function (an orthogonal least squares method). In each case, the sub-model without noise terms was fitted first followed by a fitting of the noise model. The coefficient of determination ( R-squared), the Nash-Sutcliffe Efficiency criterion (NSE) and the F statistic for the estimation (training) series were used to evaluate the significance of fit of each model to this series while model selection from the range of models fitted for each gauging site was done by examining the NSEs and the AICs of the validation series. Monthly runoff predictions from the selected models were very good, and the polynomial models appeared to have captured a good part of the rainfall-runoff non-linearity. The results indicate that the NARMAX modelling framework is suitable for monthly river runoff prediction in the Volta Basin. The several good models made available by the NARMAX modelling framework could be useful in the selection of model structures that also provide insights into the physical behaviour of the catchment rainfall-runoff system.

  19. Pharmacokinetic Modeling of Intranasal Scopolamine in Plasma Saliva and Urine

    NASA Technical Reports Server (NTRS)

    Wu, L.; Chow, D. S. L.; Tam, V.; Putcha, L.

    2014-01-01

    An intranasal gel formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness. The bioavailability and pharmacokinetics (PK) were evaluated under the Food and Drug Administration guidelines for clinical trials for an Investigative New Drug (IND). The aim of this project was to develop a PK model that can predict the relationship between plasma, saliva and urinary scopolamine concentrations using data collected from the IND clinical trial with INSCOP. METHODS: Twelve healthy human subjects were administered three dose levels (0.1, 0.2 and 0.4 mg) of INSCOP. Serial blood, saliva and urine samples were collected between 5 min to 24 h after dosing and scopolamine concentrations measured by using a validated LC-MS-MS assay. Pharmacokinetic Compartmental models, using actual dosing and sampling times, were built using Phoenix (version 1.2). Model discrimination was performed, by minimizing the Akaike Information Criteria (AIC), maximizing the coefficient of determination (r²) and by comparison of the quality of fit plots. RESULTS: The best structural model to describe scopolamine disposition after INSCOP administration (minimal AIC =907.2) consisted of one compartment for plasma, saliva and urine respectively that were inter-connected with different rate constants. The estimated values of PK parameters were compiled in Table 1. The model fitting exercises revealed a nonlinear PK for scopolamine between plasma and saliva compartments for K21, Vmax and Km. CONCLUSION: PK model for INSCOP was developed and for the first time it satisfactorily predicted the PK of scopolamine in plasma, saliva and urine after INSCOP administration. Using non-linear PK yielded the best structural model to describe scopolamine disposition between plasma and saliva compartments, and inclusion of non-linear PK resulted in a significant improved model fitting. The model can be utilized to predict scopolamine plasma concentration using saliva and/or urine data that

  20. Predicting crappie recruitment in Ohio reservoirs with spawning stock size, larval density, and chlorophyll concentrations

    USGS Publications Warehouse

    Bunnell, David B.; Hale, R. Scott; Vanni, Michael J.; Stein, Roy A.

    2006-01-01

    Stock-recruit models typically use only spawning stock size as a predictor of recruitment to a fishery. In this paper, however, we used spawning stock size as well as larval density and key environmental variables to predict recruitment of white crappies Pomoxis annularis and black crappies P. nigromaculatus, a genus notorious for variable recruitment. We sampled adults and recruits from 11 Ohio reservoirs and larvae from 9 reservoirs during 1998-2001. We sampled chlorophyll as an index of reservoir productivity and obtained daily estimates of water elevation to determine the impact of hydrology on recruitment. Akaike's information criterion (AIC) revealed that Ricker and Beverton-Holt stock-recruit models that included chlorophyll best explained the variation in larval density and age-2 recruits. Specifically, spawning stock catch per effort (CPE) and chlorophyll explained 63-64% of the variation in larval density. In turn, larval density and chlorophyll explained 43-49% of the variation in age-2 recruit CPE. Finally, spawning stock CPE and chlorophyll were the best predictors of recruit CPE (i.e., 74-86%). Although larval density and recruitment increased with chlorophyll, neither was related to seasonal water elevation. Also, the AIC generally did not distinguish between Ricker and Beverton-Holt models. From these relationships, we concluded that crappie recruitment can be limited by spawning stock CPE and larval production when spawning stock sizes are low (i.e., CPE , 5 crappies/net-night). At higher levels of spawning stock sizes, spawning stock CPE and recruitment were less clearly related. To predict recruitment in Ohio reservoirs, managers should assess spawning stock CPE with trap nets and estimate chlorophyll concentrations. To increase crappie recruitment in reservoirs where recruitment is consistently poor, managers should use regulations to increase spawning stock size, which, in turn, should increase larval production and recruits to the fishery.

  1. Assessing the wildlife habitat value of New England salt marshes: II. Model testing and validation.

    PubMed

    McKinney, Richard A; Charpentier, Michael A; Wigand, Cathleen

    2009-07-01

    We tested a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. As a group, wildlife habitat value assessment scores for the marshes ranged from 307-509, or 31-67% of the maximum attainable score. We recorded 6 species of wading birds (Ardeidae; herons, egrets, and bitterns) at the sites during biweekly survey. Species richness (r (2)=0.24, F=4.53, p=0.05) and abundance (r (2)=0.26, F=5.00, p=0.04) of wading birds significantly increased with increasing assessment score. We optimized our assessment model for wading birds by using Akaike information criteria (AIC) to compare a series of models comprised of specific components and categories of our model that best reflect their habitat use. The model incorporating pre-classification, wading bird habitat categories, and natural land surrounding the sites was substantially supported by AIC analysis as the best model. The abundance of wading birds significantly increased with increasing assessment scores generated with the optimized model (r (2)=0.48, F=12.5, p=0.003), demonstrating that optimizing models can be helpful in improving the accuracy of the assessment for a given species or species assemblage. In addition to validating the assessment model, our results show that in spite of their urban setting our study marshes provide substantial wildlife habitat value. This suggests that even small wetlands in highly urbanized coastal settings can provide important wildlife habitat value if key habitat attributes (e.g., natural buffers, habitat heterogeneity) are present. PMID:18597178

  2. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    SciTech Connect

    Houweling, Antonetta C.; Philippens, Marielle E.P.; Dijkema, Tim; Roesink, Judith M.; Terhaard, Chris H.J.; Schilstra, Cornelis; Ten Haken, Randall K.; Eisbruch, Avraham; Raaijmakers, Cornelis P.J.

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulated salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.

  3. Preliminary analysis using multi-atlas labeling algorithms for tracing longitudinal change.

    PubMed

    Kim, Regina E Y; Lourens, Spencer; Long, Jeffrey D; Paulsen, Jane S; Johnson, Hans J

    2015-01-01

    Multicenter longitudinal neuroimaging has great potential to provide efficient and consistent biomarkers for research of neurodegenerative diseases and aging. In rare disease studies it is of primary importance to have a reliable tool that performs consistently for data from many different collection sites to increase study power. A multi-atlas labeling algorithm is a powerful brain image segmentation approach that is becoming increasingly popular in image processing. The present study examined the performance of multi-atlas labeling tools for subcortical identification using two types of in-vivo image database: Traveling Human Phantom (THP) and PREDICT-HD. We compared the accuracy (Dice Similarity Coefficient; DSC and intraclass correlation; ICC), multicenter reliability (Coefficient of Variance; CV), and longitudinal reliability (volume trajectory smoothness and Akaike Information Criterion; AIC) of three automated segmentation approaches: two multi-atlas labeling tools, MABMIS and MALF, and a machine-learning-based tool, BRAINSCut. In general, MALF showed the best performance (higher DSC, ICC, lower CV, AIC, and smoother trajectory) with a couple of exceptions. First, the results of accumben, where BRAINSCut showed higher reliability, were still premature to discuss their reliability levels since their validity is still in doubt (DSC < 0.7, ICC < 0.7). For caudate, BRAINSCut presented slightly better accuracy while MALF showed significantly smoother longitudinal trajectory. We discuss advantages and limitations of these performance variations and conclude that improved segmentation quality can be achieved using multi-atlas labeling methods. While multi-atlas labeling methods are likely to help improve overall segmentation quality, caution has to be taken when one chooses an approach, as our results suggest that segmentation outcome can vary depending on research interest. PMID:26236182

  4. Postural laterality in Iberian ibex Capra pyrenaica: effects of age, sex and nursing suggest stress and social information.

    PubMed

    Sarasa, Mathieu; Soriguer, Ramón C; Serrano, Emmanuel; Granados, José-Enrique; Pérez, Jesús M

    2014-01-01

    Most studies of lateralized behaviour have to date focused on active behaviour such as sensorial perception and locomotion and little is known about lateralized postures, such as lying, that can potentially magnify the effectiveness of lateralized perception and reaction. Moreover, the relative importance of factors such as sex, age and the stress associated with social status in laterality is now a subject of increasing interest. In this study, we assess the importance of sex, age and reproductive investment in females in lying laterality in the Iberian ibex (Capra pyrenaica). Using generalized additive models under an information-theoretic approach based on the Akaike information criterion, we analyzed lying laterality of 78 individually marked ibexes. Sex, age and nursing appeared as key factors associated, in interaction and non-linearly, with lying laterality. Beyond the benefits of studying laterality with non-linear models, our results highlight the fact that a combination of static factors such as sex, and dynamic factors such as age and stress associated with parental care, are associated with postural laterality. PMID:24611891

  5. Human benzene metabolism following occupational and environmental exposures.

    PubMed

    Rappaport, Stephen M; Kim, Sungkyoon; Lan, Qing; Li, Guilan; Vermeulen, Roel; Waidyanatha, Suramya; Zhang, Luoping; Yin, Songnian; Smith, Martyn T; Rothman, Nathaniel

    2010-03-19

    We previously reported evidence that humans metabolize benzene via two enzymes, including a hitherto unrecognized high-affinity enzyme that was responsible for an estimated 73% of total urinary metabolites [sum of phenol (PH), hydroquinone (HQ), catechol (CA), E,E-muconic acid (MA), and S-phenylmercapturic acid (SPMA)] in nonsmoking females exposed to benzene at sub-saturating (ppb) air concentrations. Here, we used the same Michaelis-Menten-like kinetic models to individually analyze urinary levels of PH, HQ, CA and MA from 263 nonsmoking Chinese women (179 benzene-exposed workers and 84 control workers) with estimated benzene air concentrations ranging from less than 0.001-299 ppm. One model depicted benzene metabolism as a single enzymatic process (1-enzyme model) and the other as two enzymatic processes which competed for access to benzene (2-enzyme model). We evaluated model fits based upon the difference in values of Akaike's Information Criterion (DeltaAIC), and we gauged the weights of evidence favoring the two models based upon the associated Akaike weights and Evidence Ratios. For each metabolite, the 2-enzyme model provided a better fit than the 1-enzyme model with DeltaAIC values decreasing in the order 9.511 for MA, 7.379 for PH, 1.417 for CA, and 0.193 for HQ. The corresponding weights of evidence favoring the 2-enzyme model (Evidence Ratios) were: 116.2:1 for MA, 40.0:1 for PH, 2.0:1 for CA and 1.1:1 for HQ. These results indicate that our earlier findings from models of total metabolites were driven largely by MA, representing the ring-opening pathway, and by PH, representing the ring-hydroxylation pathway. The predicted percentage of benzene metabolized by the putative high-affinity enzyme at an air concentration of 0.001 ppm was 88% based upon urinary MA and was 80% based upon urinary PH. As benzene concentrations increased, the respective percentages of benzene metabolized to MA and PH by the high-affinity enzyme decreased successively to 66 and

  6. Inferential Statistics from Black Hispanic Breast Cancer Survival Data

    PubMed Central

    Khan, Hafiz M. R.; Saxena, Anshul; Ross, Elizabeth

    2014-01-01

    In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973–2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER) database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC), Bayesian Information Criteria (BIC), and Deviance Information Criteria (DIC) to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC) method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation. PMID:24678273

  7. Prognosis Evaluation in Patients with Hepatocellular Carcinoma after Hepatectomy: Comparison of BCLC, TNM and Hangzhou Criteria Staging Systems

    PubMed Central

    Lu, Wu-sheng; Yan, Lu-nan; Xiao, Guang-qin; Jiang, Li; Yang, Jian; Yang, Jia-yin

    2014-01-01

    Purpose This study is to evaluate the Hangzhou criteria (HC) for patients with HCC undergoing surgical resection and to identify whether this staging system is superior to other staging systems in predicting the survival of resectable HCC. Method 774 HCC patients underwent surgical resection between 2007 and 2009 in West China Hospital were enrolled retrospectively. Predictors of survival were identified using the Kaplan–Meier method and the Cox model. The disease state was staged by the HC, as well as by the TNM and BCLC staging systems. Prognostic powers were quantified using a linear trend χ2 test, c-index, and the likelihood ratio (LHR) χ2 test and correlated using Cox's regression model adjusted using the Akaike information criterion (AIC). Results Serum AFP level (P = 0.02), tumor size (P<0.001), tumor number (P<0.001), portal vein invasion (P<0.001), hepatic vein invasion (P<0.001), tumor differentiation (P<0.001), and distant organ (P = 0.016) and lymph node metastasis (P<0.001) were identified as independent risk factors of survival after resection by multivariate analysis. The comparison of the different staging system results showed that BCLC had the best homogeneity (likelihood ratio χ2 test 151.119, P<0.001), the TNM system had the best monotonicity of gradients (linear trend χ2 test 137.523, P<0.001), and discriminatory ability was the highest for the BCLC (the AUCs for 1-year mortality were 0.759) and TNM staging systems (the AUCs for 3-, and 5-year mortality were 0.738 and 0.731, respectively). However, based on the c-index and AIC, the HC was the most informative staging system in predicting survival (c-index 0.6866, AIC 5924.4729). Conclusions The HC can provide important prognostic information after surgery. The HC were shown to be a promising survival predictor in a Chinese cohort of patients with resectable HCC. PMID:25133493

  8. The Infuence of Physical Variables on Whole-Stream Metabolism in an Arctic Tundra River

    NASA Astrophysics Data System (ADS)

    Cappelletti, C. K.; Bowden, W.

    2005-05-01

    We examined the influence of light, temperature, nutrients, and discharge on whole-stream metabolism (WSM) in three experimental reaches of the Kuparuk River, Alaska, using the open-system, single-station method. Ambient PO4 levels in the reference reach were ~0.05μM, while addition of phosphoric acid since 1983 in the fertilized reach and since 2004 in the ultra-fertilized reach increased PO4 levels to ~0.30μM and ~0.90μM, respectively. Among all reaches, gross primary production (GPP) was positively correlated with light, temperature, and PO4 and negatively correlated with discharge. Temperature explained most of the variance in GPP. Among all reaches, community respiration (CR) was weakly correlated with light, temperature, PO4, and discharge. However, CR showed a greater response to temperature in the fertilized reaches. Benthic respiration by mosses in the fertilized reaches responds to temperature while heterotrophic respiration in the hyporheic zone is similar in all reaches and does not respond to temperature due to thermal buffering. Light, temperature, and discharge were moderately intercorrelated. An Information-Theoretic approach using Akaike's Information Criterion (AIC) was used to examine the relative importance of each physical variable on WSM and to develop a photosynthesis model.

  9. Effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum.

    PubMed

    Kadam, Shekhar U; Tiwari, Brijesh K; O'Donnell, Colm P

    2015-03-01

    The effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum under hot-air convective drying was investigated. Pretreatments were carried out at ultrasound intensity levels ranging from 7.00 to 75.78 Wcm(-2) for 10 min using an ultrasonic probe system. It was observed that ultrasound pre-treatments reduced the drying time required. The shortest drying times were obtained from samples pre-treated at 75.78 Wcm(-2). The fit quality of 6 thin-layer drying models was also evaluated using the determination of coefficient (R(2)), root means square error (RMSE), AIC (Akaike information criterion) and BIC (Bayesian information criterion). Drying kinetics were modelled using the Newton, Henderson and Pabis, Page, Wang and Singh, Midilli et al. and Weibull models. The Newton, Wang and Singh, and Midilli et al. models showed the best fit to the experimental drying data. Color of ultrasound pretreated dried seaweed samples were lighter compared to control samples. It was concluded that ultrasound pretreatment can be effectively used to reduce the energy cost and drying time for drying of A. nodosum. PMID:25454823

  10. Real time detection of farm-level swine mycobacteriosis outbreak using time series modeling of the number of condemned intestines in abattoirs.

    PubMed

    Adachi, Yasumoto; Makita, Kohei

    2015-09-01

    Mycobacteriosis in swine is a common zoonosis found in abattoirs during meat inspections, and the veterinary authority is expected to inform the producer for corrective actions when an outbreak is detected. The expected value of the number of condemned carcasses due to mycobacteriosis therefore would be a useful threshold to detect an outbreak, and the present study aims to develop such an expected value through time series modeling. The model was developed using eight years of inspection data (2003 to 2010) obtained at 2 abattoirs of the Higashi-Mokoto Meat Inspection Center, Japan. The resulting model was validated by comparing the predicted time-dependent values for the subsequent 2 years with the actual data for 2 years between 2011 and 2012. For the modeling, at first, periodicities were checked using Fast Fourier Transformation, and the ensemble average profiles for weekly periodicities were calculated. An Auto-Regressive Integrated Moving Average (ARIMA) model was fitted to the residual of the ensemble average on the basis of minimum Akaike's information criterion (AIC). The sum of the ARIMA model and the weekly ensemble average was regarded as the time-dependent expected value. During 2011 and 2012, the number of whole or partial condemned carcasses exceeded the 95% confidence interval of the predicted values 20 times. All of these events were associated with the slaughtering of pigs from three producers with the highest rate of condemnation due to mycobacteriosis. PMID:25913899

  11. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGESBeta

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  12. The Interdependence between Rainfall and Temperature: Copula Analyses

    PubMed Central

    Cong, Rong-Gang; Brady, Mark

    2012-01-01

    Rainfall and temperature are important climatic inputs for agricultural production, especially in the context of climate change. However, accurate analysis and simulation of the joint distribution of rainfall and temperature are difficult due to possible interdependence between them. As one possible approach to this problem, five families of copula models are employed to model the interdependence between rainfall and temperature. Scania is a leading agricultural province in Sweden and is affected by a maritime climate. Historical climatic data for Scania is used to demonstrate the modeling process. Heteroscedasticity and autocorrelation of sample data are also considered to eliminate the possibility of observation error. The results indicate that for Scania there are negative correlations between rainfall and temperature for the months from April to July and September. The student copula is found to be most suitable to model the bivariate distribution of rainfall and temperature based on the Akaike information criterion (AIC) and Bayesian information criterion (BIC). Using the student copula, we simulate temperature and rainfall simultaneously. The resulting models can be integrated with research on agricultural production and planning to study the effects of changing climate on crop yields. PMID:23213286

  13. Prediction and extension of curves of distillation of vacuum residue using probability functions

    NASA Astrophysics Data System (ADS)

    León, A. Y.; Riaño, P. A.; Laverde, D.

    2016-02-01

    The use of the probability functions for the prediction of crude distillation curves has been implemented in different characterization studies for refining processes. The study of four functions of probability (Weibull extreme, Weibull, Kumaraswamy and Riazi), was analyzed in this work for the fitting of curves of distillation of vacuum residue. After analysing the experimental data was selected the Weibull extreme function as the best prediction function, the fitting capability of the best function was validated considering as criterions of estimation the AIC (Akaike Information Criterion), BIC (Bayesian information Criterion), and correlation coefficient R2. To cover a wide range of composition were selected fifty-five (55) vacuum residue derived from different hydrocarbon mixture. The parameters of the probability function Weibull Extreme were adjusted from simple measure properties such as Conradson Carbon Residue (CCR), and compositional analysis SARA (saturates, aromatics, resins and asphaltenes). The proposed method is an appropriate tool to describe the tendency of distillation curves and offers a practical approach in terms of classification of vacuum residues.

  14. Joint inversion of T1-T2 spectrum combining the iterative truncated singular value decomposition and the parallel particle swarm optimization algorithms

    NASA Astrophysics Data System (ADS)

    Ge, Xinmin; Wang, Hua; Fan, Yiren; Cao, Yingchang; Chen, Hua; Huang, Rui

    2016-01-01

    With more information than the conventional one dimensional (1D) longitudinal relaxation time (T1) and transversal relaxation time (T2) spectrums, a two dimensional (2D) T1-T2 spectrum in a low field nuclear magnetic resonance (NMR) is developed to discriminate the relaxation components of fluids such as water, oil and gas in porous rock. However, the accuracy and efficiency of the T1-T2 spectrum are limited by the existing inversion algorithms and data acquisition schemes. We introduce a joint method to inverse the T1-T2 spectrum, which combines iterative truncated singular value decomposition (TSVD) and a parallel particle swarm optimization (PSO) algorithm to get fast computational speed and stable solutions. We reorganize the first kind Fredholm integral equation of two kernels to a nonlinear optimization problem with non-negative constraints, and then solve the ill-conditioned problem by the iterative TSVD. Truncating positions of the two diagonal matrices are obtained by the Akaike information criterion (AIC). With the initial values obtained by TSVD, we use a PSO with parallel structure to get the global optimal solutions with a high computational speed. We use the synthetic data with different signal to noise ratio (SNR) to test the performance of the proposed method. The result shows that the new inversion algorithm can achieve favorable solutions for signals with SNR larger than 10, and the inversion precision increases with the decrease of the components of the porous rock.

  15. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese.

    PubMed

    Cai, Qianqian; Turner, Brett D; Sheng, Daichao; Sloan, Scott

    2015-01-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd(2+), Ba(2+) and Mn(2+)) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L(-1)) and Cd (100 mg L(-1)) respectively increases the rate of fluoride sorption by a factor of ~28.3 and ~10.9, the maximum sorption capacity is increased by ~2.2 and ~1.7. The presence of Ba (100 mg L(-1)) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba. PMID:25909159

  16. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese

    NASA Astrophysics Data System (ADS)

    Cai, Qianqian; Turner, Brett D.; Sheng, Daichao; Sloan, Scott

    2015-06-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd2 +, Ba2 + and Mn2 +) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L- 1) and Cd (100 mg L- 1) respectively increases the rate of fluoride sorption by a factor of ~ 28.3 and ~ 10.9, the maximum sorption capacity is increased by ~ 2.2 and ~ 1.7. The presence of Ba (100 mg L- 1) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba.

  17. Flexible and fixed mathematical models describing growth patterns of chukar partridges

    NASA Astrophysics Data System (ADS)

    Aygün, Ali; Narinç, Doǧan

    2016-04-01

    In animal science, the nonlinear regression models for growth curve analysis ofgrowth patterns are separated into two groups called fixed and flexible according to their point of inflection. The aims of this study were to compare fixed and flexible growth functions and to determine the best fit model for the growth data of chukar partridges. With this aim, the growth data of partridges were modeled with widely used models, such as Gompertz, Logistic, Von Bertalanffy as well as the flexible functions, such as, Richards, Janoschek, Levakovich. So as to evaluate growth functions, the R2 (coefficient of determination), adjusted R2 (adjusted coefficient of determination), MSE (mean square error), AIC (Akaike's information criterion) and BIC (Bayesian information criterion) goodness of fit criteria were used. It has been determined that the best fit model from the point of chukar partridge growth data according to mentioned goodness of fit criteria is Janoschek function which has a flexible structure. The Janoschek model is not only important because it has a higher number of parameters with biological meaning than the other functions (the mature weight and initial weight parameters), but also because it was not previously used in the modeling of the chukar partridge growth.

  18. Evaluating Key Watershed Components of Low Flow Regimes in New England Streams.

    PubMed

    Morrison, Alisa C; Gold, Arthur J; Pelletier, Marguerite C

    2016-05-01

    Water resource managers seeking to optimize stream ecosystem services and abstractions of water from watersheds need an understanding of the importance of land use, physical and climatic characteristics, and hydrography on different low flow components of stream hydrographs. Within 33 USGS gaged watersheds of southern New England, we assessed relationships between watershed variables and a set of low flow parameters by using an information-theoretical approach. The key variables identified by the Akaike Information Criteria (AIC) weighting factors as generating positive relationships with low flow events included percent stratified drift, mean elevation, drainage area, and mean August precipitation. The extent of wetlands in the watershed was negatively related to low flow magnitudes. Of the various land use variables, the percentage of developed land was found to have the highest importance and a negative relationship on low flow magnitudes, but was less important than wetlands and physical and climatic features. Our results suggest that management practices aimed to sustain low flows in fluvial systems can benefit from attention to specific watershed features. We draw attention to the finding that streams located in watersheds with high proportions of wetlands may require more stringent approaches to withdrawals to sustain fluvial ecosystems during drought periods, particularly in watersheds with extensive development and limited deposits of stratified drift. PMID:27136170

  19. Modelling lactation curve for milk fat to protein ratio in Iranian buffaloes (Bubalus bubalis) using non-linear mixed models.

    PubMed

    Hossein-Zadeh, Navid Ghavi

    2016-08-01

    The aim of this study was to compare seven non-linear mathematical models (Brody, Wood, Dhanoa, Sikka, Nelder, Rook and Dijkstra) to examine their efficiency in describing the lactation curves for milk fat to protein ratio (FPR) in Iranian buffaloes. Data were 43 818 test-day records for FPR from the first three lactations of Iranian buffaloes which were collected on 523 dairy herds in the period from 1996 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly FPR records of buffaloes using the non-linear mixed model procedure (PROC NLMIXED) in SAS and the parameters were estimated. The models were tested for goodness of fit using Akaike's information criterion (AIC), Bayesian information criterion (BIC) and log maximum likelihood (-2 Log L). The Nelder and Sikka mixed models provided the best fit of lactation curve for FPR in the first and second lactations of Iranian buffaloes, respectively. However, Wood, Dhanoa and Sikka mixed models provided the best fit of lactation curve for FPR in the third parity buffaloes. Evaluation of first, second and third lactation features showed that all models, except for Dijkstra model in the third lactation, under-predicted test time at which daily FPR was minimum. On the other hand, minimum FPR was over-predicted by all equations. Evaluation of the different models used in this study indicated that non-linear mixed models were sufficient for fitting test-day FPR records of Iranian buffaloes. PMID:27600968

  20. Multimodel Predictive System for Carbon Dioxide Solubility in Saline Formation Waters

    SciTech Connect

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-05

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO{sub 2} solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304–433 K, pressure range 74–500 bar, and salt concentration range 0–7 m (NaCl equivalent), using 173 published CO{sub 2} solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO{sub 2} solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO{sub 2} solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  1. Modeling Dark Energy Through AN Ising Fluid with Network Interactions

    NASA Astrophysics Data System (ADS)

    Luongo, Orlando; Tommasini, Damiano

    2014-12-01

    We show that the dark energy (DE) effects can be modeled by using an Ising perfect fluid with network interactions, whose low redshift equation of state (EoS), i.e. ω0, becomes ω0 = -1 as in the ΛCDM model. In our picture, DE is characterized by a barotropic fluid on a lattice in the equilibrium configuration. Thus, mimicking the spin interaction by replacing the spin variable with an occupational number, the pressure naturally becomes negative. We find that the corresponding EoS mimics the effects of a variable DE term, whose limiting case reduces to the cosmological constant Λ. This permits us to avoid the introduction of a vacuum energy as DE source by hand, alleviating the coincidence and fine tuning problems. We find fairly good cosmological constraints, by performing three tests with supernovae Ia (SNeIa), baryonic acoustic oscillation (BAO) and cosmic microwave background (CMB) measurements. Finally, we perform the Akaike information criterion (AIC) and Bayesian information criterion (BIC) selection criteria, showing that our model is statistically favored with respect to the Chevallier-Polarsky-Linder (CPL) parametrization.

  2. An investigation on the estimation of evaporation by combining artificial neural network and dynamic factor analysis

    NASA Astrophysics Data System (ADS)

    Sun, W.; Chiang, Y.; Chang, F.

    2010-12-01

    Evaporation is a substantial factor in hydrological circle, moreover a significant reference to the management of both water resources and agricultural irrigation. In general, evaporation can be directly measured by evaporation pan. As for its estimation, the accuracy of traditional empirical equation is not very precise. Therefore, in this study the Dynamic Factor Analysis (DFA) is first applied to investigating the interaction and the tendency of each gauging station. Additionally, the analysis can effectively establish the common trend at each gauging station by evaluating the corresponding AIC (Akaike Information Criterion) values. Furthermore, the meteorological factors such as relative humidity and temperature are also conducted to identify the explanatory variables which have higher relation to evaporation. These variables are further used as inputs to the Back-Propagation Neural Network (BPNN) and are expected to provide meaningful information for successfully estimating evaporation. The applicability and reliability of the BPNN was demonstrated by comparing its performance with that of empirical formula. Keywords: Evaporation, Dynamic Factor Analysis, Artificial Neural Network.

  3. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    SciTech Connect

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  4. Selecting best-fit models for estimating the body mass from 3D data of the human calcaneus.

    PubMed

    Jung, Go-Un; Lee, U-Young; Kim, Dong-Ho; Kwak, Dai-Soon; Ahn, Yong-Woo; Han, Seung-Ho; Kim, Yi-Suk

    2016-05-01

    Body mass (BM) estimation could facilitate the interpretation of skeletal materials in terms of the individual's body size and physique in forensic anthropology. However, few metric studies have tried to estimate BM by focusing on prominent biomechanical properties of the calcaneus. The purpose of this study was to prepare best-fit models for estimating BM from the 3D human calcaneus by two major linear regression analysis (the heuristic statistical and all-possible-regressions techniques) and validate the models through predicted residual sum of squares (PRESS) statistics. A metric analysis was conducted based on 70 human calcaneus samples (29 males and 41 females) taken from 3D models in the Digital Korean Database and 10 variables were measured for each sample. Three best-fit models were postulated by F-statistics, Mallows' Cp, and Akaike information criterion (AIC) and Bayes information criterion (BIC) for each available candidate models. Finally, the most accurate regression model yields lowest %SEE and 0.843 of R(2). Through the application of leave-one-out cross validation, the predictive power was indicated a high level of validation accuracy. This study also confirms that the equations for estimating BM using 3D models of human calcaneus will be helpful to establish identification in forensic cases with consistent reliability. PMID:26970867

  5. Calu-3 model under AIC and LCC conditions and application for protein permeability studies.

    PubMed

    Marušić, Maja; Djurdjevič, Ida; Drašlar, Kazimir; Caserman, Simon

    2014-01-01

    Broad area of respiratory epithelium with mild surface conditions is an attractive possibility when trans-mucosal delivery of protein drugs is considered. A mucus and cellular barrier of respiratory epithelium can be modelled in vitro by Calu-3 cell line. We have monitored morphology and barrier properties of Calu-3 culture on permeable supports while developing into liquid covered or air interfaced and mucus lined cellular barrier. Besides morphological differences, cultures differed in electrical resistance and permeability to proteins as well. The accelerated permeability to proteins in these models, due to permeability modulator MP C16, was examined. The effect on electrical resistance of cellular layer was rapid in both cultures suggesting easy access of MP C16 to cells even though its overall impact on cell permeability was strongly reduced in mucus covered culture. Differences in properties of the two models enable better understanding of protein transmucosal permeability, suggesting route of transport and MP C16 modulator action. PMID:24664333

  6. INFORMATION COLLECTION RULE INFORMATION SYSTEM

    EPA Science Inventory

    Resource Purpose:The Information Collection Rule (ICR) Information System was developed to store and distribute the information collected in the ICR for DBPs and microbiological research. It is a research database. The information system consists of our parts: laboratory...

  7. 16. Information.

    PubMed

    2014-05-01

    Effective disaster management requires systems for data acquisition and information management that enable responders to rapidly collect, process, interpret, distribute, and access the data and information required for disaster management. Effective information sharing depends on the types of users, the type of damage, alterations of the functional status of the affected society, and how the information is structured. Those in need of information should be provided with the information necessary for their tasks and not be overloaded with unnecessary information that could serve as a distraction. Such information systems must be designed and exercised. To disseminate and share data with the relevant users, all disaster responses must include effective and reliable information systems. This information includes that acquired from repeated assessments in terms of available and needed human and material resources, which resources no longer are needed, and the status of the relief and recovery workers. It is through this information system that vital decisions are made that are congruent with the overall picture as perceived by the most relevant coordination and control centre. It is essential that information systems be designed and tested regularly as part of preparedness. Such systems must have the capacity to acquire, classify, and present information in an organised and useful manner. PMID:24785813

  8. Double-input compartmental modeling and spectral analysis for the quantification of positron emission tomography data in oncology

    NASA Astrophysics Data System (ADS)

    Tomasi, G.; Kimberley, S.; Rosso, L.; Aboagye, E.; Turkheimer, F.

    2012-04-01

    In positron emission tomography (PET) studies involving organs different from the brain, ignoring the metabolite contribution to the tissue time-activity curves (TAC), as in the standard single-input (SI) models, may compromise the accuracy of the estimated parameters. We employed here double-input (DI) compartmental modeling (CM), previously used for [11C]thymidine, and a novel DI spectral analysis (SA) approach on the tracers 5-[18F]fluorouracil (5-[18F]FU) and [18F]fluorothymidine ([18F]FLT). CM and SA were performed initially with a SI approach using the parent plasma TAC as an input function. These methods were then employed using a DI approach with the metabolite plasma TAC as an additional input function. Regions of interest (ROIs) corresponding to healthy liver, kidneys and liver metastases for 5-[18F]FU and to tumor, vertebra and liver for [18F]FLT were analyzed. For 5-[18F]FU, the improvement of the fit quality with the DI approaches was remarkable; in CM, the Akaike information criterion (AIC) always selected the DI over the SI model. Volume of distribution estimates obtained with DI CM and DI SA were in excellent agreement, for both parent 5-[18F]FU (R2 = 0.91) and metabolite [18F]FBAL (R2 = 0.99). For [18F]FLT, the DI methods provided notable improvements but less substantial than for 5-[18F]FU due to the lower rate of metabolism of [18F]FLT. On the basis of the AIC values, agreement between [18F]FLT Ki estimated with the SI and DI models was good (R2 = 0.75) for the ROIs where the metabolite contribution was negligible, indicating that the additional input did not bias the parent tracer only-related estimates. When the AIC suggested a substantial contribution of the metabolite [18F]FLT-glucuronide, on the other hand, the change in the parent tracer only-related parameters was significant (R2 = 0.33 for Ki). Our results indicated that improvements of DI over SI approaches can range from moderate to substantial and are more significant for tracers with

  9. Informed Consent

    MedlinePlus

    ... saved articles window. My Saved Articles » My ACS » Informed Consent Download Printable Version [PDF] » ( En español ) Learn about informed consent, a process you go through before getting a ...

  10. Information "Literacies"

    ERIC Educational Resources Information Center

    Anderson, Byron

    2007-01-01

    As communication technologies change, so do libraries. Library instruction programs are now focused on teaching information literacy, a term that may just as well be referred to as information "literacies." The new media age involves information in a wide variety of mediums. Educators everywhere are realizing media's power to communicate and…

  11. Information Integrity

    ERIC Educational Resources Information Center

    Graves, Eric

    2013-01-01

    This dissertation introduces the concept of Information Integrity, which is the detection and possible correction of information manipulation by any intermediary node in a communication system. As networks continue to grow in complexity, information theoretic security has failed to keep pace. As a result many parties whom want to communicate,…

  12. Population demographics of two local South Carolina mourning dove populations

    USGS Publications Warehouse

    McGowan, D.P., Jr.; Otis, D.L.

    1998-01-01

    The mourning dove (Zenaida macroura) call-count index had a significant (P 2,300 doves and examined >6,000 individuals during harvest bag checks. An age-specific band recovery model with time- and area-specific recovery rates, and constant survival rates, was chosen for estimation via Akaike's Information Criterion (AIC), likelihood ratio, and goodness-of-fit criteria. After-hatching-year (AHY) annual survival rate was 0.359 (SE = 0.056), and hatching-year (HY) annual survival rate was 0.118 (SE = 0.042). Average estimated recruitment per adult female into the prehunting season population was 3.40 (SE = 1.25) and 2.32 (SE = 0.46) for the 2 study areas. Our movement data support earlier hypotheses of nonmigratory breeding and harvested populations in South Carolina. Low survival rates and estimated population growth rate in the study areas may be representative only of small-scale areas that are heavily managed for dove hunting. Source-sink theory was used to develop a model of region-wide populations that is composed of source areas with positive growth rates and sink areas of declining growth. We suggest management of mourning doves in the Southeast might benefit from improved understanding of local population dynamics, as opposed to regional-scale population demographics.

  13. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems. PMID:21710159

  14. Genomic evidence for polyphyletic origins and interlineage gene flow within complex taxa: a case study of Picea brachytyla in the Qinghai-Tibet Plateau.

    PubMed

    Ru, Dafu; Mao, Kangshan; Zhang, Lei; Wang, Xiaojuan; Lu, Zhiqiang; Sun, Yongshuai

    2016-06-01

    Hybridization and introgression are believed to play important roles in plant evolution. However, few empirical studies have been designed to clarify the ways in which these processes complicate taxonomic delimitation. Recent phylogenetic studies based on a number of different DNA fragments have indicated that Picea brachytyla in the eastern Qinghai-Tibet Plateau is polyphyletic, a finding that contrasts with traditional taxonomy based on morphological traits. We aimed to test this conflict using transcriptomic data from 26 trees collected from multiple localities for this and related species. Our phylogenomic analyses suggest that the sampled trees of P. brachytyla cluster into two distinct lineages corresponding to the two taxonomically recognized intraspecific varieties: var. brachytyla and var. complanata. However, var. complanata nested within Picea likiangensis and was sister to one of its three varieties, while var. brachytyla comprised an isolated lineage. The polyphyletic origin hypothesis was further supported by likelihood tree comparisons using Akaike's information criterion (AIC) and by coalescent analyses under the snapp model. However, our abba-baba and ∂a∂i analyses suggest that gene flow between these two independently evolved lineages has been extensive and bidirectional. Introgression, as well as parallel evolution in the arid habitats common to both lineages, may have given rise to their morphological similarity. Our study highlights the importance of genomic evidence and the use of newly developed coalescent analysis methods for clarifying the evolutionary complexity of certain plant taxa. PMID:27093071

  15. Land-use and land-cover change in Western Ghats of India.

    PubMed

    Kale, Manish P; Chavan, Manoj; Pardeshi, Satish; Joshi, Chitiz; Verma, Prabhakar A; Roy, P S; Srivastav, S K; Srivastava, V K; Jha, A K; Chaudhari, Swapnil; Giri, Yogesh; Krishna Murthy, Y V N

    2016-07-01

    The Western Ghats (WG) of India, one of the hottest biodiversity hotspots in the world, has witnessed major land-use and land-cover (LULC) change in recent times. The present research was aimed at studying the patterns of LULC change in WG during 1985-1995-2005, understanding the major drivers that caused such change, and projecting the future (2025) spatial distribution of forest using coupled logistic regression and Markov model. The International Geosphere Biosphere Program (IGBP) classification scheme was mainly followed in LULC characterization and change analysis. The single-step Markov model was used to project the forest demand. The spatial allocation of such forest demand was based on the predicted probabilities derived through logistic regression model. The R statistical package was used to set the allocation rules. The projection model was selected based on Akaike information criterion (AIC) and area under receiver operating characteristic (ROC) curve. The actual and projected areas of forest in 2005 were compared before making projection for 2025. It was observed that forest degradation has reduced from 1985-1995 to 1995-2005. The study obtained important insights about the drivers and their impacts on LULC simulations. To the best of our knowledge, this is the first attempt where projection of future state of forest in entire WG is made based on decadal LULC and socio-economic datasets at the Taluka (sub-district) level. PMID:27256392

  16. Molecular detection of hematozoa infections in tundra swans relative to migration patterns and ecological conditions at breeding grounds

    USGS Publications Warehouse

    Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.

  17. Persistent disturbance by commercial navigation afters the relative abundance of channel-dwelling fishes in a large river

    USGS Publications Warehouse

    Gutreuter, S.; Vallazza, J.M.; Knights, B.C.

    2006-01-01

    We provide the first evidence for chronic effects of disturbance by commercial vessels on the spatial distribution and abundance of fishes in the channels of a large river. Most of the world's large rivers are intensively managed to satisfy increasing demands for commercial shipping, but little research has been conducted to identify and alleviate any adverse consequences of commercial navigation. We used a combination of a gradient sampling design incorporating quasicontrol areas with Akaike's information criterion (AIC)-weighted model averaging to estimate effects of disturbances by commercial vessels on fishes in the upper Mississippi River. Species density, which mainly measured species evenness, decreased with increasing disturbance frequency. The most abundant species - gizzard shad (Dorosoma cepedianum) and freshwater drum (Aplodinotus grunniens) - and the less abundant shovelnose sturgeon (Scaphirhynchus platorhynchus) and flathead catfish (Pylodictis olivaris) were seemingly unaffected by traffic disturbance. In contrast, the relative abundance of the toothed herrings (Hiodon spp.), redhorses (Moxostoma spp.), buffaloes (Ictiobus spp.), channel catfish (Ictalurus punctatus), sauger (Sander canadensis), and white bass (Morone chrysops) decreased with increasing traffic in the navigation channel. We hypothesized that the combination of alteration of hydraulic features within navigation channels and rehabilitation of secondary channels might benefit channel-dependent species. ?? 2006 NRC.

  18. Determining Individual Variation in Growth and Its Implication for Life-History and Population Processes Using the Empirical Bayes Method

    PubMed Central

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J.; Munch, Stephan; Skaug, Hans J.

    2014-01-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish. PMID:25211603

  19. Impact of Large-scale Circulation Patterns on Surface Ozone Variability in Houston-Galveston-Brazoria

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Jia, B.; Xie, Y.

    2015-12-01

    The Bermuda High (BH) is a key driver of large-scale circulation patterns for Southeastern Texas and other Gulf coast states in summer, with the expected influence on surface ozone through its modulation of marine air inflow with lower ozone background from the Gulf of Mexico. We develop a statistical relationship through multiple linear regression (MLR) to quantify the impact of the BH variations on surface ozone variability during the ozone season in the Houston-Galveston-Brazoria (HGB) area, a major ozone nonattainment region on the Gulf Coast. We find that the variability in BH location, represented by a longitude index of the BH west edge (BH-Lon) in the MLR, explains 50-60% of the year-to-year variability in monthly mean ozone over HGB for Jun and July during 1998-2013, and the corresponding figure for Aug and Sep is 20%. Additional 30%-40% of the ozone variability for Aug and Sep can be explained by the variability in BH strength, represented by two BH intensity indices (BHI) in the MLR, but its contribution is only 5% for June and not significant for July. Including a maximum Through stepwise regression based on Akaike Information Criterion (AIC), the MLR model captures 58~72% of monthly ozone variability during Jun-Sep with a cross-validation R2 of 0.5. This observation-derived statistical relationship will be valuable to constrain model simulations of ozone variability attributable to large-scale circulation patterns.

  20. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  1. Estimating annual survival and movement rates of adults within a metapopulation of roseate terns

    USGS Publications Warehouse

    Spendelow, J.A.; Nichols, J.D.; Nisbet, I.C.T.; Hays, H.; Cormons, G.D.; Burger, J.; Safina, C.; Hines, J.E.; Gochfeld, M.

    1995-01-01

    Several multistratum capture-recapture models were used to test various hypotheses about possible geographic and temporal variation in survival, movement, and recapture/resighting probabilities of 2399 adult Roseate Terns (Sterna dougallii) color-banded from 1988 to 1992 at the sites of the four largest breeding colonies of this species in the northeastern USA. Linear-logistic ultrastructural models also were developed to investigate possible correlates of geographic variation in movement probabilities. Based on goodness-of-fit tests and comparisons of Akaike's Information Criterion (AIC) values, the fully parameterized model (Model A) with time- and location-specific survival, movement, and capture probabilities, was selected as the most appropriate model for this metapopulation structure. With almost all movement accounted for, on average gt 90% of the surviving adults from each colony site returned to the same site the following year. Variations in movement probabilities were more closely associated with the identity of the destination colony site than with either the identity of the colony site of origin or the distance between colony sites. The average annual survival estimates (0.740.84) of terns from all four sites indicate a high rate of annual mortality relative to that of other species of marine birds.

  2. Accretion Timescales from Kepler AGN

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.; Vogeley, Michael S.; Richards, Gordon T.

    2015-01-01

    We constrain AGN accretion disk variability mechanisms using the optical light curves of AGN observed by Kepler. AGN optical fluxes are known to exhibit stochastic variations on timescales of hours, days, months and years. The excellent sampling properties of the original Kepler mission - high S/N ratio (105), short sampling interval (30 minutes), and long sampling duration (~ 3.5 years) - allow for a detailed examination of the differences between the variability processes present in various sub-types of AGN such as Type I and II Seyferts, QSOs, and Blazars. We model the flux data using the Auto-Regressive Moving Average (ARMA) representation from the field of time series analysis. We use the Kalman filter to determine optimal mode parameters and use the Akaike Information Criteria (AIC) to select the optimal model. We find that optical light curves from Kepler AGN cannot be fit by low order statistical models such as the popular AR(1) process or damped random walk. Kepler light curves exhibit complicated power spectra and are better modeled by higher order ARMA processes. We find that Kepler AGN typically exhibit power spectra that change from a bending power law (PSD ~ 1/fa) to a flat power spectrum on timescales in the range of ~ 5 - 100 days consistent with the orbital and thermal timescales of a typical 107 solar mass black hole.

  3. Predictive occurrence models for coastal wetland plant communities: Delineating hydrologic response surfaces with multinomial logistic regression

    NASA Astrophysics Data System (ADS)

    Snedden, Gregg A.; Steyer, Gregory D.

    2013-02-01

    Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007-Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.

  4. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  5. Negative binomial models for abundance estimation of multiple closed populations

    USGS Publications Warehouse

    Boyce, Mark S.; MacKenzie, Darry I.; Manly, Bryan F.J.; Haroldson, Mark A.; Moody, David W.

    2001-01-01

    Counts of uniquely identified individuals in a population offer opportunities to estimate abundance. However, for various reasons such counts may be burdened by heterogeneity in the probability of being detected. Theoretical arguments and empirical evidence demonstrate that the negative binomial distribution (NBD) is a useful characterization for counts from biological populations with heterogeneity. We propose a method that focuses on estimating multiple populations by simultaneously using a suite of models derived from the NBD. We used this approach to estimate the number of female grizzly bears (Ursus arctos) with cubs-of-the-year in the Yellowstone ecosystem, for each year, 1986-1998. Akaike's Information Criteria (AIC) indicated that a negative binomial model with a constant level of heterogeneity across all years was best for characterizing the sighting frequencies of female grizzly bears. A lack-of-fit test indicated the model adequately described the collected data. Bootstrap techniques were used to estimate standard errors and 95% confidence intervals. We provide a Monte Carlo technique, which confirms that the Yellowstone ecosystem grizzly bear population increased during the period 1986-1998.

  6. Construction of a cancer-perturbed protein-protein interaction network for discovery of apoptosis drug targets

    PubMed Central

    Chu, Liang-Hui; Chen, Bor-Sen

    2008-01-01

    Background Cancer is caused by genetic abnormalities, such as mutations of oncogenes or tumor suppressor genes, which alter downstream signal transduction pathways and protein-protein interactions. Comparisons of the interactions of proteins in cancerous and normal cells can shed light on the mechanisms of carcinogenesis. Results We constructed initial networks of protein-protein interactions involved in the apoptosis of cancerous and normal cells by use of two human yeast two-hybrid data sets and four online databases. Next, we applied a nonlinear stochastic model, maximum likelihood parameter estimation, and Akaike Information Criteria (AIC) to eliminate false-positive protein-protein interactions in our initial protein interaction networks by use of microarray data. Comparisons of the networks of apoptosis in HeLa (human cervical carcinoma) cells and in normal primary lung fibroblasts provided insight into the mechanism of apoptosis and allowed identification of potential drug targets. The potential targets include BCL2, caspase-3 and TP53. Our comparison of cancerous and normal cells also allowed derivation of several party hubs and date hubs in the human protein-protein interaction networks involved in caspase activation. Conclusion Our method allows identification of cancer-perturbed protein-protein interactions involved in apoptosis and identification of potential molecular targets for development of anti-cancer drugs. PMID:18590547

  7. Spatiotemporal analysis of aquifers salinization in coastal area of Yunlin, Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, P.-C.; Tan, Y.-C.

    2012-04-01

    In the past, time and space characteristics often discussed separately. This study adopts regionalized variables theory, and describes the water quality in terms of its structure in time and space to assess the situation of Yunlin. This study applied the Quantum Bayesian Maximum Entropy Toolbox (QtBME), which is a spatiotemporal statistics function, can be applied to estimate and map a non-stationary and non-homogeneous spatiotemporal process under the platform of Quantum GIS (QGIS) software. Kernel smoothing method is used to divide the original process into a deterministic trend and a stationary and homogeneous spatiotemporal process, assuming that a spatiotemporal process can be divided into high and low frequency. The covariance model of the process of high frequency is selected objectively by particle swarm optimization (PSO) method and Akaike's information criterion (AIC). Bayesian maximum entropy method is then applied to spatiotemporal mapping of the variable of interest. In this study, QtBME estimated the situation of aquifers salinization at Yunlin coastal area in 1992 to 2010. Finally, one investigated the rainfall and aquifers salinization on the degree of impact.

  8. Determination of Original Infection Source of H7N9 Avian Influenza by Dynamical Model

    NASA Astrophysics Data System (ADS)

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-05-01

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters.

  9. Mapping the mean monthly precipitation of a small island using kriging with external drifts

    NASA Astrophysics Data System (ADS)

    Cantet, Philippe

    2015-09-01

    This study focuses on the spatial distribution of mean annual and monthly precipitation in a small island (1128 km2) named Martinique, located in the Lesser Antilles. Only 35 meteorological stations are available on the territory, which has a complex topography. With a digital elevation model (DEM), 17 covariates that are likely to explain precipitation were built. Several interpolation methods, such as regression-kriging (MLRK, PCRK,and PLSK) and external drift kriging (EDK) were tested using a cross-validation procedure. For the regression methods, predictors were chosen by established techniques whereas a new approach is proposed to select external drifts in a kriging which is based on a stepwise model selection by the Akaike Information Criterion (AIC). The prediction accuracy was assessed at validation sites with three different skill scores. Results show that using methods with no predictors such as inverse distance weighting (IDW) or universal kriging (UK) is inappropriate in such a territory. EDK appears to outperform regression methods for any criteria, and selecting predictors by our approach improves the prediction of mean annual precipitation compared to kriging with only elevation as drift. Finally, the predicting performance was also studied by varying the size of the training set leading to less conclusive results for EDK and its performance. Nevertheless, the proposed method seems to be a good way to improve the mapping of climatic variables in a small island.

  10. Lee-Carter state space modeling: Application to the Malaysia mortality data

    NASA Astrophysics Data System (ADS)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-06-01

    This article presents an approach that formalizes the Lee-Carter (LC) model as a state space model. Maximum likelihood through Expectation-Maximum (EM) algorithm was used to estimate the model. The methodology is applied to Malaysia's total population mortality data. Malaysia's mortality data was modeled based on age specific death rates (ASDR) data from 1971-2009. The fitted ASDR are compared to the actual observed values. However, results from the comparison of the fitted and actual values between LC-SS model and the original LC model shows that the fitted values from the LC-SS model and original LC model are quite close. In addition, there is not much difference between the value of root mean squared error (RMSE) and Akaike information criteria (AIC) from both models. The LC-SS model estimated for this study can be extended for forecasting ASDR in Malaysia. Then, accuracy of the LC-SS compared to the original LC can be further examined by verifying the forecasting power using out-of-sample comparison.

  11. Extensions to minimum relative entropy inversion for noisy data

    NASA Astrophysics Data System (ADS)

    Ulrych, Tadeusz J.; Woodbury, Allan D.

    2003-12-01

    Minimum relative entropy (MRE) and Tikhonov regularization (TR) were compared by Neupauer et al. [Water Resour. Res. 36 (2000) 2469] on the basis of an example plume source reconstruction problem originally proposed by Skaggs and Kabala [Water Resour. Res. 30 (1994) 71] and a boxcar-like function. Although Neupauer et al. [Water Resour. Res. 36 (2000) 2469] were careful in their conclusions to note the basis of these comparisons, we show that TR does not perform well on problems in which delta-like sources are convolved with diffuse-groundwater contamination response functions, particularly in the presence of noise. We also show that it is relatively easy to estimate an appropriate value for ɛ, the hyperparameter needed in the minimum relative entropy solution for the inverse problem in the presence of noise. This can be estimated in a variety of ways, including estimation from the data themselves, analysis of data residuals, and a rigorous approach using the real cepstrum and the Akaike Information Criterion (AIC). Regardless of the approach chosen, for the sample problem reported herein, excellent resolution of multiple delta-like spikes is produced from MRE from noisy, diffuse data. The usefulness of MRE for noisy inverse problems has been demonstrated.

  12. Dynamically tunable plasmonically induced transparency in sinusoidally curved and planar graphene layers.

    PubMed

    Xia, Sheng-Xuan; Zhai, Xiang; Wang, Ling-Ling; Sun, Bin; Liu, Jian-Qiang; Wen, Shuang-Chun

    2016-08-01

    To achieve plasmonically induced transparency (PIT), general near-field plasmonic systems based on couplings between localized plasmon resonances of nanostructures rely heavily on the well-designed interantenna separations. However, the implementation of such devices and techniques encounters great difficulties mainly to due to very small sized dimensions of the nanostructures and gaps between them. Here, we propose and numerically demonstrate that PIT can be achieved by using two graphene layers that are composed of a upper sinusoidally curved layer and a lower planar layer, avoiding any pattern of the graphene sheets. Both the analytical fitting and the Akaike Information Criterion (AIC) method are employed efficiently to distinguish the induced window, which is found to be more likely caused by Autler-Townes splitting (ATS) instead of electromagnetically induced transparency (EIT). Besides, our results show that the resonant modes cannot only be tuned dramatically by geometrically changing the grating amplitude and the interlayer spacing, but also by dynamically varying the Fermi energy of the graphene sheets. Potential applications of the proposed system could be expected on various photonic functional devices, including optical switches, plasmonic sensors. PMID:27505756

  13. Predictors of CNS injury as measured by proton magnetic resonance spectroscopy in the setting of chronic HIV infection and CART.

    PubMed

    Harezlak, J; Cohen, R; Gongvatana, A; Taylor, M; Buchthal, S; Schifitto, G; Zhong, J; Daar, E S; Alger, J R; Brown, M; Singer, E J; Campbell, T B; McMahon, D; So, Y T; Yiannoutsos, C T; Navia, B A

    2014-06-01

    The reasons for persistent brain dysfunction in chronically HIV-infected persons on stable combined antiretroviral therapies (CART) remain unclear. Host and viral factors along with their interactions were examined in 260 HIV-infected subjects who underwent magnetic resonance spectroscopy (MRS). Metabolite concentrations (NAA/Cr, Cho/Cr, MI/Cr, and Glx/Cr) were measured in the basal ganglia, the frontal white matter, and gray matter, and the best predictive models were selected using a bootstrap-enhanced Akaike information criterion (AIC). Depending on the metabolite and brain region, age, race, HIV RNA concentration, ADC stage, duration of HIV infection, nadir CD4, and/or their interactions were predictive of metabolite concentrations, particularly the basal ganglia NAA/Cr and the mid-frontal NAA/Cr and Glx/Cr, whereas current CD4 and the CPE index rarely or did not predict these changes. These results show for the first time that host and viral factors related to both current and past HIV status contribute to persisting cerebral metabolite abnormalities and provide a framework for further understanding neurological injury in the setting of chronic and stable disease. PMID:24696364

  14. Estimation of exposure to toxic releases using spatial interaction modeling

    PubMed Central

    2011-01-01

    Background The United States Environmental Protection Agency's Toxic Release Inventory (TRI) data are frequently used to estimate a community's exposure to pollution. However, this estimation process often uses underdeveloped geographic theory. Spatial interaction modeling provides a more realistic approach to this estimation process. This paper uses four sets of data: lung cancer age-adjusted mortality rates from the years 1990 through 2006 inclusive from the National Cancer Institute's Surveillance Epidemiology and End Results (SEER) database, TRI releases of carcinogens from 1987 to 1996, covariates associated with lung cancer, and the EPA's Risk-Screening Environmental Indicators (RSEI) model. Results The impact of the volume of carcinogenic TRI releases on each county's lung cancer mortality rates was calculated using six spatial interaction functions (containment, buffer, power decay, exponential decay, quadratic decay, and RSEI estimates) and evaluated with four multivariate regression methods (linear, generalized linear, spatial lag, and spatial error). Akaike Information Criterion values and P values of spatial interaction terms were computed. The impacts calculated from the interaction models were also mapped. Buffer and quadratic interaction functions had the lowest AIC values (22298 and 22525 respectively), although the gains from including the spatial interaction terms were diminished with spatial error and spatial lag regression. Conclusions The use of different methods for estimating the spatial risk posed by pollution from TRI sites can give different results about the impact of those sites on health outcomes. The most reliable estimates did not always come from the most complex methods. PMID:21418644

  15. Towards a Model Selection Rule for Quantum State Tomography

    NASA Astrophysics Data System (ADS)

    Scholten, Travis; Blume-Kohout, Robin

    Quantum tomography on large and/or complex systems will rely heavily on model selection techniques, which permit on-the-fly selection of small efficient statistical models (e.g. small Hilbert spaces) that accurately fit the data. Many model selection tools, such as hypothesis testing or Akaike's AIC, rely implicitly or explicitly on the Wilks Theorem, which predicts the behavior of the loglikelihood ratio statistic (LLRS) used to choose between models. We used Monte Carlo simulations to study the behavior of the LLRS in quantum state tomography, and found that it disagrees dramatically with Wilks' prediction. We propose a simple explanation for this behavior; namely, that boundaries (in state space and between models) play a significant role in determining the distribution of the LLRS. The resulting distribution is very complex, depending strongly both on the true state and the nature of the data. We consider a simplified model that neglects anistropy in the Fisher information, derive an almost analytic prediction for the mean value of the LLRS, and compare it to numerical experiments. While our simplified model outperforms the Wilks Theorem, it still does not predict the LLRS accurately, implying that alternative methods may be necessary for tomographic model selection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE.

  16. Verification of deforestation in East Asia by spatial logit models due to population and relief energy

    NASA Astrophysics Data System (ADS)

    Tanaka, Shojiro; Nishii, Ryuei

    2005-10-01

    Deforestation is a result of complex causality chains in most cases. But identification of limited number of factors shall provide comprehensive general understanding of the vital phenomenon at a broad scale, as well as projection for the future. Only two factors -- human population size (N) and relief energy (R: difference of minimum altitude from the maximum in a sampled area) -- were found to give sufficient elucidation of deforestation by nonlinear logit regression models, whose functional forms were suggested by step functions fitted to one-kilometer square high precision grid-cell data in Japan (n=6825). Likelihood with spatial dependency was derived, and several deforestation models were selected for the application to East Asia by calculating relative appropriateness to data. For the measure of appropriateness, Akaike's Information Criterion (AIC) was used. Logit model is employed so as to avoid anomaly in asymptotic lower and upper bounds. Therefore the forest areal rate, 0 < F < 1. To formulate East-Asian dataset, landcover dataset estimated from NOAA observations available at UNEP, Tsukuba for F, gridded population of the world of CIESIN, US for N, and GTOPO30 of USGS for R, were used. The resolutions were matched by taking their common multiple of 20 minutes square. It was suggested that data of full forest coverage, F=1.0, which were not dealt in calculations due to logit transformation this time, should give important role in stabilizing parameter estimations.

  17. Stochastic Spatio-Temporal Dynamic Model for Gene/Protein Interaction Network in Early Drosophila Development

    PubMed Central

    Li, Cheng-Wei; Chen, Bor-Sen

    2009-01-01

    In order to investigate the possible mechanisms for eve stripe formation of Drosophila embryo, a spatio-temporal gene/protein interaction network model is proposed to mimic dynamic behaviors of protein synthesis, protein decay, mRNA decay, protein diffusion, transcription regulations and autoregulation to analyze the interplay of genes and proteins at different compartments in early embryogenesis. In this study, we use the maximum likelihood (ML) method to identify the stochastic 3-D Embryo Space-Time (3-DEST) dynamic model for gene/protein interaction network via 3-D mRNA and protein expression data and then use the Akaike Information Criterion (AIC) to prune the gene/protein interaction network. The identified gene/protein interaction network allows us not only to analyze the dynamic interplay of genes and proteins on the border of eve stripes but also to infer that eve stripes are established and maintained by network motifs built by the cooperation between transcription regulations and diffusion mechanisms in early embryogenesis. Literature reference with the wet experiments of gene mutations provides a clue for validating the identified network. The proposed spatio-temporal dynamic model can be extended to gene/protein network construction of different biological phenotypes, which depend on compartments, e.g. postnatal stem/progenitor cell differentiation. PMID:20054403

  18. The optimal number of lymph nodes removed in maximizing the survival of breast cancer patients

    NASA Astrophysics Data System (ADS)

    Peng, Lim Fong; Taib, Nur Aishah; Mohamed, Ibrahim; Daud, Noorizam

    2014-07-01

    The number of lymph nodes removed is one of the important predictors for survival in breast cancer study. Our aim is to determine the optimal number of lymph nodes to be removed for maximizing the survival of breast cancer patients. The study population consists of 873 patients with at least one of axillary nodes involved among 1890 patients from the University of Malaya Medical Center (UMMC) breast cancer registry. For this study, the Chi-square test of independence is performed to determine the significant association between prognostic factors and survival status, while Wilcoxon test is used to compare the estimates of the hazard functions of the two or more groups at each observed event time. Logistic regression analysis is then conducted to identify important predictors of survival. In particular, Akaike's Information Criterion (AIC) are calculated from the logistic regression model for all thresholds of node involved, as an alternative measure for the Wald statistic (χ2), in order to determine the optimal number of nodes that need to be removed to obtain the maximum differential in survival. The results from both measurements are compared. It is recommended that, for this particular group, the minimum of 10 nodes should be removed to maximize survival of breast cancer patients.

  19. Demographic, Reproductive, and Dietary Determinants of Perfluorooctane Sulfonic (PFOS) and Perfluorooctanoic Acid (PFOA) Concentrations in Human Colostrum.

    PubMed

    Jusko, Todd A; Oktapodas, Marina; Palkovičová Murinová, L'ubica; Babinská, Katarina; Babjaková, Jana; Verner, Marc-André; DeWitt, Jamie C; Thevenet-Morrison, Kelly; Čonka, Kamil; Drobná, Beata; Chovancová, Jana; Thurston, Sally W; Lawrence, B Paige; Dozier, Ann M; Järvinen, Kirsi M; Patayová, Henrieta; Trnovec, Tomáš; Legler, Juliette; Hertz-Picciotto, Irva; Lamoree, Marja H

    2016-07-01

    To determine demographic, reproductive, and maternal dietary factors that predict perfluoroalkyl substance (PFAS) concentrations in breast milk, we measured perfluorooctane sulfonic (PFOS) and perfluorooctanoic acid (PFOA) concentrations, using liquid chromatography-mass spectrometry, in 184 colostrum samples collected from women participating in a cohort study in Eastern Slovakia between 2002 and 2004. During their hospital delivery stay, mothers completed a food frequency questionnaire, and demographic and reproductive data were also collected. PFOS and PFOA predictors were identified by optimizing multiple linear regression models using Akaike's information criterion (AIC). The geometric mean concentration in colostrum was 35.3 pg/mL for PFOS and 32.8 pg/mL for PFOA. In multivariable models, parous women had 40% lower PFOS (95% CI: -56 to -17%) and 40% lower PFOA (95% CI: -54 to -23%) concentrations compared with nulliparous women. Moreover, fresh/frozen fish consumption, longer birth intervals, and Slovak ethnicity were associated with higher PFOS and PFOA concentrations in colostrum. These results will help guide the design of future epidemiologic studies examining milk PFAS concentrations in relation to health end points in children. PMID:27244128

  20. Ultrafine particle concentrations in the surroundings of an urban area: comparing downwind to upwind conditions using Generalized Additive Models (GAMs).

    PubMed

    Sartini, Claudio; Zauli Sajani, Stefano; Ricciardelli, Isabella; Delgado-Saborit, Juana Mari; Scotto, Fabiana; Trentini, Arianna; Ferrari, Silvia; Poluzzi, Vanes

    2013-10-01

    The aim of this study was to investigate the influence of an urban area on ultrafine particle (UFP) concentration in nearby surrounding areas. We assessed how downwind and upwind conditions affect the UFP concentration at a site placed a few kilometres from the city border. Secondarily, we investigated the relationship among other meteorological factors, temporal variables and UFP. Data were collected for 44 days during 2008 and 2009 at a rural site placed about 3 kilometres from Bologna, in northern Italy. Measurements were performed using a spectrometer (FMPS TSI 3091). The average UFP number concentration was 11 776 (±7836) particles per cm(3). We analysed the effect of wind direction in a multivariate Generalized Additive Model (GAM) adjusted for the principal meteorological parameters and temporal trends. An increase of about 25% in UFP levels was observed when the site was downwind of the urban area, compared with the levels observed when wind blew from rural areas. The size distribution of particles was also affected by the wind direction, showing higher concentration of small size particles when the wind blew from the urban area. The GAM showed a good fit to the data (R(2) = 0.81). Model choice was via Akaike Information Criteria (AIC). The analysis also revealed that an approach based on meteorological data plus temporal trends improved the goodness of the fit of the model. In addition, the findings contribute to evidence on effects of exposure to ultrafine particles on a population living in city surroundings. PMID:24077061

  1. Age and growth of chub mackerel ( Xcomber japonicus) in the East China and Yellow Seas using sectioned otolith samples

    NASA Astrophysics Data System (ADS)

    Li, Gang; Chen, Xinjun; Feng, Bo

    2008-11-01

    Although chub mackerel ( Scomber japonicus) is a primary pelagic fish species, we have only limited knowledge on its key life history processes. The present work studied the age and growth of chub mackerel in the East China and Yellow Seas. Age was determined by interpreting and counting growth rings on the sagitta otoliths of 252 adult fish caught by the Chinese commercial purse seine fleet during the period from November 2006 to January 2007 and 150 juveniles from bottom trawl surveys on the spawning ground in May 2006. The difference between the assumed birth date of 1st April and date of capture was used to adjust the age determined from counting the number of complete translucent rings. The parameters of three commonly used growth models, the von Bertalanffy, Logistic and Gompertz models, were estimated using the maximum likelihood method. Based on the Akaike Information Criterion ( AIC), the von Bertalanffy growth model was found to be the most appropriate model. The size-at-age and size-at-maturity values were also found to decrease greatly compared with the results achieved in the 1950s, which was caused by heavy exploitation over the last few decades.

  2. Prevalence and predictors for musculoskeletal discomfort in Malaysian office workers: Investigating explanatory factors for a developing country.

    PubMed

    Maakip, Ismail; Keegel, Tessa; Oakman, Jodi

    2016-03-01

    Musculoskeletal disorders (MSDs) are a major occupational health issue for workers in developed and developing countries, including Malaysia. Most research related to MSDs has been undertaken in developed countries; given the different regulatory and cultural practices it is plausible that contributions of hazard and risk factors may be different. A population of Malaysian public service office workers were surveyed (N = 417, 65.5% response rate) to determine prevalence and associated predictors of MSD discomfort. The 6-month period prevalence of MSD discomfort was 92.8% (95%CI = 90.2-95.2%). Akaike's Information Criterion (AIC) analyses was used to compare a range of models and determine a model of best fit. Contributions associated with MSD discomfort in the final model consisted of physical demands (61%), workload (14%), gender (13%), work-home balance (9%) and psychosocial factors (3%). Factors associated with MSD discomfort were similar in developed and developing countries but the relative contribution of factors was different, providing insight into future development of risk management strategies. PMID:26499952

  3. Predictive occurrence models for coastal wetland plant communities: delineating hydrologic response surfaces with multinomial logistic regression

    USGS Publications Warehouse

    Snedden, Gregg A.; Steyer, Gregory D.

    2013-01-01

    Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007–Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.

  4. Mixture regression models for closed population capture-recapture data.

    PubMed

    Tounkara, Fodé; Rivest, Louis-Paul

    2015-09-01

    In capture-recapture studies, the use of individual covariates has been recommended to get stable population estimates. However, some residual heterogeneity might still exist and ignoring such heterogeneity could lead to underestimating the population size (N). In this work, we explore two new models with capture probabilities depending on both covariates and unobserved random effects, to estimate the size of a population. Inference techniques including Horvitz-Thompson estimate and confidence intervals for the population size, are derived. The selection of a particular model is carried out using the Akaike information criterion (AIC). First, we extend the random effect model of Darroch et al. (1993, Journal of American Statistical Association 88, 1137-1148) to handle unit level covariates and discuss its limitations. The second approach is a generalization of the traditional zero-truncated binomial model that includes a random effect to account for an unobserved heterogeneity. This approach provides useful tools for inference about N, since key quantities such as moments, likelihood functions and estimates of N and their standard errors have closed form expressions. Several models for the unobserved heterogeneity are available and the marginal capture probability is expressed using the Logit and the complementary Log-Log link functions. The sensitivity of the inference to the specification of a model is also investigated through simulations. A numerical example is presented. We compare the performance of the proposed estimator with that obtained under model Mh of Huggins (1989 Biometrika 76, 130-140). PMID:25963047

  5. Temporal relationship between rainfall, temperature and occurrence of dengue cases in São Luís, Maranhão, Brazil.

    PubMed

    Silva, Fabrício Drummond; dos Santos, Alcione Miranda; Corrêa, Rita da Graça Carvalhal Frazão; Caldas, Arlene de Jesus Mendes

    2016-02-01

    This study analyzed the relationship between rainfall, temperature and occurrence of dengue cases. Ecological study performed with autochthonous dengue cases reported during 2003 to 2010 in São Luís, Maranhão. Data of rainfall and temperature were collected monthly. The monthly incidence of dengue cases was calculated by year/100,000 inhabitants. In order to identify the influence of climate variables and dengue cases different distributed lag models using negative binomial distribution were considered. Model selection was based on the lowest AIC (Akaike Information Criterion). Thirteen thousand, four hundred forty-four cases of dengue between 2003 and 2010 were reported, with peaks in 2005, 2007 and 2010. The correlation between rainfall and the occurrence of dengue cases showed increase in the first months after the rainy months. Occurrence of dengue cases was observed during all the period of study. Only rainfall-lag per three months showed a positive association with the number of cases dengue. Thus, this municipality is considered as an endemic and epidemic site. In addition, the relation between rainfall and dengue cases was significant with a lag of three months. These results should be useful to the future development of politics healthy for dengue prevention and control. PMID:26910171

  6. Resolution of direction of arrival and number of signal(s) in a highly noisy environment

    NASA Astrophysics Data System (ADS)

    Beyon, Jeffrey Y.; Thomopoulos, Stelios C.

    1998-07-01

    The majority of Direction-of-Arrival (DOA) estimation methods studied in the literature work effectively in relatively strong signal power environment [positive dB of Array- Signal-to-Noise-Ratio (ASNR)]. In weak power signal environments, conventional beamformer-based and subspace-based methods fail to estimate the DOA correctly. The MaxMax method allows to maintain accurate estimates of the DOA even in extremely noisy environments (-10 dB of ASNR). The method is reviewed and its performance is compared with that of the Conventional Beamformer, Capon's Beamformer, MUSIC, ESPRIT, and Min-Norm methods. In contrast with the subspace-based methods which entirely depend on the full rank signal covariance matrix, the MaxMax method does not. Hence, the performance of the method remains superior to that of the others without adjusting the algorithm to the characteristics of source signals such as multipath or singlepath. If the signal power is so weak that its presence is almost negligible, Akaike's Information Criterion (AIC) or Minimum Description Length (MDL) do not yield correct estimates the number of signal paths. A new 'spatial sampling' technique and its performance are presented for estimating the number of signals in case of strongly suppressed signal power.

  7. Seismic hazard assessment in central Ionian Islands area (Greece) based on stress release models

    NASA Astrophysics Data System (ADS)

    Votsi, Irene; Tsaklidis, George; Papadimitriou, Eleftheria

    2011-08-01

    The long-term probabilistic seismic hazard of central Ionian Islands (Greece) is studied through the application of stress release models. In order to identify statistically distinct regions, the study area is divided into two subareas, namely Kefalonia and Lefkada, on the basis of seismotectonic properties. Previous results evidenced the existence of stress transfer and interaction between the Kefalonia and Lefkada fault segments. For the consideration of stress transfer and interaction, the linked stress release model is applied. A new model is proposed, where the hazard rate function in terms of X(t) has the form of the Weibull distribution. The fitted models are evaluated through residual analysis and the best of them is selected through the Akaike information criterion. Based on AIC, the results demonstrate that the simple stress release model fits the Ionian data better than the non-homogeneous Poisson and the Weibull models. Finally, the thinning simulation method is applied in order to produce simulated data and proceed to forecasting.

  8. Lung function score including a parameter of small airway disease as a highly predictive indicator of survival after allogeneic hematopoietic cell transplantation.

    PubMed

    Nakamae, Mika; Yamashita, Mariko; Koh, Hideo; Nishimoto, Mitsutaka; Hayashi, Yoshiki; Nakane, Takahiko; Nakashima, Yasuhiro; Hirose, Asao; Hino, Masayuki; Nakamae, Hirohisa

    2016-06-01

    Some studies on the predictive value of determining pulmonary function prior to allogeneic hematopoietic cell transplantation (allo-HCT) have shown a significant association between pulmonary function test (PFT) parameters and pulmonary complications, and mortality. However, the percentage of patients showing abnormalities in pretransplant PFT parameters is low. We comprehensively evaluated the effect of pretransplant PFT parameters, including a marker of small airway disease (ratio of the airflow rate of 50% vital capacity to the airflow rate of 25% vital capacity (V˙50/V˙25), on outcomes in 206 evaluable patients who underwent allo-HCT at our institute. Notable among the significant parameters in a univariable analysis, V˙50/V˙25 was the most powerful indicator of survival following allo-HCT (delta-Akaike information criterion [∆AIC] = 12.47, ∆χ(2)  = 14.47; P = 0.0001). Additionally, a pretransplant lung function score (pLFS) established by applying three parameters with superior predictive values including V˙50/V˙25 represented a better discriminating variable for the prediction of survival. Our data demonstrate that a pLFS incorporating a parameter of small airway disease, rather than the parameters of central airway obstruction, may be useful for predicting patient survival following allo-HCT. PMID:27018997

  9. Informal Taxation*

    PubMed Central

    Olken, Benjamin A.; Singhal, Monica

    2011-01-01

    Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this “informal taxation.” Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts. PMID:22199993

  10. "Information, Information Everywhere and Not..."

    ERIC Educational Resources Information Center

    Wright, Paula

    Demographic and economic materials relevant to rural economic development are the focus of this description of the types of information that are collected by the U.S. Bureau of the Census and how this information can be accessed. Information provided on demographic materials includes collection methods--the census, surveys, and administrative…

  11. Informed Consent

    PubMed Central

    Manion, F.; Hsieh, K.; Harris, M.

    2015-01-01

    Summary Background Despite efforts to provide standard definitions of terms such as “medical record”, “computer-based patient record”, “electronic medical record” and “electronic health record”, the terms are still used interchangeably. Initiatives like data and information governance, research biorepositories, and learning health systems require availability and reuse of data, as well as common understandings of the scope for specific purposes. Lacking widely shared definitions, utilization of the afore-mentioned terms in research informed consent documents calls to question whether all participants in the research process — patients, information technology and regulatory staff, and the investigative team — fully understand what data and information they are asking to obtain and agreeing to share. Objectives This descriptive study explored the terminology used in research informed consent documents when describing patient data and information, asking the question “Does the use of the term “medical record” in the context of a research informed consent document accurately represent the scope of the data involved?” Methods Informed consent document templates found on 17 Institutional Review Board (IRB) websites with Clinical and Translational Science Awards (CTSA) were searched for terms that appeared to be describing the data resources to be accessed. The National Library of Medicine’s (NLM) Terminology Services was searched for definitions provided by key standards groups that deposit terminologies with the NLM. Discussion The results suggest research consent documents are using outdated terms to describe patient information, health care terminology systems need to consider the context of research for use cases, and that there is significant work to be done to assure the HIPAA Omnibus Rule is applied to contemporary activities such as biorepositories and learning health systems. Conclusions “Medical record”, a term used extensively

  12. MH2c: Characterization of major histocompatibility α-helices - an information criterion approach

    NASA Astrophysics Data System (ADS)

    Hischenhuber, B.; Frommlet, F.; Schreiner, W.; Knapp, B.

    2012-07-01

    Major histocompatibility proteins share a common overall structure or peptide binding groove. Two binding groove domains, on the same chain for major histocompatibility class I or on two different chains for major histocompatibility class II, contribute to that structure that consists of two α-helices (“wall”) and a sheet of eight anti-parallel beta strands (“floor”). Apart from the peptide presented in the groove, the major histocompatibility α-helices play a central role for the interaction with the T cell receptor. This study presents a generalized mathematical approach for the characterization of these helices. We employed polynomials of degree 1 to 7 and splines with 1 to 2 nodes based on polynomials of degree 1 to 7 on the α-helices projected on their principal components. We evaluated all models with a corrected Akaike Information Criterion to determine which model represents the α-helices in the best way without overfitting the data. This method is applicable for both the stationary and the dynamic characterization of α-helices. By deriving differential geometric parameters from these models one obtains a reliable method to characterize and compare α-helices for a broad range of applications. Catalogue identifier: AELX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 327 565 No. of bytes in distributed program, including test data, etc.: 17 433 656 Distribution format: tar.gz Programming language: Matlab Computer: Personal computer architectures Operating system: Windows, Linux, Mac (all systems on which Matlab can be installed) RAM: Depends on the trajectory size, min. 1 GB (Matlab) Classification: 2.1, 4.9, 4.14 External routines: Curve Fitting Toolbox and Statistic Toolbox of

  13. Spacetime information

    SciTech Connect

    Hartle, J.B. Isaac Newton Institute for the Mathematical Sciences, University of Cambridge, Cambridge CB3 0EH )

    1995-02-15

    In usual quantum theory, the information available about a quantum system is defined in terms of the density matrix describing it on a spacelike surface. This definition must be generalized for extensions of quantum theory which neither require, nor always permit, a notion of state on a spacelike surface. In particular, it must be generalized for the generalized quantum theories appropriate when spacetime geometry fluctuates quantum mechanically or when geometry is fixed but not foliable by spacelike surfaces. This paper introduces a four-dimensional notion of the information available about a quantum system's boundary conditions in the various sets of decohering, coarse-grained histories it may display. This spacetime notion of information coincides with the familiar one when quantum theory [ital is] formulable in terms of states on spacelike surfaces but generalizes this notion when it cannot be so formulated. The idea of spacetime information is applied in several contexts: When spacetime geometry is fixed the information available through alternatives restricted to a fixed spacetime region is defined. The information available through histories of alternatives of general operators is compared to that obtained from the more limited coarse grainings of sum-over-histories quantum mechanics that refer only to coordinates. The definition of information is considered in generalized quantum theories. We consider as specific examples time-neutral quantum mechanics with initial and final conditions, quantum theories with nonunitary evolution, and the generalized quantum frameworks appropriate for quantum spacetime. In such theories complete information about a quantum system is not necessarily available on any spacelike surface but must be searched for throughout spacetime. The information loss commonly associated with the evolution of pure states into mixed states'' in black hole evaporation is thus not in conflict with the principles of generalized quantum mechanics.

  14. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina; Sandor, A.; Thompson, S. G.; McCann, R. S.; Kaiser, M. K.; Begault, D. R.; Adelstein, B. D.; Beutter, B. R.; Stone, L. S.

    2008-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew on flight vehicles, surface landers and habitats, and during extra-vehicular activities (EVA). Designers of displays and controls for exploration missions must be prepared to select the text formats, label styles, alarms, electronic procedure designs, and cursor control devices that provide for optimal crew performance on exploration tasks. The major areas of work, or subtasks, within the Information Presentation DRP are: 1) Controls, 2) Displays, 3) Procedures, and 4) EVA Operations.

  15. [Informed consent].

    PubMed

    Rodríguez, C R; González Parra, E; Martínez Castelao, A

    2008-01-01

    - Basic law 41/2002 on patient autonomy regulates the rights and obligations of patients, users and professionals, as well as those of public and private health care centers and services. This regulation refers to patient autonomy, the right to information and essential clinical documentation. - This law establishes the minimum requirements for the information the patient should receive and the decision making in which the patient should take part. Diagnostic tests are performed and therapeutic decisions are taken in the ACKD unit in which patient information is an essential and mandatory requirement according to this law. PMID:19018748

  16. MMA, A Computer Code for Multi-Model Analysis

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  17. CMT data inversion using a Bayesian information criterion to estimate seismogenic stress fields

    NASA Astrophysics Data System (ADS)

    Terakawa, Toshiko; Matsu'ura, Mitsuhiro

    2008-02-01

    We developed an inversion method to estimate the stress fields related to earthquake generation (seismogenic stress fields) from the centroid moment tensors (CMT) of seismic events by using Akaike's Bayesian information criterion (ABIC). On the idea that the occurrence of an earthquake releases some part of the seismogenic stress field around its hypocentre, we define the CMT of a seismic event by a weighted volume integral of the true but unknown seismogenic stress field. Representing each component of the seismogenic stress field by the superposition of a finite number of 3-D basis functions (tri-cubic B-splines), we obtain a set of linear observation equations to be solved for the expansion coefficients (model parameters). We introduce prior constraint on the roughness of the seismogenic stress field and combine it with observed data to construct a Bayesian model with hierarchic, highly flexible structure controlled by hyper-parameters. The optimum values of the hyper-parameters are objectively determined form observed data by using ABIC. Given the optimum values of the hyper-parameters, we can obtain the best estimates of model parameters by using a maximum likelihood algorithm. We tested the validity of the inversion method through numerical experiments on two synthetic CMT data sets, assuming the distribution of fault orientations to be aligned with the maximum shear stress plane in one case and to be random in the other case. Then we applied the inversion method to actual CMT data in northeast Japan, and obtained the pattern of the seismogenic stress field consistent with geophysical and geological observations.

  18. Information engineering

    SciTech Connect

    Hunt, D.N.

    1997-02-01

    The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.

  19. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, K.L.; Boyer, J.L.; Sandor, A.; Thompson, S.G.; McCann, R.S.; Begault, D.R.; Adelstein, B.D.; Beutter, B.R.; Stone, L.S.

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. The major areas of work, or subtasks, within this DRP are: 1) Displays, 2) Controls, 3) Electronic Procedures and Fault Management, and 4) Human Performance Modeling. This DRP is a collaborative effort between researchers at Johnson Space Center and Ames Research Center.

  20. [Information systems].

    PubMed

    Rodríguez Maniega, José Antonio; Trío Maseda, Reyes

    2005-03-01

    The arrival of victims of the terrorist attacks of 11 March at the hospital put the efficiency of its information systems to the test. To be most efficient, these systems should be simple and directed, above all, to the follow-up of victims and to providing the necessary information to patients and families. A specific and easy to use system is advisable. PMID:15771852

  1. When Information Improves Information Security

    NASA Astrophysics Data System (ADS)

    Grossklags, Jens; Johnson, Benjamin; Christin, Nicolas

    This paper presents a formal, quantitative evaluation of the impact of bounded-rational security decision-making subject to limited information and externalities. We investigate a mixed economy of an individual rational expert and several naïve near-sighted agents. We further model three canonical types of negative externalities (weakest-link, best shot and total effort), and study the impact of two information regimes on the threat level agents are facing.

  2. Yesterday's Information.

    ERIC Educational Resources Information Center

    McKay, Martin D.; Stout, J. David

    1999-01-01

    Discusses access to Internet resources in school libraries, including the importance of evaluating content and appropriate use. The following online services that provide current factual information from legitimate resources are described: SIRS (Social Issues Resource Series), InfoTrac, EBSCO Host, SearchBank, and the Electric Library. (MES)

  3. Copyright Information

    MedlinePlus

    ... page: https://www.nlm.nih.gov/medlineplus/copyright.html Copyright Information To use the sharing features on this page, please enable JavaScript. MedlinePlus contains both copyrighted and non-copyrighted material. Restrictions may apply when linking to ...

  4. Working Information

    ERIC Educational Resources Information Center

    Lloyd, Annemaree; Somerville, Margaret

    2006-01-01

    Purpose: The purpose of this article is to explore the contribution that an information literacy approach to the empirical study of workplace learning can make to how people understand and conceptualise workplace learning. Design/methodology/approach: Three cohorts of fire-fighters working in two regional locations in NSW, Australia were…

  5. Envisioning Information.

    ERIC Educational Resources Information Center

    Tufte, Edward R.

    This book presents over 400 illustrations of complex data that show how the dimensionality and density of portrayals can be enhanced. Practical advice on how to explain complex materials by visual means is given, and examples illustrate the fundamental principles of information display. Design strategies presented are exemplified in maps, the…

  6. Information Processing.

    ERIC Educational Resources Information Center

    Jennings, Carol Ann; McDonald, Sandy

    This publication contains instructional materials for teacher and student use for a course in information processing. The materials are written in terms of student performance using measurable objectives. The course includes 10 units. Each instructional unit contains some or all of the basic components of a unit of instruction: performance…

  7. Information Service.

    ERIC Educational Resources Information Center

    Scofield, James

    Newspaper librarians discussed the public use of their newspapers' libraries. Policies run the gamut from well-staffed public information services, within or outside the newspaper library, to no service at all to those outside the staff of the paper. Problems of dealing with tax and law enforcement agencies were covered, as well as cooperative…

  8. Teaching Information Skills: Recording Information.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2002-01-01

    Discusses how to teach students in primary and intermediate grades to record and organize information. Highlights include developing a research question; collaborative planning between teachers and library media specialists; consistency of data entry; and an example of a unit on animal migration based on an appropriate Web site. (LRW)

  9. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  10. Comparison of six statistical approaches in the selection of appropriate fish growth models

    NASA Astrophysics Data System (ADS)

    Zhu, Lixin; Li, Lifang; Liang, Zhenlin

    2009-09-01

    The performance of six statistical approaches, which can be used for selection of the best model to describe the growth of individual fish, was analyzed using simulated and real length-at-age data. The six approaches include coefficient of determination ( R 2), adjusted coefficient of determination (adj.- R 2), root mean squared error (RMSE), Akaike’s information criterion (AIC), bias correction of AIC (AIC c ) and Bayesian information criterion (BIC). The simulation data were generated by five growth models with different numbers of parameters. Four sets of real data were taken from the literature. The parameters in each of the five growth models were estimated using the maximum likelihood method under the assumption of the additive error structure for the data. The best supported model by the data was identified using each of the six approaches. The results show that R 2 and RMSE have the same properties and perform worst. The sample size has an effect on the performance of adj.- R 2, AIC, AIC c and BIC. Adj.- R 2 does better in small samples than in large samples. AIC is not suitable to use in small samples and tends to select more complex model when the sample size becomes large. AIC c and BIC have best performance in small and large sample cases, respectively. Use of AIC c or BIC is recommended for selection of fish growth model according to the size of the length-at-age data.

  11. Information management - Assessing the demand for information

    NASA Technical Reports Server (NTRS)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  12. Change in BMI Accurately Predicted by Social Exposure to Acquaintances

    PubMed Central

    Oloritun, Rahman O.; Ouarda, Taha B. M. J.; Moturu, Sai; Madan, Anmol; Pentland, Alex (Sandy); Khayal, Inas

    2013-01-01

    Research has mostly focused on obesity and not on processes of BMI change more generally, although these may be key factors that lead to obesity. Studies have suggested that obesity is affected by social ties. However these studies used survey based data collection techniques that may be biased toward select only close friends and relatives. In this study, mobile phone sensing techniques were used to routinely capture social interaction data in an undergraduate dorm. By automating the capture of social interaction data, the limitations of self-reported social exposure data are avoided. This study attempts to understand and develop a model that best describes the change in BMI using social interaction data. We evaluated a cohort of 42 college students in a co-located university dorm, automatically captured via mobile phones and survey based health-related information. We determined the most predictive variables for change in BMI using the least absolute shrinkage and selection operator (LASSO) method. The selected variables, with gender, healthy diet category, and ability to manage stress, were used to build multiple linear regression models that estimate the effect of exposure and individual factors on change in BMI. We identified the best model using Akaike Information Criterion (AIC) and R2. This study found a model that explains 68% (p<0.0001) of the variation in change in BMI. The model combined social interaction data, especially from acquaintances, and personal health-related information to explain change in BMI. This is the first study taking into account both interactions with different levels of social interaction and personal health-related information. Social interactions with acquaintances accounted for more than half the variation in change in BMI. This suggests the importance of not only individual health information but also the significance of social interactions with people we are exposed to, even people we may not consider as close friends. PMID

  13. Information Environments

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia

    2003-01-01

    The objective of GRC CNIS/IE work is to build a plug-n-play infrastructure that provides the Grand Challenge Applications with a suite of tools for coupling codes together, numerical zooming between fidelity of codes and gaining deployment of these simulations onto the Information Power Grid. The GRC CNIS/IE work will streamline and improve this process by providing tighter integration of various tools through the use of object oriented design of component models and data objects and through the use of CORBA (Common Object Request Broker Architecture).

  14. Information surveillance

    NASA Astrophysics Data System (ADS)

    Seiders, Barbara; McQuerry, Dennis; Ferryman, Thomas A.; Whitney, Paul D.; Rybka, Anthony

    2002-07-01

    Biological weapons are within reach of individuals, small groups, terrorist organizations, as well as nations. With pervasive integration of civilian and military populations worldwide, the ill winds of biological warfare stand to affect military troops and civilians alike. A variety of technologies are emerging - such as pathogen detection devices, streaming internet characterization tools, information exploitation techniques, automated feature extraction, and ubiquitous wireless communication - that can help. These technologies, if taken together within an integrated analytical framework, could make possible the monitoring of diverse parameters that may indicate a change in the state of health of a given population - either the emergence of a naturally occurring disease or the outbreak of a disease as a result of hostile intent. This presentation will discuss the application of new information surveillance tools and technologies as they apply to health and disease monitoring, particularly within the context of potential terrorist or hostile nation use of biological warfare. Although discussed within the specific context of health surveillance, the tools and processes described here are generally applicable within other domains of subject matter expertise.

  15. Information surveillance

    SciTech Connect

    Seiders, Barbara AB; McQuerry, Dennis L.; Ferryman, Thomas A.; Whitney, Paul D.; Rybka, Anthony J.

    2002-07-15

    Biological weapons are within reach of individuals, small groups, terrorist organizations, as well as nations. With pervasive integration of civilian and military populations worldwide, the ill winds of biological warfare stand to affect military troops and civilians alike. A variety of technologies are emerging - such as pathogen detection devices, streaming internet characterization tools, information exploitation techniques, automated feature extraction, and ubiquitous wireless communication - that can help. These technologies, if taken together within an integrated analytical framework, could make possible the monitoring of diverse parameters that may indicate a change in the state of health of a given population - either the emergence of a naturally occurring disease or the outbreak of a disease as a result of hostile intent. This presentation will discuss the application of new information surveillance tools and technologies as they apply to health and disease monitoring, particularly within the context of potential terrorist or hostile nation use of biological warfare. Although discussed within the specific context of health surveillance, the tools and processes described here are generally applicable within other domains of subject matter expertise.

  16. Testing the consistency of wildlife data types before combining them: the case of camera traps and telemetry.

    PubMed

    Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A

    2014-04-01

    Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case

  17. Information findability :

    SciTech Connect

    Stoecker, Nora Kathleen

    2014-03-01

    A Systems Analysis Group has existed at Sandia National Laboratories since at least the mid-1950s. Much of the groups work output (reports, briefing documents, and other materials) has been retained, along with large numbers of related documents. Over time the collection has grown to hundreds of thousands of unstructured documents in many formats contained in one or more of several different shared drives or SharePoint sites, with perhaps five percent of the collection still existing in print format. This presents a challenge. How can the group effectively find, manage, and build on information contained somewhere within such a large set of unstructured documents? In response, a project was initiated to identify tools that would be able to meet this challenge. This report documents the results found and recommendations made as of August 2013.

  18. Informed consent.

    PubMed

    Steevenson, Grania

    2006-08-01

    Disclosure of information prior to consent is a very complex area of medical ethics. On the surface it would seem to be quite clear cut, but on closer inspection the scope for 'grey areas' is vast. In practice, however, it could be argued that the number of cases that result in complaint or litigation is comparatively small. However, this does not mean that wrong decisions or unethical scenarios do not occur. It would seem that in clinical practice these ethical grey areas concerning patients' full knowledge of their condition or treatment are quite common. One of the barometers for how much disclosure should be given prior to consent could be the feedback obtained from the patients. Are they asking relevant questions pertinent to their condition and do they show a good understanding of the options available? This should be seen as a positive trait and should be welcomed by the healthcare professionals. Ultimately it gives patients greater autonomy and the healthcare professional can expand and build on the patient's knowledge as well as allay fears perhaps based on wrongly held information. Greater communication with the patient would help the healthcare professional pitch their explanations at the right level. Every case and scenario is different and unique and deserves to be treated as such. Studies have shown that most patients can understand their medical condition and treatment provided communication has been thorough (Gillon 1996). It is in the patients' best interests to feel comfortable with the level of disclosure offered to them. It can only foster greater trust and respect between them and the healthcare profession which has to be mutually beneficial to both parties. PMID:16939165

  19. A water quality index model using stepwise regression and neural networks models for the Piabanha River basin in Rio de Janeiro, Brazil

    NASA Astrophysics Data System (ADS)

    Villas Boas, M. D.; Olivera, F.; Azevedo, J. S.

    2013-12-01

    The evaluation of water quality through 'indexes' is widely used in environmental sciences. There are a number of methods available for calculating water quality indexes (WQI), usually based on site-specific parameters. In Brazil, WQI were initially used in the 1970s and were adapted from the methodology developed in association with the National Science Foundation (Brown et al, 1970). Specifically, the WQI 'IQA/SCQA', developed by the Institute of Water Management of Minas Gerais (IGAM), is estimated based on nine parameters: Temperature Range, Biochemical Oxygen Demand, Fecal Coliforms, Nitrate, Phosphate, Turbidity, Dissolved Oxygen, pH and Electrical Conductivity. The goal of this study was to develop a model for calculating the IQA/SCQA, for the Piabanha River basin in the State of Rio de Janeiro (Brazil), using only the parameters measurable by a Multiparameter Water Quality Sonde (MWQS) available in the study area. These parameters are: Dissolved Oxygen, pH and Electrical Conductivity. The use of this model will allow to further the water quality monitoring network in the basin, without requiring significant increases of resources. The water quality measurement with MWQS is less expensive than the laboratory analysis required for the other parameters. The water quality data used in the study were obtained by the Geological Survey of Brazil in partnership with other public institutions (i.e. universities and environmental institutes) as part of the project "Integrated Studies in Experimental and Representative Watersheds". Two models were developed to correlate the values of the three measured parameters and the IQA/SCQA values calculated based on all nine parameters. The results were evaluated according to the following validation statistics: coefficient of determination (R2), Root Mean Square Error (RMSE), Akaike information criterion (AIC) and Final Prediction Error (FPE). The first model was a linear stepwise regression between three independent variables

  20. Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates

    USGS Publications Warehouse

    Gray, B.R.

    2005-01-01

    The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively

  1. Seasonality and Trend Forecasting of Tuberculosis Prevalence Data in Eastern Cape, South Africa, Using a Hybrid Model

    PubMed Central

    Azeez, Adeboye; Obaromi, Davies; Odeyemi, Akinwumi; Ndege, James; Muntabayi, Ruffin

    2016-01-01

    Background: Tuberculosis (TB) is a deadly infectious disease caused by Mycobacteria tuberculosis. Tuberculosis as a chronic and highly infectious disease is prevalent in almost every part of the globe. More than 95% of TB mortality occurs in low/middle income countries. In 2014, approximately 10 million people were diagnosed with active TB and two million died from the disease. In this study, our aim is to compare the predictive powers of the seasonal autoregressive integrated moving average (SARIMA) and neural network auto-regression (SARIMA-NNAR) models of TB incidence and analyse its seasonality in South Africa. Methods: TB incidence cases data from January 2010 to December 2015 were extracted from the Eastern Cape Health facility report of the electronic Tuberculosis Register (ERT.Net). A SARIMA model and a combined model of SARIMA model and a neural network auto-regression (SARIMA-NNAR) model were used in analysing and predicting the TB data from 2010 to 2015. Simulation performance parameters of mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), mean percent error (MPE), mean absolute scaled error (MASE) and mean absolute percentage error (MAPE) were applied to assess the better performance of prediction between the models. Results: Though practically, both models could predict TB incidence, the combined model displayed better performance. For the combined model, the Akaike information criterion (AIC), second-order AIC (AICc) and Bayesian information criterion (BIC) are 288.56, 308.31 and 299.09 respectively, which were lower than the SARIMA model with corresponding values of 329.02, 327.20 and 341.99, respectively. The seasonality trend of TB incidence was forecast to have a slightly increased seasonal TB incidence trend from the SARIMA-NNAR model compared to the single model. Conclusions: The combined model indicated a better TB incidence forecasting with a lower AICc. The model also indicates the need for resolute

  2. Ventilation/Perfusion Positron Emission Tomography—Based Assessment of Radiation Injury to Lung

    SciTech Connect

    Siva, Shankar; Hardcastle, Nicholas; Kron, Tomas; Bressel, Mathias; Callahan, Jason; MacManus, Michael P.; Shaw, Mark; Plumridge, Nikki; Hicks, Rodney J.; Steinfort, Daniel; Ball, David L.; Hofman, Michael S.

    2015-10-01

    Purpose: To investigate {sup 68}Ga-ventilation/perfusion (V/Q) positron emission tomography (PET)/computed tomography (CT) as a novel imaging modality for assessment of perfusion, ventilation, and lung density changes in the context of radiation therapy (RT). Methods and Materials: In a prospective clinical trial, 20 patients underwent 4-dimensional (4D)-V/Q PET/CT before, midway through, and 3 months after definitive lung RT. Eligible patients were prescribed 60 Gy in 30 fractions with or without concurrent chemotherapy. Functional images were registered to the RT planning 4D-CT, and isodose volumes were averaged into 10-Gy bins. Within each dose bin, relative loss in standardized uptake value (SUV) was recorded for ventilation and perfusion, and loss in air-filled fraction was recorded to assess RT-induced lung fibrosis. A dose-effect relationship was described using both linear and 2-parameter logistic fit models, and goodness of fit was assessed with Akaike Information Criterion (AIC). Results: A total of 179 imaging datasets were available for analysis (1 scan was unrecoverable). An almost perfectly linear negative dose-response relationship was observed for perfusion and air-filled fraction (r{sup 2}=0.99, P<.01), with ventilation strongly negatively linear (r{sup 2}=0.95, P<.01). Logistic models did not provide a better fit as evaluated by AIC. Perfusion, ventilation, and the air-filled fraction decreased 0.75 ± 0.03%, 0.71 ± 0.06%, and 0.49 ± 0.02%/Gy, respectively. Within high-dose regions, higher baseline perfusion SUV was associated with greater rate of loss. At 50 Gy and 60 Gy, the rate of loss was 1.35% (P=.07) and 1.73% (P=.05) per SUV, respectively. Of 8/20 patients with peritumoral reperfusion/reventilation during treatment, 7/8 did not sustain this effect after treatment. Conclusions: Radiation-induced regional lung functional deficits occur in a dose-dependent manner and can be estimated by simple linear models with 4D-V/Q PET

  3. Projecting climate-driven increases in North American fire activity

    NASA Astrophysics Data System (ADS)

    Wang, D.; Morton, D. C.; Collatz, G. J.

    2013-12-01

    Climate regulates fire activity through controls on vegetation productivity (fuels), lightning ignitions, and conditions governing fire spread. In many regions of the world, human management also influences the timing, duration, and extent of fire activity. These coupled interactions between human and natural systems make fire a complex component of the Earth system. Satellite data provide valuable information on the spatial and temporal dynamics of recent fire activity, as active fires, burned area, and land cover information can be combined to separate wildfires from intentional burning for agriculture and forestry. Here, we combined satellite-derived burned area data with land cover and climate data to assess fire-climate relationships in North America between 2000-2012. We used the latest versions of the Global Fire Emissions Database (GFED) burned area product and Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate data to develop regional relationships between burned area and potential evaporation (PE), an integrated dryness metric. Logistic regression models were developed to link burned area with PE and individual climate variables during and preceding the fire season, and optimal models were selected based on Akaike Information Criterion (AIC). Overall, our model explained 85% of the variance in burned area since 2000 across North America. Fire-climate relationships from the era of satellite observations provide a blueprint for potential changes in fire activity under scenarios of climate change. We used that blueprint to evaluate potential changes in fire activity over the next 50 years based on twenty models from the Coupled Model Intercomparison Project Phase 5 (CMIP5). All models suggest an increase of PE under low and high emissions scenarios (Representative Concentration Pathways (RCP) 4.5 and 8.5, respectively), with largest increases in projected burned area across the western US and central Canada. Overall, near

  4. National Health Information Center

    MedlinePlus

    ... About ODPHP National Health Information Center National Health Information Center The National Health Information Center (NHIC) is ... of interest View the NHO calendar . Federal Health Information Centers and Clearinghouses Federal Health Information Centers and ...

  5. Predictive Information: Status or Alert Information?

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Bruneau, Daniel; Press, Hayes N.

    2008-01-01

    Previous research investigating the efficacy of predictive information for detecting and diagnosing aircraft system failures found that subjects like to have predictive information concerning when a parameter would reach an alert range. This research focused on where the predictive information should be located, whether the information should be more closely associated with the parameter information or with the alert information. Each subject saw 3 forms of predictive information: (1) none, (2) a predictive alert message, and (3) predictive information on the status display. Generally, subjects performed better and preferred to have predictive information available although the difference between status and alert predictive information was minimal. Overall, for detection and recalling what happened, status predictive information is best; however for diagnosis, alert predictive information holds a slight edge.

  6. Potential end-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1978-01-01

    Various communication systems were considered which are required to transmit both imaging and a typically error sensitive, class of data called general science/engineering (gse) over a Gaussian channel. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an Advanced Imaging Communication System (AICS) which exhibits the rather significant potential advantages of sophisticated data compression coupled with powerful yet practical channel coding.

  7. Detection of temporal changes in earthquake rates

    NASA Astrophysics Data System (ADS)

    Touati, S.

    2012-12-01

    Many statistical analyses of earthquake rates and time-dependent forecasting of future rates involve the detection of changes in the basic rate of events, independent of the fluctuations caused by aftershock sequences. We examine some of the statistical techniques for inferring these changes, using both real and synthetic earthquake data to check the statistical significance of these inferences. One common method is to use the Akaike Information Criterion (AIC) to choose between a single model and a double model with a changepoint; this criterion evaluates the strength of the fit and incorporates a penalty for the extra parameters. We test this method on many realisations of the ETAS model, with and without changepoints present, to see how often it chooses the correct model. A more rigorous method is to calculate the Bayesian evidence, or marginal likelihood, for each model and then compare these. The evidence is essentially the likelihood of the model integrated over the whole of the model space, giving a measure of how likely the data is for that model. It does not rely on estimation of best-fit parameters, making it a better comparator than the AIC; Occam's razor also arises naturally in this process due to the fact that more complex models tend to be able to explain a larger range of observations, and therefore the relative likelihood of any particular observations will be smaller than for a simpler model. Evidence can be calculated using Markov Chain Monte Carlo techniques. We compare these two approaches on synthetic data. We also look at the 1997-98 Colfiorito sequence in Umbria-Marche, Italy, using maximum likelihood to fit the ETAS model and then simulating the ETAS model to create synthetic versions of the catalogue for comparison. We simulate using ensembles of parameter values sampled from the posterior for each parameter, with the largest events artificially inserted, to compare the resultant event rates, inter-event time distributions and other

  8. The number and type of food retailers surrounding schools and their association with lunchtime eating behaviours in students

    PubMed Central

    2013-01-01

    Background The primary study objective was to examine whether the presence of food retailers surrounding schools was associated with students’ lunchtime eating behaviours. The secondary objective was to determine whether measures of the food retail environment around schools captured using road network or circular buffers were more strongly related to eating behaviours while at school. Methods Grade 9 and 10 students (N=6,971) who participated in the 2009/10 Canadian Health Behaviour in School Aged Children Survey were included in this study. The outcome was determined by students’ self-reports of where they typically ate their lunch during school days. Circular and road network-based buffers were created for a 1 km distance surrounding 158 schools participating in the HBSC. The addresses of fast food restaurants, convenience stores and coffee/donut shops were mapped within the buffers. Multilevel logistic regression was used to determine whether there was a relationship between the presence of food retailers near schools and students regularly eating their lunch at a fast food restaurant, snack-bar or café. The Akaike Information Criteria (AIC) value, a measure of goodness-of-fit, was used to determine the optimal buffer type. Results For the 1 km circular buffers, students with 1–2 (OR= 1.10, 95% CI: 0.57-2.11), 3–4 (OR=1.45, 95% CI: 0.75-2.82) and ≥5 nearby food retailers (OR=2.94, 95% CI: 1.71-5.09) were more likely to eat lunch at a food retailer compared to students with no nearby food retailers. The relationships were slightly stronger when assessed via 1 km road network buffers, with a greater likelihood of eating at a food retailer for 1–2 (OR=1.20, 95% CI:0.74-1.95), 3–4 (OR=3.19, 95% CI: 1.66-6.13) and ≥5 nearby food retailers (OR=3.54, 95% CI: 2.08-6.02). Road network buffers appeared to provide a better measure of the food retail environment, as indicated by a lower AIC value (3332 vs. 3346). Conclusions There was a strong

  9. Human Benzene Metabolism Following Occupational and Environmental Exposures

    PubMed Central

    Rappaport, Stephen M.; Kim, Sungkyoon; Lan, Qing; Li, Guilan; Vermeulen, Roel; Waidyanatha, Suramya; Zhang, Luoping; Yin, Songnian; Smith, Martyn T.; Rothman, Nathaniel

    2011-01-01

    We previously reported evidence that humans metabolize benzene via two enzymes, including a hitherto unrecognized high-affinity enzyme that was responsible for an estimated 73 percent of total urinary metabolites [sum of phenol (PH), hydroquinone (HQ), catechol (CA), E,E-muconic acid (MA), and S-phenylmercapturic acid (SPMA)] in nonsmoking females exposed to benzene at sub-saturating (ppb) air concentrations. Here, we used the same Michaelis-Menten-like kinetic models to individually analyze urinary levels of PH, HQ, CA and MA from 263 nonsmoking Chinese women (179 benzene-exposed workers and 84 control workers) with estimated benzene air concentrations ranging from less than 0.001 ppm to 299 ppm. One model depicted benzene metabolism as a single enzymatic process (1-enzyme model) and the other as two enzymatic processes which competed for access to benzene (2-enzyme model). We evaluated model fits based upon the difference in values of Akaike’s Information Criterion (ΔAIC), and we gauged the weights of evidence favoring the two models based upon the associated Akaike weights and Evidence Ratios. For each metabolite, the 2-enzyme model provided a better fit than the 1-enzyme model with ΔAIC values decreasing in the order 9.511 for MA, 7.379 for PH, 1.417 for CA, and 0.193 for HQ. The corresponding weights of evidence favoring the 2-enzyme model (Evidence Ratios) were: 116.2:1 for MA, 40.0:1 for PH, 2.0:1 for CA and 1.1:1 for HQ. These results indicate that our earlier findings from models of total metabolites were driven largely by MA, representing the ring-opening pathway, and by PH, representing the ring-hydroxylation pathway. The predicted percentage of benzene metabolized by the putative high-affinity enzyme at an air concentration of 0.001 ppm was 88% based upon urinary MA and was 80% based upon urinary PH. As benzene concentrations increased, the respective percentages of benzene metabolized to MA and PH by the high-affinity enzyme decreased successively

  10. Determination of optimal diagnostic criteria for purulent vaginal discharge and cytological endometritis in dairy cows.

    PubMed

    Denis-Robichaud, J; Dubuc, J

    2015-10-01

    The objectives of this observational study were to identify the optimal diagnostic criteria for purulent vaginal discharge (PVD) and cytological endometritis (ENDO) using vaginal discharge, endometrial cytology, and leukocyte esterase (LE) tests, and to quantify their effect on subsequent reproductive performance. Data generated from 1,099 untreated Holstein cows (28 herds) enrolled in a randomized clinical trial were used in this study. Cows were examined at 35 (± 7) d in milk for PVD using vaginal discharge scoring and for ENDO using endometrial cytology and LE testing. Optimal combinations of diagnostic criteria were determined based on the lowest Akaike information criterion (AIC) to predict pregnancy status at first service. Once identified, these criteria were used to quantify the effect of PVD and ENDO on pregnancy risk at first service and on pregnancy hazard until 200 d in milk (survival analysis). Predicting ability of these diagnostic criteria was determined using area under the curve (AUC) values. The prevalence of PVD and ENDO was calculated as well as the agreement between endometrial cytology and LE. The optimal diagnostic criteria (lowest AIC) identified in this study were purulent vaginal discharge or worse (≥ 4), ≥ 6% polymorphonuclear leukocytes (PMNL) by endometrial cytology, and small amounts of leukocytes or worse (≥ 1) by LE testing. When using the combination of vaginal discharge and PMNL percentage as diagnostic tools (n = 1,099), the prevalences of PVD and ENDO were 17.1 and 36.2%, respectively. When using the combination of vaginal discharge and LE (n = 915), the prevalences of PVD and ENDO were 17.1 and 48.4%. The optimal strategies for predicting pregnancy status at first service were the use of LE only (AUC = 0.578) and PMNL percentage only (AUC = 0.575). Cows affected by PVD and ENDO had 0.36 and 0.32 times the odds, respectively, of being pregnant at first service when using PMNL percentage compared with that of unaffected

  11. Characterizing the relationship between temperature and mortality in tropical and subtropical cities: a distributed lag non-linear model analysis in Hue, Viet Nam, 2009–2013

    PubMed Central

    Dang, Tran Ngoc; Seposo, Xerxes T.; Duc, Nguyen Huu Chau; Thang, Tran Binh; An, Do Dang; Hang, Lai Thi Minh; Long, Tran Thanh; Loan, Bui Thi Hong; Honda, Yasushi

    2016-01-01

    Background The relationship between temperature and mortality has been found to be U-, V-, or J-shaped in developed temperate countries; however, in developing tropical/subtropical cities, it remains unclear. Objectives Our goal was to investigate the relationship between temperature and mortality in Hue, a subtropical city in Viet Nam. Design We collected daily mortality data from the Vietnamese A6 mortality reporting system for 6,214 deceased persons between 2009 and 2013. A distributed lag non-linear model was used to examine the temperature effects on all-cause and cause-specific mortality by assuming negative binomial distribution for count data. We developed an objective-oriented model selection with four steps following the Akaike information criterion (AIC) rule (i.e. a smaller AIC value indicates a better model). Results High temperature-related mortality was more strongly associated with short lags, whereas low temperature-related mortality was more strongly associated with long lags. The low temperatures increased risk in all-category mortality compared to high temperatures. We observed elevated temperature-mortality risk in vulnerable groups: elderly people (high temperature effect, relative risk [RR]=1.42, 95% confidence interval [CI]=1.11–1.83; low temperature effect, RR=2.0, 95% CI=1.13–3.52), females (low temperature effect, RR=2.19, 95% CI=1.14–4.21), people with respiratory disease (high temperature effect, RR=2.45, 95% CI=0.91–6.63), and those with cardiovascular disease (high temperature effect, RR=1.6, 95% CI=1.15–2.22; low temperature effect, RR=1.99, 95% CI=0.92–4.28). Conclusions In Hue, the temperature significantly increased the risk of mortality, especially in vulnerable groups (i.e. elderly, female, people with respiratory and cardiovascular diseases). These findings may provide a foundation for developing adequate policies to address the effects of temperature on health in Hue City. PMID:26781954

  12. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  13. ActiveSeismoPick3D - automatic first arrival determination for large active seismic arrays

    NASA Astrophysics Data System (ADS)

    Paffrath, Marcel; Küperkoch, Ludger; Wehling-Benatelli, Sebastian; Friederich, Wolfgang

    2016-04-01

    We developed a tool for automatic determination of first arrivals in active seismic data based on an approach, that utilises higher order statistics (HOS) and the Akaike information criterion (AIC), commonly used in seismology, but not in active seismics. Automatic picking is highly desirable in active seismics as the number of data provided by large seismic arrays rapidly exceeds of what an analyst can evaluate in a reasonable amount of time. To bring the functionality of automatic phase picking into the context of active data, the software package ActiveSeismoPick3D was developed in Python. It uses a modified algorithm for the determination of first arrivals which searches for the HOS maximum in unfiltered data. Additionally, it offers tools for manual quality control and postprocessing, e.g. various visualisation and repicking functionalities. For flexibility, the tool also includes methods for the preparation of geometry information of large seismic arrays and improved interfaces to the Fast Marching Tomography Package (FMTOMO), which can be used for the prediction of travel times and inversion for subsurface properties. Output files are generated in the VTK format, allowing the 3D visualization of e.g. the inversion results. As a test case, a data set consisting of 9216 traces from 64 shots was gathered, recorded at 144 receivers deployed in a regular 2D array of a size of 100 x 100 m. ActiveSeismoPick3D automatically checks the determined first arrivals by a dynamic signal to noise ratio threshold. From the data a 3D model of the subsurface was generated using the export functionality of the package and FMTOMO.

  14. Predictability of Western Himalayan river flow: melt seasonal inflow into Bhakra Reservoir in northern India

    NASA Astrophysics Data System (ADS)

    Pal, I.; Lall, U.; Robertson, A. W.; Cane, M. A.; Bansal, R.

    2013-06-01

    Snowmelt-dominated streamflow of the Western Himalayan rivers is an important water resource during the dry pre-monsoon spring months to meet the irrigation and hydropower needs in northern India. Here we study the seasonal prediction of melt-dominated total inflow into the Bhakra Dam in northern India based on statistical relationships with meteorological variables during the preceding winter. Total inflow into the Bhakra Dam includes the Satluj River flow together with a flow diversion from its tributary, the Beas River. Both are tributaries of the Indus River that originate from the Western Himalayas, which is an under-studied region. Average measured winter snow volume at the upper-elevation stations and corresponding lower-elevation rainfall and temperature of the Satluj River basin were considered as empirical predictors. Akaike information criteria (AIC) and Bayesian information criteria (BIC) were used to select the best subset of inputs from all the possible combinations of predictors for a multiple linear regression framework. To test for potential issues arising due to multicollinearity of the predictor variables, cross-validated prediction skills of the best subset were also compared with the prediction skills of principal component regression (PCR) and partial least squares regression (PLSR) techniques, which yielded broadly similar results. As a whole, the forecasts of the melt season at the end of winter and as the melt season commences were shown to have potential skill for guiding the development of stochastic optimization models to manage the trade-off between irrigation and hydropower releases versus flood control during the annual fill cycle of the Bhakra Reservoir, a major energy and irrigation source in the region.

  15. Predictability of Western Himalayan River flow: melt seasonal inflow into Bhakra Reservoir in Northern India

    NASA Astrophysics Data System (ADS)

    Pal, I.; Lall, U.; Robertson, A. W.; Cane, M. A.; Bansal, R.

    2012-07-01

    Snowmelt dominated streamflow of the Western Himalayan Rivers is an important water resource during the dry pre-monsoon spring months to meet the irrigation and hydropower needs in Northern India. Here we study the seasonal prediction of melt-dominated total inflow into the Bhakra Dam in Northern India based on statistical relationships with meteorological variables during the preceding winter. Total inflow into the Bhakra dam includes the Satluj River flow together with a flow diversion from its tributary, the Beas River. Both are tributaries of the Indus River that originate from the Western Himalayas, which is an under-studied region. Average measured winter snow volume at the upper elevation stations and corresponding lower elevation rainfall and temperature of the Satluj River basin were considered as empirical predictors. Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC) were used to select the best subset of inputs from all the possible combinations of predictors for a multiple linear regression framework. To test for potential issues arising due to multi-collinearity of the predictor variables, cross-validated prediction skills of best subset were also compared with the prediction skills of Principal Component Regression (PCR) and Partial Least Squares Regression (PLSR) techniques, which yielded broadly similar results. As a whole, the forecasts of the melt season at the end of winter and as the melt season commences were shown to have potential skill for guiding the development of stochastic optimization models to manage the trade-off between irrigation and hydropower releases versus flood control during the annual fill cycle of the Bhakra reservoir, a major energy and irrigation source in the region.

  16. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size. PMID:24671204

  17. Morphological assessment on day 4 and its prognostic power in selecting viable embryos for transfer.

    PubMed

    Fabozzi, Gemma; Alteri, Alessandra; Rega, Emilia; Starita, Maria Flavia; Piscitelli, Claudio; Giannini, Pierluigi; Colicchia, Antonio

    2016-08-01

    The aim of this study was to describe a system for embryo morphology scoring at the morula stage and to determine the efficiency of this model in selecting viable embryos for transfer. In total, 519 embryos from 122 patients undergoing intracytoplasmic sperm injection (ICSI) were scored retrospectively on day 4 according to the grading system proposed in this article. Two separate quality scores were assigned to each embryo in relation to the grade of compaction and fragmentation and their developmental fate was then observed on days 5 and 6. Secondly, the prediction value of this scoring system was compared with the prediction value of the traditional scoring system adopted on day 3. Morulas classified as grade A showed a significant higher blastocyst formation rate (87.2%) compared with grades B, C and D (63.8, 41.3 and 15.0%, respectively), (P < 0.001). Furthermore, the ability to form top quality blastocysts was significantly higher for grade A morulas with respect to grades B, and C and D (37.8% vs. 22.4% vs. 11.1%), (P < 0.001). Finally, the morula scoring system showed more prediction power with respect to the embryo scoring a value of 1 [Akaike information criterion (AIC) index 16.4 vs. 635.3 and Bayesian information criterion (BIC) index -68.8 vs. -30.0 for morulas and embryos respectively]. In conclusion, results demonstrated that the presented scoring system allows for the evaluation of eligible embryos for transfer as a significant correlation between the grade of morula, blastulation rate and blastocyst quality was observed. Furthermore, the morula scoring system was shown to be the best predictive model when compared with the traditional scoring system performed on day 3. PMID:26350430

  18. Model selection, zero-inflated models, and predictors of primate abundance in Korup National Park, Cameroon.

    PubMed

    Linder, Joshua M; Lawler, Richard R

    2012-11-01

    Determining the ecological and anthropogenic factors that shape the abundance and distribution of wild primates is a critical component of primate conservation research. Such research is complicated, however, whenever the species under study are encountered infrequently, a characteristic of many taxa that are threatened with extinction. Typically, the resulting data sets based on surveys of such species will have a high frequency of zero counts which makes it difficult to determine the predictor variables that are associated with species abundance. In this study, we test various statistical models using survey data that was gathered on seven species of primate in Korup National Park, Cameroon. Predictor variables include hunting signs and aspects of habitat structure and floristic composition. Our statistical models include zero-inflated models that are tailored to deal with a high frequency of zero counts. First, using exploratory data analysis we found the most informative set of models as ranked by Δ-AIC (Akaike's information criterion). On the basis of this analysis, we used five predictor variables to construct several regression models including Poisson, zero-inflated Poisson, negative binomial, and zero-inflated negative binomial. Total basal area of all trees, density of secondary tree species, hunting signs, and mean basal area of all trees were significant predictors of abundance in the zero-inflated models. We discuss the statistical logic behind zero-inflated models and provide an interpretation of parameter estimates. We recommend that researchers explore a variety of models when determining the factors that correlate with primate abundance. PMID:22991216

  19. Productivity, embryo and eggshell characteristics, and contaminants in bald eagles from the Great Lakes, USA, 1986 to 2000.

    PubMed

    Best, David A; Elliott, Kyle H; Bowerman, William W; Shieldcastle, Mark; Postupalsky, Sergej; Kubiak, Timothy J; Tillitt, Donald E; Elliott, John E

    2010-07-01

    Chlorinated hydrocarbon concentrations in eggs of fish-eating birds from contaminated environments such as the Great Lakes of North America tend to be highly intercorrelated, making it difficult to elucidate mechanisms causing reproductive impairment, and to ascribe cause to specific chemicals. An information- theoretic approach was used on data from 197 salvaged bald eagle (Haliaeetus leucocephalus) eggs (159 clutches) that failed to hatch in Michigan and Ohio, USA (1986-2000). Contaminant levels declined over time while eggshell thickness increased, and by 2000 was at pre-1946 levels. The number of occupied territories and productivity increased during 1981 to 2004. For both the entire dataset and a subset of nests along the Great Lakes shoreline, polychlorinated biphenyls (SigmaPCBs, fresh wet wt) were generally included in the most parsimonious models (lowest-Akaike's information criterion [AICs]) describing productivity, with significant declines in productivity observed above 26 microg/g SigmaPCBs (fresh wet wt). Of 73 eggs with a visible embryo, eight (11%) were abnormal, including three with skewed bills, but they were not associated with known teratogens, including SigmaPCBs. Eggs with visible embryos had greater concentrations of all measured contaminants than eggs without visible embryos; the most parsimonious models describing the presence of visible embryos incorporated dieldrin equivalents and dichlorodiphenyldichloroethylene (DDE). There were significant negative correlations between eggshell thickness and all contaminants, with SigmaPCBs included in the most parsimonious models. There were, however, no relationships between productivity and eggshell thickness or Ratcliffe's index. The SigmaPCBs and DDE were negatively associated with nest success of bald eagles in the Great Lakes watersheds, but the mechanism does not appear to be via shell quality effects, at least at current contaminant levels, while it is not clear what other mechanisms were

  20. Selective Constraints on Amino Acids Estimated by a Mechanistic Codon Substitution Model with Multiple Nucleotide Changes

    PubMed Central

    Miyazawa, Sanzo

    2011-01-01

    Background Empirical substitution matrices represent the average tendencies of substitutions over various protein families by sacrificing gene-level resolution. We develop a codon-based model, in which mutational tendencies of codon, a genetic code, and the strength of selective constraints against amino acid replacements can be tailored to a given gene. First, selective constraints averaged over proteins are estimated by maximizing the likelihood of each 1-PAM matrix of empirical amino acid (JTT, WAG, and LG) and codon (KHG) substitution matrices. Then, selective constraints specific to given proteins are approximated as a linear function of those estimated from the empirical substitution matrices. Results Akaike information criterion (AIC) values indicate that a model allowing multiple nucleotide changes fits the empirical substitution matrices significantly better. Also, the ML estimates of transition-transversion bias obtained from these empirical matrices are not so large as previously estimated. The selective constraints are characteristic of proteins rather than species. However, their relative strengths among amino acid pairs can be approximated not to depend very much on protein families but amino acid pairs, because the present model, in which selective constraints are approximated to be a linear function of those estimated from the JTT/WAG/LG/KHG matrices, can provide a good fit to other empirical substitution matrices including cpREV for chloroplast proteins and mtREV for vertebrate mitochondrial proteins. Conclusions/Significance The present codon-based model with the ML estimates of selective constraints and with adjustable mutation rates of nucleotide would be useful as a simple substitution model in ML and Bayesian inferences of molecular phylogenetic trees, and enables us to obtain biologically meaningful information at both nucleotide and amino acid levels from codon and protein sequences. PMID:21445250

  1. Density dependence and risk of extinction in a small population of sea otters

    USGS Publications Warehouse

    Gerber, L.R.; Buenau, K.E.; VanBlaricom, G.

    2004-01-01

    Sea otters (Enhydra lutris (L.)) were hunted to extinction off the coast of Washington State early in the 20th century. A new population was established by translocations from Alaska in 1969 and 1970. The population, currently numbering at least 550 animals, A major threat to the population is the ongoing risk of majour oil spills in sea otter habitat. We apply population models to census and demographic data in order to evaluate the status of the population. We fit several density dependent models to test for density dependence and determine plausible values for the carrying capacity (K) by comparing model goodness of fit to an exponential model. Model fits were compared using Akaike Information Criterion (AIC). A significant negative relationship was found between the population growth rate and population size (r2=0.27, F=5.57, df=16, p<0.05), suggesting density dependence in Washington state sea otters. Information criterion statistics suggest that the model is the most parsimonious, followed closely by the logistic Beverton-Holt model. Values of K ranged from 612 to 759 with best-fit parameter estimates for the Beverton-Holt model including 0.26 for r and 612 for K. The latest (2001) population index count (555) puts the population at 87-92% of the estimated carrying capacity, above the suggested range for optimum sustainable population (OSP). Elasticity analysis was conducted to examine the effects of proportional changes in vital rates on the population growth rate (??). The elasticity values indicate the population is most sensitive to changes in survival rates (particularly adult survival).

  2. Broad-scale predictors of canada lynx occurrence in eastern North America

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Joseph, R.A.; O'Brien, M.

    2005-01-01

    The Canada lynx (Lynx canadensis) is listed as a threatened species throughout the southern extent of its geographic range in the United States. Most research on lynx has been conducted in the western United States and Canada; little is known about the ecology of lynx in eastern North America. To fill critical knowledge gaps about this species, we modeled and mapped lynx occurrence using habitat and weather data from 7 eastern states and 3 Canadian provinces. Annual snowfall, road density, bobcat (L. rufus) harvest, deciduous forest, and coniferous forest were compared at 1,150 lynx locations and 1,288 random locations. Nineteen a priori models were developed using the information-theoretic approach, and logistic regression models were ranked using Akaike's Information Criterion (AIC) and by our ability to correctly classify reserved data (Kappa). Annual snowfall and deciduous forest predicted lynx presence and absence for a reserved dataset (n = 278) with 94% accuracy. A map of the probability of lynx occurrence throughout the region revealed that 92% of the potential habitat (i.e., >50% probability of occurrence) was concentrated in a relatively contiguous complex encompassing northern Maine, New Brunswick, and the Gaspe?? peninsula of Quebec. Most of the remaining potential habitat (5%) was on northern Cape Breton Island in Nova Scotia. Potential habitat in New Hampshire, Vermont, and New York was small (1,252 km2), fragmented, and isolated (>200 km) from known lynx populations. When federally listed as threatened in the contiguous United States in 2000, inadequate regulations on federal lands were cited as the primary threat to Canada lynx. However, the majority of potential lynx habitat in the eastern United States is on private lands and continuous with potential habitat in Canada. Therefore, lynx conservation in eastern North America will need to develop partnerships across national, state, and provincial boundaries as well as with private landowners.

  3. Reassessment of the 2010–2011 Haiti cholera outbreak and rainfall-driven multiseason projections

    PubMed Central

    Rinaldo, Andrea; Bertuzzo, Enrico; Mari, Lorenzo; Righetto, Lorenzo; Blokesch, Melanie; Gatto, Marino; Casagrandi, Renato; Murray, Megan; Vesenbeckh, Silvan M.; Rodriguez-Iturbe, Ignacio

    2012-01-01

    Mathematical models can provide key insights into the course of an ongoing epidemic, potentially aiding real-time emergency management in allocating health care resources and by anticipating the impact of alternative interventions. We study the ex post reliability of predictions of the 2010–2011 Haiti cholera outbreak from four independent modeling studies that appeared almost simultaneously during the unfolding epidemic. We consider the impact of different approaches to the modeling of spatial spread of Vibrio cholerae and mechanisms of cholera transmission, accounting for the dynamics of susceptible and infected individuals within different local human communities. To explain resurgences of the epidemic, we go on to include waning immunity and a mechanism explicitly accounting for rainfall as a driver of enhanced disease transmission. The formal comparative analysis is carried out via the Akaike information criterion (AIC) to measure the added information provided by each process modeled, discounting for the added parameters. A generalized model for Haitian epidemic cholera and the related uncertainty is thus proposed and applied to the year-long dataset of reported cases now available. The model allows us to draw predictions on longer-term epidemic cholera in Haiti from multiseason Monte Carlo runs, carried out up to January 2014 by using suitable rainfall fields forecasts. Lessons learned and open issues are discussed and placed in perspective. We conclude that, despite differences in methods that can be tested through model-guided field validation, mathematical modeling of large-scale outbreaks emerges as an essential component of future cholera epidemic control. PMID:22505737

  4. Data-driven input variable selection for rainfall-runoff modeling using binary-coded particle swarm optimization and Extreme Learning Machines

    NASA Astrophysics Data System (ADS)

    Taormina, Riccardo; Chau, Kwok-Wing

    2015-10-01

    Selecting an adequate set of inputs is a critical step for successful data-driven streamflow prediction. In this study, we present a novel approach for Input Variable Selection (IVS) that employs Binary-coded discrete Fully Informed Particle Swarm optimization (BFIPS) and Extreme Learning Machines (ELM) to develop fast and accurate IVS algorithms. A scheme is employed to encode the subset of selected inputs and ELM specifications into the binary particles, which are evolved using single objective and multi-objective BFIPS optimization (MBFIPS). The performances of these ELM-based methods are assessed using the evaluation criteria and the datasets included in the comprehensive IVS evaluation framework proposed by Galelli et al. (2014). From a comparison with 4 major IVS techniques used in their original study it emerges that the proposed methods compare very well in terms of selection accuracy. The best performers were found to be (1) a MBFIPS-ELM algorithm based on the concurrent minimization of an error function and the number of selected inputs, and (2) a BFIPS-ELM algorithm based on the minimization of a variant of the Akaike Information Criterion (AIC). The first technique is arguably the most accurate overall, and is able to reach an almost perfect specification of the optimal input subset for a partially synthetic rainfall-runoff experiment devised for the Kentucky River basin. In addition, MBFIPS-ELM allows for the determination of the relative importance of the selected inputs. On the other hand, the BFIPS-ELM is found to consistently reach high accuracy scores while being considerably faster. By extrapolating the results obtained on the IVS test-bed, it can be concluded that the proposed techniques are particularly suited for rainfall-runoff modeling applications characterized by high nonlinearity in the catchment dynamics.

  5. Reassessment of the 2010-2011 Haiti cholera outbreak and rainfall-driven multiseason projections.

    PubMed

    Rinaldo, Andrea; Bertuzzo, Enrico; Mari, Lorenzo; Righetto, Lorenzo; Blokesch, Melanie; Gatto, Marino; Casagrandi, Renato; Murray, Megan; Vesenbeckh, Silvan M; Rodriguez-Iturbe, Ignacio

    2012-04-24

    Mathematical models can provide key insights into the course of an ongoing epidemic, potentially aiding real-time emergency management in allocating health care resources and by anticipating the impact of alternative interventions. We study the ex post reliability of predictions of the 2010-2011 Haiti cholera outbreak from four independent modeling studies that appeared almost simultaneously during the unfolding epidemic. We consider the impact of different approaches to the modeling of spatial spread of Vibrio cholerae and mechanisms of cholera transmission, accounting for the dynamics of susceptible and infected individuals within different local human communities. To explain resurgences of the epidemic, we go on to include waning immunity and a mechanism explicitly accounting for rainfall as a driver of enhanced disease transmission. The formal comparative analysis is carried out via the Akaike information criterion (AIC) to measure the added information provided by each process modeled, discounting for the added parameters. A generalized model for Haitian epidemic cholera and the related uncertainty is thus proposed and applied to the year-long dataset of reported cases now available. The model allows us to draw predictions on longer-term epidemic cholera in Haiti from multiseason Monte Carlo runs, carried out up to January 2014 by using suitable rainfall fields forecasts. Lessons learned and open issues are discussed and placed in perspective. We conclude that, despite differences in methods that can be tested through model-guided field validation, mathematical modeling of large-scale outbreaks emerges as an essential component of future cholera epidemic control. PMID:22505737

  6. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  7. Neutrophil/Lymphocyte Ratio, Lymphocyte/Monocyte Ratio, and Absolute Lymphocyte Count/Absolute Monocyte Count Prognostic Score in Diffuse Large B-Cell Lymphoma

    PubMed Central

    Ho, Ching-Liang; Lu, Chieh-Sheng; Chen, Jia-Hong; Chen, Yu-Guang; Huang, Tzu-Chuan; Wu, Yi-Ying

    2015-01-01

    Abstract The neutrophil/lymphocyte ratio (NLR), lymphocyte/monocyte ratio (LMR), and absolute lymphocyte count/absolute monocyte count prognostic score (ALC/AMC PS) have been described as the most useful prognostic tools for patients with diffuse large B-cell lymphoma (DLBCL). We retrospectively analyzed 148 Taiwanese patients with newly diagnosed diffuse large B-cell lymphoma under rituximab (R)-CHOP-like regimens from January 2001 to December 2010 at the Tri-Service General Hospital and investigated the utility of these inexpensive tools in our patients. In a univariate analysis, the NLR, LMR, and ALC/AMC PS had significant prognostic value in our DLBCL patients (NLR: 5-year progression-free survival [PFS], P = 0.001; 5-year overall survival [OS], P = 0.007. LMR: PFS, P = 0.003; OS, P = 0.05. ALC/AMC PS: PFS, P < 0.001; OS, P < 0.001). In a separate multivariate analysis, the ALC/AMC PS appeared to interact less with the other clinical factors but retained statistical significance in the survival analysis (PFS, P = 0.023; OS, P = 0.017). The akaike information criterion (AIC) analysis produced scores of 388.773 in the NLR, 387.625 in the LMR, and 372.574 in the ALC/AMC PS. The results suggested that the ALC/AMC PS appears to be more reliable than the NLR and LMR and may provide additional prognostic information when used in conjunction with the International Prognostic Index.

  8. Ordinal time series analysis for Air Quality Index (AQI) in San Bernardino County

    NASA Astrophysics Data System (ADS)

    Chitakasempornkul, Kessinee

    Ambient pollutant, especially ground level ozone that causes respiratory diseases, has been a great concern in Southern California. U.S. Environmental Protection Agency provides the Air Quality Index (AQI) as a tool to assist the public of health warnings. AQI for ozone is currently divided into six states depending on the level of public health concern. In statistical point of view AQI can be characterized as nonstationary ordinal-valued time series. The purpose of this study is to implement statistical models for short-term forecasting of AQI. This thesis presents a generalized linear type modeling to handle the autocorrelated ordinal time series. The model is applied with four different link functions: identity, logit, probit, and complementary log-log and their forecast performance are compared. Random time-varying covariates include past AQI state, various meteorological processes, and periodic component. Data used in this study are AQI for ozone from five monitoring stations in San Bernardino County, CA for 2004 to 2006. For the purpose of evaluating the performance of one-day-ahead forecast, the 2007 data from the same place are used. The meteorological data are from the nearby Barstow city in San Bernardino County. The portmanteau test is used to test error autocorrelations. The partial likelihood ratio test, Akaike information criterion (AIC), and Bayesian information criterion (BIC) are used to measure the goodness of fit and compare the models. The results show the model well captures the nonstationarity in ozone process and remove the nonstationarity in residuals. Both logit and probit models correctly forecast about 85% of the observed AQI.

  9. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    PubMed Central

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-01-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272

  10. Factors associated with utilization of antenatal care services in Balochistan province of Pakistan: An analysis of the Multiple Indicator Cluster Survey (MICS) 2010

    PubMed Central

    Ghaffar, Abdul; Pongponich, Sathirakorn; Ghaffar, Najma; Mehmood, Tahir

    2015-01-01

    Objective: The study was conducted to identify factors affecting the utilization of Antenatal Care (ANC) in Balochistan Province, Pakistan. Methods: Data on ANC utilization, together with social and economic determinants, were derived from a Multiple Indicator Cluster Survey (MICS) conducted in Balochistan in 2010. The analysis was conducted including 2339 women who gave birth in last two years preceding the survey. The researchers established a model to identify influential factors contributing to the utilization of ANC by logistic regression; model selection was by Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). Results: Household wealth, education, health condition, age at first marriage, number of children and spouse violence justification were found to be significantly associated with ANC coverage. Literate mothers are 2.45 times more likely to have ANC, and women whose newborns showed symptoms of illness at birth that needed hospitalization are 0.47 times less likely to access ANC. Women with an increase in the number of surviving children are 1.07 times less likely to have ANC, and those who think their spouse violence is socially justified are 1.36 times less likely to have ANC. The results draw attention towards evidence based planning of factors associated with utilization of ANC in the Balochistan province. Conclusion: The study reveals that women from high wealth index and having education had more chances to get ANC. Factors like younger age of the women at first marriage, increased number of children, symptoms of any illness to neonates at birth that need hospitalization and women who justify spouse violence had less chances to get ANC. Among components of ANC urine sampling and having tetanus toxoid (TT) in the last pregnancy increased the frequency of visits. ANC from a doctor decreased the number of visits. There is dire need to reduce disparities for wealth index, education and urban/rural living. PMID:26870113

  11. Evaluation of the models handling heterotachy in phylogenetic inference

    PubMed Central

    Zhou, Yan; Rodrigue, Nicolas; Lartillot, Nicolas; Philippe, Hervé

    2007-01-01

    Background The evolutionary rate at a given homologous position varies across time. When sufficiently pronounced, this phenomenon – called heterotachy – may produce artefactual phylogenetic reconstructions under the commonly used models of sequence evolution. These observations have motivated the development of models that explicitly recognize heterotachy, with research directions proposed along two main axes: 1) the covarion approach, where sites switch from variable to invariable states; and 2) the mixture of branch lengths (MBL) approach, where alignment patterns are assumed to arise from one of several sets of branch lengths, under a given phylogeny. Results Here, we report the first statistical comparisons contrasting the performance of covarion and MBL modeling strategies. Using simulations under heterotachous conditions, we explore the properties of three model comparison methods: the Akaike information criterion, the Bayesian information criterion, and cross validation. Although more time consuming, cross validation appears more reliable than AIC and BIC as it directly measures the predictive power of a model on 'future' data. We also analyze three large datasets (nuclear proteins of animals, mitochondrial proteins of mammals, and plastid proteins of plants), and find the optimal number of components of the MBL model to be two for all datasets, indicating that this model is preferred over the standard homogeneous model. However, the covarion model is always favored over the optimal MBL model. Conclusion We demonstrated, using three large datasets, that the covarion model is more efficient at handling heterotachy than the MBL model. This is probably due to the fact that the MBL model requires a serious increase in the number of parameters, as compared to two supplementary parameters of the covarion approach. Further improvements of the both the mixture and the covarion approaches might be obtained by modeling heterogeneous behavior both along time and

  12. An Information Policy for the Information Age.

    ERIC Educational Resources Information Center

    Blake, Virgil; Surprenant, Thomas

    1988-01-01

    Discusses recent federal information policies that pose a threat to access to information. A short-lived policy for protection of sensitive but unclassified information is criticized, and the Computer Security Act of 1987, currently under consideration in Congress, is described. Involvement by the library and information community in developing…

  13. Information Dissemination, Information Overload and Decision Quality.

    ERIC Educational Resources Information Center

    Hwang, Mark I.; Lin, Jerry W.

    1999-01-01

    A meta-analysis of 31 experiments reported in 18 empirical bankruptcy prediction studies was conducted to test the effect of two information dimensions: information diversity and information repetitiveness. Results indicated that both information dimensions have an adverse impact on decision quality: provision of either diverse or repeated…

  14. Advanced information society (12)

    NASA Astrophysics Data System (ADS)

    Komatsuzaki, Seisuke

    In this paper, the original Japanese idea of "advanced information society" was reviewed at the first step. Thus, advancement of information/communication technology, advancement of information/communication needs and tendency of industrialization of information" were examined. Next, by comparing studies on advanced information society in various countries, the Japanese characteristics of consensus building was reviewed. Finally, in pursuit of prospect and tasks for the society, advancement of innovation and convergence information/communication technology, information/communication needs, institutional environment for utilization of information/communication and countermeasures against information pollution. Matching of information/communication technology and needs, besides with countermeasures against information pollution were discussed.

  15. Size at the onset of maturity (SOM) revealed in length-weight relationships of brackish amphipods and isopods: An information theory approach

    NASA Astrophysics Data System (ADS)

    Longo, Emanuela; Mancinelli, Giorgio

    2014-01-01

    In amphipods and other small-sized crustaceans, allometric relationships are conventionally analysed by fitting the standard model Y = a·Xb (X and Y are, e.g., body length and weight, respectively) whose scaling exponent b is assumed to be constant. However, breakpoints in allometric relationships have long been documented in large-sized crustaceans, ultimately determined by ontogenetic, abrupt variations in the value of b. Here, the existence of breakpoints in length-weight relationships was investigated in four amphipod (i.e., Gammarus aequicauda, Gammarus insensibilis, Microdeutopus gryllotalpa, and Dexamine spinosa) and three isopod species (i.e., Lekanesphaera hookeri, Sphaeroma serratum, and Cymodoce truncata) from three Mediterranean lagoons. The power of two candidate linear models fitted to log10-transformed data - a simple model assuming a constant exponent b and a segmented model assuming b to vary after a breakpoint - was compared using a parsimonious selection strategy based on the Akaike information criterion. The segmented model with a breakpoint provided the most accurate fitting of length-weight data in the majority of the species analysed; non-conclusive results were obtained only for D. spinosa and C. truncata, of which a limited number of specimens was examined. Model parameters were consistent for amphipod and isopod species collected across the three different habitats; the generality of the results was further supported by a literature search confirming that the identified breakpoints corresponded with ontogenetic discontinuities related with sexual maturation in all the species investigated. In this study, segmented regression models were revealed to provide a statistically accurate and biologically meaningful description of length-weight relationships of common amphipod and isopod species. The methodological limitations of the approach are considered, while the practical implications for secondary production estimates are discussed.

  16. In Vivo Evaluation of Blood Based and Reference Tissue Based PET Quantifications of [11C]DASB in the Canine Brain

    PubMed Central

    Polis, Ingeborgh; Neyt, Sara; Kersemans, Ken; Dobbeleir, Andre; Saunders, Jimmy; Goethals, Ingeborg; Peremans, Kathelijne; De Vos, Filip

    2016-01-01

    This first-in-dog study evaluates the use of the PET-radioligand [11C]DASB to image the density and availability of the serotonin transporter (SERT) in the canine brain. Imaging the serotonergic system could improve diagnosis and therapy of multiple canine behavioural disorders. Furthermore, as many similarities are reported between several human neuropsychiatric conditions and naturally occurring canine behavioural disorders, making this tracer available for use in dogs also provide researchers an interesting non-primate animal model to investigate human disorders. Five adult beagles underwent a 90 minutes dynamic PET scan and arterial whole blood was sampled throughout the scan. For each ROI, the distribution volume (VT), obtained via the one- and two- tissue compartment model (1-TC, 2-TC) and the Logan Plot, was calculated and the goodness-of-fit was evaluated by the Akaike Information Criterion (AIC). For the preferred compartmental model BPND values were estimated and compared with those derived by four reference tissue models: 4-parameter RTM, SRTM2, MRTM2 and the Logan reference tissue model. The 2-TC model indicated in 61% of the ROIs a better fit compared to the 1-TC model. The Logan plot produced almost identical VT values and can be used as an alternative. Compared with the 2-TC model, all investigated reference tissue models showed high correlations but small underestimations of the BPND-parameter. The highest correlation was achieved with the Logan reference tissue model (Y = 0.9266 x + 0.0257; R2 = 0.9722). Therefore, this model can be put forward as a non-invasive standard model for future PET-experiments with [11C]DASB in dogs. PMID:26859850

  17. Possible Causes of a Harbour Porpoise Mass Stranding in Danish Waters in 2005

    PubMed Central

    Wright, Andrew J.; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J.; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7–15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003–2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003–2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11–28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor. PMID:23460787

  18. Fine-Scale Mapping by Spatial Risk Distribution Modeling for Regional Malaria Endemicity and Its Implications under the Low-to-Moderate Transmission Setting in Western Cambodia

    PubMed Central

    Okami, Suguru; Kohtake, Naohiko

    2016-01-01

    The disease burden of malaria has decreased as malaria elimination efforts progress. The mapping approach that uses spatial risk distribution modeling needs some adjustment and reinvestigation in accordance with situational changes. Here we applied a mathematical modeling approach for standardized morbidity ratio (SMR) calculated by annual parasite incidence using routinely aggregated surveillance reports, environmental data such as remote sensing data, and non-environmental anthropogenic data to create fine-scale spatial risk distribution maps of western Cambodia. Furthermore, we incorporated a combination of containment status indicators into the model to demonstrate spatial heterogeneities of the relationship between containment status and risks. The explanatory model was fitted to estimate the SMR of each area (adjusted Pearson correlation coefficient R2 = 0.774; Akaike information criterion AIC = 149.423). A Bayesian modeling framework was applied to estimate the uncertainty of the model and cross-scale predictions. Fine-scale maps were created by the spatial interpolation of estimated SMRs at each village. Compared with geocoded case data, corresponding predicted values showed conformity [Spearman’s rank correlation r = 0.662 in the inverse distance weighed interpolation and 0.645 in ordinal kriging (95% confidence intervals of 0.414–0.827 and 0.368–0.813, respectively), Welch’s t-test; Not significant]. The proposed approach successfully explained regional malaria risks and fine-scale risk maps were created under low-to-moderate malaria transmission settings where reinvestigations of existing risk modeling approaches were needed. Moreover, different representations of simulated outcomes of containment status indicators for respective areas provided useful insights for tailored interventional planning, considering regional malaria endemicity. PMID:27415623

  19. Assessment and Selection of Competing Models for Zero-Inflated Microbiome Data

    PubMed Central

    Xu, Lizhen; Paterson, Andrew D.; Turpin, Williams; Xu, Wei

    2015-01-01

    Typical data in a microbiome study consist of the operational taxonomic unit (OTU) counts that have the characteristic of excess zeros, which are often ignored by investigators. In this paper, we compare the performance of different competing methods to model data with zero inflated features through extensive simulations and application to a microbiome study. These methods include standard parametric and non-parametric models, hurdle models, and zero inflated models. We examine varying degrees of zero inflation, with or without dispersion in the count component, as well as different magnitude and direction of the covariate effect on structural zeros and the count components. We focus on the assessment of type I error, power to detect the overall covariate effect, measures of model fit, and bias and effectiveness of parameter estimations. We also evaluate the abilities of model selection strategies using Akaike information criterion (AIC) or Vuong test to identify the correct model. The simulation studies show that hurdle and zero inflated models have well controlled type I errors, higher power, better goodness of fit measures, and are more accurate and efficient in the parameter estimation. Besides that, the hurdle models have similar goodness of fit and parameter estimation for the count component as their corresponding zero inflated models. However, the estimation and interpretation of the parameters for the zero components differs, and hurdle models are more stable when structural zeros are absent. We then discuss the model selection strategy for zero inflated data and implement it in a gut microbiome study of > 400 independent subjects. PMID:26148172

  20. Possible causes of a harbour porpoise mass stranding in Danish waters in 2005.

    PubMed

    Wright, Andrew J; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7-15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003-2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003-2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11-28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor. PMID:23460787

  1. Visibility Modeling and Forecasting for Abu Dhabi using Time Series Analysis Method

    NASA Astrophysics Data System (ADS)

    Eibedingil, I. G.; Abula, B.; Afshari, A.; Temimi, M.

    2015-12-01

    Land-Atmosphere interactions-their strength, directionality and evolution-are one of the main sources of uncertainty in contemporary climate modeling. A particularly crucial role in sustaining and modulating land-atmosphere interaction is the one of aerosols and dusts. Aerosols are tiny particles suspended in the air ranging from a few nanometers to a few hundred micrometers in diameter. Furthermore, the amount of dust and fog in the atmosphere is an important measure of visibility, which is another dimension of land-atmosphere interactions. Visibility affects all form of traffic, aviation, land and sailing. Being able to predict the change of visibility in the air in advance enables relevant authorities to take necessary actions before the disaster falls. Time Series Analysis (TAS) method is an emerging technique for modeling and forecasting the behavior of land-atmosphere interactions, including visibility. This research assess the dynamics and evolution of visibility around Abu Dhabi International Airport (+24.4320 latitude, +54.6510 longitude, and 27m elevation) using mean daily visibility and mean daily wind speed. TAS has been first used to model and forecast the visibility, and then the Transfer Function Model has been applied, considering the wind speed as an exogenous variable. By considering the Akaike Information Criterion (AIC) and Mean Absolute Percentage Error (MAPE) as a statistical criteria, two forecasting models namely univarite time series model and transfer function model, were developed to forecast the visibility around Abu Dhabi International Airport for three weeks. Transfer function model improved the MAPE of the forecast significantly.

  2. Bait stations, hard mast, and black bear population growth in Great Smoky Mountains National Park

    USGS Publications Warehouse

    Clark, Joseph D.; van Manen, Frank T.; Pelton, Michael R.

    2005-01-01

    Bait-station surveys are used by wildlife managers as an index to American black bear (Ursus americanus) population abundance, but the relationship is not well established. Hard mast surveys are similarly used to assess annual black bear food availability which may affect mortality and natality rates. We used data collected in Great Smoky Mountains National Park (GSMNP) from 1989 to 2003 to determine whether changes in the bait-station index (ΔBSI) were associated with estimated rates of bear population growth (λ) and whether hard mast production was related to bear visitation to baits. We also evaluated whether hard mast production from previous years was related to λ. Estimates of λ were based on analysis of capture-recapture data with the Pradel temporal symmetry estimator. Using the Akaike's Information Criterion (AIC), our analysis revealed no direct relationship between ΔBSI and λ. A simulation analysis indicated that our data were adequate to detect a relationship had one existed. Model fit was marginally improved when we added total oak mast production of the previous year as an interaction term suggesting that the BSI was confounded with environmental variables. Consequently the utility of the bait-station survey as a population monitoring technique is questionable at the spatial and temporal scales we studied. Mast survey data, however, were valuable covariates of λ. Population growth for a given year was negatively related to oak mast production 4 and 5 years prior. That finding supported our hypothesis that mast failures can trigger reproductive synchrony, which may not be evident from the trapped sample until years later.

  3. Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality

    NASA Astrophysics Data System (ADS)

    Kipnis, E. L.; Murphy, M.; Klatt, A. L.; Miller, S. N.; Williams, D. G.

    2015-12-01

    Session H103: The Hydrology-Vegetation-Climate Nexus: Identifying Process Interactions and Environmental Shifts in Mountain Catchments Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality Evan L Kipnis, Melanie A Murphey, Alan Klatt, Scott N Miller, David G Williams Snowpack accumulation and ablation remain difficult to estimate in forested headwater catchments. How physical terrain and forest cover separately and interactively influence spatial patterns of snow accumulation and ablation largely shapes the hydrologic response to land cover disturbances. Analysis of water isotopes in snowpack provides a powerful tool for examining integrated effects of water vapor exchange, selective redistribution, and melt. Snow water equivalence (SWE), δ2H, δ18O and deuterium excess (D-excess) of snowpack were examined throughout winter 2013-2014 across two headwater catchments impacted by bark beetle induced tree mortality. A USGS 10m DEM and a derived land cover product from 1m NAIP imagery were used to examine the effects of terrain features (e.g., elevation, slope, aspect) and canopy disturbance (e.g., live, bark-beetle killed) as predictors of D-excess, an expression of kinetic isotope effects, in snowpack. A weighting of Akaike's Information Criterion (AIC) values from multiple spatially lagged regression models describing D-excess variation for peak snowpack revealed strong effects of elevation and canopy mortality, and weaker, but significant effects of aspect and slope. Snowpack D-excess was lower in beetle-killed canopy patches compared to live green canopy patches, and at lower compared to high elevation locations, suggesting that integrated isotopic effects of vapor exchange, vertical advection of melted snow, and selective accumulation and redistribution varied systematically across the two catchments. The observed patterns illustrate the potential

  4. The HII Galaxy Hubble Diagram Strongly Favors Rh = ct over ΛCDM

    NASA Astrophysics Data System (ADS)

    Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio

    2016-08-01

    We continue to build support for the proposal to use HII galaxies (HIIGx) and giant extragalactic HII regions (GEHR) as standard candles to construct the Hubble diagram at redshifts beyond the current reach of Type Ia supernovae. Using a sample of 25 high-redshift HIIGx, 107 local HIIGx, and 24 GEHR, we confirm that the correlation between the emission-line luminosity and ionized-gas velocity dispersion is a viable luminosity indicator, and use it to test and compare the standard model ΛCDM and the Rh = ct Universe by optimizing the parameters in each cosmology using a maximization of the likelihood function. For the flat ΛCDM model, the best fit is obtained with Ω _m= 0.40_{-0.09}^{+0.09}. However, statistical tools, such as the Akaike (AIC), Kullback (KIC) and Bayes (BIC) Information Criteria favor Rh = ct over the standard model with a likelihood of ≈94.8% - 98.8% versus only ≈1.2% - 5.2%. For wCDM (the version of ΛCDM with a dark-energy equation of state wde ≡ pde/ρde rather than wde = wΛ = -1), a statistically acceptable fit is realized with Ω _m=0.22_{-0.14}^{+0.16} and w_de= -0.51_{-0.25}^{+0.15} which, however, are not fully consistent with their concordance values. In this case, wCDM has two more free parameters than Rh = ct, and is penalized more heavily by these criteria. We find that Rh = ct is strongly favored over wCDM with a likelihood of ≈92.9% - 99.6% versus only 0.4% - 7.1%. The current HIIGx sample is already large enough for the BIC to rule out ΛCDM/wCDM in favor of Rh = ct at a confidence level approaching 3σ.

  5. Alien plant invasion in mixed-grass prairie: Effects of vegetation type and anthropogenic disturbance

    USGS Publications Warehouse

    Larson, D.L.; Anderson, P.J.; Newton, W.

    2001-01-01

    The ability of alien plant species to invade a region depends not only on attributes of the plant, but on characteristics of the habitat being invaded. Here, we examine characteristics that may influence the success of alien plant invasion in mixed-grass prairie at Theodore Roosevelt National Park, in western North Dakota, USA. The park consists of two geographically separate units with similar vegetation types and management history, which allowed us to examine the effects of native vegetation type, anthropogenic disturbance, and the separate park units on the invasion of native plant communities by alien plant species common to counties surrounding both park units. If matters of chance related to availability of propagules and transient establishment opportunities determine the success of invasion, park unit and anthropogenic disturbance should better explain the variation in alien plant frequency. If invasibility is more strongly related to biotic or physical characteristics of the native plant communities, models of alien plant occurrence should include vegetation type as an explanatory variable. We examined >1300 transects across all vegetation types in both units of the park. Akaike's Information Criterion (AIC) indicated that the fully parameterized model, including the interaction among vegetation type, disturbance, and park unit, best described the distribution of both total number of alien plants per transect and frequency of alien plants on transects where they occurred. Although all vegetation types were invaded by alien plants, mesic communities had both greater numbers and higher frequencies of alien plants than did drier communities. A strong element of stochasticity, reflected in differences in frequencies of individual species between the two park units, suggests that prediction of risk of invasion will always involve uncertainty. In addition, despite well-documented associations between anthropogenic disturbance and alien plant invasion, five of

  6. The importance of retaining a phylogenetic perspective in traits-based community analyses

    SciTech Connect

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineages had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.

  7. The importance of retaining a phylogenetic perspective in traits-based community analyses

    DOE PAGESBeta

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineagesmore » had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.« less

  8. Biogeographical Interpretation of Elevational Patterns of Genus Diversity of Seed Plants in Nepal.

    PubMed

    Li, Miao; Feng, Jianmeng

    2015-01-01

    This study tests if the biogeographical affinities of genera are relevant for explaining elevational plant diversity patterns in Nepal. We used simultaneous autoregressive (SAR) models to investigate the explanatory power of several predictors in explaining the diversity-elevation relationships shown in genera with different biogeographical affinities. Delta akaike information criterion (ΔAIC) was used for multi-model inferences and selections. Our results showed that both the total and tropical genus diversity peaked below the mid-point of the elevational gradient, whereas that of temperate genera had a nearly symmetrical, unimodal relationship with elevation. The proportion of temperate genera increased markedly with elevation, while that of tropical genera declined. Compared to tropical genera, temperate genera had wider elevational ranges and were observed at higher elevations. Water-related variables, rather than mid-domain effects (MDE), were the most significant predictors of elevational patterns of tropical genus diversity. The temperate genus diversity was influenced by energy availability, but only in quadratic terms of the models. Though climatic factors and mid-domain effects jointly explained most of the variation in the diversity of temperate genera with elevation, the former played stronger roles. Total genus diversity was most strongly influenced by climate and the floristic overlap of tropical and temperate floras, while the influences of mid-domain effects were relatively weak. The influences of water-related and energy-related variables may vary with biogeographical affinities. The elevational patterns may be most closely related to climatic factors, while MDE may somewhat modify the patterns. Caution is needed when investigating the causal factors underlying diversity patterns for large taxonomic groups composed of taxa of different biogeographical affinities. Right-skewed diversity-elevation patterns may be produced by the differential

  9. Is First-Order Vector Autoregressive Model Optimal for fMRI Data?

    PubMed

    Ting, Chee-Ming; Seghouane, Abd-Krim; Khalid, Muhammad Usman; Salleh, Sh-Hussain

    2015-09-01

    We consider the problem of selecting the optimal orders of vector autoregressive (VAR) models for fMRI data. Many previous studies used model order of one and ignored that it may vary considerably across data sets depending on different data dimensions, subjects, tasks, and experimental designs. In addition, the classical information criteria (IC) used (e.g., the Akaike IC (AIC)) are biased and inappropriate for the high-dimensional fMRI data typically with a small sample size. We examine the mixed results on the optimal VAR orders for fMRI, especially the validity of the order-one hypothesis, by a comprehensive evaluation using different model selection criteria over three typical data types--a resting state, an event-related design, and a block design data set--with varying time series dimensions obtained from distinct functional brain networks. We use a more balanced criterion, Kullback's IC (KIC) based on Kullback's symmetric divergence combining two directed divergences. We also consider the bias-corrected versions (AICc and KICc) to improve VAR model selection in small samples. Simulation results show better small-sample selection performance of the proposed criteria over the classical ones. Both bias-corrected ICs provide more accurate and consistent model order choices than their biased counterparts, which suffer from overfitting, with KICc performing the best. Results on real data show that orders greater than one were selected by all criteria across all data sets for the small to moderate dimensions, particularly from small, specific networks such as the resting-state default mode network and the task-related motor networks, whereas low orders close to one but not necessarily one were chosen for the large dimensions of full-brain networks. PMID:26161816

  10. Modeling Heteroscedasticity of Wind Speed Time Series in the United Arab Emirates

    NASA Astrophysics Data System (ADS)

    Kim, H. Y.; Marpu, P. R.; Ouarda, T.

    2014-12-01

    There has been a growing interest in wind resources in the Gulf region, not only for evaluating wind energy potential, but also for understanding and forecasting changes in wind, as a regional climate variable. In particular, time varying variance—the second order moment—or heteroscedasticity in wind time series is important to investigate since high variance causes turbulence, which affects wind power potential and may lead to structural changes in wind turbines. Nevertheless, the conditional variance of wind time series has been rarely explored, especially in the Gulf region. Therefore, the seasonal autoregressive integrated moving average-generalized autoregressive conditional heteroscedasticity (SARIMA-GARCH) model is applied to observed wind data in the United Arab Emirates (UAE). This model allows considering apparent seasonality which is present in wind time series and the heteroscedasticity in residuals indicated with the Engle test, to understand and forecast changes in the conditional variance of wind time series. In this study, the autocorrelation function of daily average wind speed time series obtained from seven stations within the UAE—Al Aradh, Al Mirfa, Al Wagan, East of Jebel Haffet, Madinat Zayed, Masdar City, Sir Bani Yas Island—is inspected to fit a SARIMA model. The best SARIMA model is selected according to the minimum Akaike Information Criteria (AIC) and based on residuals of the model. Then, the GARCH model is applied to the remaining residuals to capture the conditional variance of the SARIMA model. Results indicate that the SARIMA-GARCH model provides a good fir to wind data in the UAE.

  11. Biogeographical Interpretation of Elevational Patterns of Genus Diversity of Seed Plants in Nepal

    PubMed Central

    Li, Miao; Feng, Jianmeng

    2015-01-01

    This study tests if the biogeographical affinities of genera are relevant for explaining elevational plant diversity patterns in Nepal. We used simultaneous autoregressive (SAR) models to investigate the explanatory power of several predictors in explaining the diversity-elevation relationships shown in genera with different biogeographical affinities. Delta akaike information criterion (ΔAIC) was used for multi-model inferences and selections. Our results showed that both the total and tropical genus diversity peaked below the mid-point of the elevational gradient, whereas that of temperate genera had a nearly symmetrical, unimodal relationship with elevation. The proportion of temperate genera increased markedly with elevation, while that of tropical genera declined. Compared to tropical genera, temperate genera had wider elevational ranges and were observed at higher elevations. Water-related variables, rather than mid-domain effects (MDE), were the most significant predictors of elevational patterns of tropical genus diversity. The temperate genus diversity was influenced by energy availability, but only in quadratic terms of the models. Though climatic factors and mid-domain effects jointly explained most of the variation in the diversity of temperate genera with elevation, the former played stronger roles. Total genus diversity was most strongly influenced by climate and the floristic overlap of tropical and temperate floras, while the influences of mid-domain effects were relatively weak. The influences of water-related and energy-related variables may vary with biogeographical affinities. The elevational patterns may be most closely related to climatic factors, while MDE may somewhat modify the patterns. Caution is needed when investigating the causal factors underlying diversity patterns for large taxonomic groups composed of taxa of different biogeographical affinities. Right-skewed diversity-elevation patterns may be produced by the differential

  12. Crucial nesting habitat for gunnison sage-grouse: A spatially explicit hierarchical approach

    USGS Publications Warehouse

    Aldridge, C.L.; Saher, D.J.; Childers, T.M.; Stahlnecker, K.E.; Bowen, Z.H.

    2012-01-01

    Gunnison sage-grouse (Centrocercus minimus) is a species of special concern and is currently considered a candidate species under Endangered Species Act. Careful management is therefore required to ensure that suitable habitat is maintained, particularly because much of the species' current distribution is faced with exurban development pressures. We assessed hierarchical nest site selection patterns of Gunnison sage-grouse inhabiting the western portion of the Gunnison Basin, Colorado, USA, at multiple spatial scales, using logistic regression-based resource selection functions. Models were selected using Akaike Information Criterion corrected for small sample sizes (AIC c) and predictive surfaces were generated using model averaged relative probabilities. Landscape-scale factors that had the most influence on nest site selection included the proportion of sagebrush cover >5%, mean productivity, and density of 2 wheel-drive roads. The landscape-scale predictive surface captured 97% of known Gunnison sage-grouse nests within the top 5 of 10 prediction bins, implicating 57% of the basin as crucial nesting habitat. Crucial habitat identified by the landscape model was used to define the extent for patch-scale modeling efforts. Patch-scale variables that had the greatest influence on nest site selection were the proportion of big sagebrush cover >10%, distance to residential development, distance to high volume paved roads, and mean productivity. This model accurately predicted independent nest locations. The unique hierarchical structure of our models more accurately captures the nested nature of habitat selection, and allowed for increased discrimination within larger landscapes of suitable habitat. We extrapolated the landscape-scale model to the entire Gunnison Basin because of conservation concerns for this species. We believe this predictive surface is a valuable tool which can be incorporated into land use and conservation planning as well the assessment of

  13. Climatic patterns in the establishment of wintering areas by North American migratory birds.

    PubMed

    Pérez-Moreno, Heidi; Martínez-Meyer, Enrique; Soberón Mainero, Jorge; Rojas-Soto, Octavio

    2016-04-01

    Long-distance migration in birds is relatively well studied in nature; however, one aspect of this phenomenon that remains poorly understood is the pattern of distribution presented by species during arrival to and establishment of wintering areas. Some studies suggest that the selection of areas in winter is somehow determined by climate, given its influence on both the distribution of bird species and their resources. We analyzed whether different migrant passerine species of North America present climatic preferences during arrival to and departure from their wintering areas. We used ecological niche modeling to generate monthly potential climatic distributions for 13 migratory bird species during the winter season by combining the locations recorded per month with four environmental layers. We calculated monthly coefficients of climate variation and then compared two GLM (generalized linear models), evaluated with the AIC (Akaike information criterion), to describe how these coefficients varied over the course of the season, as a measure of the patterns of establishment in the wintering areas. For 11 species, the sites show nonlinear patterns of variation in climatic preferences, with low coefficients of variation at the beginning and end of the season and higher values found in the intermediate months. The remaining two species analyzed showed a different climatic pattern of selective establishment of wintering areas, probably due to taxonomic discrepancy, which would affect their modeled winter distribution. Patterns of establishment of wintering areas in the species showed a climatic preference at the macroscale, suggesting that individuals of several species actively select wintering areas that meet specific climatic conditions. This probably gives them an advantage over the winter and during the return to breeding areas. As these areas become full of migrants, alternative suboptimal sites are occupied. Nonrandom winter area selection may also have

  14. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal

  15. Does the duration and time of sleep increase the risk of allergic rhinitis? Results of the 6-year nationwide Korea youth risk behavior web-based survey.

    PubMed

    Kwon, Jeoung A; Lee, Minjee; Yoo, Ki-Bong; Park, Eun-Cheol

    2013-01-01

    Allergic rhinitis (AR) is the most common chronic disorder in the pediatric population. Although several studies have investigated the correlation between AR and sleep-related issues, the association between the duration and time of sleep and AR has not been analyzed in long-term national data. This study investigated the relationship between sleep time and duration and AR risk in middle- and high-school students (adolescents aged 12-18). We analyzed national data from the Korea Youth Risk Behavior Web-based Survey by the Korea Centers for Disease Control and Prevention from 2007-2012. The sample size was 274,480, with an average response rate of 96.2%. Multivariate logistic regression analyses were conducted to determine the relationship between sleep and AR risk. Furthermore, to determine the best-fitted model among independent variables such as sleep duration, sleep time, and the combination of sleep duration and sleep time, we used Akaike Information Criteria (AIC) to compare models. A total of 43,337 boys and 41,665 girls reported a diagnosis of AR at baseline. The odds ratio increased with age and with higher education and economic status of the parents. Further, students in mid-sized and large cities had stronger relationships to AR than those in small cities. In both genders, AR was associated with depression and suicidal ideation. In the analysis of sleep duration and sleep time, the odds ratio increased in both genders when sleep duration was <7 hours, and when the time of sleep was later than 24:00 hours. Our results indicate an association between sleep time and duration and AR. This study is the first to focus on the relationship between sleep duration and time and AR in national survey data collected over 6 years. PMID:24015253

  16. Recovery of native treefrogs after removal of nonindigenous Cuban Treefrogs, Osteopilus septentrionalis

    USGS Publications Warehouse

    Rice, K.G.; Waddle, J.H.; Miller, M.W.; Crockett, M.E.; Mazzotti, F.J.; Percival, H.F.

    2011-01-01

    Florida is home to several introduced animal species, especially in the southern portion of the state. Most introduced species are restricted to the urban and suburban areas along the coasts, but some species, like the Cuban Treefrog (Osteopilus septentrionalis), are locally abundant in natural protected areas. Although Cuban Treefrogs are known predators of native treefrog species as both adults and larvae, no study has demonstrated a negative effect of Cuban Treefrogs on native treefrog survival, abundance, or occupancy rate. We monitored survival, capture probability, abundance, and proportion of sites occupied by Cuban Treefrogs and two native species, Green Treefrogs (Hyla cinerea) and Squirrel Treefrogs (Hyla squirella), at four sites in Everglades National Park in southern Florida with the use of capture-mark-recapture techniques. After at least 5 mo of monitoring all species at each site we began removing every Cuban Treefrog captured. We continued to estimate survival, abundance, and occupancy rates of native treefrogs for 1 yr after the commencement of Cuban Treefrog removal. Mark-recapture models that included the effect of Cuban Treefrog removal on native treefrog survival did not have considerable Akaike's Information Criterion (AIC) weight, although capture rates of native species were generally very low prior to Cuban Treefrog removal. Estimated abundance of native treefrogs did increase after commencement of Cuban Treefrog removal, but also varied with the season of the year. The best models of native treefrog occupancy included a Cuban Treefrog removal effect at sites with high initial densities of Cuban Treefrogs. This study demonstrates that an introduced predator can have population-level effects on similar native species. ?? 2011 The Herpetologists' League, Inc.

  17. Development of an Adaptive Multi-Method Algorithm for Automatic Picking of First Arrival Times: Application to Near Surface Seismic Data

    NASA Astrophysics Data System (ADS)

    Khalaf, A.; Camerlynck, C. M.; Schneider, A. C.; Florsch, N.

    2015-12-01

    Accurate picking of first arrival times plays an important role in many seismic studies, particularly in seismic tomography and reservoirs or aquifers monitoring. Many techniques have been developed for picking first arrivals automatically or semi-automatically, but most of them were developed for seismological purposes which does not attain the accuracy objectives due to the complexity of near surface structures, and to usual low signal-to-noise ratio. We propose a new adaptive algorithm for near surface data based on three picking methods, combining multi-nested windows (MNW), Higher Order Statistics (HOS), and Akaike Information Criterion (AIC). They exploit the benefits of integrating many properties, which reveal the presence of first arrivals, to provide an efficient and robust first arrivals picking. This strategy mimics the human first-break picking, where at the beginning the global trend is defined. Then the exact first-breaks are searched in the vicinity of the now defined trend. In a multistage algorithm, three successive phases are launched, where each of them characterize a specific signal property. Within each phase, the potential picks and their error range are automatically estimated, and then used sequentially as leader in the following phase picking. The accuracy and robustness of the implemented algorithm are successfully validated on synthetic and real data which have special challenges for automatic pickers. The comparison of resulting P-wave arrival times with those picked manually, and other algorithms of automatic picking, demonstrated the reliable performance of the new scheme under different noisy conditions. All parameters of our multi-method algorithm are auto-adaptive thanks to the integration in series of each sub-algorithm results in the flow. Hence, it is nearly a parameter-free algorithm, which is straightforward to implement and demands low computational resources.

  18. Time-dependent oral absorption models

    NASA Technical Reports Server (NTRS)

    Higaki, K.; Yamashita, S.; Amidon, G. L.

    2001-01-01

    The plasma concentration-time profiles following oral administration of drugs are often irregular and cannot be interpreted easily with conventional models based on first- or zero-order absorption kinetics and lag time. Six new models were developed using a time-dependent absorption rate coefficient, ka(t), wherein the time dependency was varied to account for the dynamic processes such as changes in fluid absorption or secretion, in absorption surface area, and in motility with time, in the gastrointestinal tract. In the present study, the plasma concentration profiles of propranolol obtained in human subjects following oral dosing were analyzed using the newly derived models based on mass balance and compared with the conventional models. Nonlinear regression analysis indicated that the conventional compartment model including lag time (CLAG model) could not predict the rapid initial increase in plasma concentration after dosing and the predicted Cmax values were much lower than that observed. On the other hand, all models with the time-dependent absorption rate coefficient, ka(t), were superior to the CLAG model in predicting plasma concentration profiles. Based on Akaike's Information Criterion (AIC), the fluid absorption model without lag time (FA model) exhibited the best overall fit to the data. The two-phase model including lag time, TPLAG model was also found to be a good model judging from the values of sum of squares. This model also described the irregular profiles of plasma concentration with time and frequently predicted Cmax values satisfactorily. A comparison of the absorption rate profiles also suggested that the TPLAG model is better at prediction of irregular absorption kinetics than the FA model. In conclusion, the incorporation of a time-dependent absorption rate coefficient ka(t) allows the prediction of nonlinear absorption characteristics in a more reliable manner.

  19. SU-E-T-399: Determination of the Radiobiological Parameters That Describe the Dose-Response Relations of Xerostomia and Disgeusia From Head and Neck Radiotherapy

    SciTech Connect

    Mavroidis, P; Stathakis, S; Papanikolaou, N; Peixoto Xavier, C; Costa Ferreira, B; Khouri, L; Carmo Lopes, M do

    2014-06-01

    Purpose: To estimate the radiobiological parameters that describe the doseresponse relations of xerostomia and disgeusia from head and neck cancer radiotherapy. To identify the organs that are best correlated with the manifestation of those clinical endpoints. Finally, to evaluate the goodnessof- fit by comparing the model predictions against the actual clinical results. Methods: In this study, 349 head and neck cancer patients were included. For each patient the dose volume histograms (DVH) of parotids (separate and combined), mandible, submandibular glands (separate and combined) and salivary glands were calculated. The follow-up of those patients was recorded at different times after the completion of the treatment (7 weeks, 3, 7, 12, 18 and 24 months). Acute and late xerostomia and acute disgeusia were the clinical endpoints examined. A maximum likelihood fitting was performed to calculate the best estimates of the parameters used by the relative seriality model. The statistical methods of the error distribution, the receiver operating characteristic (ROC) curve, the Pearson's test and the Akaike's information criterion were utilized to assess the goodness-of-fit and the agreement between the pattern of the radiobiological predictions with that of the clinical records. Results: The estimated values of the radiobiological parameters of salivary glands are D50 = 25.2 Gy, γ = 0.52, s = 0.001. The statistical analysis confirmed the clinical validity of those parameters (area under the ROC curve = 0.65 and AIC = 38.3). Conclusion: The analysis proved that the treatment outcome pattern of the patient material can be reproduced by the relative seriality model and the estimated radiobiological parameters. Salivary glands were found to have strong volume dependence (low relative seriality). Diminishing the biologically effective uniform dose to salivary glands below 30 Gy may significantly reduce the risk of complications to the patients irradiated for prostate cancer.

  20. Modeling a habitat suitability index for the eastern fall cohort of Ommastrephes bartramii in the central North Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Chen, Xinjun; Tian, Siquan; Liu, Bilin; Chen, Yong

    2011-05-01

    The eastern fall cohort of the neon flying squid, Ommastrephes bartramii, has been commercially exploited by the Chinese squid jigging fleet in the central North Pacific Ocean since the late 1990s. To understand and identify their optimal habitat, we have developed a habitat suitability index (HSI) model using two potential important environmental variables — sea surface temperature (SST) and sea surface height anomaly (SSHA) — and fishery data from the main fishing ground (165°-180°E) during June and July of 1999-2003. A geometric mean model (GMM), minimum model (MM) and arithmetic weighted model (AWM) with different weights were compared and the best HSI model was selected using Akaike's information criterion (AIC). The performance of the developed HSI model was evaluated using fishery data for 2004. This study suggests that the highest catch per unit effort (CPUE) and fishing effort are closely related to SST and SSHA. The best SST- and SSHA-based suitability index (SI) regression models were SISST-based = 0.7SIeffort-SST + 0.3 SICPUE-SST, and SISSHA-based = 0.5SIeffort-SSHA + 0.5SICPUE-SSHA, respectively, showing that fishing effort is more important than CPUE in the estimation of SI. The best HSI model was the AWM, defined as HSI=0.3SISST-based+ 0.7SISSHA-based, indicating that SSHA is more important than SST in estimating the HSI of squid. In 2004, monthly HSI values greater than 0.6 coincided with the distribution of productive fishing ground and high CPUE in June and July, suggesting that the models perform well. The proposed model provides an important tool in our efforts to develop forecasting capacity of squid spatial dynamics.

  1. In Vivo Evaluation of Blood Based and Reference Tissue Based PET Quantifications of [11C]DASB in the Canine Brain.

    PubMed

    Van Laeken, Nick; Taylor, Olivia; Polis, Ingeborgh; Neyt, Sara; Kersemans, Ken; Dobbeleir, Andre; Saunders, Jimmy; Goethals, Ingeborg; Peremans, Kathelijne; De Vos, Filip

    2016-01-01

    This first-in-dog study evaluates the use of the PET-radioligand [11C]DASB to image the density and availability of the serotonin transporter (SERT) in the canine brain. Imaging the serotonergic system could improve diagnosis and therapy of multiple canine behavioural disorders. Furthermore, as many similarities are reported between several human neuropsychiatric conditions and naturally occurring canine behavioural disorders, making this tracer available for use in dogs also provide researchers an interesting non-primate animal model to investigate human disorders. Five adult beagles underwent a 90 minutes dynamic PET scan and arterial whole blood was sampled throughout the scan. For each ROI, the distribution volume (VT), obtained via the one- and two- tissue compartment model (1-TC, 2-TC) and the Logan Plot, was calculated and the goodness-of-fit was evaluated by the Akaike Information Criterion (AIC). For the preferred compartmental model BPND values were estimated and compared with those derived by four reference tissue models: 4-parameter RTM, SRTM2, MRTM2 and the Logan reference tissue model. The 2-TC model indicated in 61% of the ROIs a better fit compared to the 1-TC model. The Logan plot produced almost identical VT values and can be used as an alternative. Compared with the 2-TC model, all investigated reference tissue models showed high correlations but small underestimations of the BPND-parameter. The highest correlation was achieved with the Logan reference tissue model (Y = 0.9266 x + 0.0257; R2 = 0.9722). Therefore, this model can be put forward as a non-invasive standard model for future PET-experiments with [11C]DASB in dogs. PMID:26859850

  2. Influence of habitat heterogeneity on the distribution of larval Pacific lamprey (Lampetra tridentata) at two spatial scales

    USGS Publications Warehouse

    Torgersen, Christian E.; Close, David A.

    2004-01-01

    1. Spatial patterns in channel morphology and substratum composition at small (1a??10 metres) and large scales (1a??10 kilometres) were analysed to determine the influence of habitat heterogeneity on the distribution and abundance of larval lamprey. 2. We used a nested sampling design and multiple logistic regression to evaluate spatial heterogeneity in the abundance of larval Pacific lamprey, Lampetra tridentata, and habitat in 30 sites (each composed of twelve 1-m2 quadrat samples) distributed throughout a 55-km section of the Middle Fork John Day River, OR, U.SA. Statistical models predicting the relative abundance of larvae both among sites (large scale) and among samples (small scale) were ranked using Akaike's Information Criterion (AIC) to identify the 'best approximating' models from a set of a priori candidate models determined from the literature on larval lamprey habitat associations. 3. Stream habitat variables predicted patterns in larval abundance but played different roles at different spatial scales. The abundance of larvae at large scales was positively associated with water depth and open riparian canopy, whereas patchiness in larval occurrence at small scales was associated with low water velocity, channel-unit morphology (pool habitats), and the availability of habitat suitable for burrowing. 4. Habitat variables explained variation in larval abundance at large and small scales, but locational factors, such as longitudinal position (river km) and sample location within the channel unit, explained additional variation in the logistic regression model. The results emphasise the need for spatially explicit analysis, both in examining fish habitat relationships and in developing conservation plans for declining fish populations.

  3. Plant species invasions along the latitudinal gradient in the United States

    USGS Publications Warehouse

    Stohlgren, T.J.; Barnett, D.; Flather, C.; Kartesz, J.; Peterjohn, B.

    2005-01-01

    It has been long established that the richness of vascular plant species and many animal taxa decreases with increasing latitude, a pattern that very generally follows declines in actual and potential evapotranspiration, solar radiation, temperature, and thus, total productivity. Using county-level data on vascular plants from the United States (3000 counties in the conterminous 48 states), we used the Akaike Information Criterion (AIC) to evaluate competing models predicting native and nonnative plant species density (number of species per square kilometer in a county) from various combinations of biotic variables (e.g., native bird species density, vegetation carbon, normalized difference vegetation index), environmental/topographic variables (elevation, variation in elevation, the number of land cover classes in the county; radiation, mean precipitation, actual evapotranspiration, and potential evapotranspiration), and human variables (human population density, crop-land, and percentage of disturbed lands in a county). We found no evidence of a latitudinal gradient for the density of native plant species and a significant, slightly positive latitudinal gradient for the density of nonnative plant species. We found stronger evidence of a significant, positive productivity gradient (vegetation carbon) for the density of native plant species and nonnative plant species. We found much stronger significant relationships when biotic, environmental/topographic, and human variables were used to predict native plant species density and nonnative plant species density. Biotic variables generally had far greater influence in multivariate models than human or environmental/topographic variables. Later, we found that the best, single, positive predictor of the density of nonnative plant species in a county was the density of native plant species in a county. While further study is needed, it may be that, while humans facilitate the initial establishment invasions of nonnative

  4. Blood lead concentrations in free-ranging Nile crocodiles (Crocodylus niloticus) from South Africa.

    PubMed

    Warner, Jonathan K; Combrink, Xander; Myburgh, Jan G; Downs, Colleen T

    2016-07-01

    Generally crocodilians have received little attention with regard to the effects of lead toxicity despite their trophic status as apex, generalist predators that utilize both aquatic and terrestrial habitats, thereby exposing them to a potentially wide range of environmental contaminants. During July-October 2010 we collected whole blood from 34 sub-adult and adult free-ranging Nile crocodiles (Crocodylus niloticus) from three separate populations in northeastern South Africa in order to analyze their blood lead concentrations (BPb). Concentrations ranged from below detectability (<3 μg/dL, n = 8) to 960 μg/dL for an adult male at the Lake St Lucia Estuary. Blood lead concentrations averaged 8.15 μg/dL (SD = 7.47) for females and 98.10 μg/dL (SD = 217.42) for males. Eighteen individuals (53 %) had elevated BPbs (≥10 μg/dL). We assessed 12 general linear models using Akaike's Information Criterion (AIC) and found no significant statistical effects among the parameters of sex, crocodile size and population sampled. On average, crocodiles had higher BPbs at Lake St Lucia than at Ndumo Game Reserve or Kosi Bay, which we attribute to lead sinker ingestion during normal gastrolith acquisition. No clinical effects of lead toxicosis were observed in these crocodiles, even though the highest concentration (960 μg/dL) we report represents the most elevated BPb recorded to date for a free-ranging vertebrate. Although we suggest adult Nile crocodiles are likely tolerant of elevated Pb body burdens, experimental studies on other crocodilian species suggest the BPb levels reported here may have harmful or fatal effects to egg development and hatchling health. In light of recent Nile crocodile nesting declines in South Africa we urge further BPb monitoring and ecotoxicology research on reproductive females and embryos. PMID:27038476

  5. Geomagnetic survey and geomagnetic model research in China

    NASA Astrophysics Data System (ADS)

    Gu, Zuowen; Zhan, Zhijia; Gao, Jintian; Han, Wei; An, Zhenchang; Yao, Tongqi; Chen, Bin

    2006-06-01

    The geomagnetic survey at 135 stations in China were carried out in 2003. These stations are with better environmental condition and small magnetic field gradient (<5 nT/m). In the field survey, the geomagnetic declination D, the inclination I and the total intensity F were measured. Ashtech ProMark2 differential GPS (Global Positioning System) was used in measuring the azimuth, the longitude, the latitude and the elevation at these stations. The accuracy of the azimuth is 0.1'. The geomagnetic survey data were reduced using the data at geomagnetic observatories in China. The mean standard deviations of the geomagnetic reduced values are: <1.5 nT for F, <0.5' for D and I. Using the geomagnetic data which include the data at 135 stations and 35 observatories in China, and the data at 38 IGRF (International Geomagnetic Reference Field) calculation points in China's adjacent regions, the Taylor polynomial model and the spherical cap harmonic model were calculated for the geomagnetic field in China. The truncation order of the Taylor polynomial model is 5, and its original point is at 36.0°N and 104.5°E. Based on the geomagnetic anomalous values and using the method of spherical cap harmonic (SCH) analysis, the SCH model of the geomagnetic anomalous field was derived. In the SCH model, the pole of the spherical cap is at 36.0°N and 104.5°E, and the half-angle is 30°, the truncation order K= 8 is determined according to the mean square deviation between the model calculation value and the observation value, the AIC (Akaike Information Criterion) and the distribution of geomagnetic field.

  6. A functional biological network centered on XRCC3: a new possible marker of chemoradiotherapy resistance in rectal cancer patients

    PubMed Central

    Agostini, Marco; Zangrando, Andrea; Pastrello, Chiara; D'Angelo, Edoardo; Romano, Gabriele; Giovannoni, Roberto; Giordan, Marco; Maretto, Isacco; Bedin, Chiara; Zanon, Carlo; Digito, Maura; Esposito, Giovanni; Mescoli, Claudia; Lavitrano, Marialuisa; Rizzolio, Flavio; Jurisica, Igor; Giordano, Antonio; Pucciarelli, Salvatore; Nitti, Donato

    2015-01-01

    Preoperative chemoradiotherapy is widely used to improve local control of disease, sphincter preservation and to improve survival in patients with locally advanced rectal cancer. Patients enrolled in the present study underwent preoperative chemoradiotherapy, followed by surgical excision. Response to chemoradiotherapy was evaluated according to Mandard's Tumor Regression Grade (TRG). TRG 3, 4 and 5 were considered as partial or no response while TRG 1 and 2 as complete response. From pretherapeutic biopsies of 84 locally advanced rectal carcinomas available for the analysis, only 42 of them showed 70% cancer cellularity at least. By determining gene expression profiles, responders and non-responders showed significantly different expression levels for 19 genes (P < 0.001). We fitted a logistic model selected with a stepwise procedure optimizing the Akaike Information Criterion (AIC) and then validated by means of leave one out cross validation (LOOCV, accuracy = 95%). Four genes were retained in the achieved model: ZNF160, XRCC3, HFM1 and ASXL2. Real time PCR confirmed that XRCC3 is overexpressed in responders group and HFM1 and ASXL2 showed a positive trend. In vitro test on colon cancer resistant/susceptible to chemoradioterapy cells, finally prove that XRCC3 deregulation is extensively involved in the chemoresistance mechanisms. Protein-protein interactions (PPI) analysis involving the predictive classifier revealed a network of 45 interacting nodes (proteins) with TRAF6 gene playing a keystone role in the network. The present study confirmed the possibility that gene expression profiling combined with integrative computational biology is useful to predict complete responses to preoperative chemoradiotherapy in patients with advanced rectal cancer. PMID:26023803

  7. Effects of reproductive condition, roost microclimate, and weather patterns on summer torpor use by a vespertilionid bat

    PubMed Central

    Johnson, Joseph S; Lacki, Michael J

    2014-01-01

    A growing number of mammal species are recognized as heterothermic, capable of maintaining a high-core body temperature or entering a state of metabolic suppression known as torpor. Small mammals can achieve large energetic savings when torpid, but they are also subject to ecological costs. Studying torpor use in an ecological and physiological context can help elucidate relative costs and benefits of torpor to different groups within a population. We measured skin temperatures of 46 adult Rafinesque's big-eared bats (Corynorhinus rafinesquii) to evaluate thermoregulatory strategies of a heterothermic small mammal during the reproductive season. We compared daily average and minimum skin temperatures as well as the frequency, duration, and depth of torpor bouts of sex and reproductive classes of bats inhabiting day-roosts with different thermal characteristics. We evaluated roosts with microclimates colder (caves) and warmer (buildings) than ambient air temperatures, as well as roosts with intermediate conditions (trees and rock crevices). Using Akaike's information criterion (AIC), we found that different statistical models best predicted various characteristics of torpor bouts. While the type of day-roost best predicted the average number of torpor bouts that bats used each day, current weather variables best predicted daily average and minimum skin temperatures of bats, and reproductive condition best predicted average torpor bout depth and the average amount of time spent torpid each day by bats. Finding that different models best explain varying aspects of heterothermy illustrates the importance of torpor to both reproductive and nonreproductive small mammals and emphasizes the multifaceted nature of heterothermy and the need to collect data on numerous heterothermic response variables within an ecophysiological context. PMID:24558571

  8. Nonstationarity in the occurrence rate of floods in the Tarim River basin, China, and related impacts of climate indices

    NASA Astrophysics Data System (ADS)

    Gu, Xihui; Zhang, Qiang; Singh, Vijay P.; Chen, Xi; Liu, Lin

    2016-07-01

    Amplification of floods in the Xinjiang, China, has been observed, but reports on their changing properties and underlying mechanisms are not available. In this study, occurrence rates of floods in the Tarim River basin, the largest inland arid river basin in China, were analyzed using the Kernel density estimation technique and bootstrap resampling method. Also analyzed were the occurrence rates of precipitation extremes using the POT (Peak over Threshold)-based sampling method. Both stationary and non-stationary models were developed using GAMLSS (Generalized Additive Models for Location, Scale and Shape) to model flood frequency with time, climate index, precipitation and temperature as major predictors. Results indicated: (1) two periods with increasing occurrence of floods, i.e., the late 1960s and the late 1990s with considerable fluctuations around 2-3 flood events during time intervals between the late 1960s and the late 1990s; (2) changes in the occurrence rates of floods were subject to nonstationarity. A persistent increase of flood frequency and magnitude was observed during the 1990s and reached a peak value; (3) AMO (Atlantic Multidecadal Oscillation) and AO (Atlantic Oscillation) in winter were the key influencing climate indices impacting the occurrence rates of floods. However, NAO (North Atlantic Oscillation) and SOI (South Oscillation Index) are two principle factors that influence the occurrence rates of regional floods. The AIC (Akaike Information Criterion) values indicated that compared to the influence of climate indices, occurrence rates of floods seemed to be more sensitive to temperature and precipitation changes. Results of this study are important for flood management and development of mitigation measures.

  9. Variation in PAH-related DNA adduct levels among non-smokers: the role of multiple genetic polymorphisms and nucleotide excision repair phenotype.

    PubMed

    Etemadi, Arash; Islami, Farhad; Phillips, David H; Godschalk, Roger; Golozar, Asieh; Kamangar, Farin; Malekshah, Akbar Fazel-Tabar; Pourshams, Akram; Elahi, Seerat; Ghojaghi, Farhad; Strickland, Paul T; Taylor, Philip R; Boffetta, Paolo; Abnet, Christian C; Dawsey, Sanford M; Malekzadeh, Reza; van Schooten, Frederik J

    2013-06-15

    Polycyclic aromatic hydrocarbons (PAHs) likely play a role in many cancers even in never-smokers. We tried to find a model to explain the relationship between variation in PAH-related DNA adduct levels among people with similar exposures, multiple genetic polymorphisms in genes related to metabolic and repair pathways, and nucleotide excision repair (NER) capacity. In 111 randomly selected female never-smokers from the Golestan Cohort Study in Iran, we evaluated 21 SNPs in 14 genes related to xenobiotic metabolism and 12 SNPs in eight DNA repair genes. NER capacity was evaluated by a modified comet assay, and aromatic DNA adduct levels were measured in blood by32P-postlabeling. Multivariable regression models were compared by Akaike's information criterion (AIC). Aromatic DNA adduct levels ranged between 1.7 and 18.6 per 10(8) nucleotides (mean: 5.8 ± 3.1). DNA adduct level was significantly lower in homozygotes for NAT2 slow alleles and ERCC5 non-risk-allele genotype, and was higher in the MPO homozygote risk-allele genotype. The sum of risk alleles in these genes significantly correlated with the log-adduct level (r = 0.4, p < 0.001). Compared with the environmental model, adding Phase I SNPs and NER capacity provided the best fit, and could explain 17% more of the variation in adduct levels. NER capacity was affected by polymorphisms in the MTHFR and ERCC1 genes. Female non-smokers in this population had PAH-related DNA adduct levels three to four times higher than smokers and occupationally-exposed groups in previous studies, with large inter-individual variation which could best be explained by a combination of Phase I genes and NER capacity. PMID:23175176

  10. Forecast of natural aquifer discharge using a data-driven, statistical approach.

    PubMed

    Boggs, Kevin G; Van Kirk, Rob; Johnson, Gary S; Fairley, Jerry P

    2014-01-01

    In the Western United States, demand for water is often out of balance with limited water supplies. This has led to extensive water rights conflict and litigation. A tool that can reliably forecast natural aquifer discharge months ahead of peak water demand could help water practitioners and managers by providing advanced knowledge of potential water-right mitigation requirements. The timing and magnitude of natural aquifer discharge from the Eastern Snake Plain Aquifer (ESPA) in southern Idaho is accurately forecast 4 months ahead of the peak water demand, which occurs annually in July. An ARIMA time-series model with exogenous predictors (ARIMAX model) was used to develop the forecast. The ARIMAX model fit to a set of training data was assessed using Akaike's information criterion to select the optimal model that forecasts aquifer discharge, given the previous year's discharge and values of the predictor variables. Model performance was assessed by application of the model to a validation subset of data. The Nash-Sutcliffe efficiency for model predictions made on the validation set was 0.57. The predictor variables used in our forecast represent the major recharge and discharge components of the ESPA water budget, including variables that reflect overall water supply and important aspects of water administration and management. Coefficients of variation on the regression coefficients for streamflow and irrigation diversions were all much less than 0.5, indicating that these variables are strong predictors. The model with the highest AIC weight included streamflow, two irrigation diversion variables, and storage. PMID:24571388

  11. Detecting the small island effect and nestedness of herpetofauna of the West Indies.

    PubMed

    Gao, De; Perry, Gad

    2016-08-01

    To detect the small island effect (SIE) and nestedness patterns of herpetofauna of the West Indies, we derived and updated data on the presence/absence of herpetofauna in this region from recently published reviews. We applied regression-based analyses, including linear regression and piecewise regressions with two and three segments, to detect the SIE and then used the Akaike's information criterion (AIC) as a criterion to select the best model. We used the NODF (a nestedness metric based on overlap and decreasing fill) to quantify nestedness and employed two null models to determine significance. Moreover, a random sampling effort was made to infer about the degree of nestedness at portions of the entire community. We found piecewise regression with three segments performed best, suggesting the species-area relationships possess three different patterns that resulted from two area thresholds: a first one, delimiting the SIE, and a second one, delimiting evolutionary processes. We also found that taxa with lower resource requirement, higher dispersal ability, and stronger adaptation to the environment generally displayed lower corresponding threshold values, indicating superior taxonomic groups could earlier end the SIE period and start in situ speciation as the increase of island size. Moreover, the traditional two-segment piecewise regression method may cause poor estimations for both slope and threshold value of the SIE. Therefore, we suggest previous SIE detection works that conducted by two-segment piecewise regression method, ignoring the possibility of three segments, need to be reanalyzed. Antinestedness occurred in the entire system, whereas high degree of nestedness could still occur in portions within the region. Nestedness may still be applicable to conservation planning at portions even if it is antinested at the regional scale. However, nestedness may not be applicable to conservation planning at the regional scale even if nestedness does exist

  12. Landscape conditions predisposing grizzly bears to conflicts on private agricultural lands in the western USA

    USGS Publications Warehouse

    Wilson, S.M.; Madel, M.J.; Mattson, D.J.; Graham, J.M.; Merrill, T.

    2006-01-01

    We used multiple logistic regression to model how different landscape conditions contributed to the probability of human-grizzly bear conflicts on private agricultural ranch lands. We used locations of livestock pastures, traditional livestock carcass disposal areas (boneyards), beehives, and wetland-riparian associated vegetation to model the locations of 178 reported human-grizzly bear conflicts along the Rocky Mountain East Front, Montana, USA during 1986-2001. We surveyed 61 livestock producers in the upper Teton watershed of north-central Montana, to collect spatial and temporal data on livestock pastures, boneyards, and beehives for the same period, accounting for changes in livestock and boneyard management and beehive location and protection, for each season. We used 2032 random points to represent the null hypothesis of random location relative to potential explanatory landscape features, and used Akaike's Information Criteria (AIC/AICC) and Hosmer-Lemeshow goodness-of-fit statistics for model selection. We used a resulting "best" model to map contours of predicted probabilities of conflict, and used this map for verification with an independent dataset of conflicts to provide additional insights regarding the nature of conflicts. The presence of riparian vegetation and distances to spring, summer, and fall sheep or cattle pastures, calving and sheep lambing areas, unmanaged boneyards, and fenced and unfenced beehives were all associated with the likelihood of human-grizzly bear conflicts. Our model suggests that collections of attractants concentrated in high quality bear habitat largely explain broad patterns of human-grizzly bear conflicts on private agricultural land in our study area. ?? 2005 Elsevier Ltd. All rights reserved.

  13. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination

    PubMed Central

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu

    2013-01-01

    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  14. Long lead-time flood forecasting using data-driven modeling approaches

    NASA Astrophysics Data System (ADS)

    Bhatia, N.; He, J.; Srivastav, R. K.

    2014-12-01

    In spite of numerous structure measures being taken for floods, accurate flood forecasting is essential to condense the damages in hazardous areas considerably. The need of producing more accurate flow forecasts motivates the researchers to develop advanced innovative methods. In this study, it is proposed to develop a hybrid neural network model to exploit the strengths of artificial neural networks (ANNs). The proposed model has two components: i.) Dual - ANN model developed using river flows; and ii.) Multiple Linear Regression (MLR) model trained on meteorological data (Rainfall and Snow on ground). Potential model inputs that best represent the process of river basin were selected in stepwise manner by identifying input-output relationship using a linear approach, Partial Correlation Input Selection (PCIS) combined with Akaike Information Criterion (AIC) technique. The presented hybrid model was compared with three conventional methods: i) Feed-forward artificial neural network (FF-ANN) using daily river flows; ii) FF-ANN applied on decomposed river flows (low flow, rising limb and falling limb of hydrograph); and iii) Recursive method for daily river flows with lead-time of 7 days. The applicability of the presented model is illustrated through daily river flow data of Bow River, Canada. Data from 1912 to 1976 were used to train the models while data from 1977 to 2006 were used to validate the models. The results of the study indicate that the proposed model is robust enough to capture the non-linear nature of hydrograph and proves to be highly promising to forecast peak flows (extreme values) well in advance (higher lead time).

  15. [Teacher Referral Information and Statistical Information Forms.

    ERIC Educational Resources Information Center

    Short, N. J.

    This rating information form used to refer children to the PIC program, elicits information concerning the child's emotional, cognitive, and personality development. See TM 001 111 for details of the program in which it is used. (DLG)

  16. Information Skills for an Information Age?

    ERIC Educational Resources Information Center

    Gawith, Gwen

    1986-01-01

    Although information skills are the most basic of skills, the tendency is to teach strategies related to educational projects, erroneously assuming that these "information skills" are applicable to everyday decision-making. Educated imaginations are needed for today's variety of lifelong creative information situations. (17 references) (CJH)

  17. Information as Wealth.

    ERIC Educational Resources Information Center

    Deruchie, Douglas M.

    1992-01-01

    Discusses the value of information-based services in today's global economy. The combination of information technology with library and information services at an international accounting and auditing firm is described; the Global Information Network is explained; and the importance of the appropriate use of information is discussed. (LRW)

  18. Information-Mapped Chemistry.

    ERIC Educational Resources Information Center

    Olympia, P. L., Jr.

    1979-01-01

    This paper describes the use of information mapping in chemistry and in other related sciences. Information mapping is a way of presenting information without paragraphs and unnecessary transitional phrases. (BB)

  19. Information Practice and Malpractice.

    ERIC Educational Resources Information Center

    Mintz, Anne P.

    1985-01-01

    Discussion of extent of information malpractice highlights role of information broker, copyrights and fees, special library problems, protection against malpractice, contracts, ready reference risks, education against malpractice, continuing education, personal values, malpractice insurance, information producers, Dun and Bradstreet versus…

  20. Informed consent - adults

    MedlinePlus

    ... state). What Should Occur During the Informed Consent Process? When asking for your informed consent, your doctor ... What is Your Role in the Informed Consent Process? You are an important member of your health ...

  1. Advanced information society (1)

    NASA Astrophysics Data System (ADS)

    Ohira, Gosei

    In considering the relationship of informationization and industrial structure, this paper analize some factors such as information revolution, informationization of industries and industrialization of information as background of informationization of Japanese society. Next, some information indicators such as, information coefficient of household which is a share of information related expenditure, information coefficient of industry which is a share of information related cost to total cost of production, and information transmission census developed by Ministry of Post and Telecommunication are introduced. Then new information indicator by Economic Planning Agency, that is, electronic info-communication indicator is showed. In this study, the information activities are defined to produce message or to supply services on process, stores or sale of message using electronic information equipment. International comparisons of information labor force are also presented.

  2. Fireworks Information Center

    MedlinePlus

    ... Home / Safety Education / Safety Education Centers En Español Fireworks Information Center This is an information center on ... Video Put Safety First This Fourth of July Fireworks Information What are consumer fireworks and where are ...

  3. Energy information sheets

    SciTech Connect

    1995-07-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the public. The Energy Information Sheets was developed to provide general information on various aspects of fuel production, prices, consumption, and capability. Additional information on related subject matter can be found in other Energy Information Administration (EIA) publications as referenced at the end of each sheet.

  4. The Measurement of Information.

    ERIC Educational Resources Information Center

    Harmon, Glynn

    1984-01-01

    Views information as residual or catalytic form of energy which regulates other forms of energy in natural and artificial systems. Parallel human information processing (production systems, algorithms, heuristics) and information measurement are discussed. Suggestions for future research in area of parallel information processing include a matrix…

  5. Mission Medical Information System

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.; Joe, John C.; Follansbee, Nicole M.

    2008-01-01

    This viewgraph presentation gives an overview of the Mission Medical Information System (MMIS). The topics include: 1) What is MMIS?; 2) MMIS Goals; 3) Terrestrial Health Information Technology Vision; 4) NASA Health Information Technology Needs; 5) Mission Medical Information System Components; 6) Electronic Medical Record; 7) Longitudinal Study of Astronaut Health (LSAH); 8) Methods; and 9) Data Submission Agreement (example).

  6. Evaluating Internet Information Services.

    ERIC Educational Resources Information Center

    1996

    The following four papers focus on the topic of evaluating Internet information services: "Some Evaluation Criteria To Assess Internet Information Services" (Carmel Galvin); "The Teacher Librarian's Role as Evaluator of Internet Information Services" (Pru Mitchell); "How Students Evaluate Internet Information Services" (Ross Todd); and "Internet…

  7. Guideline 2: Informed Consent.

    ERIC Educational Resources Information Center

    American Journal on Mental Retardation, 2000

    2000-01-01

    The second in seven sets of guidelines based on the consensus of experts in the treatment of psychiatric and behavioral problems in mental retardation (MR) focuses on informed consent. Guidelines cover underlying concepts, usual components, informed consent as a process, information to include, what to provide, when to obtain informed consent, and…

  8. What Price Information.

    ERIC Educational Resources Information Center

    Hunter, Janne A.

    1984-01-01

    This essay considers problems with perceptions of the value of academic and public library information and thus with its marketing and pricing. Public perceptions of information, awareness of information services, value and cost of information, pricing details, and cooperation between libraries and providers of services are discussed. Seven…

  9. Seymour: Maryland's Information Retriever.

    ERIC Educational Resources Information Center

    Smith, Barbara G.

    1994-01-01

    Explains the development of an electronic information network in Maryland called Seymour that offers bibliographic records; full-text databases; community information databases; the ability to request information and materials; local, state, and federal information; and access to the Internet. Policy issues are addressed, including user fees and…

  10. Dialing Up Telecommunications Information.

    ERIC Educational Resources Information Center

    Bates, Mary Ellen

    1993-01-01

    Describes how to find accurate, current information about telecommunications industries, products and services, rates and tariffs, and regulatory information using electronic information resources available from the private and public sectors. A sidebar article provides contact information for producers and service providers. (KRN)

  11. Information Resource Management for Industrial Information Officers.

    ERIC Educational Resources Information Center

    Dosa, Marta

    This paper argues that the function of educational programs is to convey a sense of reality and an understanding of the open-endedness of information needs and situations; only such a reality orientation can instill the necessary flexibility in information professionals for effectively managing change. There is a growing consensus among…

  12. Intelligence, Information Technology, and Information Warfare.

    ERIC Educational Resources Information Center

    Davies, Philip H. J.

    2002-01-01

    Addresses the use of information technology for intelligence and information warfare in the context of national security and reviews the status of clandestine collection. Discusses hacking, human agent collection, signal interception, covert action, counterintelligence and security, and communications between intelligence producers and consumers…

  13. Geographic Names Information System

    USGS Publications Warehouse

    U.S. Geological Survey

    1984-01-01

    The Geographic Names Information System (GNIS) is an automated data system developed by the U.S. Geological Survey (USGS) to standardize and disseminate information on geographic names. GNIS provides primary information for all known places, features, and areas in the United States identified by a proper name. The information in the system can be manipulated to meet varied needs. You can incorporate information from GNIS into your own data base for special applications.

  14. Aquaculture information package

    SciTech Connect

    Boyd, T.; Rafferty, K.

    1998-08-01

    This package of information is intended to provide background information to developers of geothermal aquaculture projects. The material is divided into eight sections and includes information on market and price information for typical species, aquaculture water quality issues, typical species culture information, pond heat loss calculations, an aquaculture glossary, regional and university aquaculture offices and state aquaculture permit requirements. A bibliography containing 68 references is also included.

  15. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    SciTech Connect

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun; Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun; Wilson, David L.

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

  16. Evolution of biological information.

    PubMed

    Schneider, T D

    2000-07-15

    How do genetic systems gain information by evolutionary processes? Answering this question precisely requires a robust, quantitative measure of information. Fortunately, 50 years ago Claude Shannon defined information as a decrease in the uncertainty of a receiver. For molecular systems, uncertainty is closely related to entropy and hence has clear connections to the Second Law of Thermodynamics. These aspects of information theory have allowed the development of a straightforward and practical method of measuring information in genetic control systems. Here this method is used to observe information gain in the binding sites for an artificial 'protein' in a computer simulation of evolution. The simulation begins with zero information and, as in naturally occurring genetic systems, the information measured in the fully evolved binding sites is close to that needed to locate the sites in the genome. The transition is rapid, demonstrating that information gain can occur by punctuated equilibrium. PMID:10908337

  17. Energy information sheets

    SciTech Connect

    Not Available

    1993-12-02

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. Written for the general public, the EIA publication Energy Information Sheets was developed to provide information on various aspects of fuel production, prices, consumption and capability. The information contained herein pertains to energy data as of December 1991. Additional information on related subject matter can be found in other EIA publications as referenced at the end of each sheet.

  18. Types of quantum information

    SciTech Connect

    Griffiths, Robert B.

    2007-12-15

    Quantum, in contrast to classical, information theory, allows for different incompatible types (or species) of information which cannot be combined with each other. Distinguishing these incompatible types is useful in understanding the role of the two classical bits in teleportation (or one bit in one-bit teleportation), for discussing decoherence in information-theoretic terms, and for giving a proper definition, in quantum terms, of 'classical information.' Various examples (some updating earlier work) are given of theorems which relate different incompatible kinds of information, and thus have no counterparts in classical information theory.

  19. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  20. TOXLINE (TOXICOLOGY INFORMATION ONLINE)

    EPA Science Inventory

    TOXLINE? (TOXicology information onLINE) are the National Library of Medicines extensive collection of online bibliographic information covering the pharmacological, biochemical, physiological, and toxicological effects of drugs and other chemicals. TOXLINE and TOXLINE65 together...

  1. Advanced information society(2)

    NASA Astrophysics Data System (ADS)

    Masuyama, Keiichi

    Our modern life is full of information and information infiltrates into our daily life. Networking of the telecommunication is extended to society, company, and individual level. Although we have just entered the advanced information society, business world and our daily life have been steadily transformed by the advancement of information network. This advancement of information brings a big influence on economy, and will play they the main role in the expansion of domestic demands. This paper tries to view the image of coming advanced information society, focusing on the transforming businessman's life and the situation of our daily life, which became wealthy by the spread of daily life information and the visual information by satellite system, in the development of the intelligent city.

  2. General Information about Melanoma

    MedlinePlus

    ... Screening Research Melanoma Treatment (PDQ®)–Patient Version General Information About Melanoma Go to Health Professional Version Key ... the PDQ Adult Treatment Editorial Board . Clinical Trial Information A clinical trial is a study to answer ...

  3. Information Management Using Microcomputers.

    ERIC Educational Resources Information Center

    Adams, Larry

    1985-01-01

    Commercially available software is described that can help manage information in speech and hearing clinics. Applications addressed include word processing, database management, columnar analysis, integrated programs, specialized information management programs, and networks. (CL)

  4. Closing the Information Gap.

    ERIC Educational Resources Information Center

    Landgraf, Kurt

    2003-01-01

    Describes information-dissemination program by the Education Testing Service called "Log On, Let's Talk," aimed at informing parents, administrators, teachers, and policymakers about the role of testing in public schools. Describes pilot efforts in Sacramento and Harrisburg, California. (PKP)

  5. Information Technology: A Bibliography.

    ERIC Educational Resources Information Center

    Wright, William F.; Hawkins, Donald T.

    1981-01-01

    This selective annotated bibliography lists 86 references on the following topics: future technology for libraries, library automation, paperless information systems; computer conferencing and electronic mail, videotext systems, videodiscs, communications technology, networks, information retrieval, cataloging, microcomputers, and minicomputers.…

  6. Federal Energy Information Systems.

    ERIC Educational Resources Information Center

    Coyne, Joseph G.; Moneyhun, Dora H.

    1979-01-01

    Describes the Energy Information Administration (EIA) and the Technical Information Center (TIC), and lists databases accessible online to the Department of Energy and its contractors through DOE/RECON. (RAA)

  7. Indiana Health Information Exchange

    Cancer.gov

    The Indiana Health Information Exchange is comprised of various Indiana health care institutions, established to help improve patient safety and is recognized as a best practice for health information exchange.

  8. Keeping Public Information Public.

    ERIC Educational Resources Information Center

    Kelley, Wayne P.

    1998-01-01

    Discusses the trend toward the transfer of federal government information from the public domain to the private sector. Topics include free access, privatization, information-policy revision, accountability, copyright issues, costs, pricing, and market needs versus public needs. (LRW)

  9. National Health Information Center

    MedlinePlus

    ... About ODPHP Dietary Guidelines Physical Activity Guidelines Health Literacy and Communication Health Care Quality and Patient Safety Healthy People healthfinder health.gov About ODPHP National Health Information Center National Health Information Center The National Health ...

  10. Information As a Resource.

    ERIC Educational Resources Information Center

    Cleveland, Harlan

    1982-01-01

    In the emerging post-industrial society, there is little understanding of the characteristics of information, a basic, yet abstract resource. Information is expandable, compressible, substitutable, transportable, diffusive, and shareable. Implications for life, work, community, and conflict are considered. (AM)

  11. Congenital Heart Information Network

    MedlinePlus

    ... heart defects. Important Notice The Congenital Heart Information Network website is temporarily out of service. Please join ... and Uwe Baemayr for The Congenital Heart Information Network Exempt organization under Section 501(c)3. Copyright © ...

  12. Public informations guidelines

    SciTech Connect

    1986-06-01

    The purpose of these Public Information Guidelines is to provide principles for the implementation of the NWPA mandate and the Mission Plan requirements for the provision of public information. These Guidelines set forth the public information policy to be followed by all Office of Civilian Radioactive Waste Management (OCRWM) performance components. The OCRWM offices should observe these Guidelines in shaping and conducting public information activities.

  13. Quick Information Sheets. 1988.

    ERIC Educational Resources Information Center

    Wisconsin Univ., Madison. Trace Center.

    The Trace Center gathers and organizes information on communication, control, and computer access for handicapped individuals. The information is disseminated in the form of brief sheets describing print, nonprint, and organizational resources and listing addresses and telephone numbers for ordering or for additional information. This compilation…

  14. Information Resource Management.

    ERIC Educational Resources Information Center

    Rossmeier, Joseph G.

    1981-01-01

    Explains the function of information resources, predicts the increased importance of computers, and emphasizes that computer information systems are a resource to support institutional operation and management. Enumerates the components in planning for the use of information technology. Presents the Virginia Community College System's information…

  15. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  16. Ethics of Information Supply.

    ERIC Educational Resources Information Center

    Oppenheim, Charles

    This discussion of the ethics of the information process provides a brief review of the process of information supply and flow, primarily in science and technology; looks at various points in the flow of information; and highlights particular ethical concerns. Facets of the process discussed in more detail include ways in which some scientists…

  17. Information Highway's Educator's Update.

    ERIC Educational Resources Information Center

    Cornell, Richard; And Others

    1994-01-01

    Describes construction of the information highway and some of the current players. Alternative uses of the information highway being developed by several private companies are described. The impact of the information highway on education and how it will be delivered to all grades and ages are considered. (Contains two references.) (KRN)

  18. Government Information Policy.

    ERIC Educational Resources Information Center

    Dearstyne, Bruce W.; And Others

    1991-01-01

    Six articles discuss government information policy in context of technology and electronic records; policies on information resources management from OMB (Office of Management and Budget); state information resources, including Council of State Governments (CSG); state record laws and preservation of archival records; and management of electronic…

  19. Is Information Still Relevant?

    ERIC Educational Resources Information Center

    Ma, Lia

    2013-01-01

    Introduction: The term "information" in information science does not share the characteristics of those of a nomenclature: it does not bear a generally accepted definition and it does not serve as the bases and assumptions for research studies. As the data deluge has arrived, is the concept of information still relevant for information…

  20. Information in the Economy.

    ERIC Educational Resources Information Center

    Bearman, Toni Carbo; And Others

    1988-01-01

    Presents the Glenerin Declaration, a statement that resulted from conferences among information leaders from the United States, United Kingdom, and Canada. A series of articles discusses the issues addressed by the declaration, including access to information; technology, innovation, and productivity; management of information; and national and…

  1. Evaluating Health Information

    MedlinePlus

    Millions of consumers get health information from magazines, TV or the Internet. Some of the information is reliable and up to date; some is not. How can ... the site have an editorial board? Is the information reviewed before it is posted? Be skeptical. Things ...

  2. Developing an Information Strategy

    ERIC Educational Resources Information Center

    Hanson, Terry

    2011-01-01

    The purpose of an information strategy is to highlight the extent to which a modern, complex organization depends on information, in all of its guises, and to consider how this strategic asset should be managed. This dependency has always been present and nowhere more so than in universities, whose very purpose is built around information and its…

  3. Pricing of Information.

    ERIC Educational Resources Information Center

    Furneaux, M. I. P.; Newton, J.

    This essay considers the cost of information retrieval by databases and information centers, and explores the need to charge users for the information supplied. The advantages and disadvantages of three means of charging users are discussed: (1) connnect hour charge, (2) print/type charge, and (3) subscription. Also addressed is the practice of…

  4. Security classification of information

    SciTech Connect

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  5. Information for Agricultural Development.

    ERIC Educational Resources Information Center

    Kaungamno, E. E.

    This paper describes the major international agricultural information services, sources, and systems; outlines the existing information situation in Tanzania as it relates to problems of agricultural development; and reviews the improvements in information provision resources required to support the process of agricultural development in Tanzania.…

  6. Recruitment and Information Program

    ERIC Educational Resources Information Center

    Liebergott, Harvey

    1976-01-01

    The Bureau of Education for the Handicapped's Recruitment and Information Program provides parents and other interested individuals with information on the educational needs of handicapped children through such activities as the National Information Center for the Handicapped ("Closer Look"), pamphlets on various subjects, and media campaigns that…

  7. Teaching Information Technology Law

    ERIC Educational Resources Information Center

    Taylor, M. J.; Jones, R. P.; Haggerty, J.; Gresty, D.

    2009-01-01

    In this paper we discuss an approach to the teaching of information technology law to higher education computing students that attempts to prepare them for professional computing practice. As information technology has become ubiquitous its interactions with the law have become more numerous. Information technology practitioners, and in particular…

  8. Europe and Information Science.

    ERIC Educational Resources Information Center

    Ingwersen, Peter

    1997-01-01

    Discusses recent European library and information science (LIS) events. Describes the development and use of regional and intra-European Union networks for science. Highlights three European conferences held in 1996: ACM-SIGIR on information retrieval held in Switzerland, Information Seeking in Context (ISIC) held in Finland, and Conceptions of…

  9. Teaching Information Skills: Perspective.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2001-01-01

    Discusses "Information Literacy Standards" for school library media specialists that were included in the 1998 edition of "Information Power" and presents a lesson plan for middle school students on global warming that focuses on the standard addressing perspective, or point of view, and incorporates an information process model. (LRW)

  10. Personal, Anticipated Information Need

    ERIC Educational Resources Information Center

    Bruce, Harry

    2005-01-01

    Background: The role of personal information collections is a well known feature of personal information management. The World Wide Web has introduced to such collections ideas such as filing Web pages or noting their existence in "Bookmarks" and "Favourites". Argument: It is suggested that personal information collections are…

  11. Miami University Information Manual.

    ERIC Educational Resources Information Center

    Miami Univ., Oxford, OH.

    The 1975 information manual is designed to provide current data on policies, procedures, services, facilities, organization and governance of Miami University and, through the extensive index, quick access to this information. The manual is complementary to the university catalog and directory. Information relating to students is in the Student…

  12. Collaborative Information Synthesis.

    ERIC Educational Resources Information Center

    Blake, Catherine; Pratt, Wanda

    2002-01-01

    Discusses the growth in the quantity of scientific literature and advances in information retrieval techniques to find relevant articles. Explores user behavior and information requirements of scientists as they interact with medical literature to answer research questions and introduces METIS (Multi-user extraction and information synthesis) to…

  13. Connectionist Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  14. DNA as information.

    PubMed

    Wills, Peter R

    2016-03-13

    This article reviews contributions to this theme issue covering the topic 'DNA as information' in relation to the structure of DNA, the measure of its information content, the role and meaning of information in biology and the origin of genetic coding as a transition from uninformed to meaningful computational processes in physical systems. PMID:26857666

  15. Energy information directory 1995

    SciTech Connect

    1995-10-01

    The National Energy Information Center provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. This Energy Information Directory is used to assist the Center staff as well as other DOE staff in directing inquires to the proper offices.

  16. Government Information Policy Principles.

    ERIC Educational Resources Information Center

    Hernon, Peter

    1991-01-01

    Analyzes the utility of policy principles advanced by professional associations for public access to government information. The National Commission on Libraries and Information Science (NCLIS), the Information Industry Association (IIA), and the Office of Technology Assessment (OTA) urge the adoption of principles for the dissemination of public…

  17. America's Rural Information Resource.

    ERIC Educational Resources Information Center

    La Caille John, Patricia

    The Rural Information Center (RIC), a project of two agencies of the U.S. Department of Agriculture, has served rural information needs since 1988. The targeted audience for the RIC is local officials and citizens, rather than scientists and federal officials, and the thrust of its information is rural development rather than production…

  18. Medical Information Systems.

    ERIC Educational Resources Information Center

    Smith, Kent A.

    1986-01-01

    Description of information services from the National Library of Medicine (NLM) highlights a new system for retrieving information from NLM's databases (GRATEFUL MED); a formal Regional Medical Library Network; DOCLINE; the Unified Medical Language System; and Integrated Academic Information Management Systems. Research and development and the…

  19. Shopping for health information.

    PubMed

    Goldstein, M L; Mailander, N K; Danner, R A

    2000-01-01

    In this time of ongoing health care changes, consumers need to become better informed to actively participate in their health care decisions. As a result, hospital libraries are being challenged to address this need. Scottsdale Healthcare's Health Sciences Libraries have responded to this challenge by establishing a Health Information Center at the premiere shopping mall in the area. Implementing a Health Information Center at a mall is a unique way to bring medical information to the community. The purpose of this paper is to describe the planning process, the implementation, and the future vision of the Health Information Center at Scottsdale Fashion Square. PMID:11299612

  20. Is symmetry informative?

    PubMed

    Gray, J E; Vogt, A

    1997-01-01

    Is symmetry informative? The answer is both yes and no. We examine what information and symmetry are and how they are related. Our approach is primarily mathematical, not because mathematics provides the final word, but because it provides an insightful and relatively precise starting point. Information theory treats transformations that messages undergo from source to destination. Symmetries are information that leave some property of interest unchanged. In this respect the studies of information and symmetry can both be regarded as a Quest for the identity transformation. PMID:9224554

  1. 78 FR 7463 - Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... SPACE ADMINISTRATION Information Collection AGENCY: National Aeronautics and Space Administration (NASA... to take this opportunity to comment on proposed and/or continuing information collections, as... INFORMATION CONTACT: Requests for additional information or copies of the information collection...

  2. Information gain and information leak in quantum measurements

    NASA Astrophysics Data System (ADS)

    Xi, Zhengjun

    2016-05-01

    We discuss the relationships among various quantities of information during the process of an efficient quantum measurement, e.g., information gain, quantum loss, Holevo information, and coherent information. In particular, we give an uncertaintylike relation between information gain and coherent information. We also investigate the information gain by local measurements and quantum correlations in bipartite quantum systems. Moreover, we discuss two cases of information leak according to whether the observer of the environment possesses extra information about the measured system.

  3. Canonical information analysis

    NASA Astrophysics Data System (ADS)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-03-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical airborne data. The simulation study shows that canonical information analysis is as accurate as and much faster than algorithms presented in previous work, especially for large sample sizes. URL:

  4. Advanced information society(4)

    NASA Astrophysics Data System (ADS)

    Hiratsuka, Shinji

    This paper proposes that, as countermeasure against the centralization of information activities at the capital Tokyo region, the construction of information infrastructure as well as urban spaces adapted for the exchange and transmission of information be needed at the local regions. Development of information activities at the local regions requires urban spaces with high amenity to promote the characteristics of the city and allow for various kinds of human activities to take place. Making it a reality, the urban spaces should allow for (1) the presentation function of information transmission; (2) the cultural invention function related to knowledge production; and (3) construction of a "Mediacity" which is the actor of information exchange and carries out the international exchange function.

  5. Information barriers and authentication.

    SciTech Connect

    MacArthur, D. W.; Wolford, J. K.

    2001-01-01

    Acceptance of nuclear materials into a monitoring regime is complicated if the materials are in classified shapes or have classified composition. An attribute measurement system with an information barrier can be emplo,yed to generate an unclassified display from classified measurements. This information barrier must meet two criteria: (1) classified information cannot be released to the monitoring party, and (2) the monitoring party must be convinced that the unclassified output accurately represents the classified input. Criterion 1 is critical to the host country to protect the classified information. Criterion 2 is critical to the monitoring party and is often termed the 'authentication problem.' Thus, the necessity for authentication of a measurement system with an information barrier stems directly from the description of a useful information barrier. Authentication issues must be continually addressed during the entire development lifecycle of the measurement system as opposed to being applied only after the system is built.

  6. Regional Health Information Systems

    PubMed Central

    Fuller, Sherrilynne

    1997-01-01

    Abstract In general, there is agreement that robust integrated information systems are the foundation for building successful regional health care delivery systems. Integrated Advanced Information Management System (IAIMS) institutions that, over the years, have developed strategies for creating cohesive institutional information systems and services are finding that IAIMS strategies work well in the even more complex regional environment. The key elements of IAIMS planning are described and lessons learned are discussed in the context of regional health information systems developed. The challenges of aligning the various information agencies and agendas in support of a regional health information system are complex ; however, the potential rewards for health care in quality, efficacy, and cost savings are enormous. PMID:9067887

  7. Freedom of Information Act

    USGS Publications Warehouse

    Newman, D.J.

    2012-01-01

    The Freedom of Information Act( FOIA), 5 U.S.C.§ 552, as amended, generally provides that any person has a right to request access to Federal agency records. The USGS proactively promotes information disclosure as inherent to its mission of providing objective science to inform decisionmakers and the general public. USGS scientists disseminate up-to-date and historical scientific data that are critical to addressing national and global priorities.

  8. Value of Information spreadsheet

    SciTech Connect

    Trainor-Guitton, Whitney

    2014-05-12

    This spreadsheet represents the information posteriors derived from synthetic data of magnetotellurics (MT). These were used to calculate value of information of MT for geothermal exploration. Information posteriors describe how well MT was able to locate the "throat" of clay caps, which are indicative of hidden geothermal resources. This data is full explained in the peer-reviewed publication: Trainor-Guitton, W., Hoversten, G. M., Ramirez, A., Roberts, J., Júlíusson, E., Key, K., Mellors, R. (Sept-Oct. 2014) The value of spatial information for determining well placement: a geothermal example, Geophysics.

  9. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  10. Information management for clinicians.

    PubMed

    Mehta, Neil B; Martin, Stephen A; Maypole, Jack; Andrews, Rebecca

    2016-08-01

    Clinicians are bombarded with information daily by social media, mainstream television news, e-mail, and print and online reports. They usually do not have much control over these information streams and thus are passive recipients, which means they get more noise than signal. Accessing, absorbing, organizing, storing, and retrieving useful medical information can improve patient care. The authors outline how to create a personalized stream of relevant information that can be scanned regularly and saved so that it is readily accessible. PMID:27505880

  11. Economics of information intermediaries

    SciTech Connect

    Sass, T.R.

    1984-01-01

    For the last 20 years, economists have studied the economics of information, with emphasis on the effect costly information has on the organization of markets. In contrast, little attention has been paid to the market for information itself. Typically, buyers are assumed to produce information for themselves through search. In many markets however, buyers and sellers don't act independently, but rather exchange information through an intermediary such as a real estate broker or employment agency. Such information middlemen are the subject of this dissertation. In particular, two important aspects are explored: the demand for information intermediaries, and the choice of contractural arrangements under which they are employed. The demand begins with individuals transacting in an environment of costly information, without the aid of a middleman. Sellers advertise their goods for sale and buyers pursue search and measurement to obtain information on goods for sale. The various functions of middlemen are examined and incorporated into the model. Finally, the gains from using a middleman and the resulting demand for their services are analyzed. Testable implications regarding the use of intermediaries are then derived.

  12. Quantum information causality.

    PubMed

    Pitalúa-García, Damián

    2013-05-24

    How much information can a transmitted physical system fundamentally communicate? We introduce the principle of quantum information causality, which states the maximum amount of quantum information that a quantum system can communicate as a function of its dimension, independently of any previously shared quantum physical resources. We present a new quantum information task, whose success probability is upper bounded by the new principle, and show that an optimal strategy to perform it combines the quantum teleportation and superdense coding protocols with a task that has classical inputs. PMID:23745844

  13. Secure Information Sharing

    2005-09-09

    We are develoing a peer-to-peer system to support secure, location independent information sharing in the scientific community. Once complete, this system will allow seamless and secure sharing of information between multiple collaborators. The owners of information will be able to control how the information is stored, managed. ano shared. In addition, users will have faster access to information updates within a collaboration. Groups collaborating on scientific experiments have a need to share information and data.more » This information and data is often represented in the form of files and database entries. In a typical scientific collaboration, there are many different locations where data would naturally be stored. This makes It difficult for collaborators to find and access the information they need. Our goal is to create a lightweight file-sharing system that makes it’easy for collaborators to find and use the data they need. This system must be easy-to-use, easy-to-administer, and secure. Our information-sharing tool uses group communication, in particular the InterGroup protocols, to reliably deliver each query to all of the current participants in a scalable manner, without having to discover all of their identities. We will use the Secure Group Layer (SGL) and Akenti to provide security to the participants of our environment, SGL will provide confldentiality, integrity, authenticity, and authorization enforcement for the InterGroup protocols and Akenti will provide access control to other resources.« less

  14. Advanced information society (10)

    NASA Astrophysics Data System (ADS)

    Masuyama, Keiichi

    Informationalization in Japan has spread among various fields of industrial and social life in wide and depth by drastic advancement in technology and networking. Looking at the change in industrial structure as well as international trend in information the Japanese Government regards the role of information and communication technology as infrastructure to be important, and is under the way of constructing various measures with ministries and agencies concerned with them. This paper describes how administrative agencies involved in information and communication such as Ministry of Postal Services, Ministry of International Trade and Industry cope with informationalization, and mentions future direction in information policies.

  15. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  16. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2016-03-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care. PMID:26851670

  17. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution

  18. EDUCATIONAL INFORMATION PROJECT.

    ERIC Educational Resources Information Center

    LINDQUIST, E.F.; AND OTHERS

    TO AID DATA COLLECTION ANALYSIS, STORAGE, AND DISSEMINATION, INSTRUMENTS AND PROCEDURES WERE DEVELOPED FOR COLLECTING INFORMATION ON ALL ASPECTS OF THE EDUCATIONAL PROGRAM FOR A LARGE POPULATION OF SCHOOLS, INCLUDING INFORMATION ON INDIVIDUAL PUPILS, SCHOOL PERSONNEL, SCHOOLS, AND SCHOOL DISTRICTS. COMPUTER PROGRAMS AND DATA-PROCESSING TECHNIQUES…

  19. Hybrid quantum information processing

    SciTech Connect

    Furusawa, Akira

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  20. Accessibility of Outdated Information

    ERIC Educational Resources Information Center

    O'Brien, Edward J.; Cook, Anne E.; Gueraud, Sabine

    2010-01-01

    In 2 previous studies (O'Brien, Rizzella, Albrecht, & Halleran, 1998; Zwaan & Madden, 2004), researchers have provided conflicting accounts about whether outdated information continues to influence the comprehension of subsequent text. The current set of experiments was designed to explore further the impact of outdated information on…

  1. Parallel Information Processing.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1992-01-01

    Examines parallel computer architecture and the use of parallel processors for text. Topics discussed include parallel algorithms; performance evaluation; parallel information processing; parallel access methods for text; parallel and distributed information retrieval systems; parallel hardware for text; and network models for information…

  2. Addressing Information Security Risk

    ERIC Educational Resources Information Center

    Qayoumi, Mohammad H.; Woody, Carol

    2005-01-01

    Good information security does not just happen--and often does not happen at all. Resources are always in short supply, and there are always other needs that seem more pressing. Why? Because information security is hard to define, the required tasks are unclear, and the work never seems to be finished. However, the loss to the organization can be…

  3. Information about Musculoskeletal Conditions

    MedlinePlus

    ... Advocacy Ancillary Services Drugs, Devices, and FDA Health Information Technology MACRA and Delivery Reform Medical Liability Reform Medicare Payment and CMS Research Appropriations See All Issues State Advocacy State ... the PAC Contact Information Donate to the PAC FAQ Member Benefits PAC ...

  4. Information across Heterogeneous Media.

    ERIC Educational Resources Information Center

    Fricke, Martin

    1996-01-01

    Proposes a framework for analyzing information portrayed in different media, for example, in text and in diagrams. The framework attaches information to propositions, then analyzes text and diagrams as being interpreted languages with the ability to refer to propositions. The approach is illustrated with Venn diagrams. (Author/LRW)

  5. Dimensions of Drug Information

    ERIC Educational Resources Information Center

    Sharp, Mark E.

    2011-01-01

    The high number, heterogeneity, and inadequate integration of drug information resources constitute barriers to many drug information usage scenarios. In the biomedical domain there is a rich legacy of knowledge representation in ontology-like structures that allows us to connect this problem both to the very mature field of library and…

  6. Mandarin Visual Speech Information

    ERIC Educational Resources Information Center

    Chen, Trevor H.

    2010-01-01

    While the auditory-only aspects of Mandarin speech are heavily-researched and well-known in the field, this dissertation addresses its lesser-known aspects: The visual and audio-visual perception of Mandarin segmental information and lexical-tone information. Chapter II of this dissertation focuses on the audiovisual perception of Mandarin…

  7. Enhanced Information Exclusion Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2016-07-01

    In Hall’s reformulation of the uncertainty principle, the entropic uncertainty relation occupies a core position and provides the first nontrivial bound for the information exclusion principle. Based upon recent developments on the uncertainty relation, we present new bounds for the information exclusion relation using majorization theory and combinatoric techniques, which reveal further characteristic properties of the overlap matrix between the measurements.

  8. Enhanced Information Exclusion Relations.

    PubMed

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2016-01-01

    In Hall's reformulation of the uncertainty principle, the entropic uncertainty relation occupies a core position and provides the first nontrivial bound for the information exclusion principle. Based upon recent developments on the uncertainty relation, we present new bounds for the information exclusion relation using majorization theory and combinatoric techniques, which reveal further characteristic properties of the overlap matrix between the measurements. PMID:27460975

  9. Rural Information Network.

    ERIC Educational Resources Information Center

    National Public Telecomputing Network, Cleveland, OH.

    This report describes the National Public Telecomputing Network's (NPTN) development of free, public-access, community computer systems throughout the United States. It also provides information on how to initiate a "Free-Net" through the Rural Information Network. Free-Nets are multi-user systems with some of the power and sophistication of…

  10. Information, Mechanism and Meaning.

    ERIC Educational Resources Information Center

    Mac Kay, Donald M.

    The author has collected in this volume his papers and talks over the past 20 years on the subject of information theory. He identifies the underlying thread of his work--the idea that there is a valid analogy between Heisenberg's "Principle of Uncertainty" and certain aspects of information theory. Three of his papers then provide introductory…

  11. The Emerging Information Society.

    ERIC Educational Resources Information Center

    Ochai, Adakole

    1984-01-01

    Focuses on role of library and agencies charged with provision of information in an environment of technological change. Predictions concerning aspects of the emerging information society (computer literacy, home computers), the death of libraries, and effects of a paperless society on libraries in developing countries are noted. Footnotes are…

  12. Institutionalizing Information Literacy

    ERIC Educational Resources Information Center

    Weiner, Sharon A.

    2012-01-01

    There is increasing recognition that information literacy is essential for individual and community empowerment, workforce readiness, and global competitiveness. However, there is a history of difficulty in integrating information literacy with the postsecondary educational process. This paper posits that a greater understanding of the…

  13. A Mine of Information.

    ERIC Educational Resources Information Center

    Williams, Lisa B.

    1986-01-01

    Business researchers and marketers find certain databases useful for finding information on investments, competitors, products, and markets. Colleges can use these same databases to get background on corporate prospects. The largest data source available, DIALOG Information Services and some other databases are described. (MLW)

  14. PESTICIDE INFORMATION NETWORK

    EPA Science Inventory

    The Pesticide Information Network (PIN) is an interactive database containing information about pesticides. PIN is a free service offered by the USEPAs Office of Pesticide Programs which provides contacts on pesticide issues, has a bulletin board network for public and private us...

  15. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  16. Evaluating Health Information

    MedlinePlus

    Millions of consumers get health information from magazines, TV or the Internet. Some of the information is reliable and up to date; some is not. ... a branch of the government, a university, a health organization, a hospital or a business? Focus on ...

  17. Thinking about Museum Information.

    ERIC Educational Resources Information Center

    Reed, Patricia Ann; Sledge, Jane

    1988-01-01

    Describes work in progress at the Smithsonian Institution in developing a system to understand and articulate the information needed to support collection related functions. The discussion covers the data modeling methodology used and the advantages of this methodology in structuring museum collections information. (one reference) (CLB)

  18. Oceanography Information Sources 70.

    ERIC Educational Resources Information Center

    Vetter, Richard C.

    This booklet lists oceanography information sources in the first section under industries, laboratories and departments of oceanography, and other organizations which can provide free information and materials describing programs and activities. Publications listed in the second section include these educational materials: bibliographies, career…

  19. Encouraging Global Information Literacy

    ERIC Educational Resources Information Center

    Horton, Forest Woody, Jr.; Keiser, Barbie E.

    2008-01-01

    While much has been done to address the digital divide, awareness concerning the importance of information literacy (IL) has taken a back seat to a world that focuses on technology. This article traces the genesis of a global effort to address information literacy education and training beyond discussions taking place within the library and…

  20. The right to information.

    PubMed

    Kubiak, Rafał

    2014-01-01

    The right to self-determination, including the decision on treatment, is affirmed in modern societies. Therefore, the fundamental condition of legal procedures is informed consent of a patient or an authorised person. However, to make the consent legally effective, some conditions have to be met; of these, the provision of comprehensive medical information is of the utmost importance. Thus, a patient is entitled to necessary information provided by a physician. The correlate of this right is the obligation to disclose information which must be fulfilled by a medical practitioner. The aim of this review is to examine this obligation in terms of determining the range of subjects authorised to provide information, the scope of subject information or a set of data, and the manner and time in which it should be given. Moreover, this article discusses regulations which permit limitations of information disclosure, i.e. the patient's entitlement to renounce the right to information, and therapeutic privilege. The disquisition regards achievements of legal doctrine and judicature, from the angle of which all the legal solutions and doubts arising are presented. PMID:25078772

  1. AIDS PUBLIC INFORMATION DATABASE

    EPA Science Inventory

    The AIDS Public Information Data Set is computer software designed to run on a Microsoft Windows microcomputer, and contains information abstracted from acquired immunodeficiency syndrome (AIDS) cases reported in the United States. The data set is created by the Division of HIV/A...

  2. Marketing Information Literacy

    ERIC Educational Resources Information Center

    Seale, Maura

    2013-01-01

    In 2012, more than a decade after the original Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education (hereafter the Standards) were institutionalized as the goal of academic library instruction, the Information Literacy Competency Standards Review Task Force convened by ACRL recommended…

  3. Information System Overview.

    ERIC Educational Resources Information Center

    Burrows, J. H.

    This paper was prepared for distribution to the California Educational Administrators participating in the "Executive Information Systems" Unit of Instruction as part of the instructional program of Operation PEP (Prepare Educational Planners). The purpose of the course was to introduce some basic concepts of information systems technology to…

  4. Information Retrieval System.

    ERIC Educational Resources Information Center

    Mahle, Jack D., Jr.

    The Fort Detrick Information Retrieval System is a system of computer programs written in COBOL for a CDC 3150 to store and retrieve information about the scientific and technical reports and documents of the Fort Detrick Technical Library. The documents and reports have been abstracted and indexed. This abstract, the subject matter descriptors,…

  5. Constructor theory of information

    PubMed Central

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  6. The Information Age?

    ERIC Educational Resources Information Center

    MacVicar, Margaret L. A.

    1985-01-01

    The author examines the concept of information and how it relates to our future. She discusses the various waves of information about education: the decay of technology and the raid on educational institutions for brainpower made by business and industry, foreign competition, and the quality of education. (CT)

  7. Information Design: A Bibliography.

    ERIC Educational Resources Information Center

    Albers, Michael J.; Lisberg, Beth Conney

    2000-01-01

    Presents a 17-item annotated list of essential books on information design chosen by members of the InfoDesign e-mail list. Includes a 113-item unannotated bibliography of additional works, on topics of creativity and critical thinking; visual thinking; graphic design; infographics; information design; instructional design; interface design;…

  8. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  9. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  10. Management of Electronic Information.

    ERIC Educational Resources Information Center

    Breaks, Michael

    This paper discusses the management of library collections of electronic information resources within the classical theoretical framework of collection development and management. The first section provides an overview of electronic information resources, including bibliographic databases, electronic journals, journal aggregation services, and…

  11. State energy information networks

    SciTech Connect

    Tatar, J.; Ettinger, G.; Wrabel, M.

    1984-06-01

    In November 1983, Argonne National Laboratory (ANL) initiated a study under the sponsorship of the US Department of Energy (DOE) State Programs Branch to examine state energy information networks. Goal was to help DOE decide how best to allocate resources to assist states in acquiring information related to state energy programs and policies.

  12. Information Retrieval Systems.

    ERIC Educational Resources Information Center

    National Archives and Records Service (GSA), Washington, DC. Office of Records Management.

    Descriptions of representative nonconventional information systems in use today are given in order to provide managers, management analysts, supervisors, and others with ideas as to how they might improve the dissemination, storage, and retrieval of information in their offices. No attempt was made to evaluate the relative merits of the systems…

  13. Environmental geographic information system.

    SciTech Connect

    Peek, Dennis; Helfrich, Donald Alan; Gorman, Susan

    2010-08-01

    This document describes how the Environmental Geographic Information System (EGIS) was used, along with externally received data, to create maps for the Site-Wide Environmental Impact Statement (SWEIS) Source Document project. Data quality among the various classes of geographic information system (GIS) data is addressed. A complete listing of map layers used is provided.

  14. Faculty and Staff Information.

    ERIC Educational Resources Information Center

    Kentucky Univ., Lexington. Community Coll. System.

    This booklet is intended to acquaint faculty and staff members with general information about the University of Kentucky community College System, and to explain some of its policies affecting them. The booklet is organized into five sections. Section I contains general information about the system, gives its history, purpose, and a map of the…

  15. Information Literacy Assessment

    ERIC Educational Resources Information Center

    Warmkessel, Marjorie M.

    2007-01-01

    This article presents an annotated list of seven recent articles on the topic of information literacy assessment. They include: (1) "The Three Arenas of Information Literacy Assessment" (Bonnie Gratch Lindauer); (2) "Testing the Effectiveness of Interactive Multimedia for Library-User Education" (Karen Markey et al.); (3) "Assessing Auburn…

  16. Heroin. Specialized Information Service.

    ERIC Educational Resources Information Center

    Do It Now Foundation, Phoenix, AZ.

    The document presents a collection of articles about heroin. Article 1 provides general information on heroin identification, drug dependence, effects of abuse, cost, source of supply, and penalties for illegal heroin use. Article 2 gives statistical information on heroin-related deaths in the District of Columbia between 1971 and 1982. Article 3…

  17. Information Workers within Bureaucracies.

    ERIC Educational Resources Information Center

    Porat, Marc

    1984-01-01

    Report on the conference on information workers held as part of the White House Conference on Productivity covers statistics on the information work force and findings and recommendations from the conference on productivity within bureaucracies, e.g., productivity measures for determining wage increases and formal price systems. (EJS)

  18. Information System Plan.

    ERIC Educational Resources Information Center

    McIntyre, Chuck

    Prepared for review and discussion by the Board of Governors of the California Community Colleges (CCC), this report provides background and recommendations for the refinement, expansion, and increased use of the information system of the CCC Chancellor's Office. Following introductory material proposing an expanded scope of the information system…

  19. Taking Information Literacy Online.

    ERIC Educational Resources Information Center

    Levesque, Carla

    2003-01-01

    Explores the process of designing, teaching, and revising an online information literacy course at St. Petersburg College (SPC) (Florida). Shares methods for encouraging participation in online courses and ways of tracking students' progress. Reports that basic computer information and literacy is now a graduation requirement at SBC. Contains…

  20. Career Information Handbook.

    ERIC Educational Resources Information Center

    Texas State Technical Inst., Waco.

    The handbook is a companion volume to "High School Career Interest and Information Survey" but its use extends to high school counselors, teachers, administrators and their students as an independent reference tool for occupational information. The manual is divided into sections corresponding to the fifteen career clusters identified by the U.S.…

  1. The Rural Information Center.

    ERIC Educational Resources Information Center

    John, Patricia La Caille

    1989-01-01

    Describes the events that led to the creation of the Rural Information Center (RIC), a joint venture between the Extension Service and the National Agricultural Library to provide information to government officials involved in rural development. The databases accessed by RIC are described, and plans for a gateway system and network of all…

  2. Copyright Program Information.

    ERIC Educational Resources Information Center

    National Center for Educational Communication (DHEW/OE), Washington, DC.

    The purpose of this publication is to provide information about the U.S. Office of Education (USOE) Copyright Program. It is a supplement to the Copyright Guidelines published in the Federal Register on May 9, 1970 (available as LI 002 914) and provides information primarily for those institutions and organizations which are developing educational…

  3. Reverse Coherent Information

    NASA Astrophysics Data System (ADS)

    García-Patrón, Raúl; Pirandola, Stefano; Lloyd, Seth; Shapiro, Jeffrey H.

    2009-05-01

    In this Letter we define a family of entanglement distribution protocols assisted by feedback classical communication that gives an operational interpretation to reverse coherent information, i.e., the symmetric counterpart of the well-known coherent information. This leads to the definition of a new entanglement distribution capacity that exceeds the unassisted capacity for some interesting channels.

  4. Reverse Coherent Information

    NASA Astrophysics Data System (ADS)

    García-Patrón, Raúl; Pirandola, Stefano; Lloyd, Seth; Shapiro, Jeffrey H.

    2009-04-01

    We define a family of entanglement distribution protocols assisted by classical feedback communication that gives an operational interpretation to reverse coherent information, i.e., the symmetric counterpart of the well-known coherent information. This protocol family leads to the definition of a new entanglement distribution capacity that exceeds the unassisted entanglement distribution capacity for some interesting channels.

  5. Distributed Information Management.

    ERIC Educational Resources Information Center

    Pottenger, William M.; Callahan, Miranda R.; Padgett, Michael A.

    2001-01-01

    Reviews the scope and effects of distributed information management. Discusses cultural and social influences, including library and Internet culture, information and knowledge, electronic libraries, and social aspects of libraries; digital collections; indexing; permanent link systems; metadata; the Open Archives initiative; digital object…

  6. What is information?

    PubMed

    Barbieri, Marcello

    2016-03-13

    Molecular biology is based on two great discoveries: the first is that genes carry hereditary information in the form of linear sequences of nucleotides; the second is that in protein synthesis a sequence of nucleotides is translated into a sequence of amino acids, a process that amounts to a transfer of information from genes to proteins. These discoveries have shown that the information of genes and proteins is the specific linear order of their sequences. This is a clear definition of information and there is no doubt that it reflects an experimental reality. What is not clear, however, is the ontological status of information, and the result is that today we have two conflicting paradigms in biology. One is the 'chemical paradigm', the idea that 'life is chemistry', or, more precisely, that 'life is an extremely complex form of chemistry'. The other is the 'information paradigm', the view that chemistry is not enough, that 'life is chemistry plus information'. This implies that there is an ontological difference between information and chemistry, a difference which is often expressed by saying that information-based processes like heredity and natural selection simply do not exist in the world of chemistry. Against this conclusion, the supporters of the chemical paradigm have argued that the concept of information is only a linguistic metaphor, a word that summarizes the result of countless underlying chemical reactions. The supporters of the information paradigm insist that information is a real and fundamental component of the living world, but have not been able to prove this point. As a result, the chemical view has not been abandoned and the two paradigms both coexist today. Here, it is shown that a solution to the ontological problem of information does exist. It comes from the idea that life is artefact-making, that genes and proteins are molecular artefacts manufactured by molecular machines and that artefacts necessarily require sequences and coding

  7. Space Station Information Systems

    NASA Technical Reports Server (NTRS)

    Pittman, Clarence W.

    1988-01-01

    The utility of the Space Station is improved, the ability to manage and integrate its development and operation enhanced, and the cost and risk of developing the software for it is minimized by three major information systems. The Space Station Information System (SSIS) provides for the transparent collection and dissemination of operational information to all users and operators. The Technical and Management Information System (TMIS) provides all the developers with timely and consistent program information and a project management 'window' to assess the project status. The Software Support Environment (SSE) provides automated tools and standards to be used by all software developers. Together, these three systems are vital to the successful execution of the program.

  8. Information Technology Resources Assessment

    SciTech Connect

    Not Available

    1993-04-01

    The Information Technology Resources Assessment (ITRA) is being published as a companion document to the Department of Energy (DOE) FY 1994--FY 1998 Information Resources Management Long-Range Plan. This document represents a collaborative effort between the Office of Information Resources Management and the Office of Energy Research that was undertaken to achieve, in part, the Technology Strategic Objective of IRM Vision 21. An integral part of this objective, technology forecasting provides an understanding of the information technology horizon and presents a perspective and focus on technologies of particular interest to DOE program activities. Specifically, this document provides site planners with an overview of the status and use of new information technology for their planning consideration.

  9. Context Oriented Information Integration

    NASA Astrophysics Data System (ADS)

    Mohania, Mukesh; Bhide, Manish; Roy, Prasan; Chakaravarthy, Venkatesan T.; Gupta, Himanshu

    Faced with growing knowledge management needs, enterprises are increasingly realizing the importance of seamlessly integrating critical business information distributed across both structured and unstructured data sources. Academicians have focused on this problem but there still remain a lot of obstacles for its widespread use in practice. One of the key problems is the absence of schema in unstructured text. In this paper we present a new paradigm for integrating information which overcomes this problem - that of Context Oriented Information Integration. The goal is to integrate unstructured data with the structured data present in the enterprise and use the extracted information to generate actionable insights for the enterprise. We present two techniques which enable context oriented information integration and show how they can be used for solving real world problems.

  10. Economics of information

    NASA Astrophysics Data System (ADS)

    Noguchi, Mitsunori

    2000-06-01

    The economics of information covers a wide range of topics such as insurance, stochastic equilibria, the theory of finance (e.g. option pricing), job search, etc. In this paper, we focus on an economic model in which traders are uncertain about the true characteristics of commodities and know only the probability distributions of those characteristics. The traders acquire information on those characteristics via the actual consumption in the past and are allowed to exchange the information among themselves prior to the forthcoming trade. Though optimal consumption at the preceding trade generally alters optimal consumption at the succeeding trade, it may happen that they both coincide. We call this particular type of optimal consumption an information stable equilibrium (ISE). At an ISE, the traders gain no additional information from consumption, which is significant enough to revise their optimal choice at the succeeding trade. .

  11. Advanced information society(5)

    NASA Astrophysics Data System (ADS)

    Tanizawa, Ippei

    Based on the advancement of information network technology information communication forms informationalized society giving significant impact on business activities and life style in it. The information network has been backed up technologically by development of computer technology and has got great contribution by enhanced computer technology and communication equipments. Information is transferred by digital and analog methods. Technical development which has brought out multifunctioned modems of communication equipments in analog mode, and construction of advanced information communication network which has come out by joint work of computer and communication under digital technique, are described. The trend in institutional matter and standardization of electrical communication is also described showing some examples of value-added network (VAN).

  12. Energy information directory 1994

    SciTech Connect

    Not Available

    1994-03-28

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory is a list of most Government offices and trade associations that are involved in energy matters. It does not include those DOE offices which do not deal with the public or public information.

  13. Failures of information geometry

    NASA Astrophysics Data System (ADS)

    Skilling, John

    2015-01-01

    Information H is a unique relationship between probabilities, based on the property of independence which is central to scientific methodology. Information Geometry makes the tempting but fallacious assumption that a local metric (conventionally based on information) can be used to endow the space of probability distributions with a preferred global Riemannian metric. No such global metric can conform to H, which is "from-to" asymmetric whereas geometrical length is by definition symmetric. Accordingly, any Riemannian metric will contradict the required structure of the very distributions which are supposedly being triangulated. It follows that probabilities do not form a metric space. We give counter-examples in which alternative formulations of information, and the use of information geometry, lead to unacceptable results.

  14. Next generation information systems

    SciTech Connect

    Limback, Nathan P; Medina, Melanie A; Silva, Michelle E

    2010-01-01

    The Information Systems Analysis and Development (ISAD) Team of the Safeguards Systems Group at Los Alamos National Laboratory (LANL) has been developing web based information and knowledge management systems for sixteen years. Our vision is to rapidly and cost effectively provide knowledge management solutions in the form of interactive information systems that help customers organize, archive, post and retrieve nonproliferation and safeguards knowledge and information vital to their success. The team has developed several comprehensive information systems that assist users in the betterment and growth of their organizations and programs. Through our information systems, users are able to streamline operations, increase productivity, and share and access information from diverse geographic locations. The ISAD team is also producing interactive visual models. Interactive visual models provide many benefits to customers beyond the scope of traditional full-scale modeling. We have the ability to simulate a vision that a customer may propose, without the time constraints of traditional engineering modeling tools. Our interactive visual models can be used to access specialized training areas, controlled areas, and highly radioactive areas, as well as review site-specific training for complex facilities, and asset management. Like the information systems that the ISAD team develops, these models can be shared and accessed from any location with access to the internet. The purpose of this paper is to elaborate on the capabilities of information systems and interactive visual models as well as consider the possibility of combining the two capabilities to provide the next generation of infonnation systems. The collection, processing, and integration of data in new ways can contribute to the security of the nation by providing indicators and information for timely action to decrease the traditional and new nuclear threats. Modeling and simulation tied to comprehensive

  15. HS3 Information System

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Conover, H.; Ramachandran, R.; Kulkarni, A.; Mceniry, M.; Stone, B.

    2015-12-01

    The Global Hydrology Resource Center (GHRC) is developing an enterprise information system to manage and better serve data for Hurricane and Severe Storm Sentinel (HS3), a NASA airborne field campaign. HS3 is a multiyear campaign aimed at helping scientists understand the physical processes that contribute to hurricane intensification. For in-depth analysis, HS3 encompasses not only airborne data but also variety of in-situ, satellite, simulation, and flight report data. Thus, HS3 provides a unique challenge in information system design. The GHRC team is experienced with previous airborne campaigns to handle such challenge. Many supplementary information and reports collected during the mission include information rich contents that provide mission snapshots. In particular, flight information, instrument status, weather reports, and summary statistics offer vital knowledge about the corresponding science data. Furthermore, such information help narrow the science data of interest. Therefore, the GHRC team is building HS3 information system that augments the current GHRC data management framework to support search and discover of airborne science data with interactive visual exploration. Specifically, the HS3 information system is developing a tool to visually playback mission flights along with other traditional search and discover interfaces. This playback capability allows the users to follow the flight in time and visualize collected data. The flight summary and analyzed information are also presented during the playback. If the observed data is of interest, then they can order the data from GHRC using the interface. The users will be able to order just the data for the part of the flight that they are interested in. This presentation will demonstrate use of visual exploration to data download along with other components that comprise the HS3 information system.

  16. Potential for Inclusion of Information Encountering within Information Literacy Models

    ERIC Educational Resources Information Center

    Erdelez, Sanda; Basic, Josipa; Levitov, Deborah D.

    2011-01-01

    Introduction: Information encountering (finding information while searching for some other information), is a type of opportunistic discovery of information that complements purposeful approaches to finding information. The motivation for this paper was to determine if the current models of information literacy instruction refer to information…

  17. Beyond informed consent.

    PubMed Central

    Bhutta, Zulfiqar A.

    2004-01-01

    Although a relatively recent phenomenon, the role of informed consent in human research is central to its ethical regulation and conduct. However, guidelines often recommend procedures for obtaining informed consent (usually written consent) that are difficult to implement in developing countries. This paper reviews the guidelines for obtaining informed consent and also discusses prevailing views on current controversies, ambiguities and problems with these guidelines and suggests potential solutions. The emphasis in most externally sponsored research projects in developing countries is on laborious documentation of several mechanical aspects of the research process rather than on assuring true comprehension and voluntary participation. The onus for the oversight of this process is often left to overworked and ill-equipped local ethics review committees. Current guidelines and processes for obtaining informed consent should be reviewed with the specific aim of developing culturally appropriate methods of sharing information about the research project and obtaining and documenting consent that is truly informed. Further research is needed to examine the validity and user friendliness of innovations in information sharing procedures for obtaining consent in different cultural settings. PMID:15643799

  18. Ignorance, information and autonomy.

    PubMed

    Harris, J; Keywood, K

    2001-09-01

    People have a powerful interest in genetic privacy and its associated claim to ignorance, and some equally powerful desires to be shielded from disturbing information are often voiced. We argue, however, that there is no such thing as a right to remain in ignorance, where a fight is understood as an entitlement that trumps competing claims. This does not of course mean that information must always be forced upon unwilling recipients, only that there is no prima facie entitlement to be protected from true or honest information about oneself. Any claims to be shielded from information about the self must compete on equal terms with claims based in the rights and interests of others. In balancing the weight and importance of rival considerations about giving or withholding information, if rights claims have any place, rights are more likely to be defensible on the side of honest communication of information rather than in defence of ignorance. The right to free speech and the right to decline to accept responsibility to take decisions for others imposed by those others seem to us more plausible candidates for fully fledged rights in this field than any purported right to ignorance. Finally, and most importantly, if the right to autonomy is invoked, a proper understanding of the distinction between claims to liberty and claims to autonomy show that the principle of autonomy, as it is understood in contemporary social ethics and English law, supports the giving rather than the withholding of information in most circumstances. PMID:11808677

  19. Acting to gain information

    NASA Technical Reports Server (NTRS)

    Rosenchein, Stanley J.; Burns, J. Brian; Chapman, David; Kaelbling, Leslie P.; Kahn, Philip; Nishihara, H. Keith; Turk, Matthew

    1993-01-01

    This report is concerned with agents that act to gain information. In previous work, we developed agent models combining qualitative modeling with real-time control. That work, however, focused primarily on actions that affect physical states of the environment. The current study extends that work by explicitly considering problems of active information-gathering and by exploring specialized aspects of information-gathering in computational perception, learning, and language. In our theoretical investigations, we analyzed agents into their perceptual and action components and identified these with elements of a state-machine model of control. The mathematical properties of each was developed in isolation and interactions were then studied. We considered the complexity dimension and the uncertainty dimension and related these to intelligent-agent design issues. We also explored active information gathering in visual processing. Working within the active vision paradigm, we developed a concept of 'minimal meaningful measurements' suitable for demand-driven vision. We then developed and tested an architecture for ongoing recognition and interpretation of visual information. In the area of information gathering through learning, we explored techniques for coping with combinatorial complexity. We also explored information gathering through explicit linguistic action by considering the nature of conversational rules, coordination, and situated communication behavior.

  20. European drug information centers.

    PubMed

    Markind, J E; Stachnik, J M

    1996-09-01

    Drug information is a clinical specialty throughout the United States and Europe. This professional support service not only addresses drug information requests, but also provides pharmacy (drug) and therapeutics support, newsletter publication, fee-for-service consultation, education, drug policy development, and research. Although the primary services of drug information centers (DICs) in Europe are similar to those in the United States, substantial differences have been reported. Recent surveys have compared the locations, resources, staff, and services of the DICs throughout Europe. DICs in the United States and Europe play a pivotal role in the provision of pharmaceutical care to patients as well as providing support to hospital functions. PMID:9025433

  1. Informativeness of microsatellite markers.

    PubMed

    Reyes-Valdés, M Humberto

    2013-01-01

    Simple sequence repeats (SSR) are extensively used as genetic markers for studies of diversity, genetic mapping, and cultivar discrimination. The informativeness of a given SSR locus or a loci group depends on the number of alleles, their frequency distribution, as well as the kind of application. Here I describe several methods for calculating marker informativeness, all of them suitable for SSR polymorphisms, proposed by several authors and synthesized in an Information Theory framework. Additionally, free access software resources are described as well as their application through worked examples. PMID:23546797

  2. ENERGY INFORMATION CLEARINGHOUSE

    SciTech Connect

    Ron Johnson

    2003-10-01

    Alaska has spent billions of dollars on various energy-related activities over the past several decades, with projects ranging from smaller utilities used to produce heat and power in rural Alaska to huge endeavors relating to exported resources. To help provide information for end users, utilities, decision makers, and the general public, the Institute of Northern Engineering at UAF established an Energy Information Clearinghouse accessible through the worldwide web in 2002. This clearinghouse contains information on energy resources, end use technologies, policies, related environmental issues, emerging technologies, efficiency, storage, demand side management, and developments in Alaska.

  3. Can randomization be informative?

    NASA Astrophysics Data System (ADS)

    Pereira, Carlos A. B.; Campos, Thiago F.; Silva, Gustavo M.; Wechsler, Sergio

    2012-10-01

    In this paper the Pair of Siblings Paradox introduced by Pereira [1] is extended by considering more than two children and more than one child observed for gender. We follow the same lines of Wechsler et al. [2] that generalizes the three prisoners' dilemma, introduced by Gardner [3]. This paper's conjecture is that the Pair of Siblings and the Three Prisoners dilemma are dual paradoxes. Looking at possible likelihoods, the sure (randomized) selection for the former is non informative (informative), the opposite that holds for the latter. This situation is maintained for generalizations. Non informative likelihood here means that prior and posterior are equal.

  4. Advanced information society (11)

    NASA Astrophysics Data System (ADS)

    Nawa, Kotaro

    Late in the 1980's the information system of Japanese corporation has been operated strategically to strengthen its competitive position in markets rather than to make corporate management efficient. Therefore, information-oriented policy in the corporation is making remarkable progress. This policy expands the intelligence activity in the corporation and also leads to the extension of the market in an information industry. In this environment closed corporate system is transformed into open one. For this system network and database are important managerial resources.

  5. Weather Information System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    WxLink is an aviation weather system based on advanced airborne sensors, precise positioning available from the satellite-based Global Positioning System, cockpit graphics and a low-cost datalink. It is a two-way system that uplinks weather information to the aircraft and downlinks automatic pilot reports of weather conditions aloft. Manufactured by ARNAV Systems, Inc., the original technology came from Langley Research Center's cockpit weather information system, CWIN (Cockpit Weather INformation). The system creates radar maps of storms, lightning and reports of surface observations, offering improved safety, better weather monitoring and substantial fuel savings.

  6. Information Literacy in the Workplace.

    ERIC Educational Resources Information Center

    Oman, Julie N.

    2001-01-01

    Discusses the need for information literacy in the workplace in the face of information overload and problems related to end user information skills. Explains how to improve information literacy by assessing the organization's infrastructure, including available information technologies and information processes; considering demographics; and…

  7. Physiological Information Database (PID)

    EPA Science Inventory

    EPA has developed a physiological information database (created using Microsoft ACCESS) intended to be used in PBPK modeling. The database contains physiological parameter values for humans from early childhood through senescence as well as similar data for laboratory animal spec...

  8. Quantum information and computation

    SciTech Connect

    Bennett, C.H.

    1995-10-01

    A new quantum theory of communication and computation is emerging, in which the stuff transmitted or processed is not classical information, but arbitrary superpositions of quantum states. {copyright} 1995 {ital American} {ital Institute} {ital of} {ital Physics}.

  9. Tuberculosis: General Information

    MedlinePlus

    TB Elimination Tuberculosis: General Information What is TB? Tuberculosis (TB) is a disease caused by germs that are spread from person ... Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination CS227840_A What Does a Positive Test ...

  10. Zika Travel Information

    MedlinePlus

    ... Citizens and Residents Living in Areas with Ongoing Zika Virus Transmission Guidelines for Travelers Visiting Friends and Family ... with Zika . For the most current information about Zika virus, please visit CDC’s Zika website . Traveling soon? Get ...

  11. Ethics and biomedical information.

    PubMed

    France, F H

    1998-03-01

    Ethical rules are similar for physicians in most countries that follow the Hippocratic oath. They have no formal legal force, but can be used as a reference to provide answers to solve individual cases. It appears erroneous to believe that privacy is about information. It is about relationship. In medicine, there is a contract between a patient and a physician, where health care personnel has to respect secrecy, while integrity and availability of information should be obtained for continuity of care. These somewhat contradictory objectives have to be applied very carefully to computerised biomedical information. Ethical principles have to be made clear to everyone, and society should take the necessary steps to organise their enforcement. Several examples are given in the delivery of health care, telediagnosis, patient follow-up. clinical research as well as possible breakthroughs that could jeopardise privacy, using biomedical information. PMID:9723809

  12. CALIPSO Data and Information

    Atmospheric Science Data Center

    2016-06-13

    ... CALIPSO Data and Information Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations ( CALIPSO ) was launched ... satellite comprises three instruments, the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP Lidar), the Imaging Infrared ...

  13. Retrieving Patent Information Online

    ERIC Educational Resources Information Center

    Kaback, Stuart M.

    1978-01-01

    This paper discusses patent information retrieval from online files in terms of types of questions, file contents, coverage, timeliness, and other file variations. CLAIMS, Derwent, WPI, APIPAT and Chemical Abstracts Service are described. (KP)

  14. Information Technology for Education.

    ERIC Educational Resources Information Center

    Snyder, Cathrine E.; And Others

    1990-01-01

    Eight papers address technological, behavioral, and philosophical aspects of the application of information technology to training. Topics include instructional technology centers, intelligent training systems, distance learning, automated task analysis, training system selection, the importance of instructional methods, formative evaluation and…

  15. National Rehabilitation Information Center

    MedlinePlus

    ... Resources Our Publications Knowledgebase Articles, Books & Reports Research Projects In this section About NARIC Submit an Information ... reports from the NARIC library (including international research). Projects conducting research and/or development. Organizations, agencies, and ...

  16. Information Storage and Display.

    ERIC Educational Resources Information Center

    Burns, Christopher

    1981-01-01

    Reviews the state of information technologies involving teletext, viewdata, interactive graphics, videodisc, audio synthesis, and holography, and discusses the current understanding of access schemes and cognitive processing. Sixteen suggested readings and nine references are cited. (FM)

  17. Value of Information References

    SciTech Connect

    Morency, Christina

    2014-12-12

    This file contains a list of relevant references on value of information (VOI) in RIS format. VOI provides a quantitative analysis to evaluate the outcome of the combined technologies (seismology, hydrology, geodesy) used to monitor Brady's Geothermal Field.

  18. Designing Information Interoperability

    SciTech Connect

    Gorman, Bryan L.; Shankar, Mallikarjun; Resseguie, David R.

    2009-01-01

    Examples of incompatible systems are offered with a discussion of the relationship between incompatibility and innovation. Engineering practices and the role of standards are reviewed as a means of resolving issues of incompatibility, with particular attention to the issue of innovation. Loosely-coupled systems are described as a means of achieving and sustaining both interoperability and innovation in heterogeneous environments. A virtual unifying layer, in terms of a standard, a best practice, and a methodology, is proposed as a modality for designing information interoperability for enterprise applicaitons. The Uniform Resource Identifier (URI), microformats, and Joshua Porter s AOF Method are described and presented as solutions for designing interoperable information sharing web sites. The Special Operations Force Information Access (SOFIA), a mock design, is presented as an example of information interoperability.

  19. Insect Barcode Information System

    PubMed Central

    Pratheepa, Maria; Jalali, Sushil Kumar; Arokiaraj, Robinson Silvester; Venkatesan, Thiruvengadam; Nagesh, Mandadi; Panda, Madhusmita; Pattar, Sharath

    2014-01-01

    Insect Barcode Information System called as Insect Barcode Informática (IBIn) is an online database resource developed by the National Bureau of Agriculturally Important Insects, Bangalore. This database provides acquisition, storage, analysis and publication of DNA barcode records of agriculturally important insects, for researchers specifically in India and other countries. It bridges a gap in bioinformatics by integrating molecular, morphological and distribution details of agriculturally important insects. IBIn was developed using PHP/My SQL by using relational database management concept. This database is based on the client– server architecture, where many clients can access data simultaneously. IBIn is freely available on-line and is user-friendly. IBIn allows the registered users to input new information, search and view information related to DNA barcode of agriculturally important insects.This paper provides a current status of insect barcode in India and brief introduction about the database IBIn. Availability http://www.nabg-nbaii.res.in/barcode PMID:24616562

  20. Information Spreading in Context

    NASA Astrophysics Data System (ADS)

    Wang, Dashun; Wen, Zhen; Tong, Hanghang; Lin, Ching-Yung; Song, Chaoming; Barabasi, Albert-Laszlo

    2012-02-01

    Information spreading processes are central to human interactions. Despite recent studies in online domains, little is known about factors that could affect the dissemination of a single piece of information. In this paper, we address this challenge by combining two related but distinct datasets, collected from a large scale privacy-preserving distributed social sensor system. We find that the social and organizational context significantly impacts to whom and how fast people forward information. Yet the structures within spreading processes can be well captured by a simple stochastic branching model, indicating surprising independence of context. Our results build the foundation of future predictive models of information flow and provide significant insights towards design of communication platforms.

  1. PREFACE: Quantum information processing

    NASA Astrophysics Data System (ADS)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  2. DSCOVR Data and Information

    Atmospheric Science Data Center

    2016-08-03

    ... DSCOVR Data and Information Deep Space Climate Observatory ( DSCOVR ) (formerly known as Triana) was ... and tested in 2008, and the same year the Committee on Space Environmental Sensor Mitigation Options (CSESMO) determined that DSCOVR ...

  3. Alternative fuel information sources

    SciTech Connect

    Not Available

    1994-06-01

    This short document contains a list of more than 200 US sources of information (Name, address, phone number, and sometimes contact) related to the use of alternative fuels in automobiles and trucks. Electric-powered cars are also included.

  4. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  5. Financial Assistance Information

    MedlinePlus

    ... Sites: Genetic and Rare Diseases Information Center Get Email Updates Advancing human health through genomics research Privacy Copyright Contact Accessibility Plug-ins Site Map Staff Directory FOIA Share Top

  6. Nutrition information sources.

    PubMed

    Farrell, L

    1972-10-01

    Medical personnel and medical librarians may tend to think of nutrition in medical terms and to forget its interdisciplinary aspects. For this reason, it is desirable for medical librarians to become familiar with a variety of sources of information on the composition of foods, nutrient values, food additives, and food protection. Many of these are government publications from such agencies as the U.S. Department of Agriculture, the National Research Council, and the Food and Agriculture Organization. Less familiar sources include nutrition materials from state agricultural experiment stations and extension services and important data published in a wide range of scientific or agricultural journals, which may be located through Nutrition Abstracts and Reviews, Food Science and Technology Abstracts, and the Bibliography of Agriculture. Sources of current information on nutrition research in progress include the Department of Agriculture's Current Research Information System (CRIS) and selective listings from the Smith-sonian Information Exchange. PMID:4563540

  7. Energy information directory 1998

    SciTech Connect

    1998-11-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are: (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory lists most Government offices and trade associations that are involved in energy matters.

  8. Nutrition Information Sources

    PubMed Central

    Farrell, Lois

    1972-01-01

    Medical personnel and medical librarians may tend to think of nutrition in medical terms and to forget its interdisciplinary aspects. For this reason, it is desirable for medical librarians to become familiar with a variety of sources of information on the composition of foods, nutrient values, food additives, and food protection. Many of these are government publications from such agencies as the U.S. Department of Agriculture, the National Research Council, and the Food and Agriculture Organization. Less familiar sources include nutrition materials from state agricultural experiment stations and extension services and important data published in a wide range of scientific or agricultural journals, which may be located through Nutrition Abstracts and Reviews, Food Science and Technology Abstracts, and the Bibliography of Agriculture. Sources of current information on nutrition research in progress include the Department of Agriculture's Current Research Information System (CRIS) and selective listings from the Smith-sonian Information Exchange. PMID:4563540

  9. The Information Gap.

    ERIC Educational Resources Information Center

    Sharp, Bill; Appleton, Elaine

    1993-01-01

    Addresses the misconception that the ecosystems involving plants and animals in our national parks are thoroughly monitored. Discusses research and other programs designed to inventory species throughout the national parks' and to inform the national parks concerning its ecosystems. (MDH)

  10. Information geometric nonlinear filtering

    NASA Astrophysics Data System (ADS)

    Newton, Nigel J.

    2015-06-01

    This paper develops information geometric representations for nonlinear filters in continuous time. The posterior distribution associated with an abstract nonlinear filtering problem is shown to satisfy a stochastic differential equation on a Hilbert information manifold. This supports the Fisher metric as a pseudo-Riemannian metric. Flows of Shannon information are shown to be connected with the quadratic variation of the process of posterior distributions in this metric. Apart from providing a suitable setting in which to study such information-theoretic properties, the Hilbert manifold has an appropriate topology from the point of view of multi-objective filter approximations. A general class of finite-dimensional exponential filters is shown to fit within this framework, and an intrinsic evolution equation, involving Amari's -1-covariant derivative, is developed for such filters. Three example systems, one of infinite dimension, are developed in detail.

  11. Energy Information Online

    ERIC Educational Resources Information Center

    Miller, Betty

    1978-01-01

    The need to search several files to obtain the maximum information on energy is emphasized. Energyline, APILIT, APIPAT, PIE News, TULSA, NTIS, and Chemical Abstracts Condensates files are described. (KP)

  12. A multivariate approach to filling gaps in large ecological data sets using probabilistic matrix factorization techniques

    NASA Astrophysics Data System (ADS)

    Schrodt, F. I.; Shan, H.; Kattge, J.; Reich, P.; Banerjee, A.; Reichstein, M.

    2012-12-01

    With the advent of remotely sensed data and coordinated efforts to create global databases, the ecological community has progressively become more data-intensive. However, in contrast to other disciplines, statistical ways of handling these large data sets, especially the gaps which are inherent to them, are lacking. Widely used theoretical approaches, for example model averaging based on Akaike's information criterion (AIC), are sensitive to missing values. Yet, the most common way of handling sparse matrices - the deletion of cases with missing data (complete case analysis) - is known to severely reduce statistical power as well as inducing biased parameter estimates. In order to address these issues, we present novel approaches to gap filling in large ecological data sets using matrix factorization techniques. Factorization based matrix completion was developed in a recommender system context and has since been widely used to impute missing data in fields outside the ecological community. Here, we evaluate the effectiveness of probabilistic matrix factorization techniques for imputing missing data in ecological matrices using two imputation techniques. Hierarchical Probabilistic Matrix Factorization (HPMF) effectively incorporates hierarchical phylogenetic information (phylogenetic group, family, genus, species and individual plant) into the trait imputation. Kernelized Probabilistic Matrix Factorization (KPMF) on the other hand includes environmental information (climate and soils) into the matrix factorization through kernel matrices over rows and columns. We test the accuracy and effectiveness of HPMF and KPMF in filling sparse matrices, using the TRY database of plant functional traits (http://www.try-db.org). TRY is one of the largest global compilations of plant trait databases (750 traits of 1 million plants), encompassing data on morphological, anatomical, biochemical, phenological and physiological features of plants. However, despite of unprecedented

  13. Enhanced Information Exclusion Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2016-01-01

    In Hall’s reformulation of the uncertainty principle, the entropic uncertainty relation occupies a core position and provides the first nontrivial bound for the information exclusion principle. Based upon recent developments on the uncertainty relation, we present new bounds for the information exclusion relation using majorization theory and combinatoric techniques, which reveal further characteristic properties of the overlap matrix between the measurements. PMID:27460975

  14. Information technology financing options.

    PubMed

    Rai, D

    1996-01-01

    Healthcare executives facing the challenges of delivering quality care and controlling costs must consider the role information technology systems can play in meeting those challenges. To make the best use of information system expenditures, organizations must carefully plan how to finance system acquisitions. Some options that should be considered are paying cash, financing, financing "soft" costs, leasing, credit warehousing and early acceptance financing, and tax-exempt and conduit financing. PMID:10154097

  15. Informed consent: II.

    PubMed

    Miller, L J

    1980-11-21

    In the second article of his four-part series on informed consent, the author discusses defenses a physician may employ against informed consent claims. Seven situations are described where failure to disclose risks might be justified by lack of materiality, standards of medical practice, or lack of proximate cause of the injury. Miller cites cases in which these arguments were recognized by the courts, but cautions against generalization of their applicability to other cases. PMID:7431561

  16. Information applications: Rapporteur summary

    SciTech Connect

    Siegel, S.

    1990-12-31

    An increased level of mathematical sophistication will be needed in the future to be able to handle the spectrum of information as it comes from a broad array of biological systems and other sources. Classification will be an increasingly complex and difficult issue. Several projects that are discussed are being developed by the US Department of Health and Human Services (DHHS), including a directory of risk assessment projects and a directory of exposure information resources.

  17. Information systems definition architecture

    SciTech Connect

    Calapristi, A.J.

    1996-06-20

    The Tank Waste Remediation System (TWRS) Information Systems Definition architecture evaluated information Management (IM) processes in several key organizations. The intent of the study is to identify improvements in TWRS IM processes that will enable better support to the TWRS mission, and accommodate changes in TWRS business environment. The ultimate goals of the study are to reduce IM costs, Manage the configuration of TWRS IM elements, and improve IM-related process performance.

  18. Asymmetric information and economics

    NASA Astrophysics Data System (ADS)

    Frieden, B. Roy; Hawkins, Raymond J.

    2010-01-01

    We present an expression of the economic concept of asymmetric information with which it is possible to derive the dynamical laws of an economy. To illustrate the utility of this approach we show how the assumption of optimal information flow leads to a general class of investment strategies including the well-known Q theory of Tobin. Novel consequences of this formalism include a natural definition of market efficiency and an uncertainty principle relating capital stock and investment flow.

  19. Management Information System

    NASA Technical Reports Server (NTRS)

    1984-01-01

    New Automated Management Information Center (AMIC) employs innovative microcomputer techniques to create color charts, viewgraphs, or other data displays in a fraction of the time formerly required. Developed under Kennedy Space Center's contract by Boeing Services International Inc., Seattle, WA, AMIC can produce an entirely new informational chart in 30 minutes, or an updated chart in only five minutes. AMIC also has considerable potential as a management system for business firms.

  20. Cockpit weather information system

    NASA Technical Reports Server (NTRS)

    Tu, Jeffrey Chen-Yu (Inventor)

    2000-01-01

    Weather information, periodically collected from throughout a global region, is periodically assimilated and compiled at a central source and sent via a high speed data link to a satellite communication service, such as COMSAT. That communication service converts the compiled weather information to GSDB format, and transmits the GSDB encoded information to an orbiting broadcast satellite, INMARSAT, transmitting the information at a data rate of no less than 10.5 kilobits per second. The INMARSAT satellite receives that data over its P-channel and rebroadcasts the GDSB encoded weather information, in the microwave L-band, throughout the global region at a rate of no less than 10.5 KB/S. The transmission is received aboard an aircraft by means of an onboard SATCOM receiver and the output is furnished to a weather information processor. A touch sensitive liquid crystal panel display allows the pilot to select the weather function by touching a predefined icon overlain on the display's surface and in response a color graphic display of the weather is displayed for the pilot.