Science.gov

Sample records for aic akaike information

  1. Improving data analysis in herpetology: Using Akaike's information criterion (AIC) to assess the strength of biological hypotheses

    USGS Publications Warehouse

    Mazerolle, M.J.

    2006-01-01

    In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.

  2. Model Selection and Psychological Theory: A Discussion of the Differences between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)

    ERIC Educational Resources Information Center

    Vrieze, Scott I.

    2012-01-01

    This article reviews the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models, given their growing use in theory testing and construction. Theoretical statistical results in regression are discussed, and more important…

  3. Model selection and psychological theory: A discussion of the differences between the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC)

    PubMed Central

    Vrieze, Scott I.

    2012-01-01

    This article reviews the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models given their growing use in theory testing and construction. We discuss theoretical statistical results in regression and illustrate more important issues with novel simulations involving latent variable models including factor analysis, latent profile analysis, and factor mixture models. Asymptotically, the BIC is consistent, in that it will select the true model if, among other assumptions, the true model is among the candidate models considered. The AIC is not consistent under these circumstances. When the true model is not in the candidate model set the AIC is effcient, in that it will asymptotically choose whichever model minimizes the mean squared error of prediction/estimation. The BIC is not effcient under these circumstances. Unlike the BIC, the AIC also has a minimax property, in that it can minimize the maximum possible risk in finite sample sizes. In sum, the AIC and BIC have quite different properties that require different assumptions, and applied researchers and methodologists alike will benefit from improved understanding of the asymptotic and finite-sample behavior of these criteria. The ultimate decision to use AIC or BIC depends on many factors, including: the loss function employed, the study's methodological design, the substantive research question, and the notion of a true model and its applicability to the study at hand. PMID:22309957

  4. Model selection and psychological theory: a discussion of the differences between the Akaike information criterion (AIC) and the Bayesian information criterion (BIC).

    PubMed

    Vrieze, Scott I

    2012-06-01

    This article reviews the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) in model selection and the appraisal of psychological theory. The focus is on latent variable models, given their growing use in theory testing and construction. Theoretical statistical results in regression are discussed, and more important issues are illustrated with novel simulations involving latent variable models including factor analysis, latent profile analysis, and factor mixture models. Asymptotically, the BIC is consistent, in that it will select the true model if, among other assumptions, the true model is among the candidate models considered. The AIC is not consistent under these circumstances. When the true model is not in the candidate model set the AIC is efficient, in that it will asymptotically choose whichever model minimizes the mean squared error of prediction/estimation. The BIC is not efficient under these circumstances. Unlike the BIC, the AIC also has a minimax property, in that it can minimize the maximum possible risk in finite sample sizes. In sum, the AIC and BIC have quite different properties that require different assumptions, and applied researchers and methodologists alike will benefit from improved understanding of the asymptotic and finite-sample behavior of these criteria. The ultimate decision to use the AIC or BIC depends on many factors, including the loss function employed, the study's methodological design, the substantive research question, and the notion of a true model and its applicability to the study at hand.

  5. Identifying P phase arrival of weak events: The Akaike Information Criterion picking application based on the Empirical Mode Decomposition

    NASA Astrophysics Data System (ADS)

    Li, Xibing; Shang, Xueyi; Morales-Esteban, A.; Wang, Zewei

    2017-03-01

    Seismic P phase arrival picking of weak events is a difficult problem in seismology. The algorithm proposed in this research is based on Empirical Mode Decomposition (EMD) and on the Akaike Information Criterion (AIC) picker. It has been called the EMD-AIC picker. The EMD is a self-adaptive signal decomposition method that not only improves Signal to Noise Ratio (SNR) but also retains P phase arrival information. Then, P phase arrival picking has been determined by applying the AIC picker to the selected main Intrinsic Mode Functions (IMFs). The performance of the EMD-AIC picker has been evaluated on the basis of 1938 micro-seismic signals from the Yongshaba mine (China). The P phases identified by this algorithm have been compared with manual pickings. The evaluation results confirm that the EMD-AIC pickings are highly accurate for the majority of the micro-seismograms. Moreover, the pickings are independent of the kind of noise. Finally, the results obtained by this algorithm have been compared to the wavelet based Discrete Wavelet Transform (DWT)-AIC pickings. This comparison has demonstrated that the EMD-AIC picking method has a better picking accuracy than the DWT-AIC picking method, thus showing this method's reliability and potential.

  6. Model selection and model averaging in phylogenetics: advantages of akaike information criterion and bayesian approaches over likelihood ratio tests.

    PubMed

    Posada, David; Buckley, Thomas R

    2004-10-01

    Model selection is a topic of special relevance in molecular phylogenetics that affects many, if not all, stages of phylogenetic inference. Here we discuss some fundamental concepts and techniques of model selection in the context of phylogenetics. We start by reviewing different aspects of the selection of substitution models in phylogenetics from a theoretical, philosophical and practical point of view, and summarize this comparison in table format. We argue that the most commonly implemented model selection approach, the hierarchical likelihood ratio test, is not the optimal strategy for model selection in phylogenetics, and that approaches like the Akaike Information Criterion (AIC) and Bayesian methods offer important advantages. In particular, the latter two methods are able to simultaneously compare multiple nested or nonnested models, assess model selection uncertainty, and allow for the estimation of phylogenies and model parameters using all available models (model-averaged inference or multimodel inference). We also describe how the relative importance of the different parameters included in substitution models can be depicted. To illustrate some of these points, we have applied AIC-based model averaging to 37 mitochondrial DNA sequences from the subgenus Ohomopterus(genus Carabus) ground beetles described by Sota and Vogler (2001).

  7. Assessing Fit and Dimensionality in Least Squares Metric Multidimensional Scaling Using Akaike's Information Criterion

    ERIC Educational Resources Information Center

    Ding, Cody S.; Davison, Mark L.

    2010-01-01

    Akaike's information criterion is suggested as a tool for evaluating fit and dimensionality in metric multidimensional scaling that uses least squares methods of estimation. This criterion combines the least squares loss function with the number of estimated parameters. Numerical examples are presented. The results from analyses of both simulation…

  8. Linear and curvilinear correlations of brain gray matter volume and density with age using voxel-based morphometry with the Akaike information criterion in 291 healthy children.

    PubMed

    Taki, Yasuyuki; Hashizume, Hiroshi; Thyreau, Benjamin; Sassa, Yuko; Takeuchi, Hikaru; Wu, Kai; Kotozaki, Yuka; Nouchi, Rui; Asano, Michiko; Asano, Kohei; Fukuda, Hiroshi; Kawashima, Ryuta

    2013-08-01

    We examined linear and curvilinear correlations of gray matter volume and density in cortical and subcortical gray matter with age using magnetic resonance images (MRI) in a large number of healthy children. We applied voxel-based morphometry (VBM) and region-of-interest (ROI) analyses with the Akaike information criterion (AIC), which was used to determine the best-fit model by selecting which predictor terms should be included. We collected data on brain structural MRI in 291 healthy children aged 5-18 years. Structural MRI data were segmented and normalized using a custom template by applying the diffeomorphic anatomical registration using exponentiated lie algebra (DARTEL) procedure. Next, we analyzed the correlations of gray matter volume and density with age in VBM with AIC by estimating linear, quadratic, and cubic polynomial functions. Several regions such as the prefrontal cortex, the precentral gyrus, and cerebellum showed significant linear or curvilinear correlations between gray matter volume and age on an increasing trajectory, and between gray matter density and age on a decreasing trajectory in VBM and ROI analyses with AIC. Because the trajectory of gray matter volume and density with age suggests the progress of brain maturation, our results may contribute to clarifying brain maturation in healthy children from the viewpoint of brain structure.

  9. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.

    PubMed

    Zanini, Andrea; Woodbury, Allan D

    2016-01-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  10. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Zanini, Andrea; Woodbury, Allan D.

    2016-02-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  11. Applying Least Absolute Shrinkage Selection Operator and Akaike Information Criterion Analysis to Find the Best Multiple Linear Regression Models between Climate Indices and Components of Cow's Milk.

    PubMed

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2016-07-23

    This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow's milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R² (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R² (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available.

  12. Multidimensional Rasch Model Information-Based Fit Index Accuracy

    ERIC Educational Resources Information Center

    Harrell-Williams, Leigh M.; Wolfe, Edward W.

    2013-01-01

    Most research on confirmatory factor analysis using information-based fit indices (Akaike information criterion [AIC], Bayesian information criteria [BIC], bias-corrected AIC [AICc], and consistent AIC [CAIC]) has used a structural equation modeling framework. Minimal research has been done concerning application of these indices to item response…

  13. Applying Least Absolute Shrinkage Selection Operator and Akaike Information Criterion Analysis to Find the Best Multiple Linear Regression Models between Climate Indices and Components of Cow’s Milk

    PubMed Central

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2016-01-01

    This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow’s milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R2 (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R2 (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available. PMID:28231147

  14. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  15. Practical application of the Average Information Content Maximization (AIC-MAX) algorithm: selection of the most important structural features for serotonin receptor ligands.

    PubMed

    Warszycki, Dawid; Śmieja, Marek; Kafel, Rafał

    2017-02-09

    The Average Information Content Maximization algorithm (AIC-MAX) based on mutual information maximization was recently introduced to select the most discriminatory features. Here, this methodology was applied to select the most significant bits from the Klekota-Roth fingerprint for serotonin receptors ligands as well as to select the most important features for distinguishing ligands with activity for one receptor versus another. The interpretation of selected bits and machine-learning experiments performed using the reduced interpretations outperformed the raw fingerprints and indicated the most important structural features of the analyzed ligands in terms of activity and selectivity. Moreover, the AIC-MAX methodology applied here for serotonin receptor ligands can also be applied to other target classes.

  16. AIC and the challenge of complexity: A case study from ecology.

    PubMed

    Moll, Remington J; Steel, Daniel; Montgomery, Robert A

    2016-12-01

    Philosophers and scientists alike have suggested Akaike's Information Criterion (AIC), and other similar model selection methods, show predictive accuracy justifies a preference for simplicity in model selection. This epistemic justification of simplicity is limited by an assumption of AIC which requires that the same probability distribution must generate the data used to fit the model and the data about which predictions are made. This limitation has been previously noted but appears to often go unnoticed by philosophers and scientists and has not been analyzed in relation to complexity. If predictions are about future observations, we argue that this assumption is unlikely to hold for models of complex phenomena. That in turn creates a practical limitation for simplicity's AIC-based justification because scientists modeling such phenomena are often interested in predicting the future. We support our argument with an ecological case study concerning the reintroduction of wolves into Yellowstone National Park, U.S.A. We suggest that AIC might still lend epistemic support for simplicity by leading to better explanations of complex phenomena.

  17. Derivation of 3-D surface deformation from an integration of InSAR and GNSS measurements based on Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Luo, Haipeng; Liu, Yang; Chen, Ting; Xu, Caijun; Wen, Yangmao

    2016-01-01

    We present a new method to derive 3-D surface deformation from an integration of interferometric synthetic aperture radar (InSAR) images and Global Navigation Satellite System (GNSS) observations based on Akaike's Bayesian Information Criterion (ABIC), considering relationship between deformations of neighbouring locations. This method avoids interpolated errors by excluding the interpolation of GNSS into the same spatial resolution as InSAR images and harnesses the data sets and the prior smooth constraints of surface deformation objectively and simultaneously by using ABIC, which were inherently unresolved in previous studies. In particular, we define surface roughness measuring smoothing degree to evaluate the performance of the prior constraints and deduce the formula of the covariance for the estimation errors to estimate the uncertainty of modelled solution. We validate this method using synthetic tests and the 2008 Mw 7.9 Wenchuan earthquake. We find that the optimal weights associated with ABIC minimum are generally at trade-off locations that balance contributions from InSAR, GNSS data sets and the prior constraints. We use this method to evaluate the influence of the interpolated errors from the Ordinary Kriging algorithm on the derivation of surface deformation. Tests show that the interpolated errors may contribute to biasing very large weights imposed on Kriged GNSS data, suggesting that fixing the relative weights is required in this case. We also make a comparison with SISTEM method, indicating that our method allows obtaining better estimations even with sparse GNSS observations. In addition, this method can be generalized to provide a solution for situations where some types of data sets are lacking and can be exploited further to account for data sets such as the integration of displacements along radar lines and offsets along satellite tracks.

  18. Water-solvent partition coefficients and Delta Log P values as predictors for blood-brain distribution; application of the Akaike information criterion.

    PubMed

    Abraham, Michael H; Acree, William E; Leo, Albert J; Hoekman, David; Cavanaugh, Joseph E

    2010-05-01

    It is shown that log P values for water-alkane or water-cyclohexane partitions, and the corresponding Delta log P values when used as descriptors for blood-brain distribution, as log BB, yield equations with very poor correlation coefficients but very good standard deviations, S from 0.25 to 0.33 log units. Using quite large data sets, we have verified that similar S-values apply to predictions of log BB. A suggested model, based on log P for water-dodecane and water-hexadecane partition coefficients, has 109 data points and a fitted S = 0.254 log units. It is essential to include in the model an indicator variable for volatile compounds, and an indicator variable for drugs that contain the carboxylic group. A similar equation based on water-chloroform partition coefficients has 83 data points and a fitted S = 0.287 log units. We can find no causal connection between these log P values and log BB in terms of correlation or in terms of chemical similarity, but conclude that the log P descriptor will yield excellent predictions of log BB provided that predictions are within the chemical space of the compounds used to set up the model. We also show that model based on log P(octanol) and an Abraham descriptor provides a simple and easy method of predicting log BB with an error of no more than 0.31 log units. We have used the Akaike information criterion to investigate the most economic models for log BB.

  19. A new method for arrival time determination of impact signal based on HHT and AIC

    NASA Astrophysics Data System (ADS)

    Liu, Mingzhou; Yang, Jiangxin; Cao, Yanpeng; Fu, Weinan; Cao, Yanlong

    2017-03-01

    Time-difference method is usually used to locate loose parts in nuclear power plant, the key to which is estimating the arrival time of impact signal caused by the crash of loose parts. However, the dispersion behavior of impact signal and the noise of nuclear power station primary circuit have negative effect on the arrival time determination. In this paper, a method of arrival time determination of impact signal based on Hilbert-Huang Transform (HHT) and Akaike Information Criterion (AIC) is proposed. Firstly, the impact signal is decomposed by Empirical Mode Decomposition (EMD). Then the instantaneous frequency of the first intrinsic mode function (IMF) is calculated, which characterizes the difference between the background noise and the impact signal. The arrival time is determined finally by AIC function. The proposed method is tested through simulation experiment which takes steel balls as the real loose parts. The deviation between the arrival time determined by proposed method and the real arrival time distributes stably under different SNRs and different sensor-to-drop point distances, mostly within the range ±0.5 ms. The proposed method is also compared with another AIC technique and a RMS approach, both of which have more dispersive distribution of deviation, quite a lot out of the range ±1 ms.

  20. Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective

    USGS Publications Warehouse

    Barker, Richard J.; Link, William A.

    2015-01-01

    Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.

  1. Autonomic Intelligent Cyber Sensor (AICS) Version 1.0.1

    SciTech Connect

    2015-03-01

    The Autonomic Intelligent Cyber Sensor (AICS) provides cyber security and industrial network state awareness for Ethernet based control network implementations. The AICS utilizes collaborative mechanisms based on Autonomic Research and a Service Oriented Architecture (SOA) to: 1) identify anomalous network traffic; 2) discover network entity information; 3) deploy deceptive virtual hosts; and 4) implement self-configuring modules. AICS achieves these goals by dynamically reacting to the industrial human-digital ecosystem in which it resides. Information is transported internally and externally on a standards based, flexible two-level communication structure.

  2. Complexity vs. simplicity: groundwater model ranking using information criteria.

    PubMed

    Engelhardt, I; De Aguinaga, J G; Mikat, H; Schüth, C; Liedl, R

    2014-01-01

    A groundwater model characterized by a lack of field data about hydraulic model parameters and boundary conditions combined with many observation data sets for calibration purpose was investigated concerning model uncertainty. Seven different conceptual models with a stepwise increase from 0 to 30 adjustable parameters were calibrated using PEST. Residuals, sensitivities, the Akaike information criterion (AIC and AICc), Bayesian information criterion (BIC), and Kashyap's information criterion (KIC) were calculated for a set of seven inverse calibrated models with increasing complexity. Finally, the likelihood of each model was computed. Comparing only residuals of the different conceptual models leads to an overparameterization and certainty loss in the conceptual model approach. The model employing only uncalibrated hydraulic parameters, estimated from sedimentological information, obtained the worst AIC, BIC, and KIC values. Using only sedimentological data to derive hydraulic parameters introduces a systematic error into the simulation results and cannot be recommended for generating a valuable model. For numerical investigations with high numbers of calibration data the BIC and KIC select as optimal a simpler model than the AIC. The model with 15 adjusted parameters was evaluated by AIC as the best option and obtained a likelihood of 98%. The AIC disregards the potential model structure error and the selection of the KIC is, therefore, more appropriate. Sensitivities to piezometric heads were highest for the model with only five adjustable parameters and sensitivity coefficients were directly influenced by the changes in extracted groundwater volumes.

  3. Regularization Parameter Selections via Generalized Information Criterion

    PubMed Central

    Zhang, Yiyun; Li, Runze; Tsai, Chih-Ling

    2009-01-01

    We apply the nonconcave penalized likelihood approach to obtain variable selections as well as shrinkage estimators. This approach relies heavily on the choice of regularization parameter, which controls the model complexity. In this paper, we propose employing the generalized information criterion (GIC), encompassing the commonly used Akaike information criterion (AIC) and Bayesian information criterion (BIC), for selecting the regularization parameter. Our proposal makes a connection between the classical variable selection criteria and the regularization parameter selections for the nonconcave penalized likelihood approaches. We show that the BIC-type selector enables identification of the true model consistently, and the resulting estimator possesses the oracle property in the terminology of Fan and Li (2001). In contrast, however, the AIC-type selector tends to overfit with positive probability. We further show that the AIC-type selector is asymptotically loss efficient, while the BIC-type selector is not. Our simulation results confirm these theoretical findings, and an empirical example is presented. Some technical proofs are given in the online supplementary material. PMID:20676354

  4. Automatic picking based on an AR-AIC-costfunction appraoach applied on tele-, regional- and induced seismic datasets

    NASA Astrophysics Data System (ADS)

    Olbert, Kai; Meier, Thomas; Cristiano, Luigia

    2015-04-01

    A quick picking procedure is an important tool to process large datasets in seismology. Identifying phases and determining the precise onset times at seismological stations is essential not just for localization procedures but also for seismic body-wave tomography. The automated picking procedure should be fast, robust, precise and consistent. In manual processing the speed and consistency are not guaranteed and therefore unreproducible errors may be introduced, especially for large amounts of data. In this work an offline P- and S-phase picker based on an autoregressive-prediction approach is optimized and applied to different data sets. The onset time can be described as the sum of the event source time, the theoretic travel time according to a reference velocity model and a deviation from the theoretic travel time due to lateral heterogeneity or errors in the source location. With this approach the onset time at each station can be found around the theoretical travel time within a time window smaller than the maximum lateral heterogeneity. Around the theoretic travel time an autoregressive prediction error is calculated from one or several components as characteristic function of the waveform. The minimum of the Akaike-Information-Criteria of the characteristic function identifies the phase. As was shown by Küperkoch et al. (2012), the Akaike-Information-Criteria has the tendency to be too late. Therefore, an additional processing step for precise picking is needed. In the vicinity of the minimum of the Akaike-Information-Criteria a cost function is defined and used to find the optimal estimate of the arrival time. The cost function is composed of the CF and three side conditions. The idea behind the use of a cost function is to find the phase pick in the last minimum before the CF rises due to the phase onset. The final onset time is picked in the minimum of the cost function. The automatic picking procedure is applied on datasets recorded at stations of the

  5. Mission science value-cost savings from the Advanced Imaging Communication System (AICS)

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1984-01-01

    An Advanced Imaging Communication System (AICS) was proposed in the mid-1970s as an alternative to the Voyager data/communication system architecture. The AICS achieved virtually error free communication with little loss in the downlink data rate by concatenating a powerful Reed-Solomon block code with the Voyager convolutionally coded, Viterbi decoded downlink channel. The clean channel allowed AICS sophisticated adaptive data compression techniques. Both Voyager and the Galileo mission have implemented AICS components, and the concatenated channel itself is heading for international standardization. An analysis that assigns a dollar value/cost savings to AICS mission performance gains is presented. A conservative value or savings of $3 million for Voyager, $4.5 million for Galileo, and as much as $7 to 9.5 million per mission for future projects such as the proposed Mariner Mar 2 series is shown.

  6. An empirical comparison of information-theoretic selection criteria for multivariate behavior genetic models.

    PubMed

    Markon, Kristian E; Krueger, Robert F

    2004-11-01

    Information theory provides an attractive basis for statistical inference and model selection. However, little is known about the relative performance of different information-theoretic criteria in covariance structure modeling, especially in behavioral genetic contexts. To explore these issues, information-theoretic fit criteria were compared with regard to their ability to discriminate between multivariate behavioral genetic models under various model, distribution, and sample size conditions. Results indicate that performance depends on sample size, model complexity, and distributional specification. The Bayesian Information Criterion (BIC) is more robust to distributional misspecification than Akaike's Information Criterion (AIC) under certain conditions, and outperforms AIC in larger samples and when comparing more complex models. An approximation to the Minimum Description Length (MDL; Rissanen, J. (1996). IEEE Transactions on Information Theory 42:40-47, Rissanen, J. (2001). IEEE Transactions on Information Theory 47:1712-1717) criterion, involving the empirical Fisher information matrix, exhibits variable patterns of performance due to the complexity of estimating Fisher information matrices. Results indicate that a relatively new information-theoretic criterion, Draper's Information Criterion (DIC; Draper, 1995), which shares features of the Bayesian and MDL criteria, performs similarly to or better than BIC. Results emphasize the importance of further research into theory and computation of information-theoretic criteria.

  7. Test procedures, AN/AIC-27 system and component units. [for space shuttle

    NASA Technical Reports Server (NTRS)

    Reiff, F. H.

    1975-01-01

    The AN/AIC-27 (v) intercommunication system is a 30-channel audio distribution which consists of: air crew station units, maintenance station units, and a central control unit. A test procedure for each of the above units and also a test procedure for the system are presented. The intent of the test is to provide data for use in shuttle audio subsystem design.

  8. [Preparation and characterization of poly-Si films on different topography substrates by AIC].

    PubMed

    Wang, Cheng-Long; Fan, Duo-Wang; Liu, Hong-Zhong; Zhang, Fu-Jia; Xing, Da; Liu, Song-Hao

    2009-03-01

    Polycrystalline silicon (poly-Si) thin-films were made on planar and textured glass substrates by aluminum-induced crystallization (AIC) of in situ amorphous silicon (a-Si) deposited by DC-magnetron. The poly-Si films were characterized by Raman spectroscopy, X-ray diffraction (XRD) and atomic force microscopy (AFM). A narrow and symmetrical Ranman peak at the wave number of about 521 cm(-1) was observed for all samples, indicating that the films were fully crystallized. XRD results show that the crystallites in the authors' AIC poly-Si films were preferably (111) oriented. The measurement of full width at half maximum (FWHW) of (111) XRD peaks showed that the quality of the films was affected by the a-Si deposition temperature and the surface morphology of the glass substrates. It is likely that an a-Si deposition temperature of 200 degrees C seems to be ideal for the preparation of poly-Si films by AIC.

  9. A Novel Hybrid Dimension Reduction Technique for Undersized High Dimensional Gene Expression Data Sets Using Information Complexity Criterion for Cancer Classification

    PubMed Central

    Pamukçu, Esra; Bozdogan, Hamparsum; Çalık, Sinan

    2015-01-01

    Gene expression data typically are large, complex, and highly noisy. Their dimension is high with several thousand genes (i.e., features) but with only a limited number of observations (i.e., samples). Although the classical principal component analysis (PCA) method is widely used as a first standard step in dimension reduction and in supervised and unsupervised classification, it suffers from several shortcomings in the case of data sets involving undersized samples, since the sample covariance matrix degenerates and becomes singular. In this paper we address these limitations within the context of probabilistic PCA (PPCA) by introducing and developing a new and novel approach using maximum entropy covariance matrix and its hybridized smoothed covariance estimators. To reduce the dimensionality of the data and to choose the number of probabilistic PCs (PPCs) to be retained, we further introduce and develop celebrated Akaike's information criterion (AIC), consistent Akaike's information criterion (CAIC), and the information theoretic measure of complexity (ICOMP) criterion of Bozdogan. Six publicly available undersized benchmark data sets were analyzed to show the utility, flexibility, and versatility of our approach with hybridized smoothed covariance matrix estimators, which do not degenerate to perform the PPCA to reduce the dimension and to carry out supervised classification of cancer groups in high dimensions. PMID:25838836

  10. AIC649 Induces a Bi-Phasic Treatment Response in the Woodchuck Model of Chronic Hepatitis B

    PubMed Central

    Paulsen, Daniela; Weber, Olaf; Ruebsamen-Schaeff, Helga; Tennant, Bud C.; Menne, Stephan

    2015-01-01

    AIC649 has been shown to directly address the antigen presenting cell arm of the host immune defense leading to a regulated cytokine release and activation of T cell responses. In the present study we analyzed the antiviral efficacy of AIC649 as well as its potential to induce functional cure in animal models for chronic hepatitis B. Hepatitis B virus transgenic mice and chronically woodchuck hepatitis virus (WHV) infected woodchucks were treated with AIC649, respectively. In the mouse system AIC649 decreased the hepatitis B virus titer as effective as the “gold standard”, Tenofovir. Interestingly, AIC649-treated chronically WHV infected woodchucks displayed a bi-phasic pattern of response: The marker for functional cure—hepatitis surface antigen—first increased but subsequently decreased even after cessation of treatment to significantly reduced levels. We hypothesize that the observed bi-phasic response pattern to AIC649 treatment reflects a physiologically “concerted”, reconstituted immune response against WHV and therefore may indicate a potential for inducing functional cure in HBV-infected patients. PMID:26656974

  11. Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique

    SciTech Connect

    Glosup, J.G.; Axelrod M.C.

    1994-11-15

    The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of our interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method.

  12. Joint partially linear model for longitudinal data with informative drop-outs.

    PubMed

    Kim, Sehee; Zeng, Donglin; Taylor, Jeremy M G

    2017-03-01

    In biomedical research, a steep rise or decline in longitudinal biomarkers may indicate latent disease progression, which may subsequently cause patients to drop out of the study. Ignoring the informative drop-out can cause bias in estimation of the longitudinal model. In such cases, a full parametric specification may be insufficient to capture the complicated pattern of the longitudinal biomarkers. For these types of longitudinal data with the issue of informative drop-outs, we develop a joint partially linear model, with an aim to find the trajectory of the longitudinal biomarker. Specifically, an arbitrary function of time along with linear fixed and random covariate effects is proposed in the model for the biomarker, while a flexible semiparametric transformation model is used to describe the drop-out mechanism. Advantages of this semiparametric joint modeling approach are the following: 1) it provides an easier interpretation, compared to standard nonparametric regression models, and 2) it is a natural way to control for common (observable and unobservable) prognostic factors that may affect both the longitudinal trajectory and the drop-out process. We describe a sieve maximum likelihood estimation procedure using the EM algorithm, where the Akaike information criterion (AIC) and Bayesian information criterion (BIC) are considered to select the number of knots. We show that the proposed estimators achieve desirable asymptotic properties through empirical process theory. The proposed methods are evaluated by simulation studies and applied to prostate cancer data.

  13. A Cautionary Note on the Use of Information Fit Indexes in Covariance Structure Modeling with Means

    ERIC Educational Resources Information Center

    Wicherts, Jelte M.; Dolan, Conor V.

    2004-01-01

    Information fit indexes such as Akaike Information Criterion, Consistent Akaike Information Criterion, Bayesian Information Criterion, and the expected cross validation index can be valuable in assessing the relative fit of structural equation models that differ regarding restrictiveness. In cases in which models without mean restrictions (i.e.,…

  14. Predicting the potential distribution of invasive exotic species using GIS and information-theoretic approaches: A case of ragweed (Ambrosia artemisiifolia L.) distribution in China

    USGS Publications Warehouse

    Chen, H.; Chen, L.; Albright, Thomas P.

    2007-01-01

    Invasive exotic species pose a growing threat to the economy, public health, and ecological integrity of nations worldwide. Explaining and predicting the spatial distribution of invasive exotic species is of great importance to prevention and early warning efforts. We are investigating the potential distribution of invasive exotic species, the environmental factors that influence these distributions, and the ability to predict them using statistical and information-theoretic approaches. For some species, detailed presence/absence occurrence data are available, allowing the use of a variety of standard statistical techniques. However, for most species, absence data are not available. Presented with the challenge of developing a model based on presence-only information, we developed an improved logistic regression approach using Information Theory and Frequency Statistics to produce a relative suitability map. This paper generated a variety of distributions of ragweed (Ambrosia artemisiifolia L.) from logistic regression models applied to herbarium specimen location data and a suite of GIS layers including climatic, topographic, and land cover information. Our logistic regression model was based on Akaike's Information Criterion (AIC) from a suite of ecologically reasonable predictor variables. Based on the results we provided a new Frequency Statistical method to compartmentalize habitat-suitability in the native range. Finally, we used the model and the compartmentalized criterion developed in native ranges to "project" a potential distribution onto the exotic ranges to build habitat-suitability maps. ?? Science in China Press 2007.

  15. Putting Knowledge To Work Effectively: Assessing Information Needs through Focus Groups.

    ERIC Educational Resources Information Center

    Perry, Valerie E.

    This paper describes how focus groups were used to assess the effectiveness of the University of Kentucky's Agricultural Information Center (AIC) in providing patron services. The AIC serves 1,100 undergraduate students, 370 graduate and postdoctoral students, and 1,700 faculty and staff in the College of Agriculture. In August 2000, the AIC…

  16. The role of multicollinearity in landslide susceptibility assessment by means of Binary Logistic Regression: comparison between VIF and AIC stepwise selection

    NASA Astrophysics Data System (ADS)

    Cama, Mariaelena; Cristi Nicu, Ionut; Conoscenti, Christian; Quénéhervé, Geraldine; Maerker, Michael

    2016-04-01

    Landslide susceptibility can be defined as the likelihood of a landslide occurring in a given area on the basis of local terrain conditions. In the last decades many research focused on its evaluation by means of stochastic approaches under the assumption that 'the past is the key to the future' which means that if a model is able to reproduce a known landslide spatial distribution, it will be able to predict the future locations of new (i.e. unknown) slope failures. Among the various stochastic approaches, Binary Logistic Regression (BLR) is one of the most used because it calculates the susceptibility in probabilistic terms and its results are easily interpretable from a geomorphological point of view. However, very often not much importance is given to multicollinearity assessment whose effect is that the coefficient estimates are unstable, with opposite sign and therefore difficult to interpret. Therefore, it should be evaluated every time in order to make a model whose results are geomorphologically correct. In this study the effects of multicollinearity in the predictive performance and robustness of landslide susceptibility models are analyzed. In particular, the multicollinearity is estimated by means of Variation Inflation Index (VIF) which is also used as selection criterion for the independent variables (VIF Stepwise Selection) and compared to the more commonly used AIC Stepwise Selection. The robustness of the results is evaluated through 100 replicates of the dataset. The study area selected to perform this analysis is the Moldavian Plateau where landslides are among the most frequent geomorphological processes. This area has an increasing trend of urbanization and a very high potential regarding the cultural heritage, being the place of discovery of the largest settlement belonging to the Cucuteni Culture from Eastern Europe (that led to the development of the great complex Cucuteni-Tripyllia). Therefore, identifying the areas susceptible to

  17. Egg distributions and the information a solitary parasitoid has and uses for its oviposition decisions.

    PubMed

    Hemerik, Lia; van der Hoeven, Nelly; van Alphen, Jacques J M

    2002-01-01

    Approximately three decades ago the question was first answered "whether parasitoids are able to assess the number or origin of eggs in a host" for a solitary parasitoid, Leptopilina heterotoma, by fitting theoretically derived distributions to empirical ones. We extend the set of different theoretically postulated distributions of eggs among hosts by combining searching modes and abilities in assessing host quality. In the models, parasitoids search either randomly (Poisson) (1) or by vibrotaxis (Negative Binomial) (2). Parasitoids are: (a) assumed to treat all hosts equally, (b) able to distinguish them in unparasitised and parasitised hosts only, (c) able to distinguish them by the number of eggs they contained, or (d) able to recognise their own eggs. Mathematically tractable combinations of searching mode (1 and 2) and abilities (a,b,c,d) result in seven different models (M1a, M1b, M1c, M1d, M2a, M2b and M2c). These models have been simulated for a varying number of searching parasitoids and various mean numbers of eggs per host. Each resulting distribution is fitted to all theoretical models. The model with the minimum Akaike's information criterion (AIC) is chosen as the best fitting for each simulated distribution. We thus investigate the power of the AIC and for each distribution with a specified mean number of eggs per host we derive a frequency distribution for classification. Firstly, we discuss the simulations of models including random search (M1a, M1b, M1c and M1d). For M1a, M1c and M1d the simulated distributions are correctly classified in at least 70% of all cases. However, in a few cases model M1b is only properly classified for intermediate mean values of eggs per host. The models including vibrotaxis as searching behaviour (M2a, M2b and M2c) cannot be distinguished from those with random search if the mean number of eggs per host is low. Among the models incorporating vibrotaxis the three abilities are detected analogously as in models with

  18. Teacher's Corner: Conducting Specification Searches with Amos

    ERIC Educational Resources Information Center

    Schumacker, Randall E.

    2006-01-01

    Amos 5.0 (Arbuckle, 2003) permits exploratory specification searches for the best theoretical model given an initial model using the following fit function criteria: chi-square (C), chi-square--df (C--df), Akaike Information Criteria (AIC), Browne-Cudeck criterion (BCC), Bayes Information Criterion (BIC) , chi-square divided by the degrees of…

  19. Model weights and the foundations of multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2006-01-01

    Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.

  20. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  1. Information theoretic model selection applied to supernovae data

    NASA Astrophysics Data System (ADS)

    Biesiada, Marek

    2007-02-01

    Current advances in observational cosmology suggest that our Universe is flat and dominated by dark energy. There are several different theoretical ideas invoked to explain the dark energy with relatively little guidance of which one of them might be right. Therefore the emphasis of ongoing and forthcoming research in this field shifts from estimating specific parameters of the cosmological model to the model selection. In this paper we apply an information theoretic model selection approach based on the Akaike criterion as an estimator of Kullback Leibler entropy. Although this approach has already been used by some authors in a similar context, this paper provides a more systematic introduction to the Akaike criterion. In particular, we present the proper way of ranking the competing models on the basis of Akaike weights (in Bayesian language: posterior probabilities of the models). This important ingredient is lacking from alternative studies dealing with cosmological applications of the Akaike criterion. Of the many particular models of dark energy we focus on four: quintessence, quintessence with a time varying equation of state, the braneworld scenario and the generalized Chaplygin gas model, and test them on Riess's gold sample. As a result we obtain that the best model—in terms of the Akaike criterion—is the quintessence model. The odds suggest that although there exist differences in the support given to specific scenarios by supernova data, most of the models considered receive similar support. The only exception is the Chaplygin gas which is considerably less supported. One can also note that models similar in structure, e.g. ΛCDM, quintessence and quintessence with a variable equation of state, are closer to each other in terms of Kullback Leibler entropy. Models having different structure, e.g. Chaplygin gas and the braneworld scenario, are more distant (in the Kullback Leibler sense) from the best one.

  2. AICE Survey of USSR Air Pollution Literature, Volume 13: Technical Papers from the Leningrad International Symposium on the Meteorological Aspects of Atmospheric Pollution, Part 2.

    ERIC Educational Resources Information Center

    Nuttonson, M. Y., Ed.

    Twelve papers were translated from Russian: Automation of Information Processing Involved in Experimental Studies of Atmospheric Diffusion, Micrometeorological Characteristics of Atmospheric Pollution Conditions, Study of theInfluence of Irregularities of the Earth's Surface on the Air Flow Characteristics in a Wind Tunnel, Use of Parameters of…

  3. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    Many statistical tools have been developed for evaluating, understanding, and comparing models, from both frequentist and Bayesian perspectives. In particular, the problem of model selection can be addressed according to whether the primary goal is explanation or, alternatively, prediction. In the former case, the criteria for model selection are defined over the parameter space whose physical interpretation can be difficult; in the latter case, they are defined over the space of the observations, which has a more direct physical meaning. In the frequentist approaches, model selection is generally based on an asymptotic approximation which may be poor for small data sets (e.g. the F-test, the Kolmogorov-Smirnov test, etc.); moreover, these methods often apply under specific assumptions on models (e.g. models have to be nested in the likelihood ratio test). In the Bayesian context, among the criteria for explanation, the ratio of the observed marginal densities for two competing models, named Bayes Factor (BF), is commonly used for both model choice and model averaging (Kass and Raftery, J. Am. Stat. Ass., 1995). But BF does not apply to improper priors and, even when the prior is proper, it is not robust to the specification of the prior. These limitations can be extended to two famous penalized likelihood methods as the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), since they are proved to be approximations of -2log BF . In the perspective that a model is as good as its predictions, the predictive information criteria aim at evaluating the predictive accuracy of Bayesian models or, in other words, at estimating expected out-of-sample prediction error using a bias-correction adjustment of within-sample error (Gelman et al., Stat. Comput., 2014). In particular, the Watanabe criterion is fully Bayesian because it averages the predictive distribution over the posterior distribution of parameters rather than conditioning on a point

  4. Performance of soil particle-size distribution models for describing deposited soils adjacent to constructed dams in the China Loess Plateau

    NASA Astrophysics Data System (ADS)

    Zhao, Pei; Shao, Ming-an; Horton, Robert

    2011-02-01

    Soil particle-size distributions (PSD) have been used to estimate soil hydraulic properties. Various parametric PSD models have been proposed to describe the soil PSD from sparse experimental data. It is important to determine which PSD model best represents specific soils. Fourteen PSD models were examined in order to determine the best model for representing the deposited soils adjacent to dams in the China Loess Plateau; these were: Skaggs (S-1, S-2, and S-3), fractal (FR), Jaky (J), Lima and Silva (LS), Morgan (M), Gompertz (G), logarithm (L), exponential (E), log-exponential (LE), Weibull (W), van Genuchten type (VG) as well as Fredlund (F) models. Four-hundred and eighty samples were obtained from soils deposited in the Liudaogou catchment. The coefficient of determination (R 2), the Akaike's information criterion (AIC), and the modified AIC (mAIC) were used. Based upon R 2 and AIC, the three- and four-parameter models were both good at describing the PSDs of deposited soils, and the LE, FR, and E models were the poorest. However, the mAIC in conjunction with R 2 and AIC results indicated that the W model was optimum for describing PSD of the deposited soils for emphasizing the effect of parameter number. This analysis was also helpful for finding out which model is the best one. Our results are applicable to the China Loess Plateau.

  5. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  6. In Vitro Drug Combination Studies of Letermovir (AIC246, MK-8228) with Approved Anti-Human Cytomegalovirus (HCMV) and Anti-HIV Compounds in Inhibition of HCMV and HIV Replication

    PubMed Central

    Wildum, Steffen; Zimmermann, Holger

    2015-01-01

    Despite modern prevention and treatment strategies, human cytomegalovirus (HCMV) remains a common opportunistic pathogen associated with serious morbidity and mortality in immunocompromised individuals, such as transplant recipients and AIDS patients. All drugs currently licensed for the treatment of HCMV infection target the viral DNA polymerase and are associated with severe toxicity issues and the emergence of drug resistance. Letermovir (AIC246, MK-8228) is a new anti-HCMV agent in clinical development that acts via a novel mode of action and has demonstrated anti-HCMV activity in vitro and in vivo. For the future, drug combination therapies, including letermovir, might be indicated under special medical conditions, such as the emergence of multidrug-resistant virus strains in transplant recipients or in HCMV-HIV-coinfected patients. Accordingly, knowledge of the compatibility of letermovir with other HCMV or HIV antivirals is of medical importance. Here, we evaluated the inhibition of HCMV replication by letermovir in combination with all currently approved HCMV antivirals using cell culture checkerboard assays. In addition, the effects of letermovir on the antiviral activities of selected HIV drugs, and vice versa, were analyzed. Using two different mathematical techniques to analyze the experimental data, (i) additive effects were observed for the combination of letermovir with anti-HCMV drugs and (ii) no interaction was found between letermovir and anti-HIV drugs. Since none of the tested drug combinations significantly antagonized letermovir efficacy (or vice versa), our findings suggest that letermovir may offer the potential for combination therapy with the tested HCMV and HIV drugs. PMID:25779572

  7. Appropriate model selection methods for nonstationary generalized extreme value models

    NASA Astrophysics Data System (ADS)

    Kim, Hanbeen; Kim, Sooyoung; Shin, Hongjoon; Heo, Jun-Haeng

    2017-04-01

    Several evidences of hydrologic data series being nonstationary in nature have been found to date. This has resulted in the conduct of many studies in the area of nonstationary frequency analysis. Nonstationary probability distribution models involve parameters that vary over time. Therefore, it is not a straightforward process to apply conventional goodness-of-fit tests to the selection of an appropriate nonstationary probability distribution model. Tests that are generally recommended for such a selection include the Akaike's information criterion (AIC), corrected Akaike's information criterion (AICc), Bayesian information criterion (BIC), and likelihood ratio test (LRT). In this study, the Monte Carlo simulation was performed to compare the performances of these four tests, with regard to nonstationary as well as stationary generalized extreme value (GEV) distributions. Proper model selection ratios and sample sizes were taken into account to evaluate the performances of all the four tests. The BIC demonstrated the best performance with regard to stationary GEV models. In case of nonstationary GEV models, the AIC proved to be better than the other three methods, when relatively small sample sizes were considered. With larger sample sizes, the AIC, BIC, and LRT presented the best performances for GEV models which have nonstationary location and/or scale parameters, respectively. Simulation results were then evaluated by applying all four tests to annual maximum rainfall data of selected sites, as observed by the Korea Meteorological Administration.

  8. Tomography and Location Problems in China Using Regional Travel-Time Data

    DTIC Science & Technology

    2001-10-01

    AIC) as a norm (Akaike, 1973; McQuarrie and Tsai, 1998). From a practical viewpoint, the AIC is used to determine the damping constants in a damped...Upper mantle velocity structure beneath the Tibetan Plateau from Pn travel time tomography, J. Geophysical Research, 102, 493-505. McQuarrie , A

  9. Chemical shift prediction for protein structure calculation and quality assessment using an optimally parameterized force field.

    PubMed

    Nielsen, Jakob T; Eghbalnia, Hamid R; Nielsen, Niels Chr

    2012-01-01

    The exquisite sensitivity of chemical shifts as reporters of structural information, and the ability to measure them routinely and accurately, gives great import to formulations that elucidate the structure-chemical-shift relationship. Here we present a new and highly accurate, precise, and robust formulation for the prediction of NMR chemical shifts from protein structures. Our approach, shAIC (shift prediction guided by Akaikes Information Criterion), capitalizes on mathematical ideas and an information-theoretic principle, to represent the functional form of the relationship between structure and chemical shift as a parsimonious sum of smooth analytical potentials which optimally takes into account short-, medium-, and long-range parameters in a nuclei-specific manner to capture potential chemical shift perturbations caused by distant nuclei. shAIC outperforms the state-of-the-art methods that use analytical formulations. Moreover, for structures derived by NMR or structures with novel folds, shAIC delivers better overall results; even when it is compared to sophisticated machine learning approaches. shAIC provides for a computationally lightweight implementation that is unimpeded by molecular size, making it an ideal for use as a force field.

  10. Modelling road accidents: An approach using structural time series

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  11. A hybrid model to simulate the annual runoff of the Kaidu River in northwest China

    NASA Astrophysics Data System (ADS)

    Xu, Jianhua; Chen, Yaning; Bai, Ling; Xu, Yiwen

    2016-04-01

    Fluctuant and complicated hydrological processes can result in the uncertainty of runoff forecasting. Thus, it is necessary to apply the multi-method integrated modeling approaches to simulate runoff. Integrating the ensemble empirical mode decomposition (EEMD), the back-propagation artificial neural network (BPANN) and the nonlinear regression equation, we put forward a hybrid model to simulate the annual runoff (AR) of the Kaidu River in northwest China. We also validate the simulated effects by using the coefficient of determination (R2) and the Akaike information criterion (AIC) based on the observed data from 1960 to 2012 at the Dashankou hydrological station. The average absolute and relative errors show the high simulation accuracy of the hybrid model. R2 and AIC both illustrate that the hybrid model has a much better performance than the single BPANN. The hybrid model and integrated approach elicited by this study can be applied to simulate the annual runoff of similar rivers in northwest China.

  12. Dietary Information Improves Model Performance and Predictive Ability of a Noninvasive Type 2 Diabetes Risk Model

    PubMed Central

    Han, Tianshu; Tian, Shuang; Wang, Li; Liang, Xi; Cui, Hongli; Du, Shanshan; Na, Guanqiong; Na, Lixin; Sun, Changhao

    2016-01-01

    There is no diabetes risk model that includes dietary predictors in Asia. We sought to develop a diet-containing noninvasive diabetes risk model in Northern China and to evaluate whether dietary predictors can improve model performance and predictive ability. Cross-sectional data for 9,734 adults aged 20–74 years old were used as the derivation data, and results obtained for a cohort of 4,515 adults with 4.2 years of follow-up were used as the validation data. We used a logistic regression model to develop a diet-containing noninvasive risk model. Akaike’s information criterion (AIC), area under curve (AUC), integrated discrimination improvements (IDI), net classification improvement (NRI) and calibration statistics were calculated to explicitly assess the effect of dietary predictors on a diabetes risk model. A diet-containing type 2 diabetes risk model was developed. The significant dietary predictors including the consumption of staple foods, livestock, eggs, potato, dairy products, fresh fruit and vegetables were included in the risk model. Dietary predictors improved the noninvasive diabetes risk model with a significant increase in the AUC (delta AUC = 0.03, P<0.001), an increase in relative IDI (24.6%, P-value for IDI <0.001), an increase in NRI (category-free NRI = 0.155, P<0.001), an increase in sensitivity of the model with 7.3% and a decrease in AIC (delta AIC = 199.5). The results of the validation data were similar to the derivation data. The calibration of the diet-containing diabetes risk model was better than that of the risk model without dietary predictors in the validation data. Dietary information improves model performance and predictive ability of noninvasive type 2 diabetes risk model based on classic risk factors. Dietary information may be useful for developing a noninvasive diabetes risk model. PMID:27851788

  13. Rubber yield prediction by meteorological conditions using mixed models and multi-model inference techniques.

    PubMed

    Golbon, Reza; Ogutu, Joseph Ochieng; Cotter, Marc; Sauerborn, Joachim

    2015-12-01

    Linear mixed models were developed and used to predict rubber (Hevea brasiliensis) yield based on meteorological conditions to which rubber trees had been exposed for periods ranging from 1 day to 2 months prior to tapping events. Predictors included a range of moving averages of meteorological covariates spanning different windows of time before the date of the tapping events. Serial autocorrelation in the latex yield measurements was accounted for using random effects and a spatial generalization of the autoregressive error covariance structure suited to data sampled at irregular time intervals. Information theoretics, specifically the Akaike information criterion (AIC), AIC corrected for small sample size (AICc), and Akaike weights, was used to select models with the greatest strength of support in the data from a set of competing candidate models. The predictive performance of the selected best model was evaluated using both leave-one-out cross-validation (LOOCV) and an independent test set. Moving averages of precipitation, minimum and maximum temperature, and maximum relative humidity with a 30-day lead period were identified as the best yield predictors. Prediction accuracy expressed in terms of the percentage of predictions within a measurement error of 5 g for cross-validation and also for the test dataset was above 99 %.

  14. Rubber yield prediction by meteorological conditions using mixed models and multi-model inference techniques

    NASA Astrophysics Data System (ADS)

    Golbon, Reza; Ogutu, Joseph Ochieng; Cotter, Marc; Sauerborn, Joachim

    2015-12-01

    Linear mixed models were developed and used to predict rubber ( Hevea brasiliensis) yield based on meteorological conditions to which rubber trees had been exposed for periods ranging from 1 day to 2 months prior to tapping events. Predictors included a range of moving averages of meteorological covariates spanning different windows of time before the date of the tapping events. Serial autocorrelation in the latex yield measurements was accounted for using random effects and a spatial generalization of the autoregressive error covariance structure suited to data sampled at irregular time intervals. Information theoretics, specifically the Akaike information criterion (AIC), AIC corrected for small sample size (AICc), and Akaike weights, was used to select models with the greatest strength of support in the data from a set of competing candidate models. The predictive performance of the selected best model was evaluated using both leave-one-out cross-validation (LOOCV) and an independent test set. Moving averages of precipitation, minimum and maximum temperature, and maximum relative humidity with a 30-day lead period were identified as the best yield predictors. Prediction accuracy expressed in terms of the percentage of predictions within a measurement error of 5 g for cross-validation and also for the test dataset was above 99 %.

  15. Does weather confound or modify the association of particulate air pollution with mortality? An analysis of the Philadelphia data, 1973--1980

    SciTech Connect

    Samet, J.; Zeger, S.; Kelsall, J.; Xu, J.; Kalkstein, L.

    1998-04-01

    This report considers the consequences of using alternative approaches to controlling for weather and explores modification of air pollution effects by weather, as weather patterns could plausibly alter air pollution`s effect on health. The authors analyzed 1973--1980 total mortality data for Philadelphia using four weather models and compared estimates of the effects of TSP and SO{sub 2} on mortality using a Poisson regression model. Two synoptic categories developed by Kalkstein were selected--The Temporal Synoptic Index (TSI) and the Spatial Synoptic Classification (SSC)--and compared with (1) descriptive models developed by Schwartz and Dockery (S-D); and (2) LOESS, a nonparametric function of the previous day`s temperature and dew point. The authors considered model fit using Akaike`s Information Criterion (AIC) and changes in the estimated effects of TSP and SO{sub 2}. In the full-year analysis, S-D is better than LOESS at predicting mortality, and S-D and LOESS are better than TSI, as measured by AIC. When TSP or SO{sub 2} was fit alone, the results were qualitatively similar, regardless of how weather was controlled; when TSP and SO{sub 2} were fit simultaneously, the S-D and LOESS models give qualitatively different results than TSI, which attributes more of the pollution effect to SO{sub 2} than to TSP. Model fit is substantially poorer with TSI.

  16. [Species-abundance distribution patterns along succession series of Phyllostachys glauca forest in a limestone mountain].

    PubMed

    Shi, Jian-min; Fan, Cheng-fang; Liu, Yang; Yang, Qing-pei; Fang, Kai; Fan, Fang-li; Yang, Guang-yao

    2015-12-01

    To detect the ecological process of the succession series of Phyllostachys glauca forest in a limestone mountain, five niche models, i.e., broken stick model (BSM), niche preemption model (NPM), dominance preemption model (DPM), random assortment model (RAM) and overlap- ping niche model (ONM) were employed to describe the species-abundance distribution patterns (SDPs) of 15 samples. χ² test and Akaike information criterion (AIC) were used to test the fitting effects of the five models. The results showed that the optimal SDP models for P. glauca forest, bamboo-broadleaved mixed forest and broadleaved forest were DPM (χ² = 35.86, AIC = -69.77), NPM (χ² = 1.60, AIC = -94.68) and NPM (χ² = 0.35, AIC = -364.61), respectively. BSM also well fitted the SDP of bamboo-broadleaved mixed forest and broad-leaved forest, while it was unsuitable to describe the SDP of P. glauca forest. The fittings of RAM and ONM in the three forest types were all rejected by the χ² test and AIC. With the development of community succession from P. glauca forest to broadleaved forest, the species richness and evenness increased, and the optimal SDP model changed from DPM to NPM. It was inferred that the change of ecological process from habitat filtration to interspecific competition was the main driving force of the forest succession. The results also indicated that the application of multiple SDP models and test methods would be beneficial to select the best model and deeply understand the ecological process of community succession.

  17. Online writer identification using alphabetic information clustering

    NASA Astrophysics Data System (ADS)

    Tan, Guo Xian; Viard-Gaudin, Christian; Kot, Alex C.

    2009-01-01

    Writer identification is a topic of much renewed interest today because of its importance in applications such as writer adaptation, routing of documents and forensic document analysis. Various algorithms have been proposed to handle such tasks. Of particular interests are the approaches that use allographic features [1-3] to perform a comparison of the documents in question. The allographic features are used to define prototypes that model the unique handwriting styles of the individual writers. This paper investigates a novel perspective that takes alphabetic information into consideration when the allographic features are clustered into prototypes at the character level. We hypothesize that alphabetic information provides additional clues which help in the clustering of allographic prototypes. An alphabet information coefficient (AIC) has been introduced in our study and the effect of this coefficient is presented. Our experiments showed an increase of writer identification accuracy from 66.0% to 87.0% when alphabetic information was used in conjunction with allographic features on a database of 200 reference writers.

  18. Model selection for geostatistical models.

    PubMed

    Hoeting, Jennifer A; Davis, Richard A; Merton, Andrew A; Thompson, Sandra E

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is often ignored in the selection of explanatory variables, and this can influence model selection results. For example, the importance of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often-used traditional approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also apply the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored. R software to implement the geostatistical model selection methods described in this paper is available in the Supplement.

  19. Refinement of arrival-time picks using a cross-correlation based workflow

    NASA Astrophysics Data System (ADS)

    Akram, Jubran; Eaton, David W.

    2016-12-01

    We propose a new iterative workflow based on cross-correlation for improved arrival-time picking on microseismic data. In this workflow, signal-to-noise ratio (S/N) and polarity weighted stacking are used to minimize the effect of S/N and polarity fluctuations on the pilot waveform computation. We use an exhaustive search technique for polarity estimation through stack power maximization. We use pseudo-synthetic and real microseismic data from western Canada in order to demonstrate the effectiveness of proposed workflow relative to Akaike information criterion (AIC) and a previously published cross-correlation based method. The pseudo-synthetic microseismic waveforms are obtained by introducing Gaussian noise and polarity fluctuations into waveforms from a high S/N microseismic event. We find that the cross-correlation based approaches yield more accurate arrival-time picks as compared to AIC for low S/N waveforms. AIC is not affected by waveform polarities as it works on individual receiver levels whereas the accuracy of existing cross-correlation method decreases in spite of using envelope correlation. We show that our proposed workflow yields better and consistent arrival-time picks regardless of waveform amplitude and polarity variations within the receiver array. After refinement, the initial arrival-time picks are located closer to the best estimated manual picks.

  20. Information criteria for efficient quantum state estimation

    SciTech Connect

    Yin, J. O. S.; Enk, S. J. van

    2011-06-15

    Recently several more efficient versions of quantum state tomography have been proposed, with the purpose of making tomography feasible even for many-qubit states. The number of state parameters to be estimated is reduced by tentatively introducing certain simplifying assumptions on the form of the quantum state, and subsequently using the data to rigorously verify these assumptions. The simplifying assumptions considered so far were (i) the state can be well approximated to be of low rank, or (ii) the state can be well approximated as a matrix product state, or (iii) only the permutationally invariant part of the density matrix is determined. We add one more method in that same spirit: We allow in principle any model for the state, using any (small) number of parameters (which can, e.g., be chosen to have a clear physical meaning), and the data are used to verify the model. The proof that this method is valid cannot be as strict as in the above-mentioned cases, but is based on well-established statistical methods that go under the name of ''information criteria.'' We exploit here, in particular, the Akaike information criterion. We illustrate the method by simulating experiments on (noisy) Dicke states.

  1. Power-law ansatz in complex systems: Excessive loss of information

    NASA Astrophysics Data System (ADS)

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.

  2. Power-law ansatz in complex systems: Excessive loss of information.

    PubMed

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.

  3. Hardware-Algorithms Co-Design and Implementation of an Analog-to-Information Converter for Biosignals Based on Compressed Sensing.

    PubMed

    Pareschi, Fabio; Albertini, Pierluigi; Frattini, Giovanni; Mangia, Mauro; Rovatti, Riccardo; Setti, Gianluca

    2016-02-01

    We report the design and implementation of an Analog-to-Information Converter (AIC) based on Compressed Sensing (CS). The system is realized in a CMOS 180 nm technology and targets the acquisition of bio-signals with Nyquist frequency up to 100 kHz. To maximize performance and reduce hardware complexity, we co-design hardware together with acquisition and reconstruction algorithms. The resulting AIC outperforms previously proposed solutions mainly thanks to two key features. First, we adopt a novel method to deal with saturations in the computation of CS measurements. This allows no loss in performance even when 60% of measurements saturate. Second, the system is able to adapt itself to the energy distribution of the input by exploiting the so-called rakeness to maximize the amount of information contained in the measurements. With this approach, the 16 measurement channels integrated into a single device are expected to allow the acquisition and the correct reconstruction of most biomedical signals. As a case study, measurements on real electrocardiograms (ECGs) and electromyograms (EMGs) show signals that these can be reconstructed without any noticeable degradation with a compression rate, respectively, of 8 and 10.

  4. Scaling clearance in paediatric pharmacokinetics: All models are wrong, which are useful?

    PubMed Central

    Germovsek, Eva; Barker, Charlotte I. S.; Sharland, Mike

    2016-01-01

    Linked Articles This article is commented on in the editorial by Holford NHG and Anderson BJ. Why standards are useful for predicting doses. Br J Clin Pharmacol 2017; 83: 685–7. doi: 10.1111/bcp.13230 Aim When different models for weight and age are used in paediatric pharmacokinetic studies it is difficult to compare parameters between studies or perform model‐based meta‐analyses. This study aimed to compare published models with the proposed standard model (allometric weight0.75 and sigmoidal maturation function). Methods A systematic literature search was undertaken to identify published clearance (CL) reports for gentamicin and midazolam and all published models for scaling clearance in children. Each model was fitted to the CL values for gentamicin and midazolam, and the results compared with the standard model (allometric weight exponent of 0.75, along with a sigmoidal maturation function estimating the time in weeks of postmenstrual age to reach half the mature value and a shape parameter). For comparison, we also looked at allometric size models with no age effect, the influence of estimating the allometric exponent in the standard model and, for gentamicin, using a fixed allometric exponent of 0.632 as per a study on glomerular filtration rate maturation. Akaike information criteria (AIC) and visual predictive checks were used for evaluation. Results No model gave an improved AIC in all age groups, but one model for gentamicin and three models for midazolam gave slightly improved global AIC fits albeit using more parameters: AIC drop (number of parameters), –4.1 (5), –9.2 (4), –10.8 (5) and –10.1 (5), respectively. The 95% confidence interval of estimated CL for all top performing models overlapped. Conclusion No evidence to reject the standard model was found; given the benefits of standardised parameterisation, its use should therefore be recommended. PMID:27767204

  5. Spot counting on fluorescence in situ hybridization in suspension images using Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Liu, Sijia; Sa, Ruhan; Maguire, Orla; Minderman, Hans; Chaudhary, Vipin

    2015-03-01

    Cytogenetic abnormalities are important diagnostic and prognostic criteria for acute myeloid leukemia (AML). A flow cytometry-based imaging approach for FISH in suspension (FISH-IS) was established that enables the automated analysis of several log-magnitude higher number of cells compared to the microscopy-based approaches. The rotational positioning can occur leading to discordance between spot count. As a solution of counting error from overlapping spots, in this study, a Gaussian Mixture Model based classification method is proposed. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) of GMM are used as global image features of this classification method. Via Random Forest classifier, the result shows that the proposed method is able to detect closely overlapping spots which cannot be separated by existing image segmentation based spot detection methods. The experiment results show that by the proposed method we can obtain a significant improvement in spot counting accuracy.

  6. Predicting road accidents: Structural time series approach

    NASA Astrophysics Data System (ADS)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  7. Information-criterion based selection of models for community noise annoyance.

    PubMed

    Keith Wilson, D; Valente, Dan; Nykaza, Edward T; Pettit, Chris L

    2013-03-01

    Statistical evidence for various models relating day-night sound level (DNL) to community noise annoyance is assessed with the Akaike information criterion. In particular, community-specific adjustments such as the community tolerance level (CTL, the DNL at which 50% of survey respondents are highly annoyed) and community tolerance spread (CTS, the difference between the DNL at which 90% and 10% are highly annoyed) are considered. The results strongly support models characterizing annoyance on a community-by-community basis, rather than with complete pooling and analysis of all available surveys. The most likely model was found to be a 2-parameter logistic model, with CTL and CTS fit independently to survey data from each community.

  8. EFFECT OF DIET QUALITY ON NUTRIENT ALLOCATION TO THE TEST AND ARISTOTLE'S LANTERN IN THE SEA URCHIN LYTECHINUS VARIEGATUS (LAMARCK, 1816).

    PubMed

    Heflin, Laura Elizabeth; Gibbs, Victoria K; Powell, Mickie L; Makowsky, Robert; Lawrence, Addison L; Lawrence, John M

    2012-08-01

    Small adult (19.50 ± 2.01g wet weight) Lytechinus variegatus were fed eight formulated diets with different protein (12 to 36% dry weight as fed) and carbohydrate (21 to 39 % dry weight) levels. Each sea urchin (n = 8 per treatment) was fed a daily ration of 1.5% of the average body weight of all individuals for 9 weeks. Akaike information criterion scores were used to compare six different dietary composition hypotheses for eight growth measurements. For each physical growth response, different mathematical models representing a priori hypotheses were compared using the Akaike Information Criterion (AIC) score. The AIC is one of many information-theoretic approaches that allows for direct comparison of non-nested models with varying number of parameters. Dietary protein level and protein: energy ratio were the best models for prediction of test diameter increase. Dietary protein level was the best model of test with spines wet weight gain and test with spines dry matter production. When the Aristotle's lantern was corrected for size of the test, there was an inverse relationship with dietary protein level. Log transformed lantern to test with spines index was also best associated with the dietary protein model. Dietary carbohydrate level was a poor predictor for growth parameters. However, the protein × carbohydrate interaction model was the best model of organic content (% dry weight) of the test without spines. These data suggest that there is a differential allocation of resources when dietary protein is limiting and the test with spines, but not the Aristotle's lantern, is affected by availability of dietary nutrients.

  9. EFFECT OF DIET QUALITY ON NUTRIENT ALLOCATION TO THE TEST AND ARISTOTLE’S LANTERN IN THE SEA URCHIN LYTECHINUS VARIEGATUS (LAMARCK, 1816)

    PubMed Central

    Heflin, Laura Elizabeth; Gibbs, Victoria K; Powell, Mickie L; Makowsky, Robert; Lawrence, Addison L; Lawrence, John M

    2014-01-01

    Small adult (19.50 ± 2.01g wet weight) Lytechinus variegatus were fed eight formulated diets with different protein (12 to 36% dry weight as fed) and carbohydrate (21 to 39 % dry weight) levels. Each sea urchin (n = 8 per treatment) was fed a daily ration of 1.5% of the average body weight of all individuals for 9 weeks. Akaike information criterion scores were used to compare six different dietary composition hypotheses for eight growth measurements. For each physical growth response, different mathematical models representing a priori hypotheses were compared using the Akaike Information Criterion (AIC) score. The AIC is one of many information-theoretic approaches that allows for direct comparison of non-nested models with varying number of parameters. Dietary protein level and protein: energy ratio were the best models for prediction of test diameter increase. Dietary protein level was the best model of test with spines wet weight gain and test with spines dry matter production. When the Aristotle’s lantern was corrected for size of the test, there was an inverse relationship with dietary protein level. Log transformed lantern to test with spines index was also best associated with the dietary protein model. Dietary carbohydrate level was a poor predictor for growth parameters. However, the protein × carbohydrate interaction model was the best model of organic content (% dry weight) of the test without spines. These data suggest that there is a differential allocation of resources when dietary protein is limiting and the test with spines, but not the Aristotle’s lantern, is affected by availability of dietary nutrients. PMID:25431520

  10. Thermal Signature Identification System (TheSIS)

    NASA Technical Reports Server (NTRS)

    Merritt, Scott; Bean, Brian

    2015-01-01

    We characterize both nonlinear and high order linear responses of fiber-optic and optoelectronic components using spread spectrum temperature cycling methods. This Thermal Signature Identification System (TheSIS) provides much more detail than conventional narrowband or quasi-static temperature profiling methods. This detail allows us to match components more thoroughly, detect subtle reversible shifts in performance, and investigate the cause of instabilities or irreversible changes. In particular, we create parameterized models of athermal fiber Bragg gratings (FBGs), delay line interferometers (DLIs), and distributed feedback (DFB) lasers, then subject the alternative models to selection via the Akaike Information Criterion (AIC). Detailed pairing of components, e.g. FBGs, is accomplished by means of weighted distance metrics or norms, rather than on the basis of a single parameter, such as center wavelength.

  11. MMI: Multimodel inference or models with management implications?

    USGS Publications Warehouse

    Fieberg, J.; Johnson, Douglas H.

    2015-01-01

    We consider a variety of regression modeling strategies for analyzing observational data associated with typical wildlife studies, including all subsets and stepwise regression, a single full model, and Akaike's Information Criterion (AIC)-based multimodel inference. Although there are advantages and disadvantages to each approach, we suggest that there is no unique best way to analyze data. Further, we argue that, although multimodel inference can be useful in natural resource management, the importance of considering causality and accurately estimating effect sizes is greater than simply considering a variety of models. Determining causation is far more valuable than simply indicating how the response variable and explanatory variables covaried within a data set, especially when the data set did not arise from a controlled experiment. Understanding the causal mechanism will provide much better predictions beyond the range of data observed. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  12. A K-BKZ Formulation for Soft-Tissue Viscoelasticity

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.; Diethelm, Kai

    2005-01-01

    A viscoelastic model of the K-BKZ (Kaye 1962; Bernstein et al. 1963) type is developed for isotropic biological tissues, and applied to the fat pad of the human heel. To facilitate this pursuit, a class of elastic solids is introduced through a novel strain-energy function whose elements possess strong ellipticity, and therefore lead to stable material models. The standard fractional-order viscoelastic (FOV) solid is used to arrive at the overall elastic/viscoelastic structure of the model, while the elastic potential via the K-BKZ hypothesis is used to arrive at the tensorial structure of the model. Candidate sets of functions are proposed for the elastic and viscoelastic material functions present in the model, including a regularized fractional derivative that was determined to be the best. The Akaike information criterion (AIC) is advocated for performing multi-model inference, enabling an objective selection of the best material function from within a candidate set.

  13. Particle-size distribution models for the conversion of Chinese data to FAO/USDA system.

    PubMed

    Shangguan, Wei; Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r (2), Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best.

  14. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  15. Comparing seismic tomographic images from automatically- and manually-detected arrival times

    NASA Astrophysics Data System (ADS)

    Spallarossa, Daniele; Scafidi, Davide; Turino, Chiara; Ferretti, Gabriele; Viganò, Alfio

    2013-04-01

    In this work we compare local earthquake tomographic images obtained using arrival times detected by an automatic picking procedure and by an expert seismologist. For this purpose we select a reference dataset composed of 476 earthquakes occurred in the Trentino region (north-eastern Italy) in the period 1994-2007. Local magnitudes are comprised between 0.8 and 5.3. Original recordings are mainly from the Provincia Autonoma di Trento (PAT), and from other networks operating in the surrounding areas (Istituto Nazionale di Oceanografia e di Geofisica Sperimentale - INOGS; Istituto Nazionale di Geofisica e Vulcanologia - INGV; others available via the European Integrated Data Archive). The automatic picking of P and S phases is performed through a picker engine based on the Akaike information criterion (AIC). In particular, the proposed automatic phase picker includes: (i) envelope calculation, (ii) band-pass filtering, (iii) Akaike information criterion (AIC) detector for both P- and S-arrivals, (iv) checking for impulsive arrivals, (v) evaluation of expected S onset on the basis of a preliminary location derived from the P-arrival times, and (vi) quality assessment. Simultaneously, careful manual inspection by expert seismologists is applied to the same waveform dataset, to obtain manually-repicked phase readings. Both automatic and manual procedures generate a comparable amount of readings (about 6000 P- and 5000 S-phases). These data are used for the determination of two similar 3-D propagation models for the Trentino region, applying the SIMULPS code. In order to quantitatively estimate the difference of these two models we measure their discrepancies in terms of velocity at all grid points. The small differences observed among tomographic results allow us to demonstrate that the automatic picking engine adopted in this test can be used for reprocessing large amount of seismic recordings with the aim of perform a local tomographic study with an accuracy

  16. Predicting CCHF incidence and its related factors using time-series analysis in the southeast of Iran: comparison of SARIMA and Markov switching models.

    PubMed

    Ansari, H; Mansournia, M A; Izadi, S; Zeinali, M; Mahmoodi, M; Holakouie-Naieni, K

    2015-03-01

    Crimean-Congo haemorrhagic fever (CCHF) is endemic in the southeast of Iran. This study aimed to predict the incidence of CCHF and its related factors and explore the possibility of developing an empirical forecast system using time-series analysis of 13 years' data. Data from 2000 to 2012 were obtained from the Health Centre of Zahedan University of Medical Sciences, Climate Organization and the Veterinary Organization in the southeast of Iran. Seasonal autoregressive integrated moving average (SARIMA) and Markov switching models (MSM) were performed to examine the potential related factors of CCHF outbreaks. These models showed that the mean temperature (°C), accumulated rainfall (mm), maximum relative humidity (%) and legal livestock importation from Pakistan (LIP) were significantly correlated with monthly incidence of CCHF in different lags (P < 0·05). The modelling fitness was checked with data from 2013. Model assessments indicated that the MSM had better predictive ability than the SARIMA model [MSM: root mean square error (RMSE) 0·625, Akaike's Information Criterion (AIC) 266·33; SARIMA: RMSE 0·725, AIC 278·8]. This study shows the potential of climate indicators and LIP as predictive factors in modelling the occurrence of CCHF. Our results suggest that MSM provides more information on outbreak detection and can be a better predictive model compared to a SARIMA model for evaluation of the relationship between explanatory variables and the incidence of CCHF.

  17. Assessment of non-Gaussian diffusion with singly and doubly stretched biexponential models of diffusion-weighted MRI (DWI) signal attenuation in prostate tissue.

    PubMed

    Hall, Matt G; Bongers, Andre; Sved, Paul; Watson, Geoffrey; Bourne, Roger M

    2015-04-01

    Non-Gaussian diffusion dynamics was investigated in the two distinct water populations identified by a biexponential model of diffusion in prostate tissue. Diffusion-weighted MRI (DWI) signal attenuation was measured ex vivo in two formalin-fixed prostates at 9.4 T with diffusion times Δ = 10, 20 and 40 ms, and b values in the range 0.017-8.2 ms/µm(2) . A conventional biexponential model was compared with models in which either the lower diffusivity component or both of the components of the biexponential were stretched. Models were compared using Akaike's Information Criterion (AIC) and a leave-one-out (LOO) test of model prediction accuracy. The doubly stretched (SS) model had the highest LOO prediction accuracy and lowest AIC (highest information content) in the majority of voxels at Δ = 10 and 20 ms. The lower diffusivity stretching factor (α2 ) of the SS model was consistently lower (range ~0.3-0.9) than the higher diffusivity stretching factor (α1 , range ~0.7-1.1), indicating a high degree of diffusion heterogeneity in the lower diffusivity environment, and nearly Gaussian diffusion in the higher diffusivity environment. Stretched biexponential models demonstrate that, in prostate tissue, the two distinct water populations identified by the simple biexponential model individually exhibit non-Gaussian diffusion dynamics.

  18. The evaluation of different forest structural indices to predict the stand aboveground biomass of even-aged Scotch pine (Pinus sylvestris L.) forests in Kunduz, Northern Turkey.

    PubMed

    Ercanli, İlker; Kahriman, Aydın

    2015-03-01

    We assessed the effect of stand structural diversity, including the Shannon, improved Shannon, Simpson, McIntosh, Margelef, and Berger-Parker indices, on stand aboveground biomass (AGB) and developed statistical prediction models for the stand AGB values, including stand structural diversity indices and some stand attributes. The AGB prediction model, including only stand attributes, accounted for 85 % of the total variance in AGB (R (2)) with an Akaike's information criterion (AIC) of 807.2407, Bayesian information criterion (BIC) of 809.5397, Schwarz Bayesian criterion (SBC) of 818.0426, and root mean square error (RMSE) of 38.529 Mg. After inclusion of the stand structural diversity into the model structure, considerable improvement was observed in statistical accuracy, including 97.5 % of the total variance in AGB, with an AIC of 614.1819, BIC of 617.1242, SBC of 633.0853, and RMSE of 15.8153 Mg. The predictive fitting results indicate that some indices describing the stand structural diversity can be employed as significant independent variables to predict the AGB production of the Scotch pine stand. Further, including the stand diversity indices in the AGB prediction model with the stand attributes provided important predictive contributions in estimating the total variance in AGB.

  19. Comparison of alternatives to amplitude thresholding for onset detection of acoustic emission signals

    NASA Astrophysics Data System (ADS)

    Bai, F.; Gagar, D.; Foote, P.; Zhao, Y.

    2017-02-01

    Acoustic Emission (AE) monitoring can be used to detect the presence of damage as well as determine its location in Structural Health Monitoring (SHM) applications. Information on the time difference of the signal generated by the damage event arriving at different sensors in an array is essential in performing localisation. Currently, this is determined using a fixed threshold which is particularly prone to errors when not set to optimal values. This paper presents three new methods for determining the onset of AE signals without the need for a predetermined threshold. The performance of the techniques is evaluated using AE signals generated during fatigue crack growth and compared to the established Akaike Information Criterion (AIC) and fixed threshold methods. It was found that the 1D location accuracy of the new methods was within the range of < 1 - 7.1 % of the monitored region compared to 2.7% for the AIC method and a range of 1.8-9.4% for the conventional Fixed Threshold method at different threshold levels.

  20. Model selection for the extraction of movement primitives

    PubMed Central

    Endres, Dominik M.; Chiovetto, Enrico; Giese, Martin A.

    2013-01-01

    A wide range of blind source separation methods have been used in motor control research for the extraction of movement primitives from EMG and kinematic data. Popular examples are principal component analysis (PCA), independent component analysis (ICA), anechoic demixing, and the time-varying synergy model (d'Avella and Tresch, 2002). However, choosing the parameters of these models, or indeed choosing the type of model, is often done in a heuristic fashion, driven by result expectations as much as by the data. We propose an objective criterion which allows to select the model type, number of primitives and the temporal smoothness prior. Our approach is based on a Laplace approximation to the posterior distribution of the parameters of a given blind source separation model, re-formulated as a Bayesian generative model. We first validate our criterion on ground truth data, showing that it performs at least as good as traditional model selection criteria [Bayesian information criterion, BIC (Schwarz, 1978) and the Akaike Information Criterion (AIC) (Akaike, 1974)]. Then, we analyze human gait data, finding that an anechoic mixture model with a temporal smoothness constraint on the sources can best account for the data. PMID:24391580

  1. Theorizing Information for Information Science.

    ERIC Educational Resources Information Center

    Cornelius, Ian

    2002-01-01

    Considers whether information science has a theory of information. Highlights include guides to information and its theory; constructivism; information outside information science; process theories; cognitive views of information; measuring information; meaning; and misinformation. (Contains 89 references.) (LRW)

  2. Modern methods of image reconstruction.

    NASA Astrophysics Data System (ADS)

    Puetter, R. C.

    The author reviews the image restoration or reconstruction problem in its general setting. He first discusses linear methods for solving the problem of image deconvolution, i.e. the case in which the data are a convolution of a point-spread function and an underlying unblurred image. Next, non-linear methods are introduced in the context of Bayesian estimation, including maximum likelihood and maximum entropy methods. Then, the author discusses the role of language and information theory concepts for data compression and solving the inverse problem. The concept of algorithmic information content (AIC) is introduced and is shown to be crucial to achieving optimal data compression and optimized Bayesian priors for image reconstruction. The dependence of the AIC on the selection of language then suggests how efficient coordinate systems for the inverse problem may be selected. The author also introduced pixon-based image restoration and reconstruction methods. The relation between image AIC and the Bayesian incarnation of Occam's Razor is discussed, as well as the relation of multiresolution pixon languages and image fractal dimension. Also discussed is the relation of pixons to the role played by the Heisenberg uncertainty principle in statistical physics and how pixon-based image reconstruction provides a natural extension to the Akaike information criterion for maximum likelihood. The author presents practical applications of pixon-based Bayesian estimation to the restoration of astronomical images. He discusses the effects of noise, effects of finite sampling on resolution, and special problems associated with spatially correlated noise introduced by mosaicing. Comparisons to other methods demonstrate the significant improvements afforded by pixon-based methods and illustrate the science that such performance improvements allow.

  3. Modelling the growth of Japanese eel Anguilla japonica in the lower reach of the Kao-Ping River, southern Taiwan: an information theory approach.

    PubMed

    Lin, Y J; Tzeng, W N

    2009-07-01

    Information theory was applied to select the best model fitting total length (L(T))-at-age data and calculate the averaged model for Japanese eel Anguilla japonica compiled from published literature and the differences in growth between sexes were examined. Five candidate growth models were the von Bertalanffy, generalized von Bertalanffy, Gompertz, logistic and power models. The von Bertalanffy growth model with sex-specific coefficients was best supported by the data and nearly overlapped the averaged growth model based on Akaike weights, indicating a similar fit to the data. The Gompertz, generalized von Bertalanffy and power growth models were also substantially supported by the data. The L(T) at age of A. japonica were larger in females than in males according to the averaged growth mode, suggesting a sexual dimorphism in growth. Model inferences based on information theory, which deal with uncertainty in model selection and robust parameter estimates, are recommended for modelling the growth of A. japonica.

  4. Evaluation of viral load thresholds for predicting new WHO Stage 3 and 4 events in HIV-infected children receiving highly active antiretroviral therapy

    PubMed Central

    Siberry, George K; Harris, D. Robert; Oliveira, Ricardo Hugo; Krauss, Margot R.; Hofer, Cristina B.; Tiraboschi, Adriana Aparecida; Marques, Heloisa; Succi, Regina C.; Abreu, Thalita; Negra, Marinella Della; Mofenson, Lynne M.; Hazra, Rohan

    2012-01-01

    Background This study evaluated a wide range of viral load (VL) thresholds to identify a cut-point that best predicts new clinical events in children on stable highly-active antiretroviral therapy (HAART). Methods Cox proportional hazards modeling was used to assess the adjusted risk of World Health Organization stage 3 or 4 clinical events (WHO events) as a function of time-varying CD4, VL, and hemoglobin values in a cohort study of Latin American children on HAART ≥ 6 months. Models were fit using different VL cut-points between 400 and 50,000 copies/mL, with model fit evaluated on the basis of the minimum Akaike Information Criterion (AIC) value, a standard model fit statistic. Results Models were based on 67 subjects with WHO events out of 550 subjects on study. The VL cutpoints of > 2600 copies/mL and > 32,000 copies/mL corresponded to the lowest AIC values and were associated with the highest hazard ratios [2.0 (p = 0.015) and 2.1 (p = 0.0058), respectively] for WHO events. Conclusions In HIV-infected Latin American children on stable HAART, two distinct VL thresholds (> 2,600 copies/mL and > 32,000 copies/mL) were identified for predicting children at significantly increased risk of HIV-related clinical illness, after accounting for CD4 level, hemoglobin level, and other significant factors. PMID:22343177

  5. An Investigation of State-Space Model Fidelity for SSME Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2008-01-01

    In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.

  6. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk ( Tamarix spp.) Habitat in the Western USA

    NASA Astrophysics Data System (ADS)

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-10-01

    Habitat suitability maps are commonly created by modeling a species' environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk ( Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  7. The hyper-envelope modeling interface (HEMI): a novel approach illustrated through predicting tamarisk (Tamarix spp.) habitat in the Western USA.

    PubMed

    Graham, Jim; Young, Nick; Jarnevich, Catherine S; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J

    2013-10-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  8. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk (Tamarix spp.) Habitat in the Western USA

    USGS Publications Warehouse

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-01-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  9. Assessing bimodality to detect the presence of a dual cognitive process.

    PubMed

    Freeman, Jonathan B; Dale, Rick

    2013-03-01

    Researchers have long sought to distinguish between single-process and dual-process cognitive phenomena, using responses such as reaction times and, more recently, hand movements. Analysis of a response distribution's modality has been crucial in detecting the presence of dual processes, because they tend to introduce bimodal features. Rarely, however, have bimodality measures been systematically evaluated. We carried out tests of readily available bimodality measures that any researcher may easily employ: the bimodality coefficient (BC), Hartigan's dip statistic (HDS), and the difference in Akaike's information criterion between one-component and two-component distribution models (AIC(diff)). We simulated distributions containing two response populations and examined the influences of (1) the distances between populations, (2) proportions of responses, (3) the amount of positive skew present, and (4) sample size. Distance always had a stronger effect than did proportion, and the effects of proportion greatly differed across the measures. Skew biased the measures by increasing bimodality detection, in some cases leading to anomalous interactive effects. BC and HDS were generally convergent, but a number of important discrepancies were found. AIC(diff) was extremely sensitive to bimodality and identified nearly all distributions as bimodal. However, all measures served to detect the presence of bimodality in comparison to unimodal simulations. We provide a validation with experimental data, discuss methodological and theoretical implications, and make recommendations regarding the choice of analysis.

  10. Double point source W-phase inversion: Real-time implementation and automated model selection

    USGS Publications Warehouse

    Nealy, Jennifer; Hayes, Gavin

    2015-01-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  11. Double Point Source W-phase Inversion: Real-time Implementation and Automated Model Selection

    NASA Astrophysics Data System (ADS)

    Nealy, J. L.; Hayes, G. P.

    2015-12-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever-evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquakes. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match previously published analyses of the same events.

  12. Double point source W-phase inversion: Real-time implementation and automated model selection

    NASA Astrophysics Data System (ADS)

    Nealy, Jennifer L.; Hayes, Gavin P.

    2015-12-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  13. Metastatic lymph node ratio demonstrates better prognostic stratification than pN staging in patients with esophageal squamous cell carcinoma after esophagectomy

    PubMed Central

    Zhang, Hongdian; Liang, Huagang; Gao, Yongyin; Shang, Xiaobin; Gong, Lei; Ma, Zhao; Sun, Ke; Tang, Peng; Yu, Zhentao

    2016-01-01

    This study aimed to evaluate the prognostic significance of lymph node ratio (LNR) by establishing a hypothetical tumor-ratio-metastasis (TRM) staging system in patients with esophageal squamous cell carcinoma (ESCC). The records of 387 ESCC patients receiving curative esophagectomy were retrospectively investigated. The optimal cut-point for LNR was assessed via the best cut-off approach. Potential prognostic parameters were identified through univariate and multivariate analyses. A novel LNR-based TRM stage was proposed. The prognostic discriminatory ability and prediction accuracy of each system were determined using hazard ratio (HR), Akaike information criterion (AIC), concordance index (C-index), and area under the receiver operating characteristic curve (AUC). The optimal cut-points of LNR were set at 0, 0~0.2, 0.2~0.4, and 0.4~1.0. Multivariate Cox analysis indicated that the LNR category was an independent risk factor of overall survival (P < 0.001). The calibration curves for the probability of 3- and 5-year survival showed good consistency between nomogram prediction and actual observation. The LNR category and TRM stage yielded a larger HR, a smaller AIC, a larger C-index, and a larger AUC than the N category and TNM stage did. In summary, the proposed LNR category was superior to the conventional N category in predicting the prognosis of ESCC patients. PMID:27941828

  14. Genetic evaluation using random regression models with different covariance functions for test-day milk yield in an admixture population of Thailand goats.

    PubMed

    Thepparat, Mongkol; Boonkum, Wuttigrai; Duangjinda, Monchai; Tumwasorn, Sornthep; Nakavisut, Sansak; Thongchumroon, Thumrong

    2015-07-01

    The objectives of this study were to compare covariance functions (CF) and estimate the heritability of milk yield from test-day records among exotic (Saanen, Anglo-Nubian, Toggenburg and Alpine) and crossbred goats (Thai native and exotic breed), using a random regression model. A total of 1472 records of test-day milk yield were used, collected from 112 does between 2003 and 2006. CF of the study were Wilmink function, second- and third-order Legendre polynomials, and linear splines 4 knots located at 5, 25, 90 and 155 days in milk (SP25-90) and 5, 35, 95 and 155 of days in milk (SP35-95). Variance components were estimated by restricted maximum likelihood method (REML). Goodness of fit, Akaike information criterion (AIC), percentage of squared bias (PSB), mean square error (MSE), and empirical correlation (RHO) between the observed and predicted values were used to compare models. The results showed that CF had an impact on (co)variance estimation in random regression models (RRM). The RRM with splines 4 knots located at 5, 25, 90 and 155 of days in milk had the lowest AIC, PSB and MSE, and the highest RHO. The heritability estimated throughout lactation obtained with this model ranged from 0.13 to 0.23.

  15. Age and Growth of the Round Stingray Urotrygon rogersi, a Particularly Fast-Growing and Short-Lived Elasmobranch

    PubMed Central

    Mejía-Falla, Paola A.; Cortés, Enric; Navia, Andrés F.; Zapata, Fernando A.

    2014-01-01

    We examined the age and growth of Urotrygon rogersi on the Colombian coast of the Eastern Tropical Pacific Ocean by directly estimating age using vertebral centra. We verified annual deposition of growth increments with marginal increment analysis. Eight growth curves were fitted to four data sets defined on the basis of the reproductive cycle (unadjusted or adjusted for age at first band) and size variables (disc width or total length). Model performance was evaluated using Akaike's Information Criterion (AIC), AIC weights and multi-model inference criteria. A two-phase growth function with adjusted age provided the best description of growth for females (based on five parameters, DW∞  =  20.1 cm, k  =  0.22 yr–1) and males (based on four and five parameters, DW∞  =  15.5 cm, k  =  0.65 yr–1). Median maturity of female and male U. rogersi is reached very fast (mean ± SE  =  1.0 ± 0.1 year). This is the first age and growth study for a species of the genus Urotrygon and results indicate that U. rogersi attains a smaller maximum size and has a shorter lifespan and lower median age at maturity than species of closely related genera. These life history traits are in contrast with those typically reported for other elasmobranchs. PMID:24776963

  16. Testing signal-detection models of yes/no and two-alternative forced-choice recognition memory.

    PubMed

    Jang, Yoonhee; Wixted, John T; Huber, David E

    2009-05-01

    The current study compared 3 models of recognition memory in their ability to generalize across yes/no and 2-alternative forced-choice (2AFC) testing. The unequal-variance signal-detection model assumes a continuous memory strength process. The dual-process signal-detection model adds a thresholdlike recollection process to a continuous familiarity process. The mixture signal-detection model assumes a continuous memory strength process, but the old item distribution consists of a mixture of 2 distributions with different means. Prior efforts comparing the ability of the models to characterize data from both test formats did not consider the role of parameter reliability, which can be critical when comparing models that differ in flexibility. Parametric bootstrap simulations revealed that parameter regressions based on separate fits of each test type only served to identify the least flexible model. However, simultaneous fits of receiver-operating characteristic data from both test types with goodness-of-fit adjusted with Akaike's information criterion (AIC) successfully recovered the true model that generated the data. With AIC and simultaneous fits to real data, the unequal-variance signal-detection model was found to provide the best account across yes/no and 2AFC testing.

  17. Age and growth of the round stingray Urotrygon rogersi, a particularly fast-growing and short-lived elasmobranch.

    PubMed

    Mejía-Falla, Paola A; Cortés, Enric; Navia, Andrés F; Zapata, Fernando A

    2014-01-01

    We examined the age and growth of Urotrygon rogersi on the Colombian coast of the Eastern Tropical Pacific Ocean by directly estimating age using vertebral centra. We verified annual deposition of growth increments with marginal increment analysis. Eight growth curves were fitted to four data sets defined on the basis of the reproductive cycle (unadjusted or adjusted for age at first band) and size variables (disc width or total length). Model performance was evaluated using Akaike's Information Criterion (AIC), AIC weights and multi-model inference criteria. A two-phase growth function with adjusted age provided the best description of growth for females (based on five parameters, DW∞  =  20.1 cm, k  =  0.22 yr⁻¹) and males (based on four and five parameters, DW(∞)  =  15.5 cm, k  =  0.65 yr⁻¹). Median maturity of female and male U. rogersi is reached very fast (mean ± SE  =  1.0 ± 0.1 year). This is the first age and growth study for a species of the genus Urotrygon and results indicate that U. rogersi attains a smaller maximum size and has a shorter lifespan and lower median age at maturity than species of closely related genera. These life history traits are in contrast with those typically reported for other elasmobranchs.

  18. Difference image analysis: automatic kernel design using information criteria

    NASA Astrophysics Data System (ADS)

    Bramich, D. M.; Horne, Keith; Alsubai, K. A.; Bachelet, E.; Mislis, D.; Parley, N.

    2016-03-01

    We present a selection of methods for automatically constructing an optimal kernel model for difference image analysis which require very few external parameters to control the kernel design. Each method consists of two components; namely, a kernel design algorithm to generate a set of candidate kernel models, and a model selection criterion to select the simplest kernel model from the candidate models that provides a sufficiently good fit to the target image. We restricted our attention to the case of solving for a spatially invariant convolution kernel composed of delta basis functions, and we considered 19 different kernel solution methods including six employing kernel regularization. We tested these kernel solution methods by performing a comprehensive set of image simulations and investigating how their performance in terms of model error, fit quality, and photometric accuracy depends on the properties of the reference and target images. We find that the irregular kernel design algorithm employing unregularized delta basis functions, combined with either the Akaike or Takeuchi information criterion, is the best kernel solution method in terms of photometric accuracy. Our results are validated by tests performed on two independent sets of real data. Finally, we provide some important recommendations for software implementations of difference image analysis.

  19. Spatial Distribution of Black Bear Incident Reports in Michigan.

    PubMed

    McFadden-Hiller, Jamie E; Beyer, Dean E; Belant, Jerrold L

    2016-01-01

    Interactions between humans and carnivores have existed for centuries due to competition for food and space. American black bears are increasing in abundance and populations are expanding geographically in many portions of its range, including areas that are also increasing in human density, often resulting in associated increases in human-bear conflict (hereafter, bear incidents). We used public reports of bear incidents in Michigan, USA, from 2003-2011 to assess the relative contributions of ecological and anthropogenic variables in explaining the spatial distribution of bear incidents and estimated the potential risk of bear incidents. We used weighted Normalized Difference Vegetation Index mean as an index of primary productivity, region (i.e., Upper Peninsula or Lower Peninsula), primary and secondary road densities, and percentage land cover type within 6.5-km2 circular buffers around bear incidents and random points. We developed 22 a priori models and used generalized linear models and Akaike's Information Criterion (AIC) to rank models. The global model was the best compromise between model complexity and model fit (w = 0.99), with a ΔAIC 8.99 units from the second best performing model. We found that as deciduous forest cover increased, the probability of bear incident occurrence increased. Among the measured anthropogenic variables, cultivated crops and primary roads were the most important in our AIC-best model and were both positively related to the probability of bear incident occurrence. The spatial distribution of relative bear incident risk varied markedly throughout Michigan. Forest cover fragmented with agriculture and other anthropogenic activities presents an environment that likely facilitates bear incidents. Our map can help wildlife managers identify areas of bear incident occurrence, which in turn can be used to help develop strategies aimed at reducing incidents. Researchers and wildlife managers can use similar mapping techniques to

  20. Comparison of Temperature Indexes for the Impact Assessment of Heat Stress on Heat-Related Mortality

    PubMed Central

    Kim, Young-Min; Cheong, Hae-Kwan; Kim, Eun-Hye

    2011-01-01

    Objectives In order to evaluate which temperature index is the best predictor for the health impact assessment of heat stress in Korea, several indexes were compared. Methods We adopted temperature, perceived temperature (PT), and apparent temperature (AT), as a heat stress index, and changes in the risk of death for Seoul and Daegu were estimated with 1℃ increases in those temperature indexes using generalized additive model (GAM) adjusted for the non-temperature related factors: time trends, seasonality, and air pollution. The estimated excess mortality and Akaike's Information Criterion (AIC) due to the increased temperature indexes for the 75th percentile in the summers from 2001 to 2008 were compared and analyzed to define the best predictor. Results For Seoul, all-cause mortality presented the highest percent increase (2.99% [95% CI, 2.43 to 3.54%]) in maximum temperature while AIC showed the lowest value when the all-cause daily death counts were fitted with the maximum PT for the 75th percentile of summer. For Daegu, all-cause mortality presented the greatest percent increase (3.52% [95% CI, 2.23 to 4.80%]) in minimum temperature and AIC showed the lowest value in maximum temperature. No lag effect was found in the association between temperature and mortality for Seoul, whereas for Daegu one-day lag effect was noted. Conclusions There was no one temperature measure that was superior to the others in summer. To adopt an appropriate temperature index, regional meteorological characteristics and the disease status of population should be considered. PMID:22125770

  1. Prediction of Vigilant Attention and Cognitive Performance Using Self-Reported Alertness, Circadian Phase, Hours since Awakening, and Accumulated Sleep Loss

    PubMed Central

    Bermudez, Eduardo B.; Klerman, Elizabeth B.; Czeisler, Charles A.; Cohen, Daniel A.; Wyatt, James K.; Phillips, Andrew J. K.

    2016-01-01

    Sleep restriction causes impaired cognitive performance that can result in adverse consequences in many occupational settings. Individuals may rely on self-perceived alertness to decide if they are able to adequately perform a task. It is therefore important to determine the relationship between an individual’s self-assessed alertness and their objective performance, and how this relationship depends on circadian phase, hours since awakening, and cumulative lost hours of sleep. Healthy young adults (aged 18–34) completed an inpatient schedule that included forced desynchrony of sleep/wake and circadian rhythms with twelve 42.85-hour “days” and either a 1:2 (n = 8) or 1:3.3 (n = 9) ratio of sleep-opportunity:enforced-wakefulness. We investigated whether subjective alertness (visual analog scale), circadian phase (melatonin), hours since awakening, and cumulative sleep loss could predict objective performance on the Psychomotor Vigilance Task (PVT), an Addition/Calculation Test (ADD) and the Digit Symbol Substitution Test (DSST). Mathematical models that allowed nonlinear interactions between explanatory variables were evaluated using the Akaike Information Criterion (AIC). Subjective alertness was the single best predictor of PVT, ADD, and DSST performance. Subjective alertness alone, however, was not an accurate predictor of PVT performance. The best AIC scores for PVT and DSST were achieved when all explanatory variables were included in the model. The best AIC score for ADD was achieved with circadian phase and subjective alertness variables. We conclude that subjective alertness alone is a weak predictor of objective vigilant or cognitive performance. Predictions can, however, be improved by knowing an individual’s circadian phase, current wake duration, and cumulative sleep loss. PMID:27019198

  2. The counterintuitive effect of multiple injuries in severity scoring: a simple variable improves the predictive ability of NISS

    PubMed Central

    2011-01-01

    Background Injury scoring is important to formulate prognoses for trauma patients. Although scores based on empirical estimation allow for better prediction, those based on expert consensus, e.g. the New Injury Severity Score (NISS) are widely used. We describe how the addition of a variable quantifying the number of injuries improves the ability of NISS to predict mortality. Methods We analyzed 2488 injury cases included into the trauma registry of the Italian region Emilia-Romagna in 2006-2008 and assessed the ability of NISS alone, NISS plus number of injuries, and the maximum Abbreviated Injury Scale (AIS) to predict in-hospital mortality. Hierarchical logistic regression was used. We measured discrimination through the C statistics, and calibration through Hosmer-Lemeshow statistics, Akaike's information criterion (AIC) and calibration curves. Results The best discrimination and calibration resulted from the model with NISS plus number of injuries, followed by NISS alone and then by the maximum AIS (C statistics 0.775, 0.755, and 0.729, respectively; AIC 1602, 1635, and 1712, respectively). The predictive ability of all the models improved after inclusion of age, gender, mechanism of injury, and the motor component of Glasgow Coma Scale (C statistics 0.889, 0.898, and 0.901; AIC 1234, 1174, and 1167). The model with NISS plus number of injuries still showed the best performances, this time with borderline statistical significance. Conclusions In NISS, the same weight is assigned to the three worst injuries, although the contribution of the second and third to the probability of death is smaller than that of the worst one. An improvement of the predictive ability of NISS can be obtained adjusting for the number of injuries. PMID:21504567

  3. Anterior Insular Cortex and Emotional Awareness

    PubMed Central

    Gu, Xiaosi; Hof, Patrick R.; Friston, Karl J.; Fan, Jin

    2014-01-01

    This paper reviews the foundation for a role of the human anterior insular cortex (AIC) in emotional awareness, defined as the conscious experience of emotions. We first introduce the neuroanatomical features of AIC and existing findings on emotional awareness. Using empathy, the awareness and understanding of other people’s emotional states, as a test case, we then present evidence to demonstrate: 1) AIC and anterior cingulate cortex (ACC) are commonly coactivated as revealed by a meta-analysis, 2) AIC is functionally dissociable from ACC, 3) AIC integrates stimulus-driven and top-down information, and 4) AIC is necessary for emotional awareness. We propose a model in which AIC serves two major functions: integrating bottom-up interoceptive signals with top-down predictions to generate a current awareness state and providing descending predictions to visceral systems that provide a point of reference for autonomic reflexes. We argue that AIC is critical and necessary for emotional awareness. PMID:23749500

  4. Information management

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell; Corker, Kevin

    1990-01-01

    Primary Flight Display (PFD) information management and cockpit display of information management research is presented in viewgraph form. The information management problem in the cockpit, information management burdens, the key characteristics of an information manager, the interface management system handling the flow of information and the dialogs between the system and the pilot, and overall system architecture are covered.

  5. Average Information Content Maximization—A New Approach for Fingerprint Hybridization and Reduction

    PubMed Central

    Śmieja, Marek; Warszycki, Dawid

    2016-01-01

    Fingerprints, bit representations of compound chemical structure, have been widely used in cheminformatics for many years. Although fingerprints with the highest resolution display satisfactory performance in virtual screening campaigns, the presence of a relatively high number of irrelevant bits introduces noise into data and makes their application more time-consuming. In this study, we present a new method of hybrid reduced fingerprint construction, the Average Information Content Maximization algorithm (AIC-Max algorithm), which selects the most informative bits from a collection of fingerprints. This methodology, applied to the ligands of five cognate serotonin receptors (5-HT2A, 5-HT2B, 5-HT2C, 5-HT5A, 5-HT6), proved that 100 bits selected from four non-hashed fingerprints reflect almost all structural information required for a successful in silico discrimination test. A classification experiment indicated that a reduced representation is able to achieve even slightly better performance than the state-of-the-art 10-times-longer fingerprints and in a significantly shorter time. PMID:26784447

  6. Asbestos Information

    MedlinePlus

    ... Homepage > Asbestos / Prevention > Asbestos Information: Mesothelioma and Asbestos Asbestos Information e-News Signup Click Here to Sign ... making asbestos poisoning prime time news. Explore More Asbestos Information The Mesothelioma Applied Research Foundation offers resources ...

  7. Information Technology Division Technical Paper Abstracts

    DTIC Science & Technology

    1993-06-11

    decisions under dependent evidences. 15 NATURAL LANGUAGE Title: Eucalyptus : An Integrated Spoken Language/Graphical Interface for Human- Computer Dialog...AIC-92-012 Eucalyptus : An Integrated Spoken Language/Graphical interface for Human-Computer Dialog, Kenneth Wauchope S] MAlr-92-013 Inductive Biases...algorithm which is an extension of the 2400- b/s LPC. In essence , the 800-b/s voice algorithm is a 2400-b/s LPC with modified parameter quantifiers with

  8. 16. Information.

    PubMed

    2014-05-01

    Effective disaster management requires systems for data acquisition and information management that enable responders to rapidly collect, process, interpret, distribute, and access the data and information required for disaster management. Effective information sharing depends on the types of users, the type of damage, alterations of the functional status of the affected society, and how the information is structured. Those in need of information should be provided with the information necessary for their tasks and not be overloaded with unnecessary information that could serve as a distraction. Such information systems must be designed and exercised. To disseminate and share data with the relevant users, all disaster responses must include effective and reliable information systems. This information includes that acquired from repeated assessments in terms of available and needed human and material resources, which resources no longer are needed, and the status of the relief and recovery workers. It is through this information system that vital decisions are made that are congruent with the overall picture as perceived by the most relevant coordination and control centre. It is essential that information systems be designed and tested regularly as part of preparedness. Such systems must have the capacity to acquire, classify, and present information in an organised and useful manner.

  9. Some novel growth functions and their application with reference to growth in ostrich.

    PubMed

    Faridi, A; López, S; Ammar, H; Salwa, K S; Golian, A; Thornley, J H M; France, J

    2015-06-01

    Four novel growth functions, namely, Pareto, extreme value distribution (EVD), Lomolino, and cumulative β-P distribution (CBP), are derived, and their ability to describe ostrich growth curves is evaluated. The functions were compared with standard growth equations, namely, the monomolecular, Michaelis-Menten (MM), Gompertz, Richards, and generalized MM (gMM). For this purpose, 2 separate comparisons were conducted. In the first, all the functions were fitted to 40 individual growth curves (5 males and 35 females) of ostriches using nonlinear regression. In the second, performance of the functions was assessed when data from 71 individuals were composited (570 data points). This comparison was undertaken using nonlinear mixed models and considering 3 approaches: 1) models with no random effect, 2) random effect incorporated as the intercept, and 3) random effect incorporated into the asymptotic weight parameter (Wf). The results from the first comparison showed that the functions generally gave acceptable values of R2 and residual variance. On the basis of the Akaike information criterion (AIC), CBP gave the best fit, whereas the Gompertz and Lomolino equations were the preferred functions on the basis of corrected AIC (AICc). Bias, accuracy factor, the Durbin-Watson statistic, and the number of runs of sign were used to analyze the residuals. CBP gave the best distribution of residuals but also produced more residual autocorrelation (significant Durbin-Watson statistic). The functions were applied to sample data for a more conventional farm species (2 breeds of cattle) to verify the results of the comparison of fit among functions and their applicability across species. In the second comparison, analysis of mixed models showed that incorporation of a random effect into Wf gave the best fit, resulting in smaller AIC and AIC values compared with those in the other 2 approaches. On the basis of AICc, best fit was achieved with CBP, followed by gMM, Lomolino, and

  10. The Association between Environmental Factors and Scarlet Fever Incidence in Beijing Region: Using GIS and Spatial Regression Models

    PubMed Central

    Mahara, Gehendra; Wang, Chao; Yang, Kun; Chen, Sipeng; Guo, Jin; Gao, Qi; Wang, Wei; Wang, Quanyi; Guo, Xiuhua

    2016-01-01

    (1) Background: Evidence regarding scarlet fever and its relationship with meteorological, including air pollution factors, is not very available. This study aimed to examine the relationship between ambient air pollutants and meteorological factors with scarlet fever occurrence in Beijing, China. (2) Methods: A retrospective ecological study was carried out to distinguish the epidemic characteristics of scarlet fever incidence in Beijing districts from 2013 to 2014. Daily incidence and corresponding air pollutant and meteorological data were used to develop the model. Global Moran’s I statistic and Anselin’s local Moran’s I (LISA) were applied to detect the spatial autocorrelation (spatial dependency) and clusters of scarlet fever incidence. The spatial lag model (SLM) and spatial error model (SEM) including ordinary least squares (OLS) models were then applied to probe the association between scarlet fever incidence and meteorological including air pollution factors. (3) Results: Among the 5491 cases, more than half (62%) were male, and more than one-third (37.8%) were female, with the annual average incidence rate 14.64 per 100,000 population. Spatial autocorrelation analysis exhibited the existence of spatial dependence; therefore, we applied spatial regression models. After comparing the values of R-square, log-likelihood and the Akaike information criterion (AIC) among the three models, the OLS model (R2 = 0.0741, log likelihood = −1819.69, AIC = 3665.38), SLM (R2 = 0.0786, log likelihood = −1819.04, AIC = 3665.08) and SEM (R2 = 0.0743, log likelihood = −1819.67, AIC = 3665.36), identified that the spatial lag model (SLM) was best for model fit for the regression model. There was a positive significant association between nitrogen oxide (p = 0.027), rainfall (p = 0.036) and sunshine hour (p = 0.048), while the relative humidity (p = 0.034) had an adverse association with scarlet fever incidence in SLM. (4) Conclusions: Our findings indicated that

  11. Information Technology.

    ERIC Educational Resources Information Center

    Reynolds, Roger

    1983-01-01

    Describes important information-handling products, predicting future devices in light of convergence and greater flexibility offered through use of microchip technology. Contends that information technology and its impact of privacy depends on how information systems are used, arguing that the privacy issue deals more with moral/physiological…

  12. Information Dominance

    DTIC Science & Technology

    1997-11-01

    Information dominance may be defined as superiority in the generation, manipulation, and use of information sufficient to afford its possessors... information dominance at the strategic level: knowing oneself and one’s enemy; and, at best, inducing them to see things as one does.

  13. Information "Literacies"

    ERIC Educational Resources Information Center

    Anderson, Byron

    2007-01-01

    As communication technologies change, so do libraries. Library instruction programs are now focused on teaching information literacy, a term that may just as well be referred to as information "literacies." The new media age involves information in a wide variety of mediums. Educators everywhere are realizing media's power to communicate and…

  14. Information Integrity

    ERIC Educational Resources Information Center

    Graves, Eric

    2013-01-01

    This dissertation introduces the concept of Information Integrity, which is the detection and possible correction of information manipulation by any intermediary node in a communication system. As networks continue to grow in complexity, information theoretic security has failed to keep pace. As a result many parties whom want to communicate,…

  15. Informal Taxation*

    PubMed Central

    Olken, Benjamin A.; Singhal, Monica

    2011-01-01

    Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this “informal taxation.” Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts. PMID:22199993

  16. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina L.; Thompson, Shelby G.; Sandor, Aniko; McCann, Robert S.; Kaiser, Mary K.; Adelstein, Barnard D.; Begault, Durand R.; Beutter, Brent R.; Stone, Leland S.; Godfroy, Martine

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. In addition to addressing display design issues associated with information formatting, style, layout, and interaction, the Information Presentation DRP is also working toward understanding the effects of extreme environments encountered in space travel on information processing. Work is also in progress to refine human factors-based design tools, such as human performance modeling, that will supplement traditional design techniques and help ensure that optimal information design is accomplished in the most cost-efficient manner. The major areas of work, or subtasks, within the Information Presentation DRP for FY10 are: 1) Displays, 2) Controls, 3) Procedures and Fault Management, and 4) Human Performance Modeling. The poster will highlight completed and planned work for each subtask.

  17. Identification of sorption processes and parameters for radionuclide transport in fractured rock

    NASA Astrophysics Data System (ADS)

    Dai, Zhenxue; Wolfsberg, Andrew; Reimus, Paul; Deng, Hailin; Kwicklis, Edward; Ding, Mei; Ware, Doug; Ye, Ming

    2012-01-01

    SummaryIdentification of chemical reaction processes in subsurface environments is a key issue for reactive transport modeling because simulating different processes requires developing different chemical-mathematical models. In this paper, two sorption processes (equilibrium and kinetics) are considered for modeling neptunium and uranium sorption in fractured rock. Based on different conceptualizations of the two processes occurring in fracture and/or matrix media, seven dual-porosity, multi-component reactive transport models are developed. The process models are identified with a stepwise strategy by using multi-tracer concentration data obtained from a series of transport experiments. In the first step, breakthrough data of a conservative tracer (tritium) obtained from four experiments are used to estimate the flow and non-reactive transport parameters (i.e., mean fluid residence time in fracture, fracture aperture, and matrix tortuosity) common to all the reactive transport models. In the second and third steps, by fixing the common non-reactive flow and transport parameters, the sorption parameters (retardation factor, sorption coefficient, and kinetic rate constant) of each model are estimated using the breakthrough data of reactive tracers, neptunium and uranium, respectively. Based on the inverse modeling results, the seven sorption-process models are discriminated using four model discrimination (or selection) criteria, Akaike information criterion ( AIC), modified Akaike information criterion ( AICc), Bayesian information criterion ( BIC) and Kashyap information criterion ( KIC). These criteria suggest the kinetic sorption process for modeling reactive transport of neptunium and uranium transport in both fracture and matrix. This conclusion is confirmed by two chemical criteria, the half reaction time and Damköhler number criterion.

  18. Effects of error covariance structure on estimation of model averaging weights and predictive performance

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Meyer, Philip D.; Curtis, Gary P.; Shi, Xiaoqing; Niu, Xu-Feng; Yabusaki, Steve B.

    2013-01-01

    When conducting model averaging for assessing groundwater conceptual model uncertainty, the averaging weights are often evaluated using model selection criteria such as AIC, AICc, BIC, and KIC (Akaike Information Criterion, Corrected Akaike Information Criterion, Bayesian Information Criterion, and Kashyap Information Criterion, respectively). However, this method often leads to an unrealistic situation in which the best model receives overwhelmingly large averaging weight (close to 100%), which cannot be justified by available data and knowledge. It was found in this study that this problem was caused by using the covariance matrix, CE, of measurement errors for estimating the negative log likelihood function common to all the model selection criteria. This problem can be resolved by using the covariance matrix, Cek, of total errors (including model errors and measurement errors) to account for the correlation between the total errors. An iterative two-stage method was developed in the context of maximum likelihood inverse modeling to iteratively infer the unknown Cek from the residuals during model calibration. The inferred Cek was then used in the evaluation of model selection criteria and model averaging weights. While this method was limited to serial data using time series techniques in this study, it can be extended to spatial data using geostatistical techniques. The method was first evaluated in a synthetic study and then applied to an experimental study, in which alternative surface complexation models were developed to simulate column experiments of uranium reactive transport. It was found that the total errors of the alternative models were temporally correlated due to the model errors. The iterative two-stage method using Cekresolved the problem that the best model receives 100% model averaging weight, and the resulting model averaging weights were supported by the calibration results and physical understanding of the alternative models. Using Cek

  19. Selecting the Number of Principal Components in Functional Data

    PubMed Central

    Li, Yehua; Wang, Naisyin; Carroll, Raymond J.

    2013-01-01

    Functional principal component analysis (FPCA) has become the most widely used dimension reduction tool for functional data analysis. We consider functional data measured at random, subject-specific time points, contaminated with measurement error, allowing for both sparse and dense functional data, and propose novel information criteria to select the number of principal component in such data. We propose a Bayesian information criterion based on marginal modeling that can consistently select the number of principal components for both sparse and dense functional data. For dense functional data, we also developed an Akaike information criterion (AIC) based on the expected Kullback-Leibler information under a Gaussian assumption. In connecting with factor analysis in multivariate time series data, we also consider the information criteria by Bai & Ng (2002) and show that they are still consistent for dense functional data, if a prescribed undersmoothing scheme is undertaken in the FPCA algorithm. We perform intensive simulation studies and show that the proposed information criteria vastly outperform existing methods for this type of data. Surprisingly, our empirical evidence shows that our information criteria proposed for dense functional data also perform well for sparse functional data. An empirical example using colon carcinogenesis data is also provided to illustrate the results. PMID:24376287

  20. Selecting the Number of Principal Components in Functional Data.

    PubMed

    Li, Yehua; Wang, Naisyin; Carroll, Raymond J

    2013-12-19

    Functional principal component analysis (FPCA) has become the most widely used dimension reduction tool for functional data analysis. We consider functional data measured at random, subject-specific time points, contaminated with measurement error, allowing for both sparse and dense functional data, and propose novel information criteria to select the number of principal component in such data. We propose a Bayesian information criterion based on marginal modeling that can consistently select the number of principal components for both sparse and dense functional data. For dense functional data, we also developed an Akaike information criterion (AIC) based on the expected Kullback-Leibler information under a Gaussian assumption. In connecting with factor analysis in multivariate time series data, we also consider the information criteria by Bai & Ng (2002) and show that they are still consistent for dense functional data, if a prescribed undersmoothing scheme is undertaken in the FPCA algorithm. We perform intensive simulation studies and show that the proposed information criteria vastly outperform existing methods for this type of data. Surprisingly, our empirical evidence shows that our information criteria proposed for dense functional data also perform well for sparse functional data. An empirical example using colon carcinogenesis data is also provided to illustrate the results.

  1. Modeling mercury biomagnification (South River, Virginia, USA) to inform river management decision making.

    PubMed

    Tom, Kyle R; Newman, Michael C; Schmerfeld, John

    2010-04-01

    Mercury trophic transfer in the South River (VA, USA) was modeled to guide river remediation decision making. Sixteen different biota types were collected at six sites within 23 river miles. Mercury biomagnification was modeled using a general biomagnification model based on delta(15)N and distance from the historic mercury release. Methylmercury trophic transfer was clearer than that for total Hg and, therefore, was used to build the predictive model (r(2) (prediction) = 0.76). The methylmercury biomagnification factors were similar among sites, but model intercept did increase with distance down river. Minimum Akaike's Information Criterion Estimation (MAICE) justified the incorporation of distance in the model. A model with a very similar biomagnification factor to the South River (95% confidence intervals [CI] = 0.38-0.52) was produced for a second contaminated Virginia river, the North Fork Holston River (95% CI = 0.41-0.55). Percent of total Hg that was methylmercury increased monotonically with trophic position. Trophic models based on delta(15)N were adequate for predicting changes in mercury concentrations in edible fish under different remediation scenarios.

  2. MH(2)c: Characterization of major histocompatibility α-helices - an information criterion approach.

    PubMed

    Hischenhuber, B; Frommlet, F; Schreiner, W; Knapp, B

    2012-07-01

    Major histocompatibility proteins share a common overall structure or peptide binding groove. Two binding groove domains, on the same chain for major histocompatibility class I or on two different chains for major histocompatibility class II, contribute to that structure that consists of two α-helices ("wall") and a sheet of eight anti-parallel beta strands ("floor"). Apart from the peptide presented in the groove, the major histocompatibility α-helices play a central role for the interaction with the T cell receptor. This study presents a generalized mathematical approach for the characterization of these helices. We employed polynomials of degree 1 to 7 and splines with 1 to 2 nodes based on polynomials of degree 1 to 7 on the α-helices projected on their principal components. We evaluated all models with a corrected Akaike Information Criterion to determine which model represents the α-helices in the best way without overfitting the data. This method is applicable for both the stationary and the dynamic characterization of α-helices. By deriving differential geometric parameters from these models one obtains a reliable method to characterize and compare α-helices for a broad range of applications.

  3. Statistical Tests of the PTHA Poisson Assumption for Submarine Landslides

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Chaytor, J. D.; Parsons, T.; Ten Brink, U. S.

    2012-12-01

    We demonstrate that a sequence of dated mass transport deposits (MTDs) can provide information to statistically test whether or not submarine landslides associated with these deposits conform to a Poisson model of occurrence. Probabilistic tsunami hazard analysis (PTHA) most often assumes Poissonian occurrence for all sources, with an exponential distribution of return times. Using dates that define the bounds of individual MTDs, we first describe likelihood and Monte Carlo methods of parameter estimation for a suite of candidate occurrence models (Poisson, lognormal, gamma, Brownian Passage Time). In addition to age-dating uncertainty, both methods incorporate uncertainty caused by the open time intervals: i.e., before the first and after the last event to the present. Accounting for these open intervals is critical when there are a small number of observed events. The optimal occurrence model is selected according to both the Akaike Information Criteria (AIC) and Akaike's Bayesian Information Criterion (ABIC). In addition, the likelihood ratio test can be performed on occurrence models from the same family: e.g., the gamma model relative to the exponential model of return time distribution. Parameter estimation, model selection, and hypothesis testing are performed on data from two IODP holes in the northern Gulf of Mexico that penetrated a total of 14 MTDs, some of which are correlated between the two holes. Each of these events has been assigned an age based on microfossil zonations and magnetostratigraphic datums. Results from these sites indicate that the Poisson assumption is likely valid. However, parameter estimation results using the likelihood method for one of the sites suggest that the events may have occurred quasi-periodically. Methods developed in this study provide tools with which one can determine both the rate of occurrence and the statistical validity of the Poisson assumption when submarine landslides are included in PTHA.

  4. Incidence and Description of Autoimmune Cytopenias During Treatment with Ibrutinib for Chronic Lymphocytic Leukemia Autoimmune Cytopenias During Ibrutinib Treatment

    PubMed Central

    Rogers, Kerry A.; Ruppert, Amy S.; Bingman, Anissa; Andritsos, Leslie A.; Awan, Farrukh T.; Blum, Kristie A.; Flynn, Joseph M.; Jaglowski, Samantha M.; Lozanski, Gerard; Maddocks, Kami J.; Byrd, John C.; Woyach, Jennifer A.; Jones, Jeffrey A.

    2016-01-01

    Chronic lymphocytic leukemia (CLL) is frequently complicated by secondary autoimmune cytopenias (AIC). Ibrutinib is an irreversible inhibitor of Bruton’s Tyrosine Kinase approved for treatment of relapsed CLL and CLL with del(17p). The effect of ibrutinib treatment on the incidence of AIC is currently unknown. We reviewed medical records of 301 patients treated with ibrutinib as participants in therapeutic clinical trials at the Ohio State University Comprehensive Cancer Center between July 2010 and July 2014. Subjects were reviewed with respect to past history of AIC, and treatment emergent AIC cases were identified. Prior to starting ibrutinib treatment, 26% of patients had experienced AIC. Information was available for a total of 468 patient-years of ibrutinib exposure, during which there were six cases of treatment emergent AIC. This corresponds to an estimated incidence rate of 13 episodes for every 1 000 patient-years of ibrutinib treatment. We further identified 22 patients receiving therapy for AIC at the time ibrutinib was started. Of these 22 patients, 19 were able to discontinue AIC therapy. We found that ibrutinib treatment is associated with a low rate of treatment emergent AIC. Patients with an existing AIC have been successfully treated with ibrutinib and subsequently discontinued AIC therapy. PMID:26442611

  5. Informed Consent

    MedlinePlus

    ... ask during informed consent? Can I change my mind after I’ve signed the consent? What if I don’t want the treatment that’s being offered? How is informed consent for a clinical trial or research study different from consent for standard treatment? How is ...

  6. Spacetime information

    SciTech Connect

    Hartle, J.B. Isaac Newton Institute for the Mathematical Sciences, University of Cambridge, Cambridge CB3 0EH )

    1995-02-15

    In usual quantum theory, the information available about a quantum system is defined in terms of the density matrix describing it on a spacelike surface. This definition must be generalized for extensions of quantum theory which neither require, nor always permit, a notion of state on a spacelike surface. In particular, it must be generalized for the generalized quantum theories appropriate when spacetime geometry fluctuates quantum mechanically or when geometry is fixed but not foliable by spacelike surfaces. This paper introduces a four-dimensional notion of the information available about a quantum system's boundary conditions in the various sets of decohering, coarse-grained histories it may display. This spacetime notion of information coincides with the familiar one when quantum theory [ital is] formulable in terms of states on spacelike surfaces but generalizes this notion when it cannot be so formulated. The idea of spacetime information is applied in several contexts: When spacetime geometry is fixed the information available through alternatives restricted to a fixed spacetime region is defined. The information available through histories of alternatives of general operators is compared to that obtained from the more limited coarse grainings of sum-over-histories quantum mechanics that refer only to coordinates. The definition of information is considered in generalized quantum theories. We consider as specific examples time-neutral quantum mechanics with initial and final conditions, quantum theories with nonunitary evolution, and the generalized quantum frameworks appropriate for quantum spacetime. In such theories complete information about a quantum system is not necessarily available on any spacelike surface but must be searched for throughout spacetime. The information loss commonly associated with the evolution of pure states into mixed states'' in black hole evaporation is thus not in conflict with the principles of generalized quantum mechanics.

  7. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina; Sandor, A.; Thompson, S. G.; McCann, R. S.; Kaiser, M. K.; Begault, D. R.; Adelstein, B. D.; Beutter, B. R.; Stone, L. S.

    2008-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew on flight vehicles, surface landers and habitats, and during extra-vehicular activities (EVA). Designers of displays and controls for exploration missions must be prepared to select the text formats, label styles, alarms, electronic procedure designs, and cursor control devices that provide for optimal crew performance on exploration tasks. The major areas of work, or subtasks, within the Information Presentation DRP are: 1) Controls, 2) Displays, 3) Procedures, and 4) EVA Operations.

  8. A free-knot spline modeling framework for piecewise linear logistic regression in complex samples with body mass index and mortality as an example.

    PubMed

    Keith, Scott W; Allison, David B

    2014-09-29

    This paper details the design, evaluation, and implementation of a framework for detecting and modeling nonlinearity between a binary outcome and a continuous predictor variable adjusted for covariates in complex samples. The framework provides familiar-looking parameterizations of output in terms of linear slope coefficients and odds ratios. Estimation methods focus on maximum likelihood optimization of piecewise linear free-knot splines formulated as B-splines. Correctly specifying the optimal number and positions of the knots improves the model, but is marked by computational intensity and numerical instability. Our inference methods utilize both parametric and nonparametric bootstrapping. Unlike other nonlinear modeling packages, this framework is designed to incorporate multistage survey sample designs common to nationally representative datasets. We illustrate the approach and evaluate its performance in specifying the correct number of knots under various conditions with an example using body mass index (BMI; kg/m(2)) and the complex multi-stage sampling design from the Third National Health and Nutrition Examination Survey to simulate binary mortality outcomes data having realistic nonlinear sample-weighted risk associations with BMI. BMI and mortality data provide a particularly apt example and area of application since BMI is commonly recorded in large health surveys with complex designs, often categorized for modeling, and nonlinearly related to mortality. When complex sample design considerations were ignored, our method was generally similar to or more accurate than two common model selection procedures, Schwarz's Bayesian Information Criterion (BIC) and Akaike's Information Criterion (AIC), in terms of correctly selecting the correct number of knots. Our approach provided accurate knot selections when complex sampling weights were incorporated, while AIC and BIC were not effective under these conditions.

  9. VLP Source Inversion and Evaluation of Error Analysis Techniques at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Brill, K. A.; Waite, G. P.

    2015-12-01

    In January of 2012, our team occupied 10 sites around Fuego volcano with broadband seismometers, two of which were collocated with infrasound microphone arrays and tilt-meters (see Figure 1 for full deployment details). Our radial coverage around Fuego during the 2012 campaign satisfies conditions outlined by Dawson et al. [2011] for good network coverage. Very-long-period (VLP) events that accompany small-scale explosions were classified by waveform and eruption style. We located these VLP event families which have been persistent at Fuego since at least 2008 through inversion in the same manner employed by Lyons and Waite [2011] with improved radial coverage in our network. We compare results for source inversions performed with independent tilt data against inversions incorporating tilt data extracted from the broadband. The current best-practice method for choosing an optimum solution for inversion results is based on each solution's residual error, the relevance of free parameters used in the model, and the physical significance of the source mechanism. Error analysis was performed through a boot strapping in order to explore the source location uncertainty and significance of components of the moment tensor. The significance of the number of free parameters has mostly been evaluated by calculating Akaike's Information Criterion (AIC), but little has been done to evaluate the sensitivity of AIC or other criteria (i.e. Bayesian Information Criterion) to the number of model parameters. We compare solutions as chosen by these alternate methods with more standard techniques for our real data set as well through the use of synthetic data and make recommendations as to best practices. Figure 1: a) Map of 2012 station network: stations highlighted in red were collocated with infrasound arrays. b) Location of Fuego within Guatemala and view of the complex from the west with different eruptive centers labeled. c) Operational times for each of the stations and cameras.

  10. Estimating Dbh of Trees Employing Multiple Linear Regression of the best Lidar-Derived Parameter Combination Automated in Python in a Natural Broadleaf Forest in the Philippines

    NASA Astrophysics Data System (ADS)

    Ibanez, C. A. G.; Carcellar, B. G., III; Paringit, E. C.; Argamosa, R. J. L.; Faelga, R. A. G.; Posilero, M. A. V.; Zaragosa, G. P.; Dimayacyac, N. A.

    2016-06-01

    Diameter-at-Breast-Height Estimation is a prerequisite in various allometric equations estimating important forestry indices like stem volume, basal area, biomass and carbon stock. LiDAR Technology has a means of directly obtaining different forest parameters, except DBH, from the behavior and characteristics of point cloud unique in different forest classes. Extensive tree inventory was done on a two-hectare established sample plot in Mt. Makiling, Laguna for a natural growth forest. Coordinates, height, and canopy cover were measured and types of species were identified to compare to LiDAR derivatives. Multiple linear regression was used to get LiDAR-derived DBH by integrating field-derived DBH and 27 LiDAR-derived parameters at 20m, 10m, and 5m grid resolutions. To know the best combination of parameters in DBH Estimation, all possible combinations of parameters were generated and automated using python scripts and additional regression related libraries such as Numpy, Scipy, and Scikit learn were used. The combination that yields the highest r-squared or coefficient of determination and lowest AIC (Akaike's Information Criterion) and BIC (Bayesian Information Criterion) was determined to be the best equation. The equation is at its best using 11 parameters at 10mgrid size and at of 0.604 r-squared, 154.04 AIC and 175.08 BIC. Combination of parameters may differ among forest classes for further studies. Additional statistical tests can be supplemented to help determine the correlation among parameters such as Kaiser- Meyer-Olkin (KMO) Coefficient and the Barlett's Test for Spherecity (BTS).

  11. Using Post-Traumatic Amnesia To Predict Outcome after Traumatic Brain Injury.

    PubMed

    Ponsford, Jennie L; Spitz, Gershon; McKenzie, Dean

    2016-06-01

    Duration of post-traumatic amnesia (PTA) has emerged as a strong measure of injury severity after traumatic brain injury (TBI). Despite the growing international adoption of this measure, there remains a lack of consistency in the way in which PTA duration is used to classify severity of injury. This study aimed to establish the classification of PTA that would best predict functional or productivity outcomes. We conducted a cohort study of 1041 persons recruited from inpatient admissions to a TBI rehabilitation center between 1985 and 2013. Participants had a primary diagnosis of TBI, emerged from PTA before discharge from inpatient hospital, and engaged in productive activities before injury. Eight models that classify duration of PTA were evaluated-six that were based on the literature and two that were statistically driven. Models were assessed using area under the receiver operating characteristic curve (AUC) as well as model-based Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) statistics. All categorization models showed longer PTA to be associated with a greater likelihood of being nonproductive at 1 year after TBI. Classification systems with a greater number of categories performed better than two-category systems. The dimensional (continuous) form of PTA resulted in the greatest AUC, and lowest AIC as well as BIC, of the classification systems examined. This finding indicates that the greatest accuracy in prognosis is likely to be achieved using PTA as a continuous variable. This enables the probability of productive outcomes to be estimated with far greater precision than that possible using a classification system. Categorizing PTA to classify severity of injury may be reducing the precision with which clinicians can plan the treatment of patients after TBI.

  12. Informed consent: information or knowledge?

    PubMed

    Berger, Ken

    2003-01-01

    A fiduciary relationship should be nurtured between patient and physician. This requires effective communication throughout all aspects of care - especially pertaining to treatment decisions. In the context of illness as experienced by the patient a unique set of circumstances is presented. However, communication in an illness context is fraught with problems. The patient is vulnerable and the situation may be overwhelming. Voluminous amounts of information are available to patients from a host of health care providers, family members, support groups, advocacy centers, books, journals, and the internet. Often conflicting and confusion, frequently complex, this information may be of greater burden than benefit. Some information is of high validity and reliability while other information is of dubious reliability. The emotional freight of bad news may further inhibit understanding. An overload of information may pose an obstacle in decision-making. To facilitate the transformation of information into knowledge, the health care provider must act on some occasions as a filter, on other occasions as a conduit, and on still other occasions simply as a reservoir. The evolution of patient rights to receive or refuse treatment, the right to know or not to know calls for a change in processing of overwhelming information in our modem era. In this paper we will discuss the difference between information and knowledge. How can health care providers ensure they have given their patients all necessary and sufficient information to make an autonomous decision? How can they facilitate the transformation of information into knowledge? The effect of knowledge to consent allows a more focused, relevant and modern approach to choice in health care.

  13. Copyright Information

    Atmospheric Science Data Center

    2013-03-25

    ... not copyrighted. You may use NASA imagery, video and audio material for educational or informational purposes, including photo ...   NASA should be acknowledged as the source of the material except in cases of advertising. See  NASA Advertising Guidelines . ...

  14. Information engineering

    SciTech Connect

    Hunt, D.N.

    1997-02-01

    The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.

  15. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, K.L.; Boyer, J.L.; Sandor, A.; Thompson, S.G.; McCann, R.S.; Begault, D.R.; Adelstein, B.D.; Beutter, B.R.; Stone, L.S.

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. The major areas of work, or subtasks, within this DRP are: 1) Displays, 2) Controls, 3) Electronic Procedures and Fault Management, and 4) Human Performance Modeling. This DRP is a collaborative effort between researchers at Johnson Space Center and Ames Research Center.

  16. Informal Communication of Scientific Information

    ERIC Educational Resources Information Center

    Korfhage, Robert R.

    1974-01-01

    A re-examination of data used in a study of informal communication among sleep researchers raises some interesting questions in the definition of communication and its relationship to research productivity. (JB)

  17. Informed consent.

    PubMed

    Sacchini, D; Pennacchini, M

    2010-01-01

    Informed consent (IC) in clinical experimentation is a process by which a subject voluntarily and freely confirms his/her willingness to participate in a trial, after having been informed of all involved aspects. IC is a concept enough recent within medical tradition. Unquestionably, Nuremberg trials (1945-1947) influenced thought about consent in Medicine. When the idea of IC evolved, discussion for appropriate guidelines moved increasingly from a narrow focus on the physician's/researcher's obligation. IC shall be obtained in writing and documented before a subject is enrolled into clinical investigation. , Particularly in the case of medical devices, it is necessary an adequate information to the patient on possible incidents occurring following placing of devices.

  18. [Informed consent].

    PubMed

    Medina Castellano, Carmen Delia

    2009-10-01

    At present times, numerous complaints claiming defects produced at some point in the process of obtaining informed consent are filed in courts of justice; in these complaints there is an underlying comment element which is the roles that health professionals have in these processes. In obtaining this consent, one can see this more as a means to obtain judicial protection for professional practices rather than this process being a respectful exercise for the dignity and freedom which health service patients have. This article reflects on two basic rights related to informed consent: adequately obtaining this consent and the need to protect those people who lack, either partially or totally, the capacity to make this decision by themselves. Afterwards, the author makes some considerations about the necessity to obtain informed consent for nursing practices and treatment.

  19. Characterizing the relationship between temperature and mortality in tropical and subtropical cities: a distributed lag non-linear model analysis in Hue, Viet Nam, 2009-2013.

    PubMed

    Dang, Tran Ngoc; Seposo, Xerxes T; Duc, Nguyen Huu Chau; Thang, Tran Binh; An, Do Dang; Hang, Lai Thi Minh; Long, Tran Thanh; Loan, Bui Thi Hong; Honda, Yasushi

    2016-01-01

    Background The relationship between temperature and mortality has been found to be U-, V-, or J-shaped in developed temperate countries; however, in developing tropical/subtropical cities, it remains unclear. Objectives Our goal was to investigate the relationship between temperature and mortality in Hue, a subtropical city in Viet Nam. Design We collected daily mortality data from the Vietnamese A6 mortality reporting system for 6,214 deceased persons between 2009 and 2013. A distributed lag non-linear model was used to examine the temperature effects on all-cause and cause-specific mortality by assuming negative binomial distribution for count data. We developed an objective-oriented model selection with four steps following the Akaike information criterion (AIC) rule (i.e. a smaller AIC value indicates a better model). Results High temperature-related mortality was more strongly associated with short lags, whereas low temperature-related mortality was more strongly associated with long lags. The low temperatures increased risk in all-category mortality compared to high temperatures. We observed elevated temperature-mortality risk in vulnerable groups: elderly people (high temperature effect, relative risk [RR]=1.42, 95% confidence interval [CI]=1.11-1.83; low temperature effect, RR=2.0, 95% CI=1.13-3.52), females (low temperature effect, RR=2.19, 95% CI=1.14-4.21), people with respiratory disease (high temperature effect, RR=2.45, 95% CI=0.91-6.63), and those with cardiovascular disease (high temperature effect, RR=1.6, 95% CI=1.15-2.22; low temperature effect, RR=1.99, 95% CI=0.92-4.28). Conclusions In Hue, the temperature significantly increased the risk of mortality, especially in vulnerable groups (i.e. elderly, female, people with respiratory and cardiovascular diseases). These findings may provide a foundation for developing adequate policies to address the effects of temperature on health in Hue City.

  20. Regression with Empirical Variable Selection: Description of a New Method and Application to Ecological Datasets

    PubMed Central

    Goodenough, Anne E.; Hart, Adam G.; Stafford, Richard

    2012-01-01

    Despite recent papers on problems associated with full-model and stepwise regression, their use is still common throughout ecological and environmental disciplines. Alternative approaches, including generating multiple models and comparing them post-hoc using techniques such as Akaike's Information Criterion (AIC), are becoming more popular. However, these are problematic when there are numerous independent variables and interpretation is often difficult when competing models contain many different variables and combinations of variables. Here, we detail a new approach, REVS (Regression with Empirical Variable Selection), which uses all-subsets regression to quantify empirical support for every independent variable. A series of models is created; the first containing the variable with most empirical support, the second containing the first variable and the next most-supported, and so on. The comparatively small number of resultant models (n = the number of predictor variables) means that post-hoc comparison is comparatively quick and easy. When tested on a real dataset – habitat and offspring quality in the great tit (Parus major) – the optimal REVS model explained more variance (higher R2), was more parsimonious (lower AIC), and had greater significance (lower P values), than full, stepwise or all-subsets models; it also had higher predictive accuracy based on split-sample validation. Testing REVS on ten further datasets suggested that this is typical, with R2 values being higher than full or stepwise models (mean improvement = 31% and 7%, respectively). Results are ecologically intuitive as even when there are several competing models, they share a set of “core” variables and differ only in presence/absence of one or two additional variables. We conclude that REVS is useful for analysing complex datasets, including those in ecology and environmental disciplines. PMID:22479605

  1. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    SciTech Connect

    Houweling, Antonetta C.; Philippens, Marielle E.P.; Dijkema, Tim; Roesink, Judith M.; Terhaard, Chris H.J.; Schilstra, Cornelis; Ten Haken, Randall K.; Eisbruch, Avraham; Raaijmakers, Cornelis P.J.

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulated salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.

  2. Regression with empirical variable selection: description of a new method and application to ecological datasets.

    PubMed

    Goodenough, Anne E; Hart, Adam G; Stafford, Richard

    2012-01-01

    Despite recent papers on problems associated with full-model and stepwise regression, their use is still common throughout ecological and environmental disciplines. Alternative approaches, including generating multiple models and comparing them post-hoc using techniques such as Akaike's Information Criterion (AIC), are becoming more popular. However, these are problematic when there are numerous independent variables and interpretation is often difficult when competing models contain many different variables and combinations of variables. Here, we detail a new approach, REVS (Regression with Empirical Variable Selection), which uses all-subsets regression to quantify empirical support for every independent variable. A series of models is created; the first containing the variable with most empirical support, the second containing the first variable and the next most-supported, and so on. The comparatively small number of resultant models (n = the number of predictor variables) means that post-hoc comparison is comparatively quick and easy. When tested on a real dataset--habitat and offspring quality in the great tit (Parus major)--the optimal REVS model explained more variance (higher R(2)), was more parsimonious (lower AIC), and had greater significance (lower P values), than full, stepwise or all-subsets models; it also had higher predictive accuracy based on split-sample validation. Testing REVS on ten further datasets suggested that this is typical, with R(2) values being higher than full or stepwise models (mean improvement = 31% and 7%, respectively). Results are ecologically intuitive as even when there are several competing models, they share a set of "core" variables and differ only in presence/absence of one or two additional variables. We conclude that REVS is useful for analysing complex datasets, including those in ecology and environmental disciplines.

  3. Assessing the wildlife habitat value of New England salt marshes: II. Model testing and validation.

    PubMed

    McKinney, Richard A; Charpentier, Michael A; Wigand, Cathleen

    2009-07-01

    We tested a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. As a group, wildlife habitat value assessment scores for the marshes ranged from 307-509, or 31-67% of the maximum attainable score. We recorded 6 species of wading birds (Ardeidae; herons, egrets, and bitterns) at the sites during biweekly survey. Species richness (r (2)=0.24, F=4.53, p=0.05) and abundance (r (2)=0.26, F=5.00, p=0.04) of wading birds significantly increased with increasing assessment score. We optimized our assessment model for wading birds by using Akaike information criteria (AIC) to compare a series of models comprised of specific components and categories of our model that best reflect their habitat use. The model incorporating pre-classification, wading bird habitat categories, and natural land surrounding the sites was substantially supported by AIC analysis as the best model. The abundance of wading birds significantly increased with increasing assessment scores generated with the optimized model (r (2)=0.48, F=12.5, p=0.003), demonstrating that optimizing models can be helpful in improving the accuracy of the assessment for a given species or species assemblage. In addition to validating the assessment model, our results show that in spite of their urban setting our study marshes provide substantial wildlife habitat value. This suggests that even small wetlands in highly urbanized coastal settings can provide important wildlife habitat value if key habitat attributes (e.g., natural buffers, habitat heterogeneity) are present.

  4. A matrix-calibrated species-area model for predicting biodiversity losses due to land-use change.

    PubMed

    Koh, Lian Pin; Ghazoul, Jaboury

    2010-08-01

    Application of island biogeography theory to prediction of species extinctions resulting from habitat loss is based on the assumption that the transformed landscape matrix is completely inhospitable to the taxa considered, despite evidence demonstrating the nontrivial influence of matrix on populations within habitat remnants. The island biogeography paradigm therefore needs refining to account for specific responses of taxa to the area of habitat "islands" and to the quality of the surrounding matrix. We incorporated matrix effects into island theory by partitioning the slope (z value) of species-area relationships into two components: gamma, a constant, and sigma, a measure of taxon-specific responses to each component of a heterogeneous matrix. We used our matrix-calibrated model to predict extinction and endangerment of bird species resulting from land-use change in 20 biodiversity hotspots and compared these predictions with observed numbers of extinct and threatened bird species. We repeated this analysis with the conventional species-area model and the countryside species-area model, considering alternative z values of 0.35 (island) or 0.22 (continental). We evaluated the relative strength of support for each of the five candidate models with Akaike's information criterion (AIC). The matrix-calibrated model had the highest AIC weight (w(i) = 89.21%), which means the weight of evidence in support of this model was the optimal model given the set of candidate models and the data. In addition to being a valuable heuristic tool for assessing extinction risk, our matrix-calibrated model also allows quantitative assessment of biodiversity benefits (and trade-offs) of land-management options in human-dominated landscapes. Given that processes of secondary regeneration have become more widespread across tropical regions and are predicted to increase, our matrix-calibrated model will be increasingly appropriate for practical conservation in tropical landscapes.

  5. [Age identification and growth characteristics of Katsuwonus pelamis in western and central Pacific Ocean].

    PubMed

    Wang, Xue-Fang; Xu, Liu-Xiong; Zhu, Guo-Ping; Wang, Chun-Lei

    2010-03-01

    Fish age and growth are the important biological parameters for the assessment of fishery resources. With the help of purse seiners, 262 individuals of skipjack tuna (Katsuwonus pelamis) were sampled from western and central Pacific Ocean in October 2007 - January 2008. The measurements in situ showed that the fork length of the samples ranged from 278 to 746 mm, and their body mass ranged from 345 to 9905 g. The first dorsal spine of each individual was collected for age identification and growth parameters estimation. The relationship between fork length (L, mm) and body mass (M, g) was expressed as M = 3.612 x 10(-6) L3.278 (R2 = 0.9782), and no significant difference was found for the males and females (F = 2.002, P > 0.05). A comparison with Akaike information criterion (AIC) suggested that among power regression equation, linear regression equation, and exponential regression equation, linear regression equation was most suitable for describing the relationship between fork length and spine radius (AIC = 2257.4). The mean back-calculated fork lengths of K. pelamis with the ages of 1-5 estimated by Fraser-Lee's method were 398.4, 494.2, 555.4, 636.8, and 728.8 mm, respectively. Residual analyses indicated that there was no significant difference in the growth of male and female K. pelamis (F = 0.670; df = 182; P > 0.05). The sex-combined von Bertalanffy growth equation of K. pelamis was L(t) = 706.51 (1 - e(-0.64(t + 0.037))).

  6. Clinicopathologic implication of meticulous pathologic examination of regional lymph nodes in gastric cancer patients

    PubMed Central

    Koh, Jiwon; Lee, Hee Eun; Kim, Woo Ho

    2017-01-01

    Background We aimed to investigate effect of increased number of examined lymph nodes (LNs) to pN category, and compare various N categories in gastric cancer: American Joint Committee on Cancer (AJCC) 7th edition, metastatic LN ratio (MLR), and log odds of positive LNs (LODDS). Methods Four cohorts with a total of 2,309 gastric cancer patients were enrolled. For cohort 1 and 2, prognostic significance of each method by disease-specific survival was analyzed using Akaike and Bayesian information criterion (AIC and BIC). Results The total LNs in four cohorts significantly differed [median (range), 28 (6–97) in cohort 1, 37 (8–120) in cohort 2, 48 (7–122) in cohort 3, and 54 (4–221) in cohort 4; p<0.001]. The numbers of negative LNs increased with increase of total LN (p<0.001), but the numbers of metastatic LNs did not increase from cohort 1 to 4. MLR and LODDS in four cohorts had decreasing tendency with increase of total LNs in each pT3 and pT4 category (p<0.001), while the numbers of metastatic LNs did not differ significantly in any pT category (p>0.05). The AIC and BIC varied according to different cut-off values for MLR; model by cut-offs of 0.2 and 0.5 being better for cohort 1, while cut-offs 0.1 and 0.25 was better for cohort 2. Conclusion Our study showed that the number of metastatic LNs did not increase with maximal pathologic examination of regional LNs. AJCC 7th system is suggested as the simplest method with single cut-off value, but prognostic significance of MLR may be influenced by various cut-offs. PMID:28362845

  7. Pharmacokinetic Modeling of Intranasal Scopolamine in Plasma Saliva and Urine

    NASA Technical Reports Server (NTRS)

    Wu, L.; Chow, D. S. L.; Tam, V.; Putcha, L.

    2014-01-01

    An intranasal gel formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness. The bioavailability and pharmacokinetics (PK) were evaluated under the Food and Drug Administration guidelines for clinical trials for an Investigative New Drug (IND). The aim of this project was to develop a PK model that can predict the relationship between plasma, saliva and urinary scopolamine concentrations using data collected from the IND clinical trial with INSCOP. METHODS: Twelve healthy human subjects were administered three dose levels (0.1, 0.2 and 0.4 mg) of INSCOP. Serial blood, saliva and urine samples were collected between 5 min to 24 h after dosing and scopolamine concentrations measured by using a validated LC-MS-MS assay. Pharmacokinetic Compartmental models, using actual dosing and sampling times, were built using Phoenix (version 1.2). Model discrimination was performed, by minimizing the Akaike Information Criteria (AIC), maximizing the coefficient of determination (r²) and by comparison of the quality of fit plots. RESULTS: The best structural model to describe scopolamine disposition after INSCOP administration (minimal AIC =907.2) consisted of one compartment for plasma, saliva and urine respectively that were inter-connected with different rate constants. The estimated values of PK parameters were compiled in Table 1. The model fitting exercises revealed a nonlinear PK for scopolamine between plasma and saliva compartments for K21, Vmax and Km. CONCLUSION: PK model for INSCOP was developed and for the first time it satisfactorily predicted the PK of scopolamine in plasma, saliva and urine after INSCOP administration. Using non-linear PK yielded the best structural model to describe scopolamine disposition between plasma and saliva compartments, and inclusion of non-linear PK resulted in a significant improved model fitting. The model can be utilized to predict scopolamine plasma concentration using saliva and/or urine data that

  8. Genome-Wide Heterogeneity of Nucleotide Substitution Model Fit

    PubMed Central

    Arbiza, Leonardo; Patricio, Mateus; Dopazo, Hernán; Posada, David

    2011-01-01

    At a genomic scale, the patterns that have shaped molecular evolution are believed to be largely heterogeneous. Consequently, comparative analyses should use appropriate probabilistic substitution models that capture the main features under which different genomic regions have evolved. While efforts have concentrated in the development and understanding of model selection techniques, no descriptions of overall relative substitution model fit at the genome level have been reported. Here, we provide a characterization of best-fit substitution models across three genomic data sets including coding regions from mammals, vertebrates, and Drosophila (24,000 alignments). According to the Akaike Information Criterion (AIC), 82 of 88 models considered were selected as best-fit models at least in one occasion, although with very different frequencies. Most parameter estimates also varied broadly among genes. Patterns found for vertebrates and Drosophila were quite similar and often more complex than those found in mammals. Phylogenetic trees derived from models in the 95% confidence interval set showed much less variance and were significantly closer to the tree estimated under the best-fit model than trees derived from models outside this interval. Although alternative criteria selected simpler models than the AIC, they suggested similar patterns. All together our results show that at a genomic scale, different gene alignments for the same set of taxa are best explained by a large variety of different substitution models and that model choice has implications on different parameter estimates including the inferred phylogenetic trees. After taking into account the differences related to sample size, our results suggest a noticeable diversity in the underlying evolutionary process. All together, we conclude that the use of model selection techniques is important to obtain consistent phylogenetic estimates from real data at a genomic scale. PMID:21824869

  9. Effects of human recreation on the incubation behavior of American Oystercatchers

    USGS Publications Warehouse

    McGowan, C.P.; Simons, T.R.

    2006-01-01

    Human recreational disturbance and its effects on wildlife demographics and behavior is an increasingly important area of research. We monitored the nesting success of American Oystercatchers (Haematopus palliatus) in coastal North Carolina in 2002 and 2003. We also used video monitoring at nests to measure the response of incubating birds to human recreation. We counted the number of trips per hour made by adult birds to and from the nest, and we calculated the percent time that adults spent incubating. We asked whether human recreational activities (truck, all-terrain vehicle [ATV], and pedestrian traffic) were correlated with parental behavioral patterns. Eleven a priori models of nest survival and behavioral covariates were evaluated using Akaike's Information Criterion (AIC) to see whether incubation behavior influenced nest survival. Factors associated with birds leaving their nests (n = 548) included ATV traffic (25%), truck traffic (17%), pedestrian traffic (4%), aggression with neighboring oystercatchers or paired birds exchanging incubation duties (26%), airplane traffic (1%) and unknown factors (29%). ATV traffic was positively associated with the rate of trips to and away from the nest (??1 = 0.749, P < 0.001) and negatively correlated with percent time spent incubating (??1 = -0.037, P = 0.025). Other forms of human recreation apparently had little effect on incubation behaviors. Nest survival models incorporating the frequency of trips by adults to and from the nest, and the percentage of time adults spent incubating, were somewhat supported in the AIC analyses. A low frequency of trips to and from the nest and, counter to expectations, low percent time spent incubating were associated with higher daily nest survival rates. These data suggest that changes in incubation behavior might be one mechanism by which human recreation affects the reproductive success of American Oystercatchers.

  10. Does transport time help explain the high trauma mortality rates in rural areas? New and traditional predictors assessed by new and traditional statistical methods

    PubMed Central

    Røislien, Jo; Lossius, Hans Morten; Kristiansen, Thomas

    2015-01-01

    Background Trauma is a leading global cause of death. Trauma mortality rates are higher in rural areas, constituting a challenge for quality and equality in trauma care. The aim of the study was to explore population density and transport time to hospital care as possible predictors of geographical differences in mortality rates, and to what extent choice of statistical method might affect the analytical results and accompanying clinical conclusions. Methods Using data from the Norwegian Cause of Death registry, deaths from external causes 1998–2007 were analysed. Norway consists of 434 municipalities, and municipality population density and travel time to hospital care were entered as predictors of municipality mortality rates in univariate and multiple regression models of increasing model complexity. We fitted linear regression models with continuous and categorised predictors, as well as piecewise linear and generalised additive models (GAMs). Models were compared using Akaike's information criterion (AIC). Results Population density was an independent predictor of trauma mortality rates, while the contribution of transport time to hospital care was highly dependent on choice of statistical model. A multiple GAM or piecewise linear model was superior, and similar, in terms of AIC. However, while transport time was statistically significant in multiple models with piecewise linear or categorised predictors, it was not in GAM or standard linear regression. Conclusions Population density is an independent predictor of trauma mortality rates. The added explanatory value of transport time to hospital care is marginal and model-dependent, highlighting the importance of exploring several statistical models when studying complex associations in observational data. PMID:25972600

  11. An automated process for building reliable and optimal in vitro/in vivo correlation models based on Monte Carlo simulations.

    PubMed

    Sutton, Steven C; Hu, Mingxiu

    2006-05-05

    Many mathematical models have been proposed for establishing an in vitro/in vivo correlation (IVIVC). The traditional IVIVC model building process consists of 5 steps: deconvolution, model fitting, convolution, prediction error evaluation, and cross-validation. This is a time-consuming process and typically a few models at most are tested for any given data set. The objectives of this work were to (1) propose a statistical tool to screen models for further development of an IVIVC, (2) evaluate the performance of each model under different circumstances, and (3) investigate the effectiveness of common statistical model selection criteria for choosing IVIVC models. A computer program was developed to explore which model(s) would be most likely to work well with a random variation from the original formulation. The process used Monte Carlo simulation techniques to build IVIVC models. Data-based model selection criteria (Akaike Information Criteria [AIC], R2) and the probability of passing the Food and Drug Administration "prediction error" requirement was calculated. To illustrate this approach, several real data sets representing a broad range of release profiles are used to illustrate the process and to demonstrate the advantages of this automated process over the traditional approach. The Hixson-Crowell and Weibull models were often preferred over the linear. When evaluating whether a Level A IVIVC model was possible, the model selection criteria AIC generally selected the best model. We believe that the approach we proposed may be a rapid tool to determine which IVIVC model (if any) is the most applicable.

  12. Comparison of non-linear models to describe the lactation curves for milk yield and composition in buffaloes (Bubalus bubalis).

    PubMed

    Ghavi Hossein-Zadeh, N

    2016-02-01

    In order to describe the lactation curves of milk yield (MY) and composition in buffaloes, seven non-linear mathematical equations (Wood, Dhanoa, Sikka, Nelder, Brody, Dijkstra and Rook) were used. Data were 116,117 test-day records for MY, fat (FP) and protein (PP) percentages of milk from the first three lactations of buffaloes which were collected from 893 herds in the period from 1992 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly production records of dairy buffaloes using the NLIN and MODEL procedures in SAS and the parameters were estimated. The models were tested for goodness of fit using adjusted coefficient of determination (Radj(2)), root means square error (RMSE), Durbin-Watson statistic and Akaike's information criterion (AIC). The Dijkstra model provided the best fit of MY and PP of milk for the first three parities of buffaloes due to the lower values of RMSE and AIC than other models. For the first-parity buffaloes, Sikka and Brody models provided the best fit of FP, but for the second- and third-parity buffaloes, Sikka model and Brody equation provided the best fit of lactation curve for FP, respectively. The results of this study showed that the Wood and Dhanoa equations were able to estimate the time to the peak MY more accurately than the other equations. In addition, Nelder and Dijkstra equations were able to estimate the peak time at second and third parities more accurately than other equations, respectively. Brody function provided more accurate predictions of peak MY over the first three parities of buffaloes. There was generally a positive relationship between 305-day MY and persistency measures and also between peak yield and 305-day MY, calculated by different models, within each lactation in the current study. Overall, evaluation of the different equations used in the current study indicated the potential of the non-linear models for fitting monthly productive records of buffaloes.

  13. Evaluating a coupled discrete wavelet transform and support vector regression for daily and monthly streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyong; Zhou, Ping; Chen, Gang; Guo, Ledong

    2014-11-01

    This study investigated the performance and potential of a hybrid model that combined the discrete wavelet transform and support vector regression (the DWT-SVR model) for daily and monthly streamflow forecasting. Three key factors of the wavelet decomposition phase (mother wavelet, decomposition level, and edge effect) were proposed to consider for improving the accuracy of the DWT-SVR model. The performance of DWT-SVR models with different combinations of these three factors was compared with the regular SVR model. The effectiveness of these models was evaluated using the root-mean-squared error (RMSE) and Nash-Sutcliffe model efficiency coefficient (NSE). Daily and monthly streamflow data observed at two stations in Indiana, United States, were used to test the forecasting skill of these models. The results demonstrated that the different hybrid models did not always outperform the SVR model for 1-day and 1-month lead time streamflow forecasting. This suggests that it is crucial to consider and compare the three key factors when using the DWT-SVR model (or other machine learning methods coupled with the wavelet transform), rather than choosing them based on personal preferences. We then combined forecasts from multiple candidate DWT-SVR models using a model averaging technique based upon Akaike's information criterion (AIC). This ensemble prediction was superior to the single best DWT-SVR model and regular SVR model for both 1-day and 1-month ahead predictions. With respect to longer lead times (i.e., 2- and 3-day and 2-month), the ensemble predictions using the AIC averaging technique were consistently better than the best DWT-SVR model and SVR model. Therefore, integrating model averaging techniques with the hybrid DWT-SVR model would be a promising approach for daily and monthly streamflow forecasting. Additionally, we strongly recommend considering these three key factors when using wavelet-based SVR models (or other wavelet-based forecasting models).

  14. Canada lynx Lynx canadensis habitat and forest succession in northern Maine, USA

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Jakubas, W.J.; McCollough, M.A.

    2004-01-01

    The contiguous United States population of Canada lynx Lynx canadensis was listed as threatened in 2000. The long-term viability of lynx populations at the southern edge of their geographic range has been hypothesized to be dependent on old growth forests; however, lynx are a specialist predator on snowshoe hare Lepus americanus, a species associated with early-successional forests. To quantify the effects of succession and forest management on landscape-scale (100 km2) patterns of habitat occupancy by lynx, we compared landscape attributes in northern Maine, USA, where lynx had been detected on snow track surveys to landscape attributes where surveys had been conducted, but lynx tracks had not been detected. Models were constructed a priori and compared using logistic regression and Akaike's Information Criterion (AIC), which quantitatively balances data fit and parsimony. In the models with the lowest (i.e. best) AIC, lynx were more likely to occur in landscapes with much regenerating forest, and less likely to occur in landscapes with much recent clearcut, partial harvest and forested wetland. Lynx were not associated positively or negatively with mature coniferous forest. A probabilistic map of the model indicated a patchy distribution of lynx habitat in northern Maine. According to an additional survey of the study area for lynx tracks during the winter of 2003, the model correctly classified 63.5% of the lynx occurrences and absences. Lynx were more closely associated with young forests than mature forests; however, old-growth forests were functionally absent from the landscape. Lynx habitat could be reduced in northern Maine, given recent trends in forest management practices. Harvest strategies have shifted from clearcutting to partial harvesting. If this trend continues, future landscapes will shift away from extensive regenerating forests and toward landscapes dominated by pole-sized and larger stands. Because Maine presently supports the only verified

  15. Comparison of clinical outcomes and prognostic utility of risk stratification tools in patients with therapy-related vs de novo myelodysplastic syndromes: a report on behalf of the MDS Clinical Research Consortium.

    PubMed

    Zeidan, A M; Al Ali, N; Barnard, J; Padron, E; Lancet, J E; Sekeres, M A; Steensma, D P; DeZern, A; Roboz, G; Jabbour, E; Garcia-Manero, G; List, A; Komrokji, R

    2017-02-24

    While therapy-related (t)-myelodysplastic syndromes (MDS) have worse outcomes than de novo MDS (d-MDS), some t-MDS patients have an indolent course. Most MDS prognostic models excluded t-MDS patients during development. The performances of the International Prognostic Scoring System (IPSS), revised IPSS (IPSS-R), MD Anderson Global Prognostic System (MPSS), WHO Prognostic Scoring System (WPSS) and t-MDS Prognostic System (TPSS) were compared among patients with t-MDS. Akaike information criteria (AIC) assessed the relative goodness of fit of the models. We identified 370 t-MDS patients (19%) among 1950 MDS patients. Prior therapy included chemotherapy alone (48%), chemoradiation (31%), and radiation alone in 21%. Median survival for t-MDS patients was significantly shorter than for d-MDS (19 vs 46 months, P<0.005). All models discriminated survival in t-MDS (P<0.005 for each model). Patients with t-MDS had a significantly higher hazard of death relative to d-MDS in every risk model, and had inferior survival compared to patients with d-MDS within all risk group categories. AIC Scores (lower is better) were 2316 (MPSS), 2343 (TPSS), 2343 (IPSS-R), 2361 (WPSS) and 2364 (IPSS). In conclusion, subsets of t-MDS patients with varying clinical outcomes can be identified using conventional risk stratification models. The MPSS, TPSS and IPSS-R provide the best predictive power.Leukemia advance online publication, 24 February 2017; doi:10.1038/leu.2017.33.

  16. Markov Mixed Effects Modeling Using Electronic Adherence Monitoring Records Identifies Influential Covariates to HIV Preexposure Prophylaxis.

    PubMed

    Madrasi, Kumpal; Chaturvedula, Ayyappa; Haberer, Jessica E; Sale, Mark; Fossler, Michael J; Bangsberg, David; Baeten, Jared M; Celum, Connie; Hendrix, Craig W

    2016-12-06

    Adherence is a major factor in the effectiveness of preexposure prophylaxis (PrEP) for HIV prevention. Modeling patterns of adherence helps to identify influential covariates of different types of adherence as well as to enable clinical trial simulation so that appropriate interventions can be developed. We developed a Markov mixed-effects model to understand the covariates influencing adherence patterns to daily oral PrEP. Electronic adherence records (date and time of medication bottle cap opening) from the Partners PrEP ancillary adherence study with a total of 1147 subjects were used. This study included once-daily dosing regimens of placebo, oral tenofovir disoproxil fumarate (TDF), and TDF in combination with emtricitabine (FTC), administered to HIV-uninfected members of serodiscordant couples. One-coin and first- to third-order Markov models were fit to the data using NONMEM(®) 7.2. Model selection criteria included objective function value (OFV), Akaike information criterion (AIC), visual predictive checks, and posterior predictive checks. Covariates were included based on forward addition (α = 0.05) and backward elimination (α = 0.001). Markov models better described the data than 1-coin models. A third-order Markov model gave the lowest OFV and AIC, but the simpler first-order model was used for covariate model building because no additional benefit on prediction of target measures was observed for higher-order models. Female sex and older age had a positive impact on adherence, whereas Sundays, sexual abstinence, and sex with a partner other than the study partner had a negative impact on adherence. Our findings suggest adherence interventions should consider the role of these factors.

  17. Monthly streamflow prediction in the Volta Basin of West Africa: A SISO NARMAX polynomial modelling

    NASA Astrophysics Data System (ADS)

    Amisigo, B. A.; van de Giesen, N.; Rogers, C.; Andah, W. E. I.; Friesen, J.

    Single-input-single-output (SISO) non-linear system identification techniques were employed to model monthly catchment runoff at selected gauging sites in the Volta Basin of West Africa. NARMAX (Non-linear Autoregressive Moving Average with eXogenous Input) polynomial models were fitted to basin monthly rainfall and gauging station runoff data for each of the selected sites and used to predict monthly runoff at the sites. An error reduction ratio (ERR) algorithm was used to order regressors for various combinations of input, output and noise lags (various model structures) and the significant regressors for each model selected by applying an Akaike Information Criterion (AIC) to independent rainfall-runoff validation series. Model parameters were estimated from the Matlab REGRESS function (an orthogonal least squares method). In each case, the sub-model without noise terms was fitted first followed by a fitting of the noise model. The coefficient of determination ( R-squared), the Nash-Sutcliffe Efficiency criterion (NSE) and the F statistic for the estimation (training) series were used to evaluate the significance of fit of each model to this series while model selection from the range of models fitted for each gauging site was done by examining the NSEs and the AICs of the validation series. Monthly runoff predictions from the selected models were very good, and the polynomial models appeared to have captured a good part of the rainfall-runoff non-linearity. The results indicate that the NARMAX modelling framework is suitable for monthly river runoff prediction in the Volta Basin. The several good models made available by the NARMAX modelling framework could be useful in the selection of model structures that also provide insights into the physical behaviour of the catchment rainfall-runoff system.

  18. Yesterday's Information.

    ERIC Educational Resources Information Center

    McKay, Martin D.; Stout, J. David

    1999-01-01

    Discusses access to Internet resources in school libraries, including the importance of evaluating content and appropriate use. The following online services that provide current factual information from legitimate resources are described: SIRS (Social Issues Resource Series), InfoTrac, EBSCO Host, SearchBank, and the Electric Library. (MES)

  19. Envisioning Information.

    ERIC Educational Resources Information Center

    Tufte, Edward R.

    This book presents over 400 illustrations of complex data that show how the dimensionality and density of portrayals can be enhanced. Practical advice on how to explain complex materials by visual means is given, and examples illustrate the fundamental principles of information display. Design strategies presented are exemplified in maps, the…

  20. Information Technology

    DTIC Science & Technology

    2003-01-01

    Information Economy 2002. January 2003. www.oecd.org/sti/measuring-infoeconomy Pappalardo , Denise and Martin, Michael. Lobbying Group Outlines Big...Pipe Dream. Network World. January 21, 2002. Pappalardo , Denise and Mears, Jennifer. What is Broadband? Network World. November 18, 2002. Page... Pappalardo , Denise. FCC Plans 3G Spectrum Auction. Network World. November 18, 2002. Patrick, Steven. Malaysia Steps Up Piracy Fight

  1. Inferential statistics from Black Hispanic breast cancer survival data.

    PubMed

    Khan, Hafiz M R; Saxena, Anshul; Ross, Elizabeth; Ramamoorthy, Venkataraghavan; Sheehan, Diana

    2014-01-01

    In this paper we test the statistical probability models for breast cancer survival data for race and ethnicity. Data was collected from breast cancer patients diagnosed in United States during the years 1973-2009. We selected a stratified random sample of Black Hispanic female patients from the Surveillance Epidemiology and End Results (SEER) database to derive the statistical probability models. We used three common model building criteria which include Akaike Information Criteria (AIC), Bayesian Information Criteria (BIC), and Deviance Information Criteria (DIC) to measure the goodness of fit tests and it was found that Black Hispanic female patients survival data better fit the exponentiated exponential probability model. A novel Bayesian method was used to derive the posterior density function for the model parameters as well as to derive the predictive inference for future response. We specifically focused on Black Hispanic race. Markov Chain Monte Carlo (MCMC) method was used for obtaining the summary results of posterior parameters. Additionally, we reported predictive intervals for future survival times. These findings would be of great significance in treatment planning and healthcare resource allocation.

  2. Model averaging and muddled multimodel inferences

    USGS Publications Warehouse

    Cade, Brian S.

    2015-01-01

    Three flawed practices associated with model averaging coefficients for predictor variables in regression models commonly occur when making multimodel inferences in analyses of ecological data. Model-averaged regression coefficients based on Akaike information criterion (AIC) weights have been recommended for addressing model uncertainty but they are not valid, interpretable estimates of partial effects for individual predictors when there is multicollinearity among the predictor variables. Multicollinearity implies that the scaling of units in the denominators of the regression coefficients may change across models such that neither the parameters nor their estimates have common scales, therefore averaging them makes no sense. The associated sums of AIC model weights recommended to assess relative importance of individual predictors are really a measure of relative importance of models, with little information about contributions by individual predictors compared to other measures of relative importance based on effects size or variance reduction. Sometimes the model-averaged regression coefficients for predictor variables are incorrectly used to make model-averaged predictions of the response variable when the models are not linear in the parameters. I demonstrate the issues with the first two practices using the college grade point average example extensively analyzed by Burnham and Anderson. I show how partial standard deviations of the predictor variables can be used to detect changing scales of their estimates with multicollinearity. Standardizing estimates based on partial standard deviations for their variables can be used to make the scaling of the estimates commensurate across models, a necessary but not sufficient condition for model averaging of the estimates to be sensible. A unimodal distribution of estimates and valid interpretation of individual parameters are additional requisite conditions. The standardized estimates or equivalently the

  3. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  4. Teaching Information Skills: Recording Information.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2002-01-01

    Discusses how to teach students in primary and intermediate grades to record and organize information. Highlights include developing a research question; collaborative planning between teachers and library media specialists; consistency of data entry; and an example of a unit on animal migration based on an appropriate Web site. (LRW)

  5. Protecting Information

    NASA Astrophysics Data System (ADS)

    Loepp, Susan; Wootters, William K.

    2006-09-01

    For many everyday transmissions, it is essential to protect digital information from noise or eavesdropping. This undergraduate introduction to error correction and cryptography is unique in devoting several chapters to quantum cryptography and quantum computing, thus providing a context in which ideas from mathematics and physics meet. By covering such topics as Shor's quantum factoring algorithm, this text informs the reader about current thinking in quantum information theory and encourages an appreciation of the connections between mathematics and science.Of particular interest are the potential impacts of quantum physics:(i) a quantum computer, if built, could crack our currently used public-key cryptosystems; and (ii) quantum cryptography promises to provide an alternative to these cryptosystems, basing its security on the laws of nature rather than on computational complexity. No prior knowledge of quantum mechanics is assumed, but students should have a basic knowledge of complex numbers, vectors, and matrices. Accessible to readers familiar with matrix algebra, vector spaces and complex numbers First undergraduate text to cover cryptography, error-correction, and quantum computation together Features exercises designed to enhance understanding, including a number of computational problems, available from www.cambridge.org/9780521534765

  6. Joint Planck and WMAP assessment of low CMB multipoles

    NASA Astrophysics Data System (ADS)

    Iqbal, Asif; Prasad, Jayanti; Souradeep, Tarun; Malik, Manzoor A.

    2015-06-01

    The remarkable progress in cosmic microwave background (CMB) studies over past decade has led to the era of precision cosmology in striking agreement with the ΛCDM model. However, the lack of power in the CMB temperature anisotropies at large angular scales (low-l), as has been confirmed by the recent Planck data also (up to 0l=4), although statistically not very strong (less than 3σ), is still an open problem. One can avoid to seek an explanation for this problem by attributing the lack of power to cosmic variance or can look for explanations i.e., different inflationary potentials or initial conditions for inflation to begin with, non-trivial topology, ISW effect etc. Features in the primordial power spectrum (PPS) motivated by the early universe physics has been the most common solution to address this problem. In the present work we also follow this approach and consider a set of PPS which have features and constrain the parameters of those using WMAP 9 year and Planck data employing Markov-Chain Monte Carlo (MCMC) analysis. The prominent feature of all the models of PPS that we consider is an infra-red cut off which leads to suppression of power at large angular scales. We consider models of PPS with maximum three extra parameters and use Akaike information criterion (AIC) and Bayesian information criterion (BIC) of model selection to compare the models. For most models, we find good constraints for the cut off scale kc, however, for other parameters our constraints are not that good. We find that sharp cut off model gives best likelihood value for the WMAP 9 year data, but is as good as power law model according to AIC. For the joint WMAP 9 + Planck data set, Starobinsky model is slightly preferred by AIC which is also able to produce CMB power suppression up to 0l<=3 to some extent. However, using BIC criteria, one finds model(s) with least number of parameters (power law model) are always preferred.

  7. Information Environments

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia

    2003-01-01

    The objective of GRC CNIS/IE work is to build a plug-n-play infrastructure that provides the Grand Challenge Applications with a suite of tools for coupling codes together, numerical zooming between fidelity of codes and gaining deployment of these simulations onto the Information Power Grid. The GRC CNIS/IE work will streamline and improve this process by providing tighter integration of various tools through the use of object oriented design of component models and data objects and through the use of CORBA (Common Object Request Broker Architecture).

  8. Information management - Assessing the demand for information

    NASA Technical Reports Server (NTRS)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  9. Modelling lactation curve for milk fat to protein ratio in Iranian buffaloes (Bubalus bubalis) using non-linear mixed models.

    PubMed

    Hossein-Zadeh, Navid Ghavi

    2016-08-01

    The aim of this study was to compare seven non-linear mathematical models (Brody, Wood, Dhanoa, Sikka, Nelder, Rook and Dijkstra) to examine their efficiency in describing the lactation curves for milk fat to protein ratio (FPR) in Iranian buffaloes. Data were 43 818 test-day records for FPR from the first three lactations of Iranian buffaloes which were collected on 523 dairy herds in the period from 1996 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly FPR records of buffaloes using the non-linear mixed model procedure (PROC NLMIXED) in SAS and the parameters were estimated. The models were tested for goodness of fit using Akaike's information criterion (AIC), Bayesian information criterion (BIC) and log maximum likelihood (-2 Log L). The Nelder and Sikka mixed models provided the best fit of lactation curve for FPR in the first and second lactations of Iranian buffaloes, respectively. However, Wood, Dhanoa and Sikka mixed models provided the best fit of lactation curve for FPR in the third parity buffaloes. Evaluation of first, second and third lactation features showed that all models, except for Dijkstra model in the third lactation, under-predicted test time at which daily FPR was minimum. On the other hand, minimum FPR was over-predicted by all equations. Evaluation of the different models used in this study indicated that non-linear mixed models were sufficient for fitting test-day FPR records of Iranian buffaloes.

  10. Multimodel Predictive System for Carbon Dioxide Solubility in Saline Formation Waters

    SciTech Connect

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-05

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO{sub 2} solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304–433 K, pressure range 74–500 bar, and salt concentration range 0–7 m (NaCl equivalent), using 173 published CO{sub 2} solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO{sub 2} solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO{sub 2} solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  11. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese

    NASA Astrophysics Data System (ADS)

    Cai, Qianqian; Turner, Brett D.; Sheng, Daichao; Sloan, Scott

    2015-06-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd2 +, Ba2 + and Mn2 +) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L- 1) and Cd (100 mg L- 1) respectively increases the rate of fluoride sorption by a factor of 28.3 and 10.9, the maximum sorption capacity is increased by 2.2 and 1.7. The presence of Ba (100 mg L- 1) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba.

  12. Flexible and fixed mathematical models describing growth patterns of chukar partridges

    NASA Astrophysics Data System (ADS)

    Aygün, Ali; Narinç, Doǧan

    2016-04-01

    In animal science, the nonlinear regression models for growth curve analysis ofgrowth patterns are separated into two groups called fixed and flexible according to their point of inflection. The aims of this study were to compare fixed and flexible growth functions and to determine the best fit model for the growth data of chukar partridges. With this aim, the growth data of partridges were modeled with widely used models, such as Gompertz, Logistic, Von Bertalanffy as well as the flexible functions, such as, Richards, Janoschek, Levakovich. So as to evaluate growth functions, the R2 (coefficient of determination), adjusted R2 (adjusted coefficient of determination), MSE (mean square error), AIC (Akaike's information criterion) and BIC (Bayesian information criterion) goodness of fit criteria were used. It has been determined that the best fit model from the point of chukar partridge growth data according to mentioned goodness of fit criteria is Janoschek function which has a flexible structure. The Janoschek model is not only important because it has a higher number of parameters with biological meaning than the other functions (the mature weight and initial weight parameters), but also because it was not previously used in the modeling of the chukar partridge growth.

  13. Multimodel predictive system for carbon dioxide solubility in saline formation waters.

    PubMed

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-05

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO(2) solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304-433 K, pressure range 74-500 bar, and salt concentration range 0-7 m (NaCl equivalent), using 173 published CO(2) solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO(2) solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO(2) solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  14. A novel electrocardiogram parameterization algorithm and its application in myocardial infarction detection.

    PubMed

    Liu, Bin; Liu, Jikui; Wang, Guoqing; Huang, Kun; Li, Fan; Zheng, Yang; Luo, Youxi; Zhou, Fengfeng

    2015-06-01

    The electrocardiogram (ECG) is a biophysical electric signal generated by the heart muscle, and is one of the major measurements of how well a heart functions. Automatic ECG analysis algorithms usually extract the geometric or frequency-domain features of the ECG signals and have already significantly facilitated automatic ECG-based cardiac disease diagnosis. We propose a novel ECG feature by fitting a given ECG signal with a 20th order polynomial function, defined as PolyECG-S. The PolyECG-S feature is almost identical to the fitted ECG curve, measured by the Akaike information criterion (AIC), and achieved a 94.4% accuracy in detecting the Myocardial Infarction (MI) on the test dataset. Currently ST segment elongation is one of the major ways to detect MI (ST-elevation myocardial infarction, STEMI). However, many ECG signals have weak or even undetectable ST segments. Since PolyECG-S does not rely on the information of ST waves, it can be used as a complementary MI detection algorithm with the STEMI strategy. Overall, our results suggest that the PolyECG-S feature may satisfactorily reconstruct the fitted ECG curve, and is complementary to the existing ECG features for automatic cardiac function analysis.

  15. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    SciTech Connect

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  16. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  17. Time Series Decomposition into Oscillation Components and Phase Estimation.

    PubMed

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  18. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese.

    PubMed

    Cai, Qianqian; Turner, Brett D; Sheng, Daichao; Sloan, Scott

    2015-01-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd(2+), Ba(2+) and Mn(2+)) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L(-1)) and Cd (100 mg L(-1)) respectively increases the rate of fluoride sorption by a factor of ~28.3 and ~10.9, the maximum sorption capacity is increased by ~2.2 and ~1.7. The presence of Ba (100 mg L(-1)) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba.

  19. Prediction and extension of curves of distillation of vacuum residue using probability functions

    NASA Astrophysics Data System (ADS)

    León, A. Y.; Riaño, P. A.; Laverde, D.

    2016-02-01

    The use of the probability functions for the prediction of crude distillation curves has been implemented in different characterization studies for refining processes. The study of four functions of probability (Weibull extreme, Weibull, Kumaraswamy and Riazi), was analyzed in this work for the fitting of curves of distillation of vacuum residue. After analysing the experimental data was selected the Weibull extreme function as the best prediction function, the fitting capability of the best function was validated considering as criterions of estimation the AIC (Akaike Information Criterion), BIC (Bayesian information Criterion), and correlation coefficient R2. To cover a wide range of composition were selected fifty-five (55) vacuum residue derived from different hydrocarbon mixture. The parameters of the probability function Weibull Extreme were adjusted from simple measure properties such as Conradson Carbon Residue (CCR), and compositional analysis SARA (saturates, aromatics, resins and asphaltenes). The proposed method is an appropriate tool to describe the tendency of distillation curves and offers a practical approach in terms of classification of vacuum residues.

  20. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    SciTech Connect

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.

  1. Effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum.

    PubMed

    Kadam, Shekhar U; Tiwari, Brijesh K; O'Donnell, Colm P

    2015-03-01

    The effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum under hot-air convective drying was investigated. Pretreatments were carried out at ultrasound intensity levels ranging from 7.00 to 75.78 Wcm(-2) for 10 min using an ultrasonic probe system. It was observed that ultrasound pre-treatments reduced the drying time required. The shortest drying times were obtained from samples pre-treated at 75.78 Wcm(-2). The fit quality of 6 thin-layer drying models was also evaluated using the determination of coefficient (R(2)), root means square error (RMSE), AIC (Akaike information criterion) and BIC (Bayesian information criterion). Drying kinetics were modelled using the Newton, Henderson and Pabis, Page, Wang and Singh, Midilli et al. and Weibull models. The Newton, Wang and Singh, and Midilli et al. models showed the best fit to the experimental drying data. Color of ultrasound pretreated dried seaweed samples were lighter compared to control samples. It was concluded that ultrasound pretreatment can be effectively used to reduce the energy cost and drying time for drying of A. nodosum.

  2. Informed consent.

    PubMed

    Steevenson, Grania

    2006-08-01

    Disclosure of information prior to consent is a very complex area of medical ethics. On the surface it would seem to be quite clear cut, but on closer inspection the scope for 'grey areas' is vast. In practice, however, it could be argued that the number of cases that result in complaint or litigation is comparatively small. However, this does not mean that wrong decisions or unethical scenarios do not occur. It would seem that in clinical practice these ethical grey areas concerning patients' full knowledge of their condition or treatment are quite common. One of the barometers for how much disclosure should be given prior to consent could be the feedback obtained from the patients. Are they asking relevant questions pertinent to their condition and do they show a good understanding of the options available? This should be seen as a positive trait and should be welcomed by the healthcare professionals. Ultimately it gives patients greater autonomy and the healthcare professional can expand and build on the patient's knowledge as well as allay fears perhaps based on wrongly held information. Greater communication with the patient would help the healthcare professional pitch their explanations at the right level. Every case and scenario is different and unique and deserves to be treated as such. Studies have shown that most patients can understand their medical condition and treatment provided communication has been thorough (Gillon 1996). It is in the patients' best interests to feel comfortable with the level of disclosure offered to them. It can only foster greater trust and respect between them and the healthcare profession which has to be mutually beneficial to both parties.

  3. Extracting tectonic information from cosmogenic exposure ages along bedrock scarps using synthetic and natural data

    NASA Astrophysics Data System (ADS)

    Cowie, Patience; Walker, Matthew; Roberts, Gerald; Phillips, Richard; Dunai, Tibor; Zijerveld, Leo; McCaffrey, Ken

    2010-05-01

    Cosmogenic surface exposure dating is a powerful tool for reconstructing long term slip histories on active faults and extracting earthquake recurrence intervals (e.g. Benedetti et al., GRL, v.29, 2002). Extensional faults are particularly amenable to this type of study because they commonly produce a striated bedrock scarp, exhumed by faulting, that can be directly dated. Bedrock scarps in limestone can be sampled to obtain the concentration of cosmogenic 36Cl, produced primarily through interactions of cosmic ray secondary neutrons and muons with Ca within the scarp limestone. To first order the production rate decreases exponentially with depth beneath the ground surface. Because each normal-faulting earthquake uplifts a new portion of the scarp above the surface, the 36Cl concentration along the scarp is the sum of that 36Cl produced below the surface prior to the earthquake and that accumulated above the surface after the earthquake. For a scarp being seismically exhumed, the characteristic profile is therefore a series of exponentials with discontinuities marking the timing of each earthquake. The number of events, their timing and the magnitude of the associated slip strongly influence the shape of 36Cl profile. Existing methods for extracting paleo-earthquakes from these data are based on a forward modelling approach and have shown that slip events ≥0.5m (≥ Magnitude 6.0) are well resolved by fully sampling the height of the exposed bedrock scarp. A forward model for 36Cl accumulation generates 36Cl concentration versus fault height for different potential fault slip histories, which is then compared with the measured 36Cl concentrations. The best fit scenario(s) are then ranked using the Aikake Information Criterion (AIC), which is sensitive to the goodness of fit as well as the number of parameters included in the model. A key feature of published results using this approach is that slip events of several meters have, in several cases, been inferred

  4. Geographic Information Office

    USGS Publications Warehouse

    ,

    2004-01-01

    The Geographic Information Office (GIO) is the principal information office for U.S. Geological Survey (USGS), focused on: Information Policy and Services, Information Technology, Science Information, Information Security, and the Federal Geographic Data Committee/Geospatial One Stop.

  5. Effects of floods on fish assemblages in an intermittent prairie stream

    USGS Publications Warehouse

    Franssen, N.R.; Gido, K.B.; Guy, C.S.; Tripe, J.A.; Shrank, S.J.; Strakosh, T.R.; Bertrand, K.N.; Franssen, C.M.; Pitts, K.L.; Paukert, C.P.

    2006-01-01

    1. Floods are major disturbances to stream ecosystems that can kill or displace organisms and modify habitats. Many studies have reported changes in fish assemblages after a single flood, but few studies have evaluated the importance of timing and intensity of floods on long-term fish assemblage dynamics. 2. We used a 10-year dataset to evaluate the effects of floods on fishes in Kings Creek, an intermittent prairie stream in north-eastern, Kansas, U.S.A. Samples were collected seasonally at two perennial headwater sites (1995-2005) and one perennial downstream flowing site (1997-2005) allowing us to evaluate the effects of floods at different locations within a watershed. In addition, four surveys during 2003 and 2004 sampled 3-5 km of stream between the long-term study sites to evaluate the use of intermittent reaches of this stream. 3. Because of higher discharge and bed scouring at the downstream site, we predicted that the fish assemblage would have lowered species richness and abundance following floods. In contrast, we expected increased species richness and abundance at headwater sites because floods increase stream connectivity and create the potential for colonisation from downstream reaches. 4. Akaike Information Criteria (AIC) was used to select among candidate regression models that predicted species richness and abundance based on Julian date, time since floods, season and physical habitat at each site. At the downstream site, AIC weightings suggested Julian date was the best predictor of fish assemblage structure, but no model explained >16% of the variation in species richness or community structure. Variation explained by Julian date was primarily attributed to a long-term pattern of declining abundance of common species. At the headwater sites, there was not a single candidate model selected to predict total species abundance and assemblage structure. AIC weightings suggested variation in assemblage structure was associated with either Julian date

  6. Double-input compartmental modeling and spectral analysis for the quantification of positron emission tomography data in oncology

    NASA Astrophysics Data System (ADS)

    Tomasi, G.; Kimberley, S.; Rosso, L.; Aboagye, E.; Turkheimer, F.

    2012-04-01

    In positron emission tomography (PET) studies involving organs different from the brain, ignoring the metabolite contribution to the tissue time-activity curves (TAC), as in the standard single-input (SI) models, may compromise the accuracy of the estimated parameters. We employed here double-input (DI) compartmental modeling (CM), previously used for [11C]thymidine, and a novel DI spectral analysis (SA) approach on the tracers 5-[18F]fluorouracil (5-[18F]FU) and [18F]fluorothymidine ([18F]FLT). CM and SA were performed initially with a SI approach using the parent plasma TAC as an input function. These methods were then employed using a DI approach with the metabolite plasma TAC as an additional input function. Regions of interest (ROIs) corresponding to healthy liver, kidneys and liver metastases for 5-[18F]FU and to tumor, vertebra and liver for [18F]FLT were analyzed. For 5-[18F]FU, the improvement of the fit quality with the DI approaches was remarkable; in CM, the Akaike information criterion (AIC) always selected the DI over the SI model. Volume of distribution estimates obtained with DI CM and DI SA were in excellent agreement, for both parent 5-[18F]FU (R2 = 0.91) and metabolite [18F]FBAL (R2 = 0.99). For [18F]FLT, the DI methods provided notable improvements but less substantial than for 5-[18F]FU due to the lower rate of metabolism of [18F]FLT. On the basis of the AIC values, agreement between [18F]FLT Ki estimated with the SI and DI models was good (R2 = 0.75) for the ROIs where the metabolite contribution was negligible, indicating that the additional input did not bias the parent tracer only-related estimates. When the AIC suggested a substantial contribution of the metabolite [18F]FLT-glucuronide, on the other hand, the change in the parent tracer only-related parameters was significant (R2 = 0.33 for Ki). Our results indicated that improvements of DI over SI approaches can range from moderate to substantial and are more significant for tracers with

  7. Forecast of natural aquifer discharge using a data-driven, statistical approach.

    PubMed

    Boggs, Kevin G; Van Kirk, Rob; Johnson, Gary S; Fairley, Jerry P

    2014-01-01

    In the Western United States, demand for water is often out of balance with limited water supplies. This has led to extensive water rights conflict and litigation. A tool that can reliably forecast natural aquifer discharge months ahead of peak water demand could help water practitioners and managers by providing advanced knowledge of potential water-right mitigation requirements. The timing and magnitude of natural aquifer discharge from the Eastern Snake Plain Aquifer (ESPA) in southern Idaho is accurately forecast 4 months ahead of the peak water demand, which occurs annually in July. An ARIMA time-series model with exogenous predictors (ARIMAX model) was used to develop the forecast. The ARIMAX model fit to a set of training data was assessed using Akaike's information criterion to select the optimal model that forecasts aquifer discharge, given the previous year's discharge and values of the predictor variables. Model performance was assessed by application of the model to a validation subset of data. The Nash-Sutcliffe efficiency for model predictions made on the validation set was 0.57. The predictor variables used in our forecast represent the major recharge and discharge components of the ESPA water budget, including variables that reflect overall water supply and important aspects of water administration and management. Coefficients of variation on the regression coefficients for streamflow and irrigation diversions were all much less than 0.5, indicating that these variables are strong predictors. The model with the highest AIC weight included streamflow, two irrigation diversion variables, and storage.

  8. Ultrafine particle concentrations in the surroundings of an urban area: comparing downwind to upwind conditions using Generalized Additive Models (GAMs).

    PubMed

    Sartini, Claudio; Zauli Sajani, Stefano; Ricciardelli, Isabella; Delgado-Saborit, Juana Mari; Scotto, Fabiana; Trentini, Arianna; Ferrari, Silvia; Poluzzi, Vanes

    2013-10-01

    The aim of this study was to investigate the influence of an urban area on ultrafine particle (UFP) concentration in nearby surrounding areas. We assessed how downwind and upwind conditions affect the UFP concentration at a site placed a few kilometres from the city border. Secondarily, we investigated the relationship among other meteorological factors, temporal variables and UFP. Data were collected for 44 days during 2008 and 2009 at a rural site placed about 3 kilometres from Bologna, in northern Italy. Measurements were performed using a spectrometer (FMPS TSI 3091). The average UFP number concentration was 11 776 (±7836) particles per cm(3). We analysed the effect of wind direction in a multivariate Generalized Additive Model (GAM) adjusted for the principal meteorological parameters and temporal trends. An increase of about 25% in UFP levels was observed when the site was downwind of the urban area, compared with the levels observed when wind blew from rural areas. The size distribution of particles was also affected by the wind direction, showing higher concentration of small size particles when the wind blew from the urban area. The GAM showed a good fit to the data (R(2) = 0.81). Model choice was via Akaike Information Criteria (AIC). The analysis also revealed that an approach based on meteorological data plus temporal trends improved the goodness of the fit of the model. In addition, the findings contribute to evidence on effects of exposure to ultrafine particles on a population living in city surroundings.

  9. Towards a Model Selection Rule for Quantum State Tomography

    NASA Astrophysics Data System (ADS)

    Scholten, Travis; Blume-Kohout, Robin

    Quantum tomography on large and/or complex systems will rely heavily on model selection techniques, which permit on-the-fly selection of small efficient statistical models (e.g. small Hilbert spaces) that accurately fit the data. Many model selection tools, such as hypothesis testing or Akaike's AIC, rely implicitly or explicitly on the Wilks Theorem, which predicts the behavior of the loglikelihood ratio statistic (LLRS) used to choose between models. We used Monte Carlo simulations to study the behavior of the LLRS in quantum state tomography, and found that it disagrees dramatically with Wilks' prediction. We propose a simple explanation for this behavior; namely, that boundaries (in state space and between models) play a significant role in determining the distribution of the LLRS. The resulting distribution is very complex, depending strongly both on the true state and the nature of the data. We consider a simplified model that neglects anistropy in the Fisher information, derive an almost analytic prediction for the mean value of the LLRS, and compare it to numerical experiments. While our simplified model outperforms the Wilks Theorem, it still does not predict the LLRS accurately, implying that alternative methods may be necessary for tomographic model selection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE.

  10. Land-use and land-cover change in Western Ghats of India.

    PubMed

    Kale, Manish P; Chavan, Manoj; Pardeshi, Satish; Joshi, Chitiz; Verma, Prabhakar A; Roy, P S; Srivastav, S K; Srivastava, V K; Jha, A K; Chaudhari, Swapnil; Giri, Yogesh; Krishna Murthy, Y V N

    2016-07-01

    The Western Ghats (WG) of India, one of the hottest biodiversity hotspots in the world, has witnessed major land-use and land-cover (LULC) change in recent times. The present research was aimed at studying the patterns of LULC change in WG during 1985-1995-2005, understanding the major drivers that caused such change, and projecting the future (2025) spatial distribution of forest using coupled logistic regression and Markov model. The International Geosphere Biosphere Program (IGBP) classification scheme was mainly followed in LULC characterization and change analysis. The single-step Markov model was used to project the forest demand. The spatial allocation of such forest demand was based on the predicted probabilities derived through logistic regression model. The R statistical package was used to set the allocation rules. The projection model was selected based on Akaike information criterion (AIC) and area under receiver operating characteristic (ROC) curve. The actual and projected areas of forest in 2005 were compared before making projection for 2025. It was observed that forest degradation has reduced from 1985-1995 to 1995-2005. The study obtained important insights about the drivers and their impacts on LULC simulations. To the best of our knowledge, this is the first attempt where projection of future state of forest in entire WG is made based on decadal LULC and socio-economic datasets at the Taluka (sub-district) level.

  11. The optimal number of lymph nodes removed in maximizing the survival of breast cancer patients

    NASA Astrophysics Data System (ADS)

    Peng, Lim Fong; Taib, Nur Aishah; Mohamed, Ibrahim; Daud, Noorizam

    2014-07-01

    The number of lymph nodes removed is one of the important predictors for survival in breast cancer study. Our aim is to determine the optimal number of lymph nodes to be removed for maximizing the survival of breast cancer patients. The study population consists of 873 patients with at least one of axillary nodes involved among 1890 patients from the University of Malaya Medical Center (UMMC) breast cancer registry. For this study, the Chi-square test of independence is performed to determine the significant association between prognostic factors and survival status, while Wilcoxon test is used to compare the estimates of the hazard functions of the two or more groups at each observed event time. Logistic regression analysis is then conducted to identify important predictors of survival. In particular, Akaike's Information Criterion (AIC) are calculated from the logistic regression model for all thresholds of node involved, as an alternative measure for the Wald statistic (χ2), in order to determine the optimal number of nodes that need to be removed to obtain the maximum differential in survival. The results from both measurements are compared. It is recommended that, for this particular group, the minimum of 10 nodes should be removed to maximize survival of breast cancer patients.

  12. Determining Individual Variation in Growth and Its Implication for Life-History and Population Processes Using the Empirical Bayes Method

    PubMed Central

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J.; Munch, Stephan; Skaug, Hans J.

    2014-01-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish. PMID:25211603

  13. Extensions to minimum relative entropy inversion for noisy data

    NASA Astrophysics Data System (ADS)

    Ulrych, Tadeusz J.; Woodbury, Allan D.

    2003-12-01

    Minimum relative entropy (MRE) and Tikhonov regularization (TR) were compared by Neupauer et al. [Water Resour. Res. 36 (2000) 2469] on the basis of an example plume source reconstruction problem originally proposed by Skaggs and Kabala [Water Resour. Res. 30 (1994) 71] and a boxcar-like function. Although Neupauer et al. [Water Resour. Res. 36 (2000) 2469] were careful in their conclusions to note the basis of these comparisons, we show that TR does not perform well on problems in which delta-like sources are convolved with diffuse-groundwater contamination response functions, particularly in the presence of noise. We also show that it is relatively easy to estimate an appropriate value for ɛ, the hyperparameter needed in the minimum relative entropy solution for the inverse problem in the presence of noise. This can be estimated in a variety of ways, including estimation from the data themselves, analysis of data residuals, and a rigorous approach using the real cepstrum and the Akaike Information Criterion (AIC). Regardless of the approach chosen, for the sample problem reported herein, excellent resolution of multiple delta-like spikes is produced from MRE from noisy, diffuse data. The usefulness of MRE for noisy inverse problems has been demonstrated.

  14. Extensions to minimum relative entropy inversion for noisy data.

    PubMed

    Ulrych, Tadeusz J; Woodbury, Allan D

    2003-12-01

    Minimum relative entropy (MRE) and Tikhonov regularization (TR) were compared by Neupauer et al. [Water Resour. Res. 36 (2000) 2469] on the basis of an example plume source reconstruction problem originally proposed by Skaggs and Kabala [Water Resour. Res. 30 (1994) 71] and a boxcar-like function. Although Neupauer et al. [Water Resour. Res. 36 (2000) 2469] were careful in their conclusions to note the basis of these comparisons, we show that TR does not perform well on problems in which delta-like sources are convolved with diffuse-groundwater contamination response functions, particularly in the presence of noise. We also show that it is relatively easy to estimate an appropriate value for epsilon, the hyperparameter needed in the minimum relative entropy solution for the inverse problem in the presence of noise. This can be estimated in a variety of ways, including estimation from the data themselves, analysis of data residuals, and a rigorous approach using the real cepstrum and the Akaike Information Criterion (AIC). Regardless of the approach chosen, for the sample problem reported herein, excellent resolution of multiple delta-like spikes is produced from MRE from noisy, diffuse data. The usefulness of MRE for noisy inverse problems has been demonstrated.

  15. Predictive occurrence models for coastal wetland plant communities: delineating hydrologic response surfaces with multinomial logistic regression

    USGS Publications Warehouse

    Snedden, Gregg A.; Steyer, Gregory D.

    2013-01-01

    Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007–Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.

  16. A generic model for a single strain mosquito-transmitted disease with memory on the host and the vector.

    PubMed

    Sardar, Tridip; Rana, Sourav; Bhattacharya, Sabyasachi; Al-Khaled, Kamel; Chattopadhyay, Joydev

    2015-05-01

    In the present investigation, three mathematical models on a common single strain mosquito-transmitted diseases are considered. The first one is based on ordinary differential equations, and other two models are based on fractional order differential equations. The proposed models are validated using published monthly dengue incidence data from two provinces of Venezuela during the period 1999-2002. We estimate several parameters of these models like the order of the fractional derivatives (in case of two fractional order systems), the biting rate of mosquito, two probabilities of infection, mosquito recruitment and mortality rates, etc., from the data. The basic reproduction number, R0, for the ODE system is estimated using the data. For two fractional order systems, an upper bound for, R0, is derived and its value is obtained using the published data. The force of infection, and the effective reproduction number, R(t), for the three models are estimated using the data. Sensitivity analysis of the mosquito memory parameter with some important responses is worked out. We use Akaike Information Criterion (AIC) to identify the best model among the three proposed models. It is observed that the model with memory in both the host, and the vector population provides a better agreement with epidemic data. Finally, we provide a control strategy for the vector-borne disease, dengue, using the memory of the host, and the vector.

  17. Spectral recovery of outdoor illumination by an extension of the Bayesian inverse approach to the Gaussian mixture model.

    PubMed

    Peyvandi, Shahram; Amirshahi, Seyed Hossein; Hernández-Andrés, Javier; Nieves, Juan Luis; Romero, Javier

    2012-10-01

    The Bayesian inference approach to the inverse problem of spectral signal recovery has been extended to mixtures of Gaussian probability distributions of a training dataset in order to increase the efficiency of estimating the spectral signal from the response of a transformation system. Bayesian (BIC) and Akaike (AIC) information criteria were assessed in order to provide the Gaussian mixture model (GMM) with the optimum number of clusters within the spectral space. The spectra of 2600 solar illuminations measured in Granada (Spain) were recovered over the range of 360-830 nm from their corresponding tristimulus values using a linear model of basis functions, the Wiener inverse (WI) method, and the Bayesian inverse approach extended to the GMM (BGMM). A model of Gaussian mixtures for solar irradiance was deemed to be more appropriate than a single Gaussian distribution for representing the probability distribution of the solar spectral data. The results showed that the estimation performance of the BGMM method was better than either the linear model or the WI method for the spectral approximation of daylight from the three-dimensional tristimulus values.

  18. Determination of Original Infection Source of H7N9 Avian Influenza by Dynamical Model

    NASA Astrophysics Data System (ADS)

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-05-01

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters.

  19. Determining individual variation in growth and its implication for life-history and population processes using the empirical Bayes method.

    PubMed

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J

    2014-09-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.

  20. Spatial analysis of Tuberculosis in Rio de Janeiro in the period from 2005 to 2008 and associated socioeconomic factors using micro data and global spatial regression models.

    PubMed

    Magalhães, Monica de Avelar Figueiredo Mafra; Medronho, Roberto de Andrade

    2017-03-01

    The present study analyses the spatial pattern of tuberculosis (TB) from 2005 to 2008 by identifying relevant socioeconomic variables for the occurrence of the disease through spatial statistical models. This ecological study was performed in Rio de Janeiro using new cases. The census sector was used as the unit of analysis. Incidence rates were calculated, and the Local Empirical Bayesian method was used. The spatial autocorrelation was verified with Moran's Index and local indicators of spatial association (LISA). Using Spearman's test, variables with significant correlation at 5% were used in the models. In the classic multivariate regression model, the variables that fitted better to the model were proportion of head of family with an income between 1 and 2 minimum wages, proportion of illiterate people, proportion of households with people living alone and mean income of the head of family. These variables were inserted in the Spatial Lag and Spatial Error models, and the results were compared. The former exhibited the best parameters: R2 = 0.3215, Log-Likelihood = -9228, Akaike Information Criterion (AIC) = 18,468 and Schwarz Bayesian Criterion (SBC) = 18,512. The statistical methods were effective in the identification of spatial patterns and in the definition of determinants of the disease providing a view of the heterogeneity in space, allowing actions aimed more at specific populations.

  1. Impact of Large-scale Circulation Patterns on Surface Ozone Variability in Houston-Galveston-Brazoria

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Jia, B.; Xie, Y.

    2015-12-01

    The Bermuda High (BH) is a key driver of large-scale circulation patterns for Southeastern Texas and other Gulf coast states in summer, with the expected influence on surface ozone through its modulation of marine air inflow with lower ozone background from the Gulf of Mexico. We develop a statistical relationship through multiple linear regression (MLR) to quantify the impact of the BH variations on surface ozone variability during the ozone season in the Houston-Galveston-Brazoria (HGB) area, a major ozone nonattainment region on the Gulf Coast. We find that the variability in BH location, represented by a longitude index of the BH west edge (BH-Lon) in the MLR, explains 50-60% of the year-to-year variability in monthly mean ozone over HGB for Jun and July during 1998-2013, and the corresponding figure for Aug and Sep is 20%. Additional 30%-40% of the ozone variability for Aug and Sep can be explained by the variability in BH strength, represented by two BH intensity indices (BHI) in the MLR, but its contribution is only 5% for June and not significant for July. Including a maximum Through stepwise regression based on Akaike Information Criterion (AIC), the MLR model captures 58~72% of monthly ozone variability during Jun-Sep with a cross-validation R2 of 0.5. This observation-derived statistical relationship will be valuable to constrain model simulations of ozone variability attributable to large-scale circulation patterns.

  2. Molecular Detection of Hematozoa Infections in Tundra Swans Relative to Migration Patterns and Ecological Conditions at Breeding Grounds

    PubMed Central

    Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska. PMID:23049862

  3. Spatiotemporal analysis of aquifers salinization in coastal area of Yunlin, Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, P.-C.; Tan, Y.-C.

    2012-04-01

    In the past, time and space characteristics often discussed separately. This study adopts regionalized variables theory, and describes the water quality in terms of its structure in time and space to assess the situation of Yunlin. This study applied the Quantum Bayesian Maximum Entropy Toolbox (QtBME), which is a spatiotemporal statistics function, can be applied to estimate and map a non-stationary and non-homogeneous spatiotemporal process under the platform of Quantum GIS (QGIS) software. Kernel smoothing method is used to divide the original process into a deterministic trend and a stationary and homogeneous spatiotemporal process, assuming that a spatiotemporal process can be divided into high and low frequency. The covariance model of the process of high frequency is selected objectively by particle swarm optimization (PSO) method and Akaike's information criterion (AIC). Bayesian maximum entropy method is then applied to spatiotemporal mapping of the variable of interest. In this study, QtBME estimated the situation of aquifers salinization at Yunlin coastal area in 1992 to 2010. Finally, one investigated the rainfall and aquifers salinization on the degree of impact.

  4. Persistent disturbance by commercial navigation afters the relative abundance of channel-dwelling fishes in a large river

    USGS Publications Warehouse

    Gutreuter, S.; Vallazza, J.M.; Knights, B.C.

    2006-01-01

    We provide the first evidence for chronic effects of disturbance by commercial vessels on the spatial distribution and abundance of fishes in the channels of a large river. Most of the world's large rivers are intensively managed to satisfy increasing demands for commercial shipping, but little research has been conducted to identify and alleviate any adverse consequences of commercial navigation. We used a combination of a gradient sampling design incorporating quasicontrol areas with Akaike's information criterion (AIC)-weighted model averaging to estimate effects of disturbances by commercial vessels on fishes in the upper Mississippi River. Species density, which mainly measured species evenness, decreased with increasing disturbance frequency. The most abundant species - gizzard shad (Dorosoma cepedianum) and freshwater drum (Aplodinotus grunniens) - and the less abundant shovelnose sturgeon (Scaphirhynchus platorhynchus) and flathead catfish (Pylodictis olivaris) were seemingly unaffected by traffic disturbance. In contrast, the relative abundance of the toothed herrings (Hiodon spp.), redhorses (Moxostoma spp.), buffaloes (Ictiobus spp.), channel catfish (Ictalurus punctatus), sauger (Sander canadensis), and white bass (Morone chrysops) decreased with increasing traffic in the navigation channel. We hypothesized that the combination of alteration of hydraulic features within navigation channels and rehabilitation of secondary channels might benefit channel-dependent species. ?? 2006 NRC.

  5. Age and growth of chub mackerel ( Xcomber japonicus) in the East China and Yellow Seas using sectioned otolith samples

    NASA Astrophysics Data System (ADS)

    Li, Gang; Chen, Xinjun; Feng, Bo

    2008-11-01

    Although chub mackerel ( Scomber japonicus) is a primary pelagic fish species, we have only limited knowledge on its key life history processes. The present work studied the age and growth of chub mackerel in the East China and Yellow Seas. Age was determined by interpreting and counting growth rings on the sagitta otoliths of 252 adult fish caught by the Chinese commercial purse seine fleet during the period from November 2006 to January 2007 and 150 juveniles from bottom trawl surveys on the spawning ground in May 2006. The difference between the assumed birth date of 1st April and date of capture was used to adjust the age determined from counting the number of complete translucent rings. The parameters of three commonly used growth models, the von Bertalanffy, Logistic and Gompertz models, were estimated using the maximum likelihood method. Based on the Akaike Information Criterion ( AIC), the von Bertalanffy growth model was found to be the most appropriate model. The size-at-age and size-at-maturity values were also found to decrease greatly compared with the results achieved in the 1950s, which was caused by heavy exploitation over the last few decades.

  6. Degree-day accumulation influences annual variability in growth of age-0 walleye

    USGS Publications Warehouse

    Uphoff, Christopher S.; Schoenebeck, Casey W.; Hoback, W. Wyatt; Koupal, Keith D.; Pope, Kevin L.

    2013-01-01

    The growth of age-0 fishes influences survival, especially in temperate regions where size-dependent over-winter mortality can be substantial. Additional benefits of earlier maturation and greater fecundity may exist for faster growing individuals. This study correlated prey densities, growing-degree days, water-surface elevation, turbidity, and chlorophyll a with age-0 walleye Sander vitreus growth in a south-central Nebraska irrigation reservoir. Growth of age-0 walleye was variable between 2003 and 2011, with mean lengths ranging from 128 to 231 mm by fall (September 30th–October 15th). A set of a priori candidate models were used to assess the relative support of explanatory variables using Akaike's information criterion (AIC). A temperature model using the growing degree-days metric was the best supported model, describing 65% of the variability in annual mean lengths of age-0 walleye. The second and third best supported models included the variables chlorophyll a (r2 = 0.49) and larval freshwater drum density (r2 = 0.45), respectively. There have been mixed results concerning the importance of temperature effects on growth of age-0 walleye. This study supports the hypothesis that temperature is the most important predictor of age-0 walleye growth near the southwestern limits of its natural range.

  7. Accretion Timescales from Kepler AGN

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.; Vogeley, Michael S.; Richards, Gordon T.

    2015-01-01

    We constrain AGN accretion disk variability mechanisms using the optical light curves of AGN observed by Kepler. AGN optical fluxes are known to exhibit stochastic variations on timescales of hours, days, months and years. The excellent sampling properties of the original Kepler mission - high S/N ratio (105), short sampling interval (30 minutes), and long sampling duration (~ 3.5 years) - allow for a detailed examination of the differences between the variability processes present in various sub-types of AGN such as Type I and II Seyferts, QSOs, and Blazars. We model the flux data using the Auto-Regressive Moving Average (ARMA) representation from the field of time series analysis. We use the Kalman filter to determine optimal mode parameters and use the Akaike Information Criteria (AIC) to select the optimal model. We find that optical light curves from Kepler AGN cannot be fit by low order statistical models such as the popular AR(1) process or damped random walk. Kepler light curves exhibit complicated power spectra and are better modeled by higher order ARMA processes. We find that Kepler AGN typically exhibit power spectra that change from a bending power law (PSD ~ 1/fa) to a flat power spectrum on timescales in the range of ~ 5 - 100 days consistent with the orbital and thermal timescales of a typical 107 solar mass black hole.

  8. Molecular detection of hematozoa infections in tundra swans relative to migration patterns and ecological conditions at breeding grounds.

    PubMed

    Ramey, Andrew M; Ely, Craig R; Schmutz, Joel A; Pearce, John M; Heard, Darryl J

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.

  9. Lee-Carter state space modeling: Application to the Malaysia mortality data

    NASA Astrophysics Data System (ADS)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-06-01

    This article presents an approach that formalizes the Lee-Carter (LC) model as a state space model. Maximum likelihood through Expectation-Maximum (EM) algorithm was used to estimate the model. The methodology is applied to Malaysia's total population mortality data. Malaysia's mortality data was modeled based on age specific death rates (ASDR) data from 1971-2009. The fitted ASDR are compared to the actual observed values. However, results from the comparison of the fitted and actual values between LC-SS model and the original LC model shows that the fitted values from the LC-SS model and original LC model are quite close. In addition, there is not much difference between the value of root mean squared error (RMSE) and Akaike information criteria (AIC) from both models. The LC-SS model estimated for this study can be extended for forecasting ASDR in Malaysia. Then, accuracy of the LC-SS compared to the original LC can be further examined by verifying the forecasting power using out-of-sample comparison.

  10. Determination of original infection source of H7N9 avian influenza by dynamical model.

    PubMed

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-05-02

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters.

  11. Molecular detection of hematozoa infections in tundra swans relative to migration patterns and ecological conditions at breeding grounds

    USGS Publications Warehouse

    Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.

  12. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  13. Butyltins, trace metals and morphological variables in surf scoter (Melanitta perspicillata) wintering on the south coast of British Columbia, Canada.

    PubMed

    Elliott, J E; Harris, M L; Wilson, L K; Smith, B D; Batchelor, S P; Maguire, J

    2007-09-01

    From 1998 to 2001 we examined spatial and temporal variation in uptake of contaminants by surf scoters (Melanitta perspicillata) in the Georgia Basin region of the Pacific coast of Canada. Samples were collected during late fall and early spring at industrialized and reference locations, carcasses examined, and tissues collected for histology, biomarkers, and contaminant analyses. Scoters from both Vancouver and Victoria harbours had significantly higher hepatic concentrations of summation operatorbutyltins than birds from a reference site. In adult male surf scoters, hepatic summation operatorbutyltins increased over the winter at two sites (p=0.02, n=26), while mercury increased (p=0.03, n=15) and selenium decreased at one site (p=0.001, n=15). Body condition decreased over the winter at both the treatment site, Howe Sound (p<0.0001, n=12), and the reference site, Baynes Sound (p=0.02, n=15). Multiple regression analysis using Akaike's Information Criteria (AIC(C)) showed an association between hepatic butyltin concentrations and overall body condition (p=0.06, r=-0.237).

  14. Application of Parametric Models to a Survival Analysis of Hemodialysis Patients

    PubMed Central

    Montaseri, Maryam; Charati, Jamshid Yazdani; Espahbodi, Fateme

    2016-01-01

    Background Hemodialysis is the most common renal replacement therapy in patients with end stage renal disease (ESRD). Objectives The present study compared the performance of various parametric models in a survival analysis of hemodialysis patients. Methods This study consisted of 270 hemodialysis patients who were referred to Imam Khomeini and Fatima Zahra hospitals between November 2007 and November 2012. The Akaike information criterion (AIC) and residuals review were used to compare the performance of the parametric models. The computations were done using STATA Software, with significance accepted at a level of 0.05. Results The results of a multivariate analysis of the variables in the parametric models showed that the mean serum albumin and the clinic attended were the most important predictors in the survival of the hemodialysis patients (P < 0.05). Among the parametric models tested, the results indicated that the performance of the Weibull model was the highest. Conclusions Parametric models may provide complementary data for clinicians and researchers about how risks vary over time. The Weibull model seemed to show the best fit among the parametric models of the survival of hemodialysis patients. PMID:27896235

  15. Negative binomial models for abundance estimation of multiple closed populations

    USGS Publications Warehouse

    Boyce, Mark S.; MacKenzie, Darry I.; Manly, Bryan F.J.; Haroldson, Mark A.; Moody, David W.

    2001-01-01

    Counts of uniquely identified individuals in a population offer opportunities to estimate abundance. However, for various reasons such counts may be burdened by heterogeneity in the probability of being detected. Theoretical arguments and empirical evidence demonstrate that the negative binomial distribution (NBD) is a useful characterization for counts from biological populations with heterogeneity. We propose a method that focuses on estimating multiple populations by simultaneously using a suite of models derived from the NBD. We used this approach to estimate the number of female grizzly bears (Ursus arctos) with cubs-of-the-year in the Yellowstone ecosystem, for each year, 1986-1998. Akaike's Information Criteria (AIC) indicated that a negative binomial model with a constant level of heterogeneity across all years was best for characterizing the sighting frequencies of female grizzly bears. A lack-of-fit test indicated the model adequately described the collected data. Bootstrap techniques were used to estimate standard errors and 95% confidence intervals. We provide a Monte Carlo technique, which confirms that the Yellowstone ecosystem grizzly bear population increased during the period 1986-1998.

  16. High expression of galectin-7 associates with poor overall survival in patients with non-metastatic clear-cell renal cell carcinoma

    PubMed Central

    Xu, Zhiying; Zhang, Guodong; Liu, Zheng; Fu, Hangcheng; Wang, Zewei; Liu, Haiou; Xu, Jiejie

    2016-01-01

    Background Galectin-7, has a controversial role in tumor progression, can either suppress tumor growth or induce chemoresistance depends on different tumor histology types. The aim was to appraise Galectin-7 expression on the overall survival (OS) of patients with non-metastatic clear cell renal cell carcinoma (ccRCC) following surgery. Results High galectin-7 expression was specifically correlated with necrosis (P = 0.015). Multivariate analysis confirmed galectin-7 as an independent prognosticator for OS (P = 0.005). High galectin-7 expression suggested poor OS (P < 0.001), particularly with UISS intermediate and high score groups. Notably, the predictive accuracy of the traditional prognostic scores was improved when combined with galectin-7 expression. Materials and Methods We retrospectively enrolled 416 patients who underwent nephrectomy at a single institute between 2008 and 2009 and detected their intratumor galectin-7 expression by immunohistochemistry. Kaplan-Meier method was conducted to plot survival curves and multivariate cox regression analysis for potential independent prognostic factors on OS. A nomogram was constructed with concordance index (C-index) and Akaike's Information Criteria (AIC) to appraise prognostic accuracy of different models. Conclusions High galectin-7 expression is an independent adverse predictor for survival. Evaluation of galectin-7 could help guide postsurgical management for non-metastatic ccRCC patients. PMID:27259255

  17. Estimating annual survival and movement rates of adults within a metapopulation of roseate terns

    USGS Publications Warehouse

    Spendelow, J.A.; Nichols, J.D.; Nisbet, I.C.T.; Hays, H.; Cormons, G.D.; Burger, J.; Safina, C.; Hines, J.E.; Gochfeld, M.

    1995-01-01

    Several multistratum capture-recapture models were used to test various hypotheses about possible geographic and temporal variation in survival, movement, and recapture/resighting probabilities of 2399 adult Roseate Terns (Sterna dougallii) color-banded from 1988 to 1992 at the sites of the four largest breeding colonies of this species in the northeastern USA. Linear-logistic ultrastructural models also were developed to investigate possible correlates of geographic variation in movement probabilities. Based on goodness-of-fit tests and comparisons of Akaike's Information Criterion (AIC) values, the fully parameterized model (Model A) with time- and location-specific survival, movement, and capture probabilities, was selected as the most appropriate model for this metapopulation structure. With almost all movement accounted for, on average gt 90% of the surviving adults from each colony site returned to the same site the following year. Variations in movement probabilities were more closely associated with the identity of the destination colony site than with either the identity of the colony site of origin or the distance between colony sites. The average annual survival estimates (0.740.84) of terns from all four sites indicate a high rate of annual mortality relative to that of other species of marine birds.

  18. Determination of Original Infection Source of H7N9 Avian Influenza by Dynamical Model

    PubMed Central

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-01-01

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters. PMID:24786135

  19. Predictive Information: Status or Alert Information?

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Bruneau, Daniel; Press, Hayes N.

    2008-01-01

    Previous research investigating the efficacy of predictive information for detecting and diagnosing aircraft system failures found that subjects like to have predictive information concerning when a parameter would reach an alert range. This research focused on where the predictive information should be located, whether the information should be more closely associated with the parameter information or with the alert information. Each subject saw 3 forms of predictive information: (1) none, (2) a predictive alert message, and (3) predictive information on the status display. Generally, subjects performed better and preferred to have predictive information available although the difference between status and alert predictive information was minimal. Overall, for detection and recalling what happened, status predictive information is best; however for diagnosis, alert predictive information holds a slight edge.

  20. Situation Awareness Information Dominance & Information Warfare.

    DTIC Science & Technology

    1997-02-01

    Information warfare and its primary objective of achieving information dominance over enemy forces have arisen as a major area of emphasis for future...military actions. The concept of information dominance and the issues involved in attaining it are explored through a model of situation awareness...directions for the development of systems to support the goal of information dominance can be established.

  1. Information for Industrial Development,

    DTIC Science & Technology

    Information processing, *Technology transfer, * Industries , Information transfer, Industrial engineering , Planning, Research management, Investments...Operation, Industrial production, Data bases, Information systems, User needs, Symposia

  2. MMA, A Computer Code for Multi-Model Analysis

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  3. [Teacher Referral Information and Statistical Information Forms.

    ERIC Educational Resources Information Center

    Short, N. J.

    This rating information form used to refer children to the PIC program, elicits information concerning the child's emotional, cognitive, and personality development. See TM 001 111 for details of the program in which it is used. (DLG)

  4. Quantify information system benefits

    SciTech Connect

    Koppel, L.B.

    1995-06-01

    What are information systems and how do they relate to control systems? How do information systems produce benefits in hydrocarbon processing? What are some examples of benefit-generating information system applications? Information System Benefits (ISBEN) is a structured methodology for estimating information system benefits in hydrocarbon processing. The paper discusses information and control systems, information system benefits and applications, objectives, strategies and measures of ISBEN, ISBEN business drivers, ISBEN database, ISBEN methodology, and implementation.

  5. A Career in Information.

    ERIC Educational Resources Information Center

    Debons, Anthony; And Others

    The best sources of information about educational requirements for careers in information sciences are the institutions that offer training programs in such careers. The American Society for Information Science maintains a file of information on institutions offering training programs in information science. This pamphlet is intended for general…

  6. Informed consent - adults

    MedlinePlus

    ... state) What Should Occur During the Informed Consent Process? When asking for your informed consent, your doctor ... What is Your Role in the Informed Consent Process? You are an important member of your health ...

  7. Advanced information society (1)

    NASA Astrophysics Data System (ADS)

    Ohira, Gosei

    In considering the relationship of informationization and industrial structure, this paper analize some factors such as information revolution, informationization of industries and industrialization of information as background of informationization of Japanese society. Next, some information indicators such as, information coefficient of household which is a share of information related expenditure, information coefficient of industry which is a share of information related cost to total cost of production, and information transmission census developed by Ministry of Post and Telecommunication are introduced. Then new information indicator by Economic Planning Agency, that is, electronic info-communication indicator is showed. In this study, the information activities are defined to produce message or to supply services on process, stores or sale of message using electronic information equipment. International comparisons of information labor force are also presented.

  8. Energy information sheets

    SciTech Connect

    1995-07-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the public. The Energy Information Sheets was developed to provide general information on various aspects of fuel production, prices, consumption, and capability. Additional information on related subject matter can be found in other Energy Information Administration (EIA) publications as referenced at the end of each sheet.

  9. Information about Musculoskeletal Conditions

    MedlinePlus

    ... AAOS Orthopaedic Disclosure Program Position Statements Information Statements Ethics Resources Resolutions Patient, Public & Media Information OrthoInfo Patient Education Newsroom/Media Resources Find ...

  10. Change in BMI Accurately Predicted by Social Exposure to Acquaintances

    PubMed Central

    Oloritun, Rahman O.; Ouarda, Taha B. M. J.; Moturu, Sai; Madan, Anmol; Pentland, Alex (Sandy); Khayal, Inas

    2013-01-01

    Research has mostly focused on obesity and not on processes of BMI change more generally, although these may be key factors that lead to obesity. Studies have suggested that obesity is affected by social ties. However these studies used survey based data collection techniques that may be biased toward select only close friends and relatives. In this study, mobile phone sensing techniques were used to routinely capture social interaction data in an undergraduate dorm. By automating the capture of social interaction data, the limitations of self-reported social exposure data are avoided. This study attempts to understand and develop a model that best describes the change in BMI using social interaction data. We evaluated a cohort of 42 college students in a co-located university dorm, automatically captured via mobile phones and survey based health-related information. We determined the most predictive variables for change in BMI using the least absolute shrinkage and selection operator (LASSO) method. The selected variables, with gender, healthy diet category, and ability to manage stress, were used to build multiple linear regression models that estimate the effect of exposure and individual factors on change in BMI. We identified the best model using Akaike Information Criterion (AIC) and R2. This study found a model that explains 68% (p<0.0001) of the variation in change in BMI. The model combined social interaction data, especially from acquaintances, and personal health-related information to explain change in BMI. This is the first study taking into account both interactions with different levels of social interaction and personal health-related information. Social interactions with acquaintances accounted for more than half the variation in change in BMI. This suggests the importance of not only individual health information but also the significance of social interactions with people we are exposed to, even people we may not consider as close friends. PMID

  11. The Information Age and Information Development.

    ERIC Educational Resources Information Center

    Hughes, Graeme C.; And Others

    1991-01-01

    This theme issue includes eight articles that discuss the information age, the impact of information technology, and the role of libraries. Highlights include libraries in Brazil, Indonesia, and Nigeria; the Universal Availability of Publications (UAP) program; community literacy; database development in Malawi; and the Regional Energy Resources…

  12. Information Resource Management for Industrial Information Officers.

    ERIC Educational Resources Information Center

    Dosa, Marta

    This paper argues that the function of educational programs is to convey a sense of reality and an understanding of the open-endedness of information needs and situations; only such a reality orientation can instill the necessary flexibility in information professionals for effectively managing change. There is a growing consensus among…

  13. Intelligence, Information Technology, and Information Warfare.

    ERIC Educational Resources Information Center

    Davies, Philip H. J.

    2002-01-01

    Addresses the use of information technology for intelligence and information warfare in the context of national security and reviews the status of clandestine collection. Discusses hacking, human agent collection, signal interception, covert action, counterintelligence and security, and communications between intelligence producers and consumers…

  14. The Information Highway.

    ERIC Educational Resources Information Center

    Gore, Al

    1994-01-01

    The new information marketplace is based on a network of wide, two-way highways comprised of private owners and developers, makers of information appliances (televisions, telephones, computers, and combinations of all three), information providers (local broadcasters, digital libraries, information service providers, and entrepreneurs), and…

  15. Reinventing Information Services.

    ERIC Educational Resources Information Center

    Farkas-Conn, Irene; And Others

    1996-01-01

    This special section includes seven articles that discuss reinventing information services. Highlights include linking information services to business strategies; meeting client initiatives; information services at the Ottawa laboratory of Bell-Northern Research (BNR); product service strategies; information management and transition economies;…

  16. Information Services. Miscellaneous Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Papers on audiovisual information resources, the history of technical libraries, online legal information, and information technology for schoolchildren, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "Continuing Issues in the Provision of Audiovisual Information Resources -…

  17. Seymour: Maryland's Information Retriever.

    ERIC Educational Resources Information Center

    Smith, Barbara G.

    1994-01-01

    Explains the development of an electronic information network in Maryland called Seymour that offers bibliographic records; full-text databases; community information databases; the ability to request information and materials; local, state, and federal information; and access to the Internet. Policy issues are addressed, including user fees and…

  18. Mission Medical Information System

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.; Joe, John C.; Follansbee, Nicole M.

    2008-01-01

    This viewgraph presentation gives an overview of the Mission Medical Information System (MMIS). The topics include: 1) What is MMIS?; 2) MMIS Goals; 3) Terrestrial Health Information Technology Vision; 4) NASA Health Information Technology Needs; 5) Mission Medical Information System Components; 6) Electronic Medical Record; 7) Longitudinal Study of Astronaut Health (LSAH); 8) Methods; and 9) Data Submission Agreement (example).

  19. Effects of ADC Nonlinearity on the Spurious Dynamic Range Performance of Compressed Sensing

    PubMed Central

    Tian, Pengwu; Yu, Hongyi

    2014-01-01

    Analog-to-information converter (AIC) plays an important role in the compressed sensing system; it has the potential to significantly extend the capabilities of conventional analog-to-digital converter. This paper evaluates the impact of AIC nonlinearity on the dynamic performance in practical compressed sensing system, which included the nonlinearity introduced by quantization as well as the circuit non-ideality. It presents intuitive yet quantitative insights into the harmonics of quantization output of AIC, and the effect of other AIC nonlinearity on the spurious dynamic range (SFDR) performance is also analyzed. The analysis and simulation results demonstrated that, compared with conventional ADC-based system, the measurement process decorrelates the input signal and the quantization error and alleviate the effect of other decorrelates of AIC, which results in a dramatic increase in spurious free dynamic range (SFDR). PMID:24895645

  20. CAREERS IN INFORMATION SCIENCE,

    DTIC Science & Technology

    Information Science . Sets forth that Information Science is concerned with the properties, behavior, and flow of information...Describes how it is used, both by individuals and in large systems. Discusses the opportunities in Information Science and outlines three relatively...6or participation in these career areas. Concludes that Information Science is a new but rapidly growing field pushing the frontiers of human knowledge and, thus, 3ontributing to human wellbeing and progress.

  1. Aquaculture information package

    SciTech Connect

    Boyd, T.; Rafferty, K.

    1998-08-01

    This package of information is intended to provide background information to developers of geothermal aquaculture projects. The material is divided into eight sections and includes information on market and price information for typical species, aquaculture water quality issues, typical species culture information, pond heat loss calculations, an aquaculture glossary, regional and university aquaculture offices and state aquaculture permit requirements. A bibliography containing 68 references is also included.

  2. Testing the consistency of wildlife data types before combining them: the case of camera traps and telemetry.

    PubMed

    Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A

    2014-04-01

    Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case

  3. Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates

    USGS Publications Warehouse

    Gray, B.R.

    2005-01-01

    The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively

  4. A water quality index model using stepwise regression and neural networks models for the Piabanha River basin in Rio de Janeiro, Brazil

    NASA Astrophysics Data System (ADS)

    Villas Boas, M. D.; Olivera, F.; Azevedo, J. S.

    2013-12-01

    The evaluation of water quality through 'indexes' is widely used in environmental sciences. There are a number of methods available for calculating water quality indexes (WQI), usually based on site-specific parameters. In Brazil, WQI were initially used in the 1970s and were adapted from the methodology developed in association with the National Science Foundation (Brown et al, 1970). Specifically, the WQI 'IQA/SCQA', developed by the Institute of Water Management of Minas Gerais (IGAM), is estimated based on nine parameters: Temperature Range, Biochemical Oxygen Demand, Fecal Coliforms, Nitrate, Phosphate, Turbidity, Dissolved Oxygen, pH and Electrical Conductivity. The goal of this study was to develop a model for calculating the IQA/SCQA, for the Piabanha River basin in the State of Rio de Janeiro (Brazil), using only the parameters measurable by a Multiparameter Water Quality Sonde (MWQS) available in the study area. These parameters are: Dissolved Oxygen, pH and Electrical Conductivity. The use of this model will allow to further the water quality monitoring network in the basin, without requiring significant increases of resources. The water quality measurement with MWQS is less expensive than the laboratory analysis required for the other parameters. The water quality data used in the study were obtained by the Geological Survey of Brazil in partnership with other public institutions (i.e. universities and environmental institutes) as part of the project "Integrated Studies in Experimental and Representative Watersheds". Two models were developed to correlate the values of the three measured parameters and the IQA/SCQA values calculated based on all nine parameters. The results were evaluated according to the following validation statistics: coefficient of determination (R2), Root Mean Square Error (RMSE), Akaike information criterion (AIC) and Final Prediction Error (FPE). The first model was a linear stepwise regression between three independent variables

  5. Avoiding health information.

    PubMed

    Barbour, Joshua B; Rintamaki, Lance S; Ramsey, Jason A; Brashers, Dale E

    2012-01-01

    This study investigated why and how individuals avoid health information to support the development of models of uncertainty and information management and offer insights for those dealing with the information and uncertainty inherent to health and illness. Participants from student (n = 507) and community (n = 418) samples reported that they avoided health information to (a) maintain hope or deniability, (b) resist overexposure, (c) accept limits of action, (d) manage flawed information, (e) maintain boundaries, and (f) continue with life/activities. They also reported strategies for avoiding information, including removing or ignoring stimuli (e.g., avoiding people who might provide health advice) and controlling conversations (e.g., withholding information, changing the subject). Results suggest a link between previous experience with serious illness and health information avoidance. Building on uncertainty management theory, this study demonstrated that health information avoidance is situational, relatively common, not necessarily unhealthy, and may be used to accomplish multiple communication goals.

  6. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  7. Health Information on the Web: Finding Reliable Information

    MedlinePlus

    ... and Wellness Staying Healthy Health Information on the Web: Finding Reliable Information Health Information on the Web: Finding Reliable Information Prevention and WellnessStaying Healthy Share ...

  8. Energy information sheets

    SciTech Connect

    Not Available

    1993-12-02

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. Written for the general public, the EIA publication Energy Information Sheets was developed to provide information on various aspects of fuel production, prices, consumption and capability. The information contained herein pertains to energy data as of December 1991. Additional information on related subject matter can be found in other EIA publications as referenced at the end of each sheet.

  9. Types of quantum information

    SciTech Connect

    Griffiths, Robert B.

    2007-12-15

    Quantum, in contrast to classical, information theory, allows for different incompatible types (or species) of information which cannot be combined with each other. Distinguishing these incompatible types is useful in understanding the role of the two classical bits in teleportation (or one bit in one-bit teleportation), for discussing decoherence in information-theoretic terms, and for giving a proper definition, in quantum terms, of 'classical information.' Various examples (some updating earlier work) are given of theorems which relate different incompatible kinds of information, and thus have no counterparts in classical information theory.

  10. Ventilation/Perfusion Positron Emission Tomography—Based Assessment of Radiation Injury to Lung

    SciTech Connect

    Siva, Shankar; Hardcastle, Nicholas; Kron, Tomas; Bressel, Mathias; Callahan, Jason; MacManus, Michael P.; Shaw, Mark; Plumridge, Nikki; Hicks, Rodney J.; Steinfort, Daniel; Ball, David L.; Hofman, Michael S.

    2015-10-01

    Purpose: To investigate {sup 68}Ga-ventilation/perfusion (V/Q) positron emission tomography (PET)/computed tomography (CT) as a novel imaging modality for assessment of perfusion, ventilation, and lung density changes in the context of radiation therapy (RT). Methods and Materials: In a prospective clinical trial, 20 patients underwent 4-dimensional (4D)-V/Q PET/CT before, midway through, and 3 months after definitive lung RT. Eligible patients were prescribed 60 Gy in 30 fractions with or without concurrent chemotherapy. Functional images were registered to the RT planning 4D-CT, and isodose volumes were averaged into 10-Gy bins. Within each dose bin, relative loss in standardized uptake value (SUV) was recorded for ventilation and perfusion, and loss in air-filled fraction was recorded to assess RT-induced lung fibrosis. A dose-effect relationship was described using both linear and 2-parameter logistic fit models, and goodness of fit was assessed with Akaike Information Criterion (AIC). Results: A total of 179 imaging datasets were available for analysis (1 scan was unrecoverable). An almost perfectly linear negative dose-response relationship was observed for perfusion and air-filled fraction (r{sup 2}=0.99, P<.01), with ventilation strongly negatively linear (r{sup 2}=0.95, P<.01). Logistic models did not provide a better fit as evaluated by AIC. Perfusion, ventilation, and the air-filled fraction decreased 0.75 ± 0.03%, 0.71 ± 0.06%, and 0.49 ± 0.02%/Gy, respectively. Within high-dose regions, higher baseline perfusion SUV was associated with greater rate of loss. At 50 Gy and 60 Gy, the rate of loss was 1.35% (P=.07) and 1.73% (P=.05) per SUV, respectively. Of 8/20 patients with peritumoral reperfusion/reventilation during treatment, 7/8 did not sustain this effect after treatment. Conclusions: Radiation-induced regional lung functional deficits occur in a dose-dependent manner and can be estimated by simple linear models with 4D-V/Q PET

  11. Seasonality and Trend Forecasting of Tuberculosis Prevalence Data in Eastern Cape, South Africa, Using a Hybrid Model

    PubMed Central

    Azeez, Adeboye; Obaromi, Davies; Odeyemi, Akinwumi; Ndege, James; Muntabayi, Ruffin

    2016-01-01

    Background: Tuberculosis (TB) is a deadly infectious disease caused by Mycobacteria tuberculosis. Tuberculosis as a chronic and highly infectious disease is prevalent in almost every part of the globe. More than 95% of TB mortality occurs in low/middle income countries. In 2014, approximately 10 million people were diagnosed with active TB and two million died from the disease. In this study, our aim is to compare the predictive powers of the seasonal autoregressive integrated moving average (SARIMA) and neural network auto-regression (SARIMA-NNAR) models of TB incidence and analyse its seasonality in South Africa. Methods: TB incidence cases data from January 2010 to December 2015 were extracted from the Eastern Cape Health facility report of the electronic Tuberculosis Register (ERT.Net). A SARIMA model and a combined model of SARIMA model and a neural network auto-regression (SARIMA-NNAR) model were used in analysing and predicting the TB data from 2010 to 2015. Simulation performance parameters of mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), mean percent error (MPE), mean absolute scaled error (MASE) and mean absolute percentage error (MAPE) were applied to assess the better performance of prediction between the models. Results: Though practically, both models could predict TB incidence, the combined model displayed better performance. For the combined model, the Akaike information criterion (AIC), second-order AIC (AICc) and Bayesian information criterion (BIC) are 288.56, 308.31 and 299.09 respectively, which were lower than the SARIMA model with corresponding values of 329.02, 327.20 and 341.99, respectively. The seasonality trend of TB incidence was forecast to have a slightly increased seasonal TB incidence trend from the SARIMA-NNAR model compared to the single model. Conclusions: The combined model indicated a better TB incidence forecasting with a lower AICc. The model also indicates the need for resolute

  12. Family-Joining: A Fast Distance-Based Method for Constructing Generally Labeled Trees

    PubMed Central

    Kalaghatgi, Prabhav; Pfeifer, Nico; Lengauer, Thomas

    2016-01-01

    The widely used model for evolutionary relationships is a bifurcating tree with all taxa/observations placed at the leaves. This is not appropriate if the taxa have been densely sampled across evolutionary time and may be in a direct ancestral relationship, or if there is not enough information to fully resolve all the branching points in the evolutionary tree. In this article, we present a fast distance-based agglomeration method called family-joining (FJ) for constructing so-called generally labeled trees in which taxa may be placed at internal vertices and the tree may contain polytomies. FJ constructs such trees on the basis of pairwise distances and a distance threshold. We tested three methods for threshold selection, FJ-AIC, FJ-BIC, and FJ-CV, which minimize Akaike information criterion, Bayesian information criterion, and cross-validation error, respectively. When compared with related methods on simulated data, FJ-BIC was among the best at reconstructing the correct tree across a wide range of simulation scenarios. FJ-BIC was applied to HIV sequences sampled from individuals involved in a known transmission chain. The FJ-BIC tree was found to be compatible with almost all transmission events. On average, internal branches in the FJ-BIC tree have higher bootstrap support than branches in the leaf-labeled bifurcating tree constructed using RAxML. 36% and 25% of the internal branches in the FJ-BIC tree and RAxML tree, respectively, have bootstrap support greater than 70%. To the best of our knowledge the method presented here is the first attempt at modeling evolutionary relationships using generally labeled trees. PMID:27436007

  13. Projecting climate-driven increases in North American fire activity

    NASA Astrophysics Data System (ADS)

    Wang, D.; Morton, D. C.; Collatz, G. J.

    2013-12-01

    Climate regulates fire activity through controls on vegetation productivity (fuels), lightning ignitions, and conditions governing fire spread. In many regions of the world, human management also influences the timing, duration, and extent of fire activity. These coupled interactions between human and natural systems make fire a complex component of the Earth system. Satellite data provide valuable information on the spatial and temporal dynamics of recent fire activity, as active fires, burned area, and land cover information can be combined to separate wildfires from intentional burning for agriculture and forestry. Here, we combined satellite-derived burned area data with land cover and climate data to assess fire-climate relationships in North America between 2000-2012. We used the latest versions of the Global Fire Emissions Database (GFED) burned area product and Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate data to develop regional relationships between burned area and potential evaporation (PE), an integrated dryness metric. Logistic regression models were developed to link burned area with PE and individual climate variables during and preceding the fire season, and optimal models were selected based on Akaike Information Criterion (AIC). Overall, our model explained 85% of the variance in burned area since 2000 across North America. Fire-climate relationships from the era of satellite observations provide a blueprint for potential changes in fire activity under scenarios of climate change. We used that blueprint to evaluate potential changes in fire activity over the next 50 years based on twenty models from the Coupled Model Intercomparison Project Phase 5 (CMIP5). All models suggest an increase of PE under low and high emissions scenarios (Representative Concentration Pathways (RCP) 4.5 and 8.5, respectively), with largest increases in projected burned area across the western US and central Canada. Overall, near

  14. Advanced information society(2)

    NASA Astrophysics Data System (ADS)

    Masuyama, Keiichi

    Our modern life is full of information and information infiltrates into our daily life. Networking of the telecommunication is extended to society, company, and individual level. Although we have just entered the advanced information society, business world and our daily life have been steadily transformed by the advancement of information network. This advancement of information brings a big influence on economy, and will play they the main role in the expansion of domestic demands. This paper tries to view the image of coming advanced information society, focusing on the transforming businessman's life and the situation of our daily life, which became wealthy by the spread of daily life information and the visual information by satellite system, in the development of the intelligent city.

  15. Federal Energy Information Systems.

    ERIC Educational Resources Information Center

    Coyne, Joseph G.; Moneyhun, Dora H.

    1979-01-01

    Describes the Energy Information Administration (EIA) and the Technical Information Center (TIC), and lists databases accessible online to the Department of Energy and its contractors through DOE/RECON. (RAA)

  16. Indiana Health Information Exchange

    Cancer.gov

    The Indiana Health Information Exchange is comprised of various Indiana health care institutions, established to help improve patient safety and is recognized as a best practice for health information exchange.

  17. Keeping Public Information Public.

    ERIC Educational Resources Information Center

    Kelley, Wayne P.

    1998-01-01

    Discusses the trend toward the transfer of federal government information from the public domain to the private sector. Topics include free access, privatization, information-policy revision, accountability, copyright issues, costs, pricing, and market needs versus public needs. (LRW)

  18. Public informations guidelines

    SciTech Connect

    1986-06-01

    The purpose of these Public Information Guidelines is to provide principles for the implementation of the NWPA mandate and the Mission Plan requirements for the provision of public information. These Guidelines set forth the public information policy to be followed by all Office of Civilian Radioactive Waste Management (OCRWM) performance components. The OCRWM offices should observe these Guidelines in shaping and conducting public information activities.

  19. Information Technology Strategic Plan

    DTIC Science & Technology

    1998-06-01

    The members of the Information Technology Steering Group (ITSG) have been meeting since January 1998 to support the Space and Naval Warfare (SPAWAR...for corporate information technology (IT). The IT Strategic Plan documents the role that corporate information technology plays in achieving SSC San...Diego’s mission, vision and goals. This plan defines a vision for SSC San Diego’s information technology environment that will enhance the quality of

  20. Chaplain Personnel Information Guide

    DTIC Science & Technology

    1991-04-15

    documnt may not be Mased for opem pisbiktao. si it hu been deajed by the appropiate miitay svice ot oervmnent qaeny. CHAPLAIN PERSONNEL INFORMATION GUIDE ...Include Security Classification) Chaplain Personnel Information Guide 12 PERSONAL AUTHOR(S) Chaplain (LTC) Jerry W. Black 13a. TYPE OF REPORT 13b...personnel information guide called the "Red Book." This guide contains information papers that are updated annually on subjects frequently discussed among the

  1. Information for Agricultural Development.

    ERIC Educational Resources Information Center

    Kaungamno, E. E.

    This paper describes the major international agricultural information services, sources, and systems; outlines the existing information situation in Tanzania as it relates to problems of agricultural development; and reviews the improvements in information provision resources required to support the process of agricultural development in Tanzania.…

  2. Europe and Information Science.

    ERIC Educational Resources Information Center

    Ingwersen, Peter

    1997-01-01

    Discusses recent European library and information science (LIS) events. Describes the development and use of regional and intra-European Union networks for science. Highlights three European conferences held in 1996: ACM-SIGIR on information retrieval held in Switzerland, Information Seeking in Context (ISIC) held in Finland, and Conceptions of…

  3. Developing an Information Strategy

    ERIC Educational Resources Information Center

    Hanson, Terry

    2011-01-01

    The purpose of an information strategy is to highlight the extent to which a modern, complex organization depends on information, in all of its guises, and to consider how this strategic asset should be managed. This dependency has always been present and nowhere more so than in universities, whose very purpose is built around information and its…

  4. Personal, Anticipated Information Need

    ERIC Educational Resources Information Center

    Bruce, Harry

    2005-01-01

    Background: The role of personal information collections is a well known feature of personal information management. The World Wide Web has introduced to such collections ideas such as filing Web pages or noting their existence in "Bookmarks" and "Favourites". Argument: It is suggested that personal information collections are…

  5. Quantum Information Science

    DTIC Science & Technology

    2012-02-01

    for constructing quantum gates. In [Miller11b] we detailed the use of multiplexing to simulate quantum teleportation . One alternative to multiplexing...LABORATORY INFORMATION DIRECTORATE QUANTUM INFORMATION SCIENCE FEBRUARY 2012 FINAL TECHNICAL REPORT  ROME, NY...YYYY) FEB 2012 2. REPORT TYPE Final Technical Report 3. DATES COVERED (From - To) OCT 2009 – SEP 2011 4. TITLE AND SUBTITLE QUANTUM INFORMATION

  6. Information Technology Initiative (Videorecording),

    DTIC Science & Technology

    Physical description: 1 VHS video; col.; sd.; mono.; standard playback sp.; 35:40 mins.; 1/2 in. In this video, Dr. Kurt Fisher, Deputy Director for Information Technology , introduces the Corporate Information Management (CIM) program and explains the following major technical initiatives: reuse/repositories; I-case; data administration; information technology architecture; software process improvement; standards.

  7. Is Information Still Relevant?

    ERIC Educational Resources Information Center

    Ma, Lia

    2013-01-01

    Introduction: The term "information" in information science does not share the characteristics of those of a nomenclature: it does not bear a generally accepted definition and it does not serve as the bases and assumptions for research studies. As the data deluge has arrived, is the concept of information still relevant for information…

  8. Information Resources Management.

    ERIC Educational Resources Information Center

    Bergeron, Pierrette

    1996-01-01

    Information, like other organizational resources, needs to be managed to help organizations improve productivity, competitiveness, and overall performance. Reviews developments (1986-96) in Information Resources Management (IRM). Examines the concept of IRM; IRM from information technology and integrative perspectives; IRM practices; IRM in the…

  9. America's Rural Information Resource.

    ERIC Educational Resources Information Center

    La Caille John, Patricia

    The Rural Information Center (RIC), a project of two agencies of the U.S. Department of Agriculture, has served rural information needs since 1988. The targeted audience for the RIC is local officials and citizens, rather than scientists and federal officials, and the thrust of its information is rural development rather than production…

  10. Shifting Boundaries in Information.

    ERIC Educational Resources Information Center

    Schiller, Anita

    1981-01-01

    Social interest in information as a shared resource is diminishing, while proprietary interest in information as a profitable resource increases. Technological advancement in information processing and services, considerations of cost recovery by public agencies, and opportunities for commercial profit are blurring distinctions between the public…

  11. MILITARY INFORMATION SYSTEMS,

    DTIC Science & Technology

    upward are usually indications of how effectively the system is developing or operating. The use of computers in information systems tends to increase...computers into information systems must always begin at the lowest level of aggregation in the job hierarchy. Only those information-processing jobs

  12. Futures Information Interchange.

    ERIC Educational Resources Information Center

    Massachusetts Univ., Amherst. School of Education.

    This newsletter is an information exchange effort on the part of the Futures Information Center being established at the University of Massachusetts. Typical issues will contain information on innovative lesson plans, ideas, materials, project descriptions, or other facets which are being implemented at various levels and schools on the topic of…

  13. Teaching Information Technology Law

    ERIC Educational Resources Information Center

    Taylor, M. J.; Jones, R. P.; Haggerty, J.; Gresty, D.

    2009-01-01

    In this paper we discuss an approach to the teaching of information technology law to higher education computing students that attempts to prepare them for professional computing practice. As information technology has become ubiquitous its interactions with the law have become more numerous. Information technology practitioners, and in particular…

  14. Quick Information Sheets. 1988.

    ERIC Educational Resources Information Center

    Wisconsin Univ., Madison. Trace Center.

    The Trace Center gathers and organizes information on communication, control, and computer access for handicapped individuals. The information is disseminated in the form of brief sheets describing print, nonprint, and organizational resources and listing addresses and telephone numbers for ordering or for additional information. This compilation…

  15. What Is Information Design?

    ERIC Educational Resources Information Center

    Redish, Janice C. (Ginny)

    2000-01-01

    Defines two meanings of information design: the overall process of developing a successful document; and the way the information is presented on the screen (layout, typography, color, and so forth). Discusses the future importance of both of these meanings of information design, in terms of design for the web and single-sources (planning…

  16. A Global Information Utility.

    ERIC Educational Resources Information Center

    Block, Robert S.

    1984-01-01

    High-powered satellites, along with other existing technologies, make possible a world information utility that could distribute virtually limitless information to every point on earth. The utility could distribute information for business, government, education, and entertainment. How the utility would work is discussed. (RM)

  17. Energy information directory 1995

    SciTech Connect

    1995-10-01

    The National Energy Information Center provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. This Energy Information Directory is used to assist the Center staff as well as other DOE staff in directing inquires to the proper offices.

  18. Security classification of information

    SciTech Connect

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  19. Community Information Systems.

    ERIC Educational Resources Information Center

    Freeman, Andrew

    Information is provided on technological and social trends as background for a workshop designed to heighten the consciousness of workers in community information systems. Initially, the basic terminology is considered in its implications for an integrated perspective of community information systems, with particular attention given to the meaning…

  20. Connectionist Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  1. Information network architectures

    NASA Technical Reports Server (NTRS)

    Murray, N. D.

    1985-01-01

    Graphs, charts, diagrams and outlines of information relative to information network architectures for advanced aerospace missions, such as the Space Station, are presented. Local area information networks are considered a likely technology solution. The principle needs for the network are listed.

  2. Pricing of Information.

    ERIC Educational Resources Information Center

    Furneaux, M. I. P.; Newton, J.

    This essay considers the cost of information retrieval by databases and information centers, and explores the need to charge users for the information supplied. The advantages and disadvantages of three means of charging users are discussed: (1) connnect hour charge, (2) print/type charge, and (3) subscription. Also addressed is the practice of…

  3. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  4. Planning Community Information Utilities.

    ERIC Educational Resources Information Center

    Sackman, Harold, Ed.; Boehm, Barry W., Ed.

    Massive social changes are bound to occur with the extension of mass information utilities: the fundamental question is how shall this massive reconstruction of social information power be designed for the best interest of people. This book grew out of an American Federation of Information Processing Societies (AFIPS) conference, and is organized…

  5. Size at the onset of maturity (SOM) revealed in length-weight relationships of brackish amphipods and isopods: An information theory approach

    NASA Astrophysics Data System (ADS)

    Longo, Emanuela; Mancinelli, Giorgio

    2014-01-01

    In amphipods and other small-sized crustaceans, allometric relationships are conventionally analysed by fitting the standard model Y = a·Xb (X and Y are, e.g., body length and weight, respectively) whose scaling exponent b is assumed to be constant. However, breakpoints in allometric relationships have long been documented in large-sized crustaceans, ultimately determined by ontogenetic, abrupt variations in the value of b. Here, the existence of breakpoints in length-weight relationships was investigated in four amphipod (i.e., Gammarus aequicauda, Gammarus insensibilis, Microdeutopus gryllotalpa, and Dexamine spinosa) and three isopod species (i.e., Lekanesphaera hookeri, Sphaeroma serratum, and Cymodoce truncata) from three Mediterranean lagoons. The power of two candidate linear models fitted to log10-transformed data - a simple model assuming a constant exponent b and a segmented model assuming b to vary after a breakpoint - was compared using a parsimonious selection strategy based on the Akaike information criterion. The segmented model with a breakpoint provided the most accurate fitting of length-weight data in the majority of the species analysed; non-conclusive results were obtained only for D. spinosa and C. truncata, of which a limited number of specimens was examined. Model parameters were consistent for amphipod and isopod species collected across the three different habitats; the generality of the results was further supported by a literature search confirming that the identified breakpoints corresponded with ontogenetic discontinuities related with sexual maturation in all the species investigated. In this study, segmented regression models were revealed to provide a statistically accurate and biologically meaningful description of length-weight relationships of common amphipod and isopod species. The methodological limitations of the approach are considered, while the practical implications for secondary production estimates are discussed.

  6. Information-limiting correlations

    PubMed Central

    Moreno-Bote, Rubén; Beck, Jeffrey; Kanitscheider, Ingmar; Pitkow, Xaq; Latham, Peter; Pouget, Alexandre

    2015-01-01

    Computational strategies used by the brain strongly depend on the amount of information that can be stored in population activity, which in turn strongly depends on the pattern of noise correlations. In vivo, noise correlations tend to be positive and proportional to the similarity in tuning properties. Such correlations are thought to limit information, which has led to the suggestion that decorrelation increases information. In contrast, we found, analytically and numerically, that decorrelation does not imply an increase in information. Instead, the only information-limiting correlations are what we refer to as differential correlations: correlations proportional to the product of the derivatives of the tuning curves. Unfortunately, differential correlations are likely to be very small and buried under correlations that do not limit information, making them particularly difficult to detect. We found, however, that the effect of differential correlations on information can be detected with relatively simple decoders. PMID:25195105

  7. Earth Science Information Center

    USGS Publications Warehouse

    ,

    1991-01-01

    An ESIC? An Earth Science Information Center. Don't spell it. Say it. ESIC. It rhymes with seasick. You can find information in an information center, of course, and you'll find earth science information in an ESIC. That means information about the land that is the Earth, the land that is below the Earth, and in some instances, the space surrounding the Earth. The U.S. Geological Survey (USGS) operates a network of Earth Science Information Centers that sell earth science products and data. There are more than 75 ESIC's. Some are operated by the USGS, but most are in other State or Federal agencies. Each ESIC responds to requests for information received by telephone, letter, or personal visit. Your personal visit.

  8. Human Benzene Metabolism Following Occupational and Environmental Exposures

    PubMed Central

    Rappaport, Stephen M.; Kim, Sungkyoon; Lan, Qing; Li, Guilan; Vermeulen, Roel; Waidyanatha, Suramya; Zhang, Luoping; Yin, Songnian; Smith, Martyn T.; Rothman, Nathaniel

    2011-01-01

    We previously reported evidence that humans metabolize benzene via two enzymes, including a hitherto unrecognized high-affinity enzyme that was responsible for an estimated 73 percent of total urinary metabolites [sum of phenol (PH), hydroquinone (HQ), catechol (CA), E,E-muconic acid (MA), and S-phenylmercapturic acid (SPMA)] in nonsmoking females exposed to benzene at sub-saturating (ppb) air concentrations. Here, we used the same Michaelis-Menten-like kinetic models to individually analyze urinary levels of PH, HQ, CA and MA from 263 nonsmoking Chinese women (179 benzene-exposed workers and 84 control workers) with estimated benzene air concentrations ranging from less than 0.001 ppm to 299 ppm. One model depicted benzene metabolism as a single enzymatic process (1-enzyme model) and the other as two enzymatic processes which competed for access to benzene (2-enzyme model). We evaluated model fits based upon the difference in values of Akaike’s Information Criterion (ΔAIC), and we gauged the weights of evidence favoring the two models based upon the associated Akaike weights and Evidence Ratios. For each metabolite, the 2-enzyme model provided a better fit than the 1-enzyme model with ΔAIC values decreasing in the order 9.511 for MA, 7.379 for PH, 1.417 for CA, and 0.193 for HQ. The corresponding weights of evidence favoring the 2-enzyme model (Evidence Ratios) were: 116.2:1 for MA, 40.0:1 for PH, 2.0:1 for CA and 1.1:1 for HQ. These results indicate that our earlier findings from models of total metabolites were driven largely by MA, representing the ring-opening pathway, and by PH, representing the ring-hydroxylation pathway. The predicted percentage of benzene metabolized by the putative high-affinity enzyme at an air concentration of 0.001 ppm was 88% based upon urinary MA and was 80% based upon urinary PH. As benzene concentrations increased, the respective percentages of benzene metabolized to MA and PH by the high-affinity enzyme decreased successively

  9. Characterizing the relationship between temperature and mortality in tropical and subtropical cities: a distributed lag non-linear model analysis in Hue, Viet Nam, 2009–2013

    PubMed Central

    Dang, Tran Ngoc; Seposo, Xerxes T.; Duc, Nguyen Huu Chau; Thang, Tran Binh; An, Do Dang; Hang, Lai Thi Minh; Long, Tran Thanh; Loan, Bui Thi Hong; Honda, Yasushi

    2016-01-01

    Background The relationship between temperature and mortality has been found to be U-, V-, or J-shaped in developed temperate countries; however, in developing tropical/subtropical cities, it remains unclear. Objectives Our goal was to investigate the relationship between temperature and mortality in Hue, a subtropical city in Viet Nam. Design We collected daily mortality data from the Vietnamese A6 mortality reporting system for 6,214 deceased persons between 2009 and 2013. A distributed lag non-linear model was used to examine the temperature effects on all-cause and cause-specific mortality by assuming negative binomial distribution for count data. We developed an objective-oriented model selection with four steps following the Akaike information criterion (AIC) rule (i.e. a smaller AIC value indicates a better model). Results High temperature-related mortality was more strongly associated with short lags, whereas low temperature-related mortality was more strongly associated with long lags. The low temperatures increased risk in all-category mortality compared to high temperatures. We observed elevated temperature-mortality risk in vulnerable groups: elderly people (high temperature effect, relative risk [RR]=1.42, 95% confidence interval [CI]=1.11–1.83; low temperature effect, RR=2.0, 95% CI=1.13–3.52), females (low temperature effect, RR=2.19, 95% CI=1.14–4.21), people with respiratory disease (high temperature effect, RR=2.45, 95% CI=0.91–6.63), and those with cardiovascular disease (high temperature effect, RR=1.6, 95% CI=1.15–2.22; low temperature effect, RR=1.99, 95% CI=0.92–4.28). Conclusions In Hue, the temperature significantly increased the risk of mortality, especially in vulnerable groups (i.e. elderly, female, people with respiratory and cardiovascular diseases). These findings may provide a foundation for developing adequate policies to address the effects of temperature on health in Hue City. PMID:26781954

  10. A modified hypoxia-based TCP model to investigate the clinical outcome of stereotactic hypofractionated regimes for early stage non-small-cell lung cancer (NSCLC)

    SciTech Connect

    Strigari, L.; Benassi, M.; Sarnelli, A.; Polico, R.; D'Andrea, M.

    2012-07-15

    Purpose: Stereotactic body radiotherapy (SBRT) has been applied to lung tumors at different stages and sizes with good local tumor control (LC) rates. The linear quadratic model (LQM), in its basic formulation, does not seem to be appropriate to describe the response to radiotherapy for clinical trials, based on a few fractions. Thus, the main aim of this work was to develop a model, which takes into account the hypoxic cells and their reoxygenation. Methods: A parameter named B has been introduced in a modified tumor control probability (TCP) from LQM and linear-quadratic-linear model (LQLM), and represents the fraction of hypoxic cells that survive and become oxygenated after each irradiation. Based on published trials evaluating LC at 3 yr (LC3), values of B were obtained by maximum likelihood minimization between predicted TCP and clinical LC3. Two oxygen enhancement ratio (OER) parameter sets (1 and 2) from literature have been adopted to calculate the B-factors. Initial hypoxic cell fractions ({eta}{sub h}) from 0.05 to 0.50 were assumed. Log-likelihood (L) and Akaike information criterion (AIC) were determined in an independent clinical validation dataset. Results: The B-values of modified TCPs spanned the whole interval from 0 to 1, depending on the fractionation scheme (number of fractions and dose/fraction), showing a maximum (close to 1) at doses/fraction of 8-12 Gy. The B-values calculated using the OER parameter set 1 exhibited a smoother falloff than set 2. An analytical expression was derived to describe the B-value's dependence on the fractionation scheme. The R{sup 2}-adjusted values varied from 0.63 to 0.67 for LQ models and OER set 1 and from 0.75 to 0.78 for LQ model and OER set 2. Lower values of R{sup 2}-adjusted were found for LQLM and both OER sets. L and AIC, calculated using a fraction of {eta}{sub h} = 0.15 and the B-value from the authors analytical expression were higher than for other {eta}{sub h}-values, irrespective of model or OER

  11. The number and type of food retailers surrounding schools and their association with lunchtime eating behaviours in students

    PubMed Central

    2013-01-01

    Background The primary study objective was to examine whether the presence of food retailers surrounding schools was associated with students’ lunchtime eating behaviours. The secondary objective was to determine whether measures of the food retail environment around schools captured using road network or circular buffers were more strongly related to eating behaviours while at school. Methods Grade 9 and 10 students (N=6,971) who participated in the 2009/10 Canadian Health Behaviour in School Aged Children Survey were included in this study. The outcome was determined by students’ self-reports of where they typically ate their lunch during school days. Circular and road network-based buffers were created for a 1 km distance surrounding 158 schools participating in the HBSC. The addresses of fast food restaurants, convenience stores and coffee/donut shops were mapped within the buffers. Multilevel logistic regression was used to determine whether there was a relationship between the presence of food retailers near schools and students regularly eating their lunch at a fast food restaurant, snack-bar or café. The Akaike Information Criteria (AIC) value, a measure of goodness-of-fit, was used to determine the optimal buffer type. Results For the 1 km circular buffers, students with 1–2 (OR= 1.10, 95% CI: 0.57-2.11), 3–4 (OR=1.45, 95% CI: 0.75-2.82) and ≥5 nearby food retailers (OR=2.94, 95% CI: 1.71-5.09) were more likely to eat lunch at a food retailer compared to students with no nearby food retailers. The relationships were slightly stronger when assessed via 1 km road network buffers, with a greater likelihood of eating at a food retailer for 1–2 (OR=1.20, 95% CI:0.74-1.95), 3–4 (OR=3.19, 95% CI: 1.66-6.13) and ≥5 nearby food retailers (OR=3.54, 95% CI: 2.08-6.02). Road network buffers appeared to provide a better measure of the food retail environment, as indicated by a lower AIC value (3332 vs. 3346). Conclusions There was a strong

  12. Short-term effects of air quality and thermal stress on non-accidental morbidity-a multivariate meta-analysis comparing indices to single measures.

    PubMed

    Lokys, Hanna Leona; Junk, Jürgen; Krein, Andreas

    2017-02-28

    Air quality and thermal stress lead to increased morbidity and mortality. Studies on morbidity and the combined impact of air pollution and thermal stress are still rare. To analyse the correlations between air quality, thermal stress and morbidity, we used a two-stage meta-analysis approach, consisting of a Poisson regression model combined with distributed lag non-linear models (DLNMs) and a meta-analysis investigating whether latitude or the number of inhabitants significantly influence the correlations. We used air pollution, meteorological and hospital admission data from 28 administrative districts along a north-south gradient in western Germany from 2001 to 2011. We compared the performance of the single measure particulate matter (PM10) and air temperature to air quality indices (MPI and CAQI) and the biometeorological index UTCI. Based on the Akaike information criterion (AIC), it can be shown that using air quality indices instead of single measures increases the model strength. However, using the UTCI in the model does not give additional information compared to mean air temperature. Interaction between the 3-day average of air quality (max PM10, max CAQI and max MPI) and meteorology (mean air temperature and mean UTCI) did not improve the models. Using the mean air temperature, we found immediate effects of heat stress (RR 1.0013, 95% CI: 0.9983-1.0043) and by 3 days delayed effects of cold stress (RR: 1.0184, 95% CI: 1.0117-1.0252). The results for air quality differ between both air quality indices and PM10. CAQI and MPI show a delayed impact on morbidity with a maximum RR after 2 days (MPI 1.0058, 95% CI: 1.0013-1.0102; CAQI 1.0068, 95% CI: 1.0030-1.0107). Latitude was identified as a significant meta-variable, whereas the number of inhabitants was not significant in the model.

  13. ActiveSeismoPick3D - automatic first arrival determination for large active seismic arrays

    NASA Astrophysics Data System (ADS)

    Paffrath, Marcel; Küperkoch, Ludger; Wehling-Benatelli, Sebastian; Friederich, Wolfgang

    2016-04-01

    We developed a tool for automatic determination of first arrivals in active seismic data based on an approach, that utilises higher order statistics (HOS) and the Akaike information criterion (AIC), commonly used in seismology, but not in active seismics. Automatic picking is highly desirable in active seismics as the number of data provided by large seismic arrays rapidly exceeds of what an analyst can evaluate in a reasonable amount of time. To bring the functionality of automatic phase picking into the context of active data, the software package ActiveSeismoPick3D was developed in Python. It uses a modified algorithm for the determination of first arrivals which searches for the HOS maximum in unfiltered data. Additionally, it offers tools for manual quality control and postprocessing, e.g. various visualisation and repicking functionalities. For flexibility, the tool also includes methods for the preparation of geometry information of large seismic arrays and improved interfaces to the Fast Marching Tomography Package (FMTOMO), which can be used for the prediction of travel times and inversion for subsurface properties. Output files are generated in the VTK format, allowing the 3D visualization of e.g. the inversion results. As a test case, a data set consisting of 9216 traces from 64 shots was gathered, recorded at 144 receivers deployed in a regular 2D array of a size of 100 x 100 m. ActiveSeismoPick3D automatically checks the determined first arrivals by a dynamic signal to noise ratio threshold. From the data a 3D model of the subsurface was generated using the export functionality of the package and FMTOMO.

  14. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  15. Short-term effects of air quality and thermal stress on non-accidental morbidity—a multivariate meta-analysis comparing indices to single measures

    NASA Astrophysics Data System (ADS)

    Lokys, Hanna Leona; Junk, Jürgen; Krein, Andreas

    2017-02-01

    Air quality and thermal stress lead to increased morbidity and mortality. Studies on morbidity and the combined impact of air pollution and thermal stress are still rare. To analyse the correlations between air quality, thermal stress and morbidity, we used a two-stage meta-analysis approach, consisting of a Poisson regression model combined with distributed lag non-linear models (DLNMs) and a meta-analysis investigating whether latitude or the number of inhabitants significantly influence the correlations. We used air pollution, meteorological and hospital admission data from 28 administrative districts along a north-south gradient in western Germany from 2001 to 2011. We compared the performance of the single measure particulate matter (PM10) and air temperature to air quality indices (MPI and CAQI) and the biometeorological index UTCI. Based on the Akaike information criterion (AIC), it can be shown that using air quality indices instead of single measures increases the model strength. However, using the UTCI in the model does not give additional information compared to mean air temperature. Interaction between the 3-day average of air quality (max PM10, max CAQI and max MPI) and meteorology (mean air temperature and mean UTCI) did not improve the models. Using the mean air temperature, we found immediate effects of heat stress (RR 1.0013, 95% CI: 0.9983-1.0043) and by 3 days delayed effects of cold stress (RR: 1.0184, 95% CI: 1.0117-1.0252). The results for air quality differ between both air quality indices and PM10. CAQI and MPI show a delayed impact on morbidity with a maximum RR after 2 days (MPI 1.0058, 95% CI: 1.0013-1.0102; CAQI 1.0068, 95% CI: 1.0030-1.0107). Latitude was identified as a significant meta-variable, whereas the number of inhabitants was not significant in the model.

  16. Productivity, embryo and eggshell characteristics, and contaminants in bald eagles from the Great Lakes, USA, 1986 to 2000.

    PubMed

    Best, David A; Elliott, Kyle H; Bowerman, William W; Shieldcastle, Mark; Postupalsky, Sergej; Kubiak, Timothy J; Tillitt, Donald E; Elliott, John E

    2010-07-01

    Chlorinated hydrocarbon concentrations in eggs of fish-eating birds from contaminated environments such as the Great Lakes of North America tend to be highly intercorrelated, making it difficult to elucidate mechanisms causing reproductive impairment, and to ascribe cause to specific chemicals. An information- theoretic approach was used on data from 197 salvaged bald eagle (Haliaeetus leucocephalus) eggs (159 clutches) that failed to hatch in Michigan and Ohio, USA (1986-2000). Contaminant levels declined over time while eggshell thickness increased, and by 2000 was at pre-1946 levels. The number of occupied territories and productivity increased during 1981 to 2004. For both the entire dataset and a subset of nests along the Great Lakes shoreline, polychlorinated biphenyls (SigmaPCBs, fresh wet wt) were generally included in the most parsimonious models (lowest-Akaike's information criterion [AICs]) describing productivity, with significant declines in productivity observed above 26 microg/g SigmaPCBs (fresh wet wt). Of 73 eggs with a visible embryo, eight (11%) were abnormal, including three with skewed bills, but they were not associated with known teratogens, including SigmaPCBs. Eggs with visible embryos had greater concentrations of all measured contaminants than eggs without visible embryos; the most parsimonious models describing the presence of visible embryos incorporated dieldrin equivalents and dichlorodiphenyldichloroethylene (DDE). There were significant negative correlations between eggshell thickness and all contaminants, with SigmaPCBs included in the most parsimonious models. There were, however, no relationships between productivity and eggshell thickness or Ratcliffe's index. The SigmaPCBs and DDE were negatively associated with nest success of bald eagles in the Great Lakes watersheds, but the mechanism does not appear to be via shell quality effects, at least at current contaminant levels, while it is not clear what other mechanisms were

  17. Density dependence and risk of extinction in a small population of sea otters

    USGS Publications Warehouse

    Gerber, L.R.; Buenau, K.E.; VanBlaricom, G.

    2004-01-01

    Sea otters (Enhydra lutris (L.)) were hunted to extinction off the coast of Washington State early in the 20th century. A new population was established by translocations from Alaska in 1969 and 1970. The population, currently numbering at least 550 animals, A major threat to the population is the ongoing risk of majour oil spills in sea otter habitat. We apply population models to census and demographic data in order to evaluate the status of the population. We fit several density dependent models to test for density dependence and determine plausible values for the carrying capacity (K) by comparing model goodness of fit to an exponential model. Model fits were compared using Akaike Information Criterion (AIC). A significant negative relationship was found between the population growth rate and population size (r2=0.27, F=5.57, df=16, p<0.05), suggesting density dependence in Washington state sea otters. Information criterion statistics suggest that the model is the most parsimonious, followed closely by the logistic Beverton-Holt model. Values of K ranged from 612 to 759 with best-fit parameter estimates for the Beverton-Holt model including 0.26 for r and 612 for K. The latest (2001) population index count (555) puts the population at 87-92% of the estimated carrying capacity, above the suggested range for optimum sustainable population (OSP). Elasticity analysis was conducted to examine the effects of proportional changes in vital rates on the population growth rate (??). The elasticity values indicate the population is most sensitive to changes in survival rates (particularly adult survival).

  18. Empirical evaluation of scoring functions for Bayesian network model selection.

    PubMed

    Liu, Zhifa; Malone, Brandon; Yuan, Changhe

    2012-01-01

    In this work, we empirically evaluate the capability of various scoring functions of Bayesian networks for recovering true underlying structures. Similar investigations have been carried out before, but they typically relied on approximate learning algorithms to learn the network structures. The suboptimal structures found by the approximation methods have unknown quality and may affect the reliability of their conclusions. Our study uses an optimal algorithm to learn Bayesian network structures from datasets generated from a set of gold standard Bayesian networks. Because all optimal algorithms always learn equivalent networks, this ensures that only the choice of scoring function affects the learned networks. Another shortcoming of the previous studies stems from their use of random synthetic networks as test cases. There is no guarantee that these networks reflect real-world data. We use real-world data to generate our gold-standard structures, so our experimental design more closely approximates real-world situations. A major finding of our study suggests that, in contrast to results reported by several prior works, the Minimum Description Length (MDL) (or equivalently, Bayesian information criterion (BIC)) consistently outperforms other scoring functions such as Akaike's information criterion (AIC), Bayesian Dirichlet equivalence score (BDeu), and factorized normalized maximum likelihood (fNML) in recovering the underlying Bayesian network structures. We believe this finding is a result of using both datasets generated from real-world applications rather than from random processes used in previous studies and learning algorithms to select high-scoring structures rather than selecting random models. Other findings of our study support existing work, e.g., large sample sizes result in learning structures closer to the true underlying structure; the BDeu score is sensitive to the parameter settings; and the fNML performs pretty well on small datasets. We also

  19. Diversity of benthic biofilms along a land use gradient in tropical headwater streams, Puerto Rico.

    PubMed

    Burgos-Caraballo, Sofía; Cantrell, Sharon A; Ramírez, Alonso

    2014-07-01

    The properties of freshwater ecosystems can be altered, directly or indirectly, by different land uses (e.g., urbanization and agriculture). Streams heavily influenced by high nutrient concentrations associated with agriculture or urbanization may present conditions that can be intolerable for many aquatic species such as macroinvertebrates and fishes. However, information with respect to how benthic microbial communities may respond to changes in stream ecosystem properties in relation to agricultural or urban land uses is limited, in particular for tropical ecosystems. In this study, diversity of benthic biofilms was evaluated in 16 streams along a gradient of land use at the Turabo watershed in Puerto Rico using terminal restriction fragment length polymorphism. Diversity indices and community structure descriptors (species richness, Shannon diversity, dominance and evenness) were calculated for both bacteria and eukaryotes for each stream. Diversity of both groups, bacteria and eukaryotes, did not show a consistent pattern with land use, since it could be high or low at streams dominated by different land uses. This suggests that diversity of biofilms may be more related to site-specific conditions rather than watershed scale factors. To assess this contention, the relationship between biofilm diversity and reach-scale parameters (i.e., nutrient concentrations, canopy cover, conductivity, and dissolved oxygen) was determined using the Akaike Information Criterion (AIC(c)) for small sample size. Results indicated that nitrate was the variable that best explained variations in biofilm diversity. Since nitrate concentrations tend to increase with urban land use, our results suggest that urbanization may indeed increase microbial diversity indirectly by increasing nutrients in stream water.

  20. Critical thresholds associated with habitat loss: a review of the concepts, evidence, and applications.

    PubMed

    Swift, Trisha L; Hannon, Susan J

    2010-02-01

    A major conservation concern is whether population size and other ecological variables change linearly with habitat loss, or whether they suddenly decline more rapidly below a "critical threshold" level of habitat. The most commonly discussed explanation for critical threshold responses to habitat loss focus on habitat configuration. As habitat loss progresses, the remaining habitat is increasingly fragmented or the fragments are increasingly isolated, which may compound the effects of habitat loss. In this review we also explore other possible explanations for apparently nonlinear relationships between habitat loss and ecological responses, including Allee effects and time lags, and point out that some ecological variables will inherently respond nonlinearly to habitat loss even in the absence of compounding factors. In the literature, both linear and nonlinear ecological responses to habitat loss are evident among simulation and empirical studies, although the presence and value of critical thresholds is influenced by characteristics of the species (e.g. dispersal, reproduction, area/edge sensitivity) and landscape (e.g. fragmentation, matrix quality, rate of change). With enough empirical support, such trends could be useful for making important predictions about species' responses to habitat loss, to guide future research on the underlying causes of critical thresholds, and to make better informed management decisions. Some have seen critical thresholds as a means of identifying conservation targets for habitat retention. We argue that in many cases this may be misguided, and that the meaning (and utility) of a critical threshold must be interpreted carefully and in relation to the response variable and management goal. Despite recent interest in critical threshold responses to habitat loss, most studies have not used any formal statistical methods to identify their presence or value. Methods that have been used include model comparisons using Akaike

  1. Broad-scale predictors of canada lynx occurrence in eastern North America

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Joseph, R.A.; O'Brien, M.

    2005-01-01

    The Canada lynx (Lynx canadensis) is listed as a threatened species throughout the southern extent of its geographic range in the United States. Most research on lynx has been conducted in the western United States and Canada; little is known about the ecology of lynx in eastern North America. To fill critical knowledge gaps about this species, we modeled and mapped lynx occurrence using habitat and weather data from 7 eastern states and 3 Canadian provinces. Annual snowfall, road density, bobcat (L. rufus) harvest, deciduous forest, and coniferous forest were compared at 1,150 lynx locations and 1,288 random locations. Nineteen a priori models were developed using the information-theoretic approach, and logistic regression models were ranked using Akaike's Information Criterion (AIC) and by our ability to correctly classify reserved data (Kappa). Annual snowfall and deciduous forest predicted lynx presence and absence for a reserved dataset (n = 278) with 94% accuracy. A map of the probability of lynx occurrence throughout the region revealed that 92% of the potential habitat (i.e., >50% probability of occurrence) was concentrated in a relatively contiguous complex encompassing northern Maine, New Brunswick, and the Gaspe?? peninsula of Quebec. Most of the remaining potential habitat (5%) was on northern Cape Breton Island in Nova Scotia. Potential habitat in New Hampshire, Vermont, and New York was small (1,252 km2), fragmented, and isolated (>200 km) from known lynx populations. When federally listed as threatened in the contiguous United States in 2000, inadequate regulations on federal lands were cited as the primary threat to Canada lynx. However, the majority of potential lynx habitat in the eastern United States is on private lands and continuous with potential habitat in Canada. Therefore, lynx conservation in eastern North America will need to develop partnerships across national, state, and provincial boundaries as well as with private landowners.

  2. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  3. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    PubMed Central

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-01-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272

  4. Specific count model for investing the related factors of cost of GERD and functional dyspepsia

    PubMed Central

    Abadi, Alireza; Chaibakhsh, Samira; Safaee, Azadeh; Moghimi-Dehkordi, Bijan

    2013-01-01

    Aim The purpose of this study is to analyze the cost of GERD and functional dyspepsia for investing its related factors. Background Gastro-oesophageal reflux disease GERD and dyspepsia are the most common symptoms of gastrointestinal disorders. Recent studies showed high prevalence and variety of clinical presentation of these two symptoms imposed enormous economic burden to the society. Cost data that related to economics burden have specific characteristics. So this kind of data needs to specific models. Poisson regression (PR) and negative binomial regression (NB) are the models that were used for analyzing cost data in this paper. Patients and methods This study designed as a cross-sectional household survey from May 2006 to December 2007 on a random sample of individual in the Tehran province, Iran to find the prevalence of gastrointestinal symptoms and disorders and its related factors. The Cost in each item was counted. PR and NB were carried out to the data respectively. Likelihood ratio test was performed for comparison between models. Also Log likelihood, Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) were used to compare performance of the models. Results According to Likelihood ratio test and all three criterions that we used to compare performance of the models, NB was the best model for analyzing this cost data. Sex, age and insurance statues were being significant. Conclusion PR and NB models were carried out for this data and according the results improved fit of the NB model over PR, it clearly indicates that over-dispersion is involved due to unobserved heterogeneity and/or clustering. NB model in cost data more appropriate fit than PR. PMID:24834282

  5. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size.

  6. Factors associated with utilization of antenatal care services in Balochistan province of Pakistan: An analysis of the Multiple Indicator Cluster Survey (MICS) 2010

    PubMed Central

    Ghaffar, Abdul; Pongponich, Sathirakorn; Ghaffar, Najma; Mehmood, Tahir

    2015-01-01

    Objective: The study was conducted to identify factors affecting the utilization of Antenatal Care (ANC) in Balochistan Province, Pakistan. Methods: Data on ANC utilization, together with social and economic determinants, were derived from a Multiple Indicator Cluster Survey (MICS) conducted in Balochistan in 2010. The analysis was conducted including 2339 women who gave birth in last two years preceding the survey. The researchers established a model to identify influential factors contributing to the utilization of ANC by logistic regression; model selection was by Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). Results: Household wealth, education, health condition, age at first marriage, number of children and spouse violence justification were found to be significantly associated with ANC coverage. Literate mothers are 2.45 times more likely to have ANC, and women whose newborns showed symptoms of illness at birth that needed hospitalization are 0.47 times less likely to access ANC. Women with an increase in the number of surviving children are 1.07 times less likely to have ANC, and those who think their spouse violence is socially justified are 1.36 times less likely to have ANC. The results draw attention towards evidence based planning of factors associated with utilization of ANC in the Balochistan province. Conclusion: The study reveals that women from high wealth index and having education had more chances to get ANC. Factors like younger age of the women at first marriage, increased number of children, symptoms of any illness to neonates at birth that need hospitalization and women who justify spouse violence had less chances to get ANC. Among components of ANC urine sampling and having tetanus toxoid (TT) in the last pregnancy increased the frequency of visits. ANC from a doctor decreased the number of visits. There is dire need to reduce disparities for wealth index, education and urban/rural living. PMID:26870113

  7. Empirical evaluation of scoring functions for Bayesian network model selection

    PubMed Central

    2012-01-01

    In this work, we empirically evaluate the capability of various scoring functions of Bayesian networks for recovering true underlying structures. Similar investigations have been carried out before, but they typically relied on approximate learning algorithms to learn the network structures. The suboptimal structures found by the approximation methods have unknown quality and may affect the reliability of their conclusions. Our study uses an optimal algorithm to learn Bayesian network structures from datasets generated from a set of gold standard Bayesian networks. Because all optimal algorithms always learn equivalent networks, this ensures that only the choice of scoring function affects the learned networks. Another shortcoming of the previous studies stems from their use of random synthetic networks as test cases. There is no guarantee that these networks reflect real-world data. We use real-world data to generate our gold-standard structures, so our experimental design more closely approximates real-world situations. A major finding of our study suggests that, in contrast to results reported by several prior works, the Minimum Description Length (MDL) (or equivalently, Bayesian information criterion (BIC)) consistently outperforms other scoring functions such as Akaike's information criterion (AIC), Bayesian Dirichlet equivalence score (BDeu), and factorized normalized maximum likelihood (fNML) in recovering the underlying Bayesian network structures. We believe this finding is a result of using both datasets generated from real-world applications rather than from random processes used in previous studies and learning algorithms to select high-scoring structures rather than selecting random models. Other findings of our study support existing work, e.g., large sample sizes result in learning structures closer to the true underlying structure; the BDeu score is sensitive to the parameter settings; and the fNML performs pretty well on small datasets. We also

  8. Canonical information analysis

    NASA Astrophysics Data System (ADS)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-03-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical airborne data. The simulation study shows that canonical information analysis is as accurate as and much faster than algorithms presented in previous work, especially for large sample sizes. URL:

  9. Entropy and information optics

    NASA Astrophysics Data System (ADS)

    Yu, Francis T. S.

    2000-03-01

    In this paper we shall begin our discussion with the relationship between optics and humans, in which we see that light has indeed provided us with a very valuable source of information. A general optical communication concept is discussed, in which we see that a picture is indeed worth more than a thousand words. Based on Shannon's information theory, one can show that entropy and information can be simply traded. One of the most intriguing laws of thermodynamics must be the second law, in which we have found that there exists a profound relationship between the physical entropy and information. Without this relationship, information theory would be totally useless in physical science. By applying this relationship, Maxwell and diffraction-limited demons are discussed. And finally, samples of information optics are provided.

  10. Regional Health Information Systems

    PubMed Central

    Fuller, Sherrilynne

    1997-01-01

    Abstract In general, there is agreement that robust integrated information systems are the foundation for building successful regional health care delivery systems. Integrated Advanced Information Management System (IAIMS) institutions that, over the years, have developed strategies for creating cohesive institutional information systems and services are finding that IAIMS strategies work well in the even more complex regional environment. The key elements of IAIMS planning are described and lessons learned are discussed in the context of regional health information systems developed. The challenges of aligning the various information agencies and agendas in support of a regional health information system are complex ; however, the potential rewards for health care in quality, efficacy, and cost savings are enormous. PMID:9067887

  11. ECONOMICS OF INFORMATION SYSTEMS

    DTIC Science & Technology

    The paper presents a study of the rational choice-making of an individual from among available information systems , or available components of such...components, of information systems . The available set depends on the choices made by suppliers. Joint choices by demanders and suppliers would...determine which information systems are in fact produced and used under given external conditions. These conditions include the technological knowledge of those concerned.

  12. Management Information Systems Research.

    DTIC Science & Technology

    Research on management information systems is illusive in many respects. Part of the basic research problem in MIS stems from the absence of standard...definitions and the lack of a unified body of theory. Organizations continue to develop large and often very efficient information systems , but...decision making. But the transition from these results to the realization of ’satisfactory’ management information systems remains difficult indeed. The

  13. Knowledge and information modeling.

    PubMed

    Madsen, Maria

    2010-01-01

    This chapter gives an educational overview of: * commonly used modelling methods what they represent * the importance of selecting the tools and methods suited to the health information system being designed * how the quality of the information or knowledge model is determined by the quality of the system requirements specification * differentiating between the purpose of information models and knowledge models * the benefits of the openEHR approach for health care data modeling.

  14. Freedom of Information Act

    USGS Publications Warehouse

    Newman, D.J.

    2012-01-01

    The Freedom of Information Act( FOIA), 5 U.S.C.§ 552, as amended, generally provides that any person has a right to request access to Federal agency records. The USGS proactively promotes information disclosure as inherent to its mission of providing objective science to inform decisionmakers and the general public. USGS scientists disseminate up-to-date and historical scientific data that are critical to addressing national and global priorities.

  15. Instant Random Information

    NASA Astrophysics Data System (ADS)

    Abramson, Nils H.

    2010-12-01

    Information is carried by matter or by energy and thus Einstein stated that "no information can travel faster than light." He also was very critical to the "Spooky action at distance" as described in Quantum Physics. However, many verified experiments have proven that the "Spooky actions" not only work at distance but also that they travel at a velocity faster than light, probably at infinite velocity. Examples are Young's fringes at low light levels or entanglements. My explanation is that this information is without energy. In the following I will refer to this spooky information as exformation, where "ex-" refers to existence, the information is not transported in any way, it simply exists. Thus Einstein might have been wrong when he stated that no information can travel faster than light. But he was right in that no detectable information can travel faster than light. Phenomena connected to entanglement appear at first to be exceptions, but in those cases the information can not be reconstructed until energy is later sent in the form of correlation using ordinary information at the velocity of light. In entanglement we see that even if the exformation can not be detected directly because its luck of energy it still can influence what happens at random, because in Quantum Physics there is by definition no energy difference between two states that happen randomly.

  16. CLAMS Data and Information

    Atmospheric Science Data Center

    2016-06-14

    ... Information The Chesapeake Lighthouse and Aircraft Measurements for Satellites ( CLAMS ) field campaign was conducted from NASA Wallops Flight Facility covering the middle Atlantic eastern seaboard from July 10 - ...

  17. OLEM Calendar Information

    EPA Pesticide Factsheets

    This asset includes the Office of Land and Emergency Management (OLEM) Calendar Information, which comprises three OLEM Calendars: the OLEM Calendar, the OLEM Meetings and Conference Calls Calendar and the OLEM Training and Development Calendar. --The OLEM Calendar is used as a means of sharing information about OLEM activities, due dates, meetings, conferences, audit followups, and other relevant internal information. Specific OLEM personnel have access to add and edit information. --The OLEM Meetings and Conference Calls Calendar contains national meetings and conference calls with Regions and other relevant personnel. --The OLEM Training and Development Calendar tracks OLEM training opportunities.

  18. Quantum information causality.

    PubMed

    Pitalúa-García, Damián

    2013-05-24

    How much information can a transmitted physical system fundamentally communicate? We introduce the principle of quantum information causality, which states the maximum amount of quantum information that a quantum system can communicate as a function of its dimension, independently of any previously shared quantum physical resources. We present a new quantum information task, whose success probability is upper bounded by the new principle, and show that an optimal strategy to perform it combines the quantum teleportation and superdense coding protocols with a task that has classical inputs.

  19. Neurobiology as Information Physics.

    PubMed

    Street, Sterling

    2016-01-01

    This article reviews thermodynamic relationships in the brain in an attempt to consolidate current research in systems neuroscience. The present synthesis supports proposals that thermodynamic information in the brain can be quantified to an appreciable degree of objectivity, that many qualitative properties of information in systems of the brain can be inferred by observing changes in thermodynamic quantities, and that many features of the brain's anatomy and architecture illustrate relatively simple information-energy relationships. The brain may provide a unique window into the relationship between energy and information.

  20. Neurobiology as Information Physics

    PubMed Central

    Street, Sterling

    2016-01-01

    This article reviews thermodynamic relationships in the brain in an attempt to consolidate current research in systems neuroscience. The present synthesis supports proposals that thermodynamic information in the brain can be quantified to an appreciable degree of objectivity, that many qualitative properties of information in systems of the brain can be inferred by observing changes in thermodynamic quantities, and that many features of the brain’s anatomy and architecture illustrate relatively simple information-energy relationships. The brain may provide a unique window into the relationship between energy and information. PMID:27895560

  1. Regulatory Information By Sector

    EPA Pesticide Factsheets

    Find environmental regulatory, compliance, & enforcement information for various business, industry and government sectors, listed by NAICS code. Sectors include agriculture, automotive, petroleum manufacturing, oil & gas extraction & other manufacturing

  2. Climate Change: Basic Information

    MedlinePlus

    ... EPA United States Environmental Protection Agency Search Search Climate Change Share Facebook Twitter Google+ Pinterest Contact Us Climate Change: Basic Information On This Page Climate change is ...

  3. Value of Information spreadsheet

    DOE Data Explorer

    Trainor-Guitton, Whitney

    2014-05-12

    This spreadsheet represents the information posteriors derived from synthetic data of magnetotellurics (MT). These were used to calculate value of information of MT for geothermal exploration. Information posteriors describe how well MT was able to locate the "throat" of clay caps, which are indicative of hidden geothermal resources. This data is full explained in the peer-reviewed publication: Trainor-Guitton, W., Hoversten, G. M., Ramirez, A., Roberts, J., Júlíusson, E., Key, K., Mellors, R. (Sept-Oct. 2014) The value of spatial information for determining well placement: a geothermal example, Geophysics.

  4. Information entropy in cosmology.

    PubMed

    Hosoya, Akio; Buchert, Thomas; Morita, Masaaki

    2004-04-09

    The effective evolution of an inhomogeneous cosmological model may be described in terms of spatially averaged variables. We point out that in this context, quite naturally, a measure arises which is identical to a fluid model of the Kullback-Leibler relative information entropy, expressing the distinguishability of the local inhomogeneous mass density field from its spatial average on arbitrary compact domains. We discuss the time evolution of "effective information" and explore some implications. We conjecture that the information content of the Universe-measured by relative information entropy of a cosmological model containing dust matter-is increasing.

  5. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  6. State Demolition Information

    EPA Pesticide Factsheets

    Contact information and guidances for each state and selected territories's environmental agencies and programs relevant to large-scale residential demolition including asbestos, lead, and open burning.

  7. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2015-06-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  8. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2016-03-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  9. Continuous information flow fluctuations

    NASA Astrophysics Data System (ADS)

    Rosinberg, Martin Luc; Horowitz, Jordan M.

    2016-10-01

    Information plays a pivotal role in the thermodynamics of nonequilibrium processes with feedback. However, much remains to be learned about the nature of information fluctuations in small-scale devices and their relation with fluctuations in other thermodynamics quantities, like heat and work. Here we derive a series of fluctuation theorems for information flow and partial entropy production in a Brownian particle model of feedback cooling and extend them to arbitrary driven diffusion processes. We then analyze the long-time behavior of the feedback-cooling model in detail. Our results provide insights into the structure and origin of large deviations of information and thermodynamic quantities in autonomous Maxwell's demons.

  10. Informal medicine: ethical analysis

    PubMed Central

    Leavitt, F; Peleg, R; Peleg, A

    2005-01-01

    Context: Doctors have been known to treat or give consultation to patients informally, with none of the usual record keeping or follow up. They may wish to know whether this practice is ethical. Objective: To determine whether this practice meets criteria of medical ethics. Design: Informal medicine is analysed according to standard ethical principles: autonomy, beneficence and non-maleficence, distributive and procedural justice, and caring. Setting: Hospital, medical school, and other settings where patients may turn to physicians for informal help. Conclusion: No generalisation can be made to the effect that informal medicine is or is not ethical. Each request for informal consultation must be considered on its own merits. Guidelines: Informal medicine may be ethical if no payment is involved, and when the patient is fully aware of the benefits and risks of a lack of record keeping. When an informal consultation does not entail any danger to the patient or others, the physician may agree to the request. If, however, any danger to the patient or others is foreseen, then the physician must insist on professional autonomy, and consider refusing the request and persuading the patient to accept formal consultation. If a reportable infectious disease, or other serious danger to the community, is involved, the physician should refuse informal consultation or treatment, or at least make a proper report even if the consultation was informal. If agreeing to the request will result in an unfair drain on the physician's time or energy, he or she should refuse politely. PMID:16319228

  11. Addressing Information Security Risk

    ERIC Educational Resources Information Center

    Qayoumi, Mohammad H.; Woody, Carol

    2005-01-01

    Good information security does not just happen--and often does not happen at all. Resources are always in short supply, and there are always other needs that seem more pressing. Why? Because information security is hard to define, the required tasks are unclear, and the work never seems to be finished. However, the loss to the organization can be…

  12. Information Retrieval System.

    ERIC Educational Resources Information Center

    Mahle, Jack D., Jr.

    The Fort Detrick Information Retrieval System is a system of computer programs written in COBOL for a CDC 3150 to store and retrieve information about the scientific and technical reports and documents of the Fort Detrick Technical Library. The documents and reports have been abstracted and indexed. This abstract, the subject matter descriptors,…

  13. Management of Electronic Information.

    ERIC Educational Resources Information Center

    Breaks, Michael

    This paper discusses the management of library collections of electronic information resources within the classical theoretical framework of collection development and management. The first section provides an overview of electronic information resources, including bibliographic databases, electronic journals, journal aggregation services, and…

  14. Mandarin Visual Speech Information

    ERIC Educational Resources Information Center

    Chen, Trevor H.

    2010-01-01

    While the auditory-only aspects of Mandarin speech are heavily-researched and well-known in the field, this dissertation addresses its lesser-known aspects: The visual and audio-visual perception of Mandarin segmental information and lexical-tone information. Chapter II of this dissertation focuses on the audiovisual perception of Mandarin…

  15. Environmental geographic information system.

    SciTech Connect

    Peek, Dennis W; Helfrich, Donald Alan; Gorman, Susan

    2010-08-01

    This document describes how the Environmental Geographic Information System (EGIS) was used, along with externally received data, to create maps for the Site-Wide Environmental Impact Statement (SWEIS) Source Document project. Data quality among the various classes of geographic information system (GIS) data is addressed. A complete listing of map layers used is provided.

  16. Constructor theory of information

    PubMed Central

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  17. The right to information.

    PubMed

    Kubiak, Rafał

    2014-01-01

    The right to self-determination, including the decision on treatment, is affirmed in modern societies. Therefore, the fundamental condition of legal procedures is informed consent of a patient or an authorised person. However, to make the consent legally effective, some conditions have to be met; of these, the provision of comprehensive medical information is of the utmost importance. Thus, a patient is entitled to necessary information provided by a physician. The correlate of this right is the obligation to disclose information which must be fulfilled by a medical practitioner. The aim of this review is to examine this obligation in terms of determining the range of subjects authorised to provide information, the scope of subject information or a set of data, and the manner and time in which it should be given. Moreover, this article discusses regulations which permit limitations of information disclosure, i.e. the patient's entitlement to renounce the right to information, and therapeutic privilege. The disquisition regards achievements of legal doctrine and judicature, from the angle of which all the legal solutions and doubts arising are presented.

  18. Dimensions of Drug Information

    ERIC Educational Resources Information Center

    Sharp, Mark E.

    2011-01-01

    The high number, heterogeneity, and inadequate integration of drug information resources constitute barriers to many drug information usage scenarios. In the biomedical domain there is a rich legacy of knowledge representation in ontology-like structures that allows us to connect this problem both to the very mature field of library and…

  19. Electronic Information Retrieval.

    ERIC Educational Resources Information Center

    Holzberg, Carol S.

    1989-01-01

    Discusses ways in which online searching promotes the development of research, computer, and critical thinking skills, and reviews two databases that offer access to information on elementary school science. Lesson activities for grades four through six are suggested, and information on the vendor, system requirements, and cost of each database is…

  20. A Mine of Information.

    ERIC Educational Resources Information Center

    Williams, Lisa B.

    1986-01-01

    Business researchers and marketers find certain databases useful for finding information on investments, competitors, products, and markets. Colleges can use these same databases to get background on corporate prospects. The largest data source available, DIALOG Information Services and some other databases are described. (MLW)

  1. Targeted Information Dissemination

    DTIC Science & Technology

    2008-03-01

    independent articles such as the blogs (short for web logs) and wikis and topic web portals . Such digital data sources offer information of varying... Biosurveillance Information System (NBIS) • Public Health Informatics Network (PHIN) • Pockets of Excellence – Integration? Solutions ? Figure 16

  2. Microfilm and Information Retrieval.

    ERIC Educational Resources Information Center

    Teplitz, Arthur

    This paper was prepared to provide a frame of reference to the role of microfilm within the information retrieval world and to provide an opportunity for evaluation of the use of microforms for active retrieval applications. The paper discusses the principles of information retrieval, considers subject and classification indexing, and describes…

  3. Consumer Information. Final Report.

    ERIC Educational Resources Information Center

    CEMREL, Inc., St. Ann, MO.

    One of three projects reported by the Central Midwestern Regional Educational Laboratory included analysis of 178 existing consumer information products. Steps in the analytical scheme were preparation of an annotated bibliography and development of a plan for providing objective, comparative information on such products. These were found in the…

  4. Archival Information Management System.

    DTIC Science & Technology

    1995-02-01

    management system named Archival Information Management System (AIMS), designed to meet the audit trail requirement for studies completed under the...are to be archived to the extent that future reproducibility and interrogation of results will exist. This report presents a prototype information

  5. Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    1994-01-01

    Discussion of information retrieval focuses on an Interaction Information Retrieval model in which documents are interconnected; queries and documents are treated in the same way; and retrieval is the result of the interconnection between query and documents. A theoretical mathematical formulation of this type of retrieval is given. (Contains 31…

  6. The Information Explosion.

    ERIC Educational Resources Information Center

    Kuhns, William

    Three facets of the media--events, myths, and sales pitches--constitute the most important lines of force taken by the information bombardment which all of us encounter and are influenced by every day. The focus of this book is on the changes created and hastened by this information explosion of the media bombardment: how we can live with them,…

  7. Information Design: A Bibliography.

    ERIC Educational Resources Information Center

    Albers, Michael J.; Lisberg, Beth Conney

    2000-01-01

    Presents a 17-item annotated list of essential books on information design chosen by members of the InfoDesign e-mail list. Includes a 113-item unannotated bibliography of additional works, on topics of creativity and critical thinking; visual thinking; graphic design; infographics; information design; instructional design; interface design;…

  8. Information Retrieval in Physics.

    ERIC Educational Resources Information Center

    Herschman, Arthur

    Discussed in this paper are the information problems in physics and the current program of the American Institute of Physics (AIP) being conducted in an attempt to develop an information retrieval system. The seriousness of the need is described by means of graphs indicating the exponential rise in the number of physics publications in the last…

  9. EDUCATIONAL INFORMATION PROJECT.

    ERIC Educational Resources Information Center

    LINDQUIST, E.F.; AND OTHERS

    TO AID DATA COLLECTION ANALYSIS, STORAGE, AND DISSEMINATION, INSTRUMENTS AND PROCEDURES WERE DEVELOPED FOR COLLECTING INFORMATION ON ALL ASPECTS OF THE EDUCATIONAL PROGRAM FOR A LARGE POPULATION OF SCHOOLS, INCLUDING INFORMATION ON INDIVIDUAL PUPILS, SCHOOL PERSONNEL, SCHOOLS, AND SCHOOL DISTRICTS. COMPUTER PROGRAMS AND DATA-PROCESSING TECHNIQUES…

  10. Hybrid quantum information processing

    SciTech Connect

    Furusawa, Akira

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  11. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  12. Conceptualizing an Information Commons.

    ERIC Educational Resources Information Center

    Beagle, Donald

    1999-01-01

    Concepts from Strategic Alignment, a technology-management theory, are used to discuss the Information Commons as a new service-delivery model in academic libraries. The Information Commons, as a conceptual, physical, and instructional space, involves an organizational realignment from print to the digital environment. (Author)

  13. Normalized medical information visualization.

    PubMed

    Sánchez-de-Madariaga, Ricardo; Muñoz, Adolfo; Somolinos, Roberto; Castro, Antonio; Velázquez, Iker; Moreno, Oscar; García-Pacheco, José L; Pascual, Mario; Salvador, Carlos H

    2015-01-01

    A new mark-up programming language is introduced in order to facilitate and improve the visualization of ISO/EN 13606 dual model-based normalized medical information. This is the first time that visualization of normalized medical information is addressed and the programming language is intended to be used by medical non-IT professionals.

  14. Taking Information Literacy Online.

    ERIC Educational Resources Information Center

    Levesque, Carla

    2003-01-01

    Explores the process of designing, teaching, and revising an online information literacy course at St. Petersburg College (SPC) (Florida). Shares methods for encouraging participation in online courses and ways of tracking students' progress. Reports that basic computer information and literacy is now a graduation requirement at SBC. Contains…

  15. Evaluating Health Information

    MedlinePlus

    Millions of consumers get health information from magazines, TV or the Internet. Some of the information is reliable and up to date; some is not. ... a branch of the government, a university, a health organization, a hospital or a business? Focus on ...

  16. Career Information Handbook.

    ERIC Educational Resources Information Center

    Texas State Technical Inst., Waco.

    The handbook is a companion volume to "High School Career Interest and Information Survey" but its use extends to high school counselors, teachers, administrators and their students as an independent reference tool for occupational information. The manual is divided into sections corresponding to the fifteen career clusters identified by the U.S.…

  17. Distributed Information Management.

    ERIC Educational Resources Information Center

    Pottenger, William M.; Callahan, Miranda R.; Padgett, Michael A.

    2001-01-01

    Reviews the scope and effects of distributed information management. Discusses cultural and social influences, including library and Internet culture, information and knowledge, electronic libraries, and social aspects of libraries; digital collections; indexing; permanent link systems; metadata; the Open Archives initiative; digital object…

  18. Marketing Information Literacy

    ERIC Educational Resources Information Center

    Seale, Maura

    2013-01-01

    In 2012, more than a decade after the original Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education (hereafter the Standards) were institutionalized as the goal of academic library instruction, the Information Literacy Competency Standards Review Task Force convened by ACRL recommended…

  19. Structuring the Information Gap.

    ERIC Educational Resources Information Center

    Edge, Julian

    1984-01-01

    Describes an information gap procedure to teach a new structure which requires students to look for and exchange information in order to complete a task in an English as a second language class. Illustrates the method with a set of materials and suggests ways for teachers to produce similar materials. (SED)

  20. Information and Productivity.

    ERIC Educational Resources Information Center

    Bearman, Toni Carbo; And Others

    1985-01-01

    Addresses the role of information technology and effective management of information resources in reversing declining rate of growth of productivity in United States. Discussion covers U.S. productivity, reasons for decline in its growth, productivity measurement, improving productivity, and related activities of National Commission on Libraries…

  1. Copyright Program Information.

    ERIC Educational Resources Information Center

    National Center for Educational Communication (DHEW/OE), Washington, DC.

    The purpose of this publication is to provide information about the U.S. Office of Education (USOE) Copyright Program. It is a supplement to the Copyright Guidelines published in the Federal Register on May 9, 1970 (available as LI 002 914) and provides information primarily for those institutions and organizations which are developing educational…

  2. Information Literacy Assessment

    ERIC Educational Resources Information Center

    Warmkessel, Marjorie M.

    2007-01-01

    This article presents an annotated list of seven recent articles on the topic of information literacy assessment. They include: (1) "The Three Arenas of Information Literacy Assessment" (Bonnie Gratch Lindauer); (2) "Testing the Effectiveness of Interactive Multimedia for Library-User Education" (Karen Markey et al.); (3)…

  3. Accessibility of Outdated Information

    ERIC Educational Resources Information Center

    O'Brien, Edward J.; Cook, Anne E.; Gueraud, Sabine

    2010-01-01

    In 2 previous studies (O'Brien, Rizzella, Albrecht, & Halleran, 1998; Zwaan & Madden, 2004), researchers have provided conflicting accounts about whether outdated information continues to influence the comprehension of subsequent text. The current set of experiments was designed to explore further the impact of outdated information on…

  4. What is quantum information?

    NASA Astrophysics Data System (ADS)

    Lombardi, Olimpia; Holik, Federico; Vanni, Leonardo

    2016-11-01

    In the present article we address the question 'What is quantum information?' from a conceptual viewpoint. In particular, we argue that there seems to be no sufficiently good reasons to accept that quantum information is qualitatively different from classical information. The view that, in the communicational context, there is only one kind of information, physically neutral, which can be encoded by means of classical or quantum states has, in turn, interesting conceptual advantages. First, it dissolves the widely discussed puzzles of teleportation without the need to assume a particular interpretation of information. Second, and from a more general viewpoint, it frees the attempts to reconstruct quantum mechanics on the basis of informational constraints from any risk of circularity; furthermore, it endows them with a strong conceptual appealing and, derivatively, opens the way to the possibility of a non-reductive unification of physics. Finally, in the light of the idea of the physical neutrality of information, the wide field of research about classical models for quantum information acquires a particular conceptual and philosophical interest.

  5. Constructor theory of information.

    PubMed

    Deutsch, David; Marletto, Chiara

    2015-02-08

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible-i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems).

  6. Heroin. Specialized Information Service.

    ERIC Educational Resources Information Center

    Do It Now Foundation, Phoenix, AZ.

    The document presents a collection of articles about heroin. Article 1 provides general information on heroin identification, drug dependence, effects of abuse, cost, source of supply, and penalties for illegal heroin use. Article 2 gives statistical information on heroin-related deaths in the District of Columbia between 1971 and 1982. Article 3…

  7. Information Assurance Study

    DTIC Science & Technology

    1998-01-01

    be a giant step forward. We are aware that the DOD already sponsors 13 Information Analysis Centers (IAC) and the Information Assurance Technology...Norton Anti-Virus Symantec Corporation http://www.symantec.com OfficeScan Trend Micro http://www.antivirus.com Panda Antivirus Panda Software http

  8. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  9. The nature of information

    NASA Astrophysics Data System (ADS)

    Tpudlik

    2015-11-01

    In reply to Mark Buchanan's review of César Hidalgo's book How Information Grows (“The wealth of nations”, October pp40-41, http://ow.ly/SlqN3), in which Hidalgo argues that we need an “information-centric” view of economic growth.

  10. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  11. Information Technology Resources Assessment

    SciTech Connect

    Not Available

    1993-04-01

    The Information Technology Resources Assessment (ITRA) is being published as a companion document to the Department of Energy (DOE) FY 1994--FY 1998 Information Resources Management Long-Range Plan. This document represents a collaborative effort between the Office of Information Resources Management and the Office of Energy Research that was undertaken to achieve, in part, the Technology Strategic Objective of IRM Vision 21. An integral part of this objective, technology forecasting provides an understanding of the information technology horizon and presents a perspective and focus on technologies of particular interest to DOE program activities. Specifically, this document provides site planners with an overview of the status and use of new information technology for their planning consideration.

  12. Economics of information

    NASA Astrophysics Data System (ADS)

    Noguchi, Mitsunori

    2000-06-01

    The economics of information covers a wide range of topics such as insurance, stochastic equilibria, the theory of finance (e.g. option pricing), job search, etc. In this paper, we focus on an economic model in which traders are uncertain about the true characteristics of commodities and know only the probability distributions of those characteristics. The traders acquire information on those characteristics via the actual consumption in the past and are allowed to exchange the information among themselves prior to the forthcoming trade. Though optimal consumption at the preceding trade generally alters optimal consumption at the succeeding trade, it may happen that they both coincide. We call this particular type of optimal consumption an information stable equilibrium (ISE). At an ISE, the traders gain no additional information from consumption, which is significant enough to revise their optimal choice at the succeeding trade. .

  13. Patients' preferences for information

    PubMed Central

    Kindelan, K.; Kent, G.

    1986-01-01

    In a study of patients' views of the type of information they would like to receive from the doctor 265 patients from four general practices were given a list of five areas of information — diagnosis, prognosis, treatment, aetiology and social effects of their illness — and asked to rank these in order of importance for that visit. In general, information on diagnosis and prognosis was the most highly valued, while the ways the illness would affect daily activities was the least preferred. Although information on treatment was rarely selected as the first preference it was often the second or third preference. Conversely, diagnosis was the first choice of the largest proportion of patients and the least valued information for 26%. PMID:3440990

  14. Advanced information society(5)

    NASA Astrophysics Data System (ADS)

    Tanizawa, Ippei

    Based on the advancement of information network technology information communication forms informationalized society giving significant impact on business activities and life style in it. The information network has been backed up technologically by development of computer technology and has got great contribution by enhanced computer technology and communication equipments. Information is transferred by digital and analog methods. Technical development which has brought out multifunctioned modems of communication equipments in analog mode, and construction of advanced information communication network which has come out by joint work of computer and communication under digital technique, are described. The trend in institutional matter and standardization of electrical communication is also described showing some examples of value-added network (VAN).

  15. Energy information directory 1994

    SciTech Connect

    Not Available

    1994-03-28

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory is a list of most Government offices and trade associations that are involved in energy matters. It does not include those DOE offices which do not deal with the public or public information.

  16. Informed consent in gerontology.

    PubMed

    Glock, Rosana Soibelmann; Goldim, Jose Roberto

    2003-01-01

    The aim of this study was to assess the use and adequacy of informed consent in research involving the elderly in Brazil. Using a reading index, we observed that in 83% of informed consent forms, the text was considered difficult, and demanded a higher schooling level than that presented by the subjects. Whereas 100% of the investigators considered the text in informed consent forms accessible, 75% of the subjects considered it hard to understand. This difference was statistically significant. 94% percent of the elderly participating in research protocols made the decision to participate in the study before reading the term of consent. More attention should be given both to the writing of informed consent forms and to the entire informed consent process, which in gerontology research, should be reviewed at each encounter with study participants.

  17. Physics as Information Processing

    NASA Astrophysics Data System (ADS)

    D'Ariano, Giacomo Mauro

    2011-03-01

    I review some recent advances in foundational research at Pavia QUIT group. The general idea is that there is only Quantum Theory without quantization rules, and the whole Physics—including space-time and relativity—is emergent from the quantum-information processing. And since Quantum Theory itself is axiomatized solely on informational principles, the whole Physics must be reformulated in information-theoretical terms: this is the It from bit of J. A. Wheeler. The review is divided into four parts: a) the informational axiomatization of Quantum Theory; b) how space-time and relativistic covariance emerge from quantum computation; c) what is the information-theoretical meaning of inertial mass and of ℏ, and how the quantum field emerges; d) an observational consequence of the new quantum field theory: a mass-dependent refraction index of vacuum. I will conclude with the research lines that will follow in the immediate future.

  18. Counterspace Operations for Information Dominance

    DTIC Science & Technology

    1999-03-01

    INTERNET DOCUMENT INFORMATION FORM A. Report Title: Counterspace perations for Information Dominance B. DATE Report Downloaded From the Internet 3/10...Representative for resolution. Counterspace perations for Information Dominance by James G. Lee INTRODUCTION The Problem The launch of the Soviet...information gap between friendly and enemy forces. This positive information gap has been referred to as information dominance . Information Dominance The

  19. HS3 Information System

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Conover, H.; Ramachandran, R.; Kulkarni, A.; Mceniry, M.; Stone, B.

    2015-12-01

    The Global Hydrology Resource Center (GHRC) is developing an enterprise information system to manage and better serve data for Hurricane and Severe Storm Sentinel (HS3), a NASA airborne field campaign. HS3 is a multiyear campaign aimed at helping scientists understand the physical processes that contribute to hurricane intensification. For in-depth analysis, HS3 encompasses not only airborne data but also variety of in-situ, satellite, simulation, and flight report data. Thus, HS3 provides a unique challenge in information system design. The GHRC team is experienced with previous airborne campaigns to handle such challenge. Many supplementary information and reports collected during the mission include information rich contents that provide mission snapshots. In particular, flight information, instrument status, weather reports, and summary statistics offer vital knowledge about the corresponding science data. Furthermore, such information help narrow the science data of interest. Therefore, the GHRC team is building HS3 information system that augments the current GHRC data management framework to support search and discover of airborne science data with interactive visual exploration. Specifically, the HS3 information system is developing a tool to visually playback mission flights along with other traditional search and discover interfaces. This playback capability allows the users to follow the flight in time and visualize collected data. The flight summary and analyzed information are also presented during the playback. If the observed data is of interest, then they can order the data from GHRC using the interface. The users will be able to order just the data for the part of the flight that they are interested in. This presentation will demonstrate use of visual exploration to data download along with other components that comprise the HS3 information system.

  20. Next generation information systems

    SciTech Connect

    Limback, Nathan P; Medina, Melanie A; Silva, Michelle E

    2010-01-01

    The Information Systems Analysis and Development (ISAD) Team of the Safeguards Systems Group at Los Alamos National Laboratory (LANL) has been developing web based information and knowledge management systems for sixteen years. Our vision is to rapidly and cost effectively provide knowledge management solutions in the form of interactive information systems that help customers organize, archive, post and retrieve nonproliferation and safeguards knowledge and information vital to their success. The team has developed several comprehensive information systems that assist users in the betterment and growth of their organizations and programs. Through our information systems, users are able to streamline operations, increase productivity, and share and access information from diverse geographic locations. The ISAD team is also producing interactive visual models. Interactive visual models provide many benefits to customers beyond the scope of traditional full-scale modeling. We have the ability to simulate a vision that a customer may propose, without the time constraints of traditional engineering modeling tools. Our interactive visual models can be used to access specialized training areas, controlled areas, and highly radioactive areas, as well as review site-specific training for complex facilities, and asset management. Like the information systems that the ISAD team develops, these models can be shared and accessed from any location with access to the internet. The purpose of this paper is to elaborate on the capabilities of information systems and interactive visual models as well as consider the possibility of combining the two capabilities to provide the next generation of infonnation systems. The collection, processing, and integration of data in new ways can contribute to the security of the nation by providing indicators and information for timely action to decrease the traditional and new nuclear threats. Modeling and simulation tied to comprehensive

  1. Information Technology - Information Overload for Strategic Leaders

    DTIC Science & Technology

    2007-11-02

    Information Overload for Strategic Leaders 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Anthony Cotton 5d...PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) U.S. Army War College,Carlisle Barracks,Carlisle...PA,17013-5050 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM( S ) 11

  2. 78 FR 73819 - Information Collection; Financial Information Security Request Form

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-09

    ... Forest Service Information Collection; Financial Information Security Request Form AGENCY: Forest Service... extension with revision of a currently approved information collection, Financial Information Security...: Comments concerning this notice should be addressed to Financial Policy, Mail Stop 1149, USDA,...

  3. Beyond informed consent.

    PubMed Central

    Bhutta, Zulfiqar A.

    2004-01-01

    Although a relatively recent phenomenon, the role of informed consent in human research is central to its ethical regulation and conduct. However, guidelines often recommend procedures for obtaining informed consent (usually written consent) that are difficult to implement in developing countries. This paper reviews the guidelines for obtaining informed consent and also discusses prevailing views on current controversies, ambiguities and problems with these guidelines and suggests potential solutions. The emphasis in most externally sponsored research projects in developing countries is on laborious documentation of several mechanical aspects of the research process rather than on assuring true comprehension and voluntary participation. The onus for the oversight of this process is often left to overworked and ill-equipped local ethics review committees. Current guidelines and processes for obtaining informed consent should be reviewed with the specific aim of developing culturally appropriate methods of sharing information about the research project and obtaining and documenting consent that is truly informed. Further research is needed to examine the validity and user friendliness of innovations in information sharing procedures for obtaining consent in different cultural settings. PMID:15643799

  4. Information of Open Systems

    NASA Astrophysics Data System (ADS)

    Klimontovich, Yuri L.

    In the theory of communication two definitions of the concept "information" are known. One of them coincides according to its form with the Boltzmann entropy. The second definition of information is the difference between unconditional and conditional entropies. In the present work this latter is used for the definition of the information about states of open systems with various meanings of the control parameter. Two kinds of open systems are considered. The first class of systems concerns those which with zero value of the control parameter are in an equilibrium state. The information on an equilibrium state is equal to zero. During self- organizing in the process of departing from an equilibrium state the information increases. For open systems of this class the conservation law for the sum of the information and entropy with all values of control parameter is proved. In open systems of the second class the equilibrium condition is impossible. For them the concept "norm of a chaoticity" is introduced. It allows to consider two kinds of processes of self-organization and to give the corresponding definitions of information. The statement is carried out on a number of (classical and quantum) examples of physical systems. The example of a medico-biological system also is considered.

  5. Ignorance, information and autonomy.

    PubMed

    Harris, J; Keywood, K

    2001-09-01

    People have a powerful interest in genetic privacy and its associated claim to ignorance, and some equally powerful desires to be shielded from disturbing information are often voiced. We argue, however, that there is no such thing as a right to remain in ignorance, where a fight is understood as an entitlement that trumps competing claims. This does not of course mean that information must always be forced upon unwilling recipients, only that there is no prima facie entitlement to be protected from true or honest information about oneself. Any claims to be shielded from information about the self must compete on equal terms with claims based in the rights and interests of others. In balancing the weight and importance of rival considerations about giving or withholding information, if rights claims have any place, rights are more likely to be defensible on the side of honest communication of information rather than in defence of ignorance. The right to free speech and the right to decline to accept responsibility to take decisions for others imposed by those others seem to us more plausible candidates for fully fledged rights in this field than any purported right to ignorance. Finally, and most importantly, if the right to autonomy is invoked, a proper understanding of the distinction between claims to liberty and claims to autonomy show that the principle of autonomy, as it is understood in contemporary social ethics and English law, supports the giving rather than the withholding of information in most circumstances.

  6. Acting to gain information

    NASA Technical Reports Server (NTRS)

    Rosenchein, Stanley J.; Burns, J. Brian; Chapman, David; Kaelbling, Leslie P.; Kahn, Philip; Nishihara, H. Keith; Turk, Matthew

    1993-01-01

    This report is concerned with agents that act to gain information. In previous work, we developed agent models combining qualitative modeling with real-time control. That work, however, focused primarily on actions that affect physical states of the environment. The current study extends that work by explicitly considering problems of active information-gathering and by exploring specialized aspects of information-gathering in computational perception, learning, and language. In our theoretical investigations, we analyzed agents into their perceptual and action components and identified these with elements of a state-machine model of control. The mathematical properties of each was developed in isolation and interactions were then studied. We considered the complexity dimension and the uncertainty dimension and related these to intelligent-agent design issues. We also explored active information gathering in visual processing. Working within the active vision paradigm, we developed a concept of 'minimal meaningful measurements' suitable for demand-driven vision. We then developed and tested an architecture for ongoing recognition and interpretation of visual information. In the area of information gathering through learning, we explored techniques for coping with combinatorial complexity. We also explored information gathering through explicit linguistic action by considering the nature of conversational rules, coordination, and situated communication behavior.

  7. Part I. The Delivery of Information Services within a Changing Information Environment. The Changing Information Environment.

    ERIC Educational Resources Information Center

    Collins, Rosann; Straub, Detmar W.

    1991-01-01

    Outlines the impact of current divisions in the information infrastructure of the information environment on the ability of knowledge workers to integrate information across media and sources of information. Discussion covers information "seams," inaccessible information, information overload/impoverishment, and bottlenecks. The…

  8. Advanced information society (11)

    NASA Astrophysics Data System (ADS)

    Nawa, Kotaro

    Late in the 1980's the information system of Japanese corporation has been operated strategically to strengthen its competitive position in markets rather than to make corporate management efficient. Therefore, information-oriented policy in the corporation is making remarkable progress. This policy expands the intelligence activity in the corporation and also leads to the extension of the market in an information industry. In this environment closed corporate system is transformed into open one. For this system network and database are important managerial resources.

  9. Data Rich, Information Poor

    SciTech Connect

    Kaplan, P.G.; Rautman, C.A.

    1998-11-09

    Surviving in a data-rich environment means understanding the difference between data and information. This paper reviews an environmental case study that illustrates that understanding and shows its importance. In this study, a decision problem was stated in terms of au economic-objective fimction. The function contains a term that defines the stochastic relationship between the decision and the information obtained during field chamctetition for an environmental contaminant. Data is defied as samples drawn or experimental realizations of a mudom fimction. Information is defined as the quantitative change in the value of the objective fiction as a result of the sample.

  10. ENERGY INFORMATION CLEARINGHOUSE

    SciTech Connect

    Ron Johnson

    2003-10-01

    Alaska has spent billions of dollars on various energy-related activities over the past several decades, with projects ranging from smaller utilities used to produce heat and power in rural Alaska to huge endeavors relating to exported resources. To help provide information for end users, utilities, decision makers, and the general public, the Institute of Northern Engineering at UAF established an Energy Information Clearinghouse accessible through the worldwide web in 2002. This clearinghouse contains information on energy resources, end use technologies, policies, related environmental issues, emerging technologies, efficiency, storage, demand side management, and developments in Alaska.

  11. Visibility Modeling and Forecasting for Abu Dhabi using Time Series Analysis Method

    NASA Astrophysics Data System (ADS)

    Eibedingil, I. G.; Abula, B.; Afshari, A.; Temimi, M.

    2015-12-01

    Land-Atmosphere interactions-their strength, directionality and evolution-are one of the main sources of uncertainty in contemporary climate modeling. A particularly crucial role in sustaining and modulating land-atmosphere interaction is the one of aerosols and dusts. Aerosols are tiny particles suspended in the air ranging from a few nanometers to a few hundred micrometers in diameter. Furthermore, the amount of dust and fog in the atmosphere is an important measure of visibility, which is another dimension of land-atmosphere interactions. Visibility affects all form of traffic, aviation, land and sailing. Being able to predict the change of visibility in the air in advance enables relevant authorities to take necessary actions before the disaster falls. Time Series Analysis (TAS) method is an emerging technique for modeling and forecasting the behavior of land-atmosphere interactions, including visibility. This research assess the dynamics and evolution of visibility around Abu Dhabi International Airport (+24.4320 latitude, +54.6510 longitude, and 27m elevation) using mean daily visibility and mean daily wind speed. TAS has been first used to model and forecast the visibility, and then the Transfer Function Model has been applied, considering the wind speed as an exogenous variable. By considering the Akaike Information Criterion (AIC) and Mean Absolute Percentage Error (MAPE) as a statistical criteria, two forecasting models namely univarite time series model and transfer function model, were developed to forecast the visibility around Abu Dhabi International Airport for three weeks. Transfer function model improved the MAPE of the forecast significantly.

  12. Time-dependent oral absorption models

    NASA Technical Reports Server (NTRS)

    Higaki, K.; Yamashita, S.; Amidon, G. L.

    2001-01-01

    The plasma concentration-time profiles following oral administration of drugs are often irregular and cannot be interpreted easily with conventional models based on first- or zero-order absorption kinetics and lag time. Six new models were developed using a time-dependent absorption rate coefficient, ka(t), wherein the time dependency was varied to account for the dynamic processes such as changes in fluid absorption or secretion, in absorption surface area, and in motility with time, in the gastrointestinal tract. In the present study, the plasma concentration profiles of propranolol obtained in human subjects following oral dosing were analyzed using the newly derived models based on mass balance and compared with the conventional models. Nonlinear regression analysis indicated that the conventional compartment model including lag time (CLAG model) could not predict the rapid initial increase in plasma concentration after dosing and the predicted Cmax values were much lower than that observed. On the other hand, all models with the time-dependent absorption rate coefficient, ka(t), were superior to the CLAG model in predicting plasma concentration profiles. Based on Akaike's Information Criterion (AIC), the fluid absorption model without lag time (FA model) exhibited the best overall fit to the data. The two-phase model including lag time, TPLAG model was also found to be a good model judging from the values of sum of squares. This model also described the irregular profiles of plasma concentration with time and frequently predicted Cmax values satisfactorily. A comparison of the absorption rate profiles also suggested that the TPLAG model is better at prediction of irregular absorption kinetics than the FA model. In conclusion, the incorporation of a time-dependent absorption rate coefficient ka(t) allows the prediction of nonlinear absorption characteristics in a more reliable manner.

  13. Crucial nesting habitat for gunnison sage-grouse: A spatially explicit hierarchical approach

    USGS Publications Warehouse

    Aldridge, C.L.; Saher, D.J.; Childers, T.M.; Stahlnecker, K.E.; Bowen, Z.H.

    2012-01-01

    Gunnison sage-grouse (Centrocercus minimus) is a species of special concern and is currently considered a candidate species under Endangered Species Act. Careful management is therefore required to ensure that suitable habitat is maintained, particularly because much of the species' current distribution is faced with exurban development pressures. We assessed hierarchical nest site selection patterns of Gunnison sage-grouse inhabiting the western portion of the Gunnison Basin, Colorado, USA, at multiple spatial scales, using logistic regression-based resource selection functions. Models were selected using Akaike Information Criterion corrected for small sample sizes (AIC c) and predictive surfaces were generated using model averaged relative probabilities. Landscape-scale factors that had the most influence on nest site selection included the proportion of sagebrush cover >5%, mean productivity, and density of 2 wheel-drive roads. The landscape-scale predictive surface captured 97% of known Gunnison sage-grouse nests within the top 5 of 10 prediction bins, implicating 57% of the basin as crucial nesting habitat. Crucial habitat identified by the landscape model was used to define the extent for patch-scale modeling efforts. Patch-scale variables that had the greatest influence on nest site selection were the proportion of big sagebrush cover >10%, distance to residential development, distance to high volume paved roads, and mean productivity. This model accurately predicted independent nest locations. The unique hierarchical structure of our models more accurately captures the nested nature of habitat selection, and allowed for increased discrimination within larger landscapes of suitable habitat. We extrapolated the landscape-scale model to the entire Gunnison Basin because of conservation concerns for this species. We believe this predictive surface is a valuable tool which can be incorporated into land use and conservation planning as well the assessment of

  14. A functional biological network centered on XRCC3: a new possible marker of chemoradiotherapy resistance in rectal cancer patients

    PubMed Central

    Agostini, Marco; Zangrando, Andrea; Pastrello, Chiara; D'Angelo, Edoardo; Romano, Gabriele; Giovannoni, Roberto; Giordan, Marco; Maretto, Isacco; Bedin, Chiara; Zanon, Carlo; Digito, Maura; Esposito, Giovanni; Mescoli, Claudia; Lavitrano, Marialuisa; Rizzolio, Flavio; Jurisica, Igor; Giordano, Antonio; Pucciarelli, Salvatore; Nitti, Donato

    2015-01-01

    Preoperative chemoradiotherapy is widely used to improve local control of disease, sphincter preservation and to improve survival in patients with locally advanced rectal cancer. Patients enrolled in the present study underwent preoperative chemoradiotherapy, followed by surgical excision. Response to chemoradiotherapy was evaluated according to Mandard's Tumor Regression Grade (TRG). TRG 3, 4 and 5 were considered as partial or no response while TRG 1 and 2 as complete response. From pretherapeutic biopsies of 84 locally advanced rectal carcinomas available for the analysis, only 42 of them showed 70% cancer cellularity at least. By determining gene expression profiles, responders and non-responders showed significantly different expression levels for 19 genes (P < 0.001). We fitted a logistic model selected with a stepwise procedure optimizing the Akaike Information Criterion (AIC) and then validated by means of leave one out cross validation (LOOCV, accuracy = 95%). Four genes were retained in the achieved model: ZNF160, XRCC3, HFM1 and ASXL2. Real time PCR confirmed that XRCC3 is overexpressed in responders group and HFM1 and ASXL2 showed a positive trend. In vitro test on colon cancer resistant/susceptible to chemoradioterapy cells, finally prove that XRCC3 deregulation is extensively involved in the chemoresistance mechanisms. Protein-protein interactions (PPI) analysis involving the predictive classifier revealed a network of 45 interacting nodes (proteins) with TRAF6 gene playing a keystone role in the network. The present study confirmed the possibility that gene expression profiling combined with integrative computational biology is useful to predict complete responses to preoperative chemoradiotherapy in patients with advanced rectal cancer. PMID:26023803

  15. Analysis of the return period and correlation between the reservoir-induced seismic frequency and the water level based on a copula: A case study of the Three Gorges reservoir in China

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofei; Zhang, Qiuwen

    2016-11-01

    Studies have considered the many factors involved in the mechanism of reservoir seismicity. Focusing on the correlation between reservoir-induced seismicity and the water level, this study proposes to utilize copula theory to build a correlation model to analyze their relationships and perform the risk analysis. The sequences of reservoir induced seismicity events from 2003 to 2011 in the Three Gorges reservoir in China are used as a case study to test this new methodology. Next, we construct four correlation models based on the Gumbel, Clayton, Frank copula and M-copula functions and employ four methods to test the goodness of fit: Q-Q plots, the Kolmogorov-Smirnov (K-S) test, the minimum distance (MD) test and the Akaike Information Criterion (AIC) test. Through a comparison of the four models, the M-copula model fits the sample better than the other three models. Based on the M-copula model, we find that, for the case of a sudden drawdown of the water level, the possibility of seismic frequency decreasing obviously increases, whereas for the case of a sudden rising of the water level, the possibility of seismic frequency increasing obviously increases, with the former being greater than the latter. The seismic frequency is mainly distributed in the low-frequency region (Y ⩽ 20) for the low water level and in the middle-frequency region (20 < Y ≤ 80) for both the medium and high water levels; the seismic frequency in the high-frequency region (Y > 80) is the least likely. For the conditional return period, it can be seen that the period of the high-frequency seismicity is much longer than those of the normal and medium frequency seismicity, and the high water level shortens the periods.

  16. Parsimony and goodness-of-fit in multi-dimensional NMR inversion

    NASA Astrophysics Data System (ADS)

    Babak, Petro; Kryuchkov, Sergey; Kantzas, Apostolos

    2017-01-01

    Multi-dimensional nuclear magnetic resonance (NMR) experiments are often used for study of molecular structure and dynamics of matter in core analysis and reservoir evaluation. Industrial applications of multi-dimensional NMR involve a high-dimensional measurement dataset with complicated correlation structure and require rapid and stable inversion algorithms from the time domain to the relaxation rate and/or diffusion domains. In practice, applying existing inverse algorithms with a large number of parameter values leads to an infinite number of solutions with a reasonable fit to the NMR data. The interpretation of such variability of multiple solutions and selection of the most appropriate solution could be a very complex problem. In most cases the characteristics of materials have sparse signatures, and investigators would like to distinguish the most significant relaxation and diffusion values of the materials. To produce an easy to interpret and unique NMR distribution with the finite number of the principal parameter values, we introduce a new method for NMR inversion. The method is constructed based on the trade-off between the conventional goodness-of-fit approach to multivariate data and the principle of parsimony guaranteeing inversion with the least number of parameter values. We suggest performing the inversion of NMR data using the forward stepwise regression selection algorithm. To account for the trade-off between goodness-of-fit and parsimony, the objective function is selected based on Akaike Information Criterion (AIC). The performance of the developed multi-dimensional NMR inversion method and its comparison with conventional methods are illustrated using real data for samples with bitumen, water and clay.

  17. Modelling the association of dengue fever cases with temperature and relative humidity in Jeddah, Saudi Arabia-A generalised linear model with break-point analysis.

    PubMed

    Alkhaldy, Ibrahim

    2017-04-01

    The aim of this study was to examine the role of environmental factors in the temporal distribution of dengue fever in Jeddah, Saudi Arabia. The relationship between dengue fever cases and climatic factors such as relative humidity and temperature was investigated during 2006-2009 to determine whether there is any relationship between dengue fever cases and climatic parameters in Jeddah City, Saudi Arabia. A generalised linear model (GLM) with a break-point was used to determine how different levels of temperature and relative humidity affected the distribution of the number of cases of dengue fever. Break-point analysis was performed to modelled the effect before and after a break-point (change point) in the explanatory parameters under various scenarios. Akaike information criterion (AIC) and cross validation (CV) were used to assess the performance of the models. The results showed that maximum temperature and mean relative humidity are most probably the better predictors of the number of dengue fever cases in Jeddah. In this study three scenarios were modelled: no time lag, 1-week lag and 2-weeks lag. Among these scenarios, the 1-week lag model using mean relative humidity as an explanatory variable showed better performance. This study showed a clear relationship between the meteorological variables and the number of dengue fever cases in Jeddah. The results also demonstrated that meteorological variables can be successfully used to estimate the number of dengue fever cases for a given period of time. Break-point analysis provides further insight into the association between meteorological parameters and dengue fever cases by dividing the meteorological parameters into certain break-points.

  18. Nonstationarity in the occurrence rate of floods in the Tarim River basin, China, and related impacts of climate indices

    NASA Astrophysics Data System (ADS)

    Gu, Xihui; Zhang, Qiang; Singh, Vijay P.; Chen, Xi; Liu, Lin

    2016-07-01

    Amplification of floods in the Xinjiang, China, has been observed, but reports on their changing properties and underlying mechanisms are not available. In this study, occurrence rates of floods in the Tarim River basin, the largest inland arid river basin in China, were analyzed using the Kernel density estimation technique and bootstrap resampling method. Also analyzed were the occurrence rates of precipitation extremes using the POT (Peak over Threshold)-based sampling method. Both stationary and non-stationary models were developed using GAMLSS (Generalized Additive Models for Location, Scale and Shape) to model flood frequency with time, climate index, precipitation and temperature as major predictors. Results indicated: (1) two periods with increasing occurrence of floods, i.e., the late 1960s and the late 1990s with considerable fluctuations around 2-3 flood events during time intervals between the late 1960s and the late 1990s; (2) changes in the occurrence rates of floods were subject to nonstationarity. A persistent increase of flood frequency and magnitude was observed during the 1990s and reached a peak value; (3) AMO (Atlantic Multidecadal Oscillation) and AO (Atlantic Oscillation) in winter were the key influencing climate indices impacting the occurrence rates of floods. However, NAO (North Atlantic Oscillation) and SOI (South Oscillation Index) are two principle factors that influence the occurrence rates of regional floods. The AIC (Akaike Information Criterion) values indicated that compared to the influence of climate indices, occurrence rates of floods seemed to be more sensitive to temperature and precipitation changes. Results of this study are important for flood management and development of mitigation measures.

  19. Blood lead concentrations in free-ranging Nile crocodiles (Crocodylus niloticus) from South Africa.

    PubMed

    Warner, Jonathan K; Combrink, Xander; Myburgh, Jan G; Downs, Colleen T

    2016-07-01

    Generally crocodilians have received little attention with regard to the effects of lead toxicity despite their trophic status as apex, generalist predators that utilize both aquatic and terrestrial habitats, thereby exposing them to a potentially wide range of environmental contaminants. During July-October 2010 we collected whole blood from 34 sub-adult and adult free-ranging Nile crocodiles (Crocodylus niloticus) from three separate populations in northeastern South Africa in order to analyze their blood lead concentrations (BPb). Concentrations ranged from below detectability (<3 μg/dL, n = 8) to 960 μg/dL for an adult male at the Lake St Lucia Estuary. Blood lead concentrations averaged 8.15 μg/dL (SD = 7.47) for females and 98.10 μg/dL (SD = 217.42) for males. Eighteen individuals (53 %) had elevated BPbs (≥10 μg/dL). We assessed 12 general linear models using Akaike's Information Criterion (AIC) and found no significant statistical effects among the parameters of sex, crocodile size and population sampled. On average, crocodiles had higher BPbs at Lake St Lucia than at Ndumo Game Reserve or Kosi Bay, which we attribute to lead sinker ingestion during normal gastrolith acquisition. No clinical effects of lead toxicosis were observed in these crocodiles, even though the highest concentration (960 μg/dL) we report represents the most elevated BPb recorded to date for a free-ranging vertebrate. Although we suggest adult Nile crocodiles are likely tolerant of elevated Pb body burdens, experimental studies on other crocodilian species suggest the BPb levels reported here may have harmful or fatal effects to egg development and hatchling health. In light of recent Nile crocodile nesting declines in South Africa we urge further BPb monitoring and ecotoxicology research on reproductive females and embryos.

  20. Artificial Neural Networks for the Diagnosis of Aggressive Periodontitis Trained by Immunologic Parameters

    PubMed Central

    Papantonopoulos, Georgios; Takahashi, Keiso; Bountis, Tasos; Loos, Bruno G.

    2014-01-01

    There is neither a single clinical, microbiological, histopathological or genetic test, nor combinations of them, to discriminate aggressive periodontitis (AgP) from chronic periodontitis (CP) patients. We aimed to estimate probability density functions of clinical and immunologic datasets derived from periodontitis patients and construct artificial neural networks (ANNs) to correctly classify patients into AgP or CP class. The fit of probability distributions on the datasets was tested by the Akaike information criterion (AIC). ANNs were trained by cross entropy (CE) values estimated between probabilities of showing certain levels of immunologic parameters and a reference mode probability proposed by kernel density estimation (KDE). The weight decay regularization parameter of the ANNs was determined by 10-fold cross-validation. Possible evidence for 2 clusters of patients on cross-sectional and longitudinal bone loss measurements were revealed by KDE. Two to 7 clusters were shown on datasets of CD4/CD8 ratio, CD3, monocyte, eosinophil, neutrophil and lymphocyte counts, IL-1, IL-2, IL-4, INF-γ and TNF-α level from monocytes, antibody levels against A. actinomycetemcomitans (A.a.) and P.gingivalis (P.g.). ANNs gave 90%–98% accuracy in classifying patients into either AgP or CP. The best overall prediction was given by an ANN with CE of monocyte, eosinophil, neutrophil counts and CD4/CD8 ratio as inputs. ANNs can be powerful in classifying periodontitis patients into AgP or CP, when fed by CE values based on KDE. Therefore ANNs can be employed for accurate diagnosis of AgP or CP by using relatively simple and conveniently obtained parameters, like leukocyte counts in peripheral blood. This will allow clinicians to better adapt specific treatment protocols for their AgP and CP patients. PMID:24603408

  1. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  2. The H II galaxy Hubble diagram strongly favours Rh = ct over ΛCDM

    NASA Astrophysics Data System (ADS)

    Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio

    2016-12-01

    We continue to build support for the proposal to use H II galaxies (HIIGx) and giant extragalactic H II regions (GEHR) as standard candles to construct the Hubble diagram at redshifts beyond the current reach of Type Ia supernovae. Using a sample of 25 high-redshift HIIGx, 107 local HIIGx, and 24 GEHR, we confirm that the correlation between the emission-line luminosity and ionized-gas velocity dispersion is a viable luminosity indicator, and use it to test and compare the standard model ΛCDM and the Rh = ct universe by optimizing the parameters in each cosmology using a maximization of the likelihood function. For the flat ΛCDM model, the best fit is obtained with Ω _m= 0.40_{-0.09}^{+0.09}. However, statistical tools, such as the Akaike (AIC), Kullback (KIC) and Bayes (BIC) Information Criteria favour Rh = ct over the standard model with a likelihood of ≈94.8-98.8 per cent versus only ≈1.2-5.2 per cent. For wCDM (the version of ΛCDM with a dark-energy equation of state wde ≡ pde/ρde rather than wde = wΛ = -1), a statistically acceptable fit is realized with Ω _m=0.22_{-0.14}^{+0.16} and w_de= -0.51_{-0.25}^{+0.15} which, however, are not fully consistent with their concordance values. In this case, wCDM has two more free parameters than Rh = ct, and is penalized more heavily by these criteria. We find that Rh = ct is strongly favoured over wCDM with a likelihood of ≈92.9-99.6 per cent versus only 0.4-7.1 per cent. The current HIIGx sample is already large enough for the BIC to rule out ΛCDM/wCDM in favour of Rh = ct at a confidence level approaching 3σ.

  3. Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.

    PubMed

    Durdu, Omer Faruk

    2010-10-01

    In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic

  4. Development of an Adaptive Multi-Method Algorithm for Automatic Picking of First Arrival Times: Application to Near Surface Seismic Data

    NASA Astrophysics Data System (ADS)

    Khalaf, A.; Camerlynck, C. M.; Schneider, A. C.; Florsch, N.

    2015-12-01

    Accurate picking of first arrival times plays an important role in many seismic studies, particularly in seismic tomography and reservoirs or aquifers monitoring. Many techniques have been developed for picking first arrivals automatically or semi-automatically, but most of them were developed for seismological purposes which does not attain the accuracy objectives due to the complexity of near surface structures, and to usual low signal-to-noise ratio. We propose a new adaptive algorithm for near surface data based on three picking methods, combining multi-nested windows (MNW), Higher Order Statistics (HOS), and Akaike Information Criterion (AIC). They exploit the benefits of integrating many properties, which reveal the presence of first arrivals, to provide an efficient and robust first arrivals picking. This strategy mimics the human first-break picking, where at the beginning the global trend is defined. Then the exact first-breaks are searched in the vicinity of the now defined trend. In a multistage algorithm, three successive phases are launched, where each of them characterize a specific signal property. Within each phase, the potential picks and their error range are automatically estimated, and then used sequentially as leader in the following phase picking. The accuracy and robustness of the implemented algorithm are successfully validated on synthetic and real data which have special challenges for automatic pickers. The comparison of resulting P-wave arrival times with those picked manually, and other algorithms of automatic picking, demonstrated the reliable performance of the new scheme under different noisy conditions. All parameters of our multi-method algorithm are auto-adaptive thanks to the integration in series of each sub-algorithm results in the flow. Hence, it is nearly a parameter-free algorithm, which is straightforward to implement and demands low computational resources.

  5. Modeling a habitat suitability index for the eastern fall cohort of Ommastrephes bartramii in the central North Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Chen, Xinjun; Tian, Siquan; Liu, Bilin; Chen, Yong

    2011-05-01

    The eastern fall cohort of the neon flying squid, Ommastrephes bartramii, has been commercially exploited by the Chinese squid jigging fleet in the central North Pacific Ocean since the late 1990s. To understand and identify their optimal habitat, we have developed a habitat suitability index (HSI) model using two potential important environmental variables — sea surface temperature (SST) and sea surface height anomaly (SSHA) — and fishery data from the main fishing ground (165°-180°E) during June and July of 1999-2003. A geometric mean model (GMM), minimum model (MM) and arithmetic weighted model (AWM) with different weights were compared and the best HSI model was selected using Akaike's information criterion (AIC). The performance of the developed HSI model was evaluated using fishery data for 2004. This study suggests that the highest catch per unit effort (CPUE) and fishing effort are closely related to SST and SSHA. The best SST- and SSHA-based suitability index (SI) regression models were SISST-based = 0.7SIeffort-SST + 0.3 SICPUE-SST, and SISSHA-based = 0.5SIeffort-SSHA + 0.5SICPUE-SSHA, respectively, showing that fishing effort is more important than CPUE in the estimation of SI. The best HSI model was the AWM, defined as HSI=0.3SISST-based+ 0.7SISSHA-based, indicating that SSHA is more important than SST in estimating the HSI of squid. In 2004, monthly HSI values greater than 0.6 coincided with the distribution of productive fishing ground and high CPUE in June and July, suggesting that the models perform well. The proposed model provides an important tool in our efforts to develop forecasting capacity of squid spatial dynamics.

  6. Effects of reproductive condition, roost microclimate, and weather patterns on summer torpor use by a vespertilionid bat

    PubMed Central

    Johnson, Joseph S; Lacki, Michael J

    2014-01-01

    A growing number of mammal species are recognized as heterothermic, capable of maintaining a high-core body temperature or entering a state of metabolic suppression known as torpor. Small mammals can achieve large energetic savings when torpid, but they are also subject to ecological costs. Studying torpor use in an ecological and physiological context can help elucidate relative costs and benefits of torpor to different groups within a population. We measured skin temperatures of 46 adult Rafinesque's big-eared bats (Corynorhinus rafinesquii) to evaluate thermoregulatory strategies of a heterothermic small mammal during the reproductive season. We compared daily average and minimum skin temperatures as well as the frequency, duration, and depth of torpor bouts of sex and reproductive classes of bats inhabiting day-roosts with different thermal characteristics. We evaluated roosts with microclimates colder (caves) and warmer (buildings) than ambient air temperatures, as well as roosts with intermediate conditions (trees and rock crevices). Using Akaike's information criterion (AIC), we found that different statistical models best predicted various characteristics of torpor bouts. While the type of day-roost best predicted the average number of torpor bouts that bats used each day, current weather variables best predicted daily average and minimum skin temperatures of bats, and reproductive condition best predicted average torpor bout depth and the average amount of time spent torpid each day by bats. Finding that different models best explain varying aspects of heterothermy illustrates the importance of torpor to both reproductive and nonreproductive small mammals and emphasizes the multifaceted nature of heterothermy and the need to collect data on numerous heterothermic response variables within an ecophysiological context. PMID:24558571

  7. Fine-Scale Mapping by Spatial Risk Distribution Modeling for Regional Malaria Endemicity and Its Implications under the Low-to-Moderate Transmission Setting in Western Cambodia

    PubMed Central

    Okami, Suguru; Kohtake, Naohiko

    2016-01-01

    The disease burden of malaria has decreased as malaria elimination efforts progress. The mapping approach that uses spatial risk distribution modeling needs some adjustment and reinvestigation in accordance with situational changes. Here we applied a mathematical modeling approach for standardized morbidity ratio (SMR) calculated by annual parasite incidence using routinely aggregated surveillance reports, environmental data such as remote sensing data, and non-environmental anthropogenic data to create fine-scale spatial risk distribution maps of western Cambodia. Furthermore, we incorporated a combination of containment status indicators into the model to demonstrate spatial heterogeneities of the relationship between containment status and risks. The explanatory model was fitted to estimate the SMR of each area (adjusted Pearson correlation coefficient R2 = 0.774; Akaike information criterion AIC = 149.423). A Bayesian modeling framework was applied to estimate the uncertainty of the model and cross-scale predictions. Fine-scale maps were created by the spatial interpolation of estimated SMRs at each village. Compared with geocoded case data, corresponding predicted values showed conformity [Spearman’s rank correlation r = 0.662 in the inverse distance weighed interpolation and 0.645 in ordinal kriging (95% confidence intervals of 0.414–0.827 and 0.368–0.813, respectively), Welch’s t-test; Not significant]. The proposed approach successfully explained regional malaria risks and fine-scale risk maps were created under low-to-moderate malaria transmission settings where reinvestigations of existing risk modeling approaches were needed. Moreover, different representations of simulated outcomes of containment status indicators for respective areas provided useful insights for tailored interventional planning, considering regional malaria endemicity. PMID:27415623

  8. Assessment and Selection of Competing Models for Zero-Inflated Microbiome Data

    PubMed Central

    Xu, Lizhen; Paterson, Andrew D.; Turpin, Williams; Xu, Wei

    2015-01-01

    Typical data in a microbiome study consist of the operational taxonomic unit (OTU) counts that have the characteristic of excess zeros, which are often ignored by investigators. In this paper, we compare the performance of different competing methods to model data with zero inflated features through extensive simulations and application to a microbiome study. These methods include standard parametric and non-parametric models, hurdle models, and zero inflated models. We examine varying degrees of zero inflation, with or without dispersion in the count component, as well as different magnitude and direction of the covariate effect on structural zeros and the count components. We focus on the assessment of type I error, power to detect the overall covariate effect, measures of model fit, and bias and effectiveness of parameter estimations. We also evaluate the abilities of model selection strategies using Akaike information criterion (AIC) or Vuong test to identify the correct model. The simulation studies show that hurdle and zero inflated models have well controlled type I errors, higher power, better goodness of fit measures, and are more accurate and efficient in the parameter estimation. Besides that, the hurdle models have similar goodness of fit and parameter estimation for the count component as their corresponding zero inflated models. However, the estimation and interpretation of the parameters for the zero components differs, and hurdle models are more stable when structural zeros are absent. We then discuss the model selection strategy for zero inflated data and implement it in a gut microbiome study of > 400 independent subjects. PMID:26148172

  9. Alien plant invasion in mixed-grass prairie: Effects of vegetation type and anthropogenic disturbance

    USGS Publications Warehouse

    Larson, D.L.; Anderson, P.J.; Newton, W.

    2001-01-01

    The ability of alien plant species to invade a region depends not only on attributes of the plant, but on characteristics of the habitat being invaded. Here, we examine characteristics that may influence the success of alien plant invasion in mixed-grass prairie at Theodore Roosevelt National Park, in western North Dakota, USA. The park consists of two geographically separate units with similar vegetation types and management history, which allowed us to examine the effects of native vegetation type, anthropogenic disturbance, and the separate park units on the invasion of native plant communities by alien plant species common to counties surrounding both park units. If matters of chance related to availability of propagules and transient establishment opportunities determine the success of invasion, park unit and anthropogenic disturbance should better explain the variation in alien plant frequency. If invasibility is more strongly related to biotic or physical characteristics of the native plant communities, models of alien plant occurrence should include vegetation type as an explanatory variable. We examined >1300 transects across all vegetation types in both units of the park. Akaike's Information Criterion (AIC) indicated that the fully parameterized model, including the interaction among vegetation type, disturbance, and park unit, best described the distribution of both total number of alien plants per transect and frequency of alien plants on transects where they occurred. Although all vegetation types were invaded by alien plants, mesic communities had both greater numbers and higher frequencies of alien plants than did drier communities. A strong element of stochasticity, reflected in differences in frequencies of individual species between the two park units, suggests that prediction of risk of invasion will always involve uncertainty. In addition, despite well-documented associations between anthropogenic disturbance and alien plant invasion, five of

  10. Alien plant invasion in mixed-grass prairie: effects of vegetation type, stochiasticity, and anthropogenic disturbance in two park units

    USGS Publications Warehouse

    Larson, Diane L.; Anderson, Patrick J.; Newton, Wesley E.

    2001-01-01

    The ability of alien plant species to invade a region depends not only on attributes of the plant, but on characteristics of the habitat being invaded. Here, we examine characteristics that may influence the success of alien plant invasion in mixed-grass prairie at Theodore Roosevelt National Park, in western North Dakota, USA. The park consists of two geographically separate units with similar vegetation types and management history, which allowed us to examine the effects of native vegetation type, anthropogenic disturbance, and the separate park units on the invasion of native plant communities by alien plant species common to counties surrounding both park units. If matters of chance related to availability of propagules and transient establishment opportunities determine the success of invasion, park unit and anthropogenic disturbance should better explain the variation in alien plant frequency. If invasibility is more strongly related to biotic or physical characteristics of the native plant communities, models of alien plant occurrence should include vegetation type as an explanatory variable. We examined >1300 transects across all vegetation types in both units of the park. Akaike's Information Criterion (AIC) indicated that the fully parameterized model, including the interaction among vegetation type, disturbance, and park unit, best described the distribution of both total number of alien plants per transect and frequency of alien plants on transects where they occurred. Although all vegetation types were invaded by alien plants, mesic communities had both greater numbers and higher frequencies of alien plants than did drier communities. A strong element of stochasticity, reflected in differences in frequencies of individual species between the two park units, suggests that prediction of risk of invasion will always involve uncertainty. In addition, despite well-documented associations between anthropogenic disturbance and alien plant invasion, five of

  11. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination

    PubMed Central

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu

    2013-01-01

    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  12. The importance of retaining a phylogenetic perspective in traits-based community analyses

    DOE PAGES

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineagesmore » had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.« less

  13. The importance of retaining a phylogenetic perspective in traits-based community analyses

    SciTech Connect

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineages had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.

  14. Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality

    NASA Astrophysics Data System (ADS)

    Kipnis, E. L.; Murphy, M.; Klatt, A. L.; Miller, S. N.; Williams, D. G.

    2015-12-01

    Session H103: The Hydrology-Vegetation-Climate Nexus: Identifying Process Interactions and Environmental Shifts in Mountain Catchments Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality Evan L Kipnis, Melanie A Murphey, Alan Klatt, Scott N Miller, David G Williams Snowpack accumulation and ablation remain difficult to estimate in forested headwater catchments. How physical terrain and forest cover separately and interactively influence spatial patterns of snow accumulation and ablation largely shapes the hydrologic response to land cover disturbances. Analysis of water isotopes in snowpack provides a powerful tool for examining integrated effects of water vapor exchange, selective redistribution, and melt. Snow water equivalence (SWE), δ2H, δ18O and deuterium excess (D-excess) of snowpack were examined throughout winter 2013-2014 across two headwater catchments impacted by bark beetle induced tree mortality. A USGS 10m DEM and a derived land cover product from 1m NAIP imagery were used to examine the effects of terrain features (e.g., elevation, slope, aspect) and canopy disturbance (e.g., live, bark-beetle killed) as predictors of D-excess, an expression of kinetic isotope effects, in snowpack. A weighting of Akaike's Information Criterion (AIC) values from multiple spatially lagged regression models describing D-excess variation for peak snowpack revealed strong effects of elevation and canopy mortality, and weaker, but significant effects of aspect and slope. Snowpack D-excess was lower in beetle-killed canopy patches compared to live green canopy patches, and at lower compared to high elevation locations, suggesting that integrated isotopic effects of vapor exchange, vertical advection of melted snow, and selective accumulation and redistribution varied systematically across the two catchments. The observed patterns illustrate the potential

  15. Recovery of native treefrogs after removal of nonindigenous Cuban Treefrogs, Osteopilus septentrionalis

    USGS Publications Warehouse

    Rice, K.G.; Waddle, J.H.; Miller, M.W.; Crockett, M.E.; Mazzotti, F.J.; Percival, H.F.

    2011-01-01

    Florida is home to several introduced animal species, especially in the southern portion of the state. Most introduced species are restricted to the urban and suburban areas along the coasts, but some species, like the Cuban Treefrog (Osteopilus septentrionalis), are locally abundant in natural protected areas. Although Cuban Treefrogs are known predators of native treefrog species as both adults and larvae, no study has demonstrated a negative effect of Cuban Treefrogs on native treefrog survival, abundance, or occupancy rate. We monitored survival, capture probability, abundance, and proportion of sites occupied by Cuban Treefrogs and two native species, Green Treefrogs (Hyla cinerea) and Squirrel Treefrogs (Hyla squirella), at four sites in Everglades National Park in southern Florida with the use of capture–mark–recapture techniques. After at least 5 mo of monitoring all species at each site we began removing every Cuban Treefrog captured. We continued to estimate survival, abundance, and occupancy rates of native treefrogs for 1 yr after the commencement of Cuban Treefrog removal. Mark–recapture models that included the effect of Cuban Treefrog removal on native treefrog survival did not have considerable Akaike's Information Criterion (AIC) weight, although capture rates of native species were generally very low prior to Cuban Treefrog removal. Estimated abundance of native treefrogs did increase after commencement of Cuban Treefrog removal, but also varied with the season of the year. The best models of native treefrog occupancy included a Cuban Treefrog removal effect at sites with high initial densities of Cuban Treefrogs. This study demonstrates that an introduced predator can have population-level effects on similar native species.

  16. Mapping the mean monthly precipitation of a small island using kriging with external drifts

    NASA Astrophysics Data System (ADS)

    Cantet, Philippe

    2017-01-01

    This study focuses on the spatial distribution of mean annual and monthly precipitation in a small island (1128 km2) named Martinique, located in the Lesser Antilles. Only 35 meteorological stations are available on the territory, which has a complex topography. With a digital elevation model (DEM), 17 covariates that are likely to explain precipitation were built. Several interpolation methods, such as regression-kriging (𝖬𝖫𝖱𝖪, 𝖯𝖢𝖱𝖪, and 𝖯𝖫𝖲𝖪) and external drift kriging (𝖤𝖣𝖪) were tested using a cross-validation procedure. For the regression methods, predictors were chosen by established techniques whereas a new approach is proposed to select external drifts in a kriging which is based on a stepwise model selection by the Akaike Information Criterion (AIC). The prediction accuracy was assessed at validation sites with three different skill scores. Results show that using methods with no predictors such as inverse distance weighting (𝖨𝖣𝖶) or universal kriging (𝖴𝖪) is inappropriate in such a territory. 𝖤𝖣𝖪 appears to outperform regression methods for any criteria, and selecting predictors by our approach improves the prediction of mean annual precipitation compared to kriging with only elevation as drift. Finally, the predicting performance was also studied by varying the size of the training set leading to less conclusive results for 𝖤𝖣𝖪 and its performance. Nevertheless, the proposed method seems to be a good way to improve the mapping of climatic variables in a small island.

  17. Possible Causes of a Harbour Porpoise Mass Stranding in Danish Waters in 2005

    PubMed Central

    Wright, Andrew J.; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J.; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7–15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003–2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003–2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11–28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor. PMID:23460787

  18. Effects of reproductive condition, roost microclimate, and weather patterns on summer torpor use by a vespertilionid bat.

    PubMed

    Johnson, Joseph S; Lacki, Michael J

    2014-01-01

    A growing number of mammal species are recognized as heterothermic, capable of maintaining a high-core body temperature or entering a state of metabolic suppression known as torpor. Small mammals can achieve large energetic savings when torpid, but they are also subject to ecological costs. Studying torpor use in an ecological and physiological context can help elucidate relative costs and benefits of torpor to different groups within a population. We measured skin temperatures of 46 adult Rafinesque's big-eared bats (Corynorhinus rafinesquii) to evaluate thermoregulatory strategies of a heterothermic small mammal during the reproductive season. We compared daily average and minimum skin temperatures as well as the frequency, duration, and depth of torpor bouts of sex and reproductive classes of bats inhabiting day-roosts with different thermal characteristics. We evaluated roosts with microclimates colder (caves) and warmer (buildings) than ambient air temperatures, as well as roosts with intermediate conditions (trees and rock crevices). Using Akaike's information criterion (AIC), we found that different statistical models best predicted various characteristics of torpor bouts. While the type of day-roost best predicted the average number of torpor bouts that bats used each day, current weather variables best predicted daily average and minimum skin temperatures of bats, and reproductive condition best predicted average torpor bout depth and the average amount of time spent torpid each day by bats. Finding that different models best explain varying aspects of heterothermy illustrates the importance of torpor to both reproductive and nonreproductive small mammals and emphasizes the multifaceted nature of heterothermy and the need to collect data on numerous heterothermic response variables within an ecophysiological context.

  19. Modeling fecal bacteria transport and retention in agricultural and urban soils under saturated and unsaturated flow conditions.

    PubMed

    Balkhair, Khaled S

    2017-03-01

    Pathogenic bacteria, that enter surface water bodies and groundwater systems through unmanaged wastewater land application, pose a great risk to human health. In this study, six soil column experiments were conducted to simulate the vulnerability of agricultural and urban field soils for fecal bacteria transport and retention under saturated and unsaturated flow conditions. HYDRUS-1D kinetic attachment and kinetic attachment-detachment models were used to simulate the breakthrough curves of the experimental data by fitting model parameters. Results indicated significant differences in the retention and drainage of bacteria between saturated and unsaturated flow condition in the two studied soils. Flow under unsaturated condition retained more bacteria than the saturated flow case. The high bacteria retention in the urban soil compared to agricultural soil is ascribed not only to the dynamic attachment and sorption mechanisms but also to the greater surface area of fine particles and low flow rate. All models simulated experimental data satisfactorily under saturated flow conditions; however, under variably saturated flow, the peak concentrations were overestimated by the attachment-detachment model and underestimated by the attachment model with blocking. The good match between observed data and simulated concentrations by the attachment model which was supported by the Akaike information criterion (AIC) for model selection indicates that the first-order attachment coefficient was sufficient to represent the quantitative and temporal distribution of bacteria in the soil column. On the other hand, the total mass balance of the drained and retained bacteria in all transport experiments was in the range of values commonly found in the literature. Regardless of flow conditions and soil texture, most of the bacteria were retained in the top 12 cm of the soil column. The approaches and the models used in this study have proven to be a good tool for simulating fecal

  20. Biogeographical Interpretation of Elevational Patterns of Genus Diversity of Seed Plants in Nepal.

    PubMed

    Li, Miao; Feng, Jianmeng

    2015-01-01

    This study tests if the biogeographical affinities of genera are relevant for explaining elevational plant diversity patterns in Nepal. We used simultaneous autoregressive (SAR) models to investigate the explanatory power of several predictors in explaining the diversity-elevation relationships shown in genera with different biogeographical affinities. Delta akaike information criterion (ΔAIC) was used for multi-model inferences and selections. Our results showed that both the total and tropical genus diversity peaked below the mid-point of the elevational gradient, whereas that of temperate genera had a nearly symmetrical, unimodal relationship with elevation. The proportion of temperate genera increased markedly with elevation, while that of tropical genera declined. Compared to tropical genera, temperate genera had wider elevational ranges and were observed at higher elevations. Water-related variables, rather than mid-domain effects (MDE), were the most significant predictors of elevational patterns of tropical genus diversity. The temperate genus diversity was influenced by energy availability, but only in quadratic terms of the models. Though climatic factors and mid-domain effects jointly explained most of the variation in the diversity of temperate genera with elevation, the former played stronger roles. Total genus diversity was most strongly influenced by climate and the floristic overlap of tropical and temperate floras, while the influences of mid-domain effects were relatively weak. The influences of water-related and energy-related variables may vary with biogeographical affinities. The elevational patterns may be most closely related to climatic factors, while MDE may somewhat modify the patterns. Caution is needed when investigating the causal factors underlying diversity patterns for large taxonomic groups composed of taxa of different biogeographical affinities. Right-skewed diversity-elevation patterns may be produced by the differential

  1. Possible causes of a harbour porpoise mass stranding in Danish waters in 2005.

    PubMed

    Wright, Andrew J; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7-15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003-2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003-2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11-28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor.

  2. Landscape conditions predisposing grizzly bears to conflicts on private agricultural lands in the western USA

    USGS Publications Warehouse

    Wilson, S.M.; Madel, M.J.; Mattson, D.J.; Graham, J.M.; Merrill, T.

    2006-01-01

    We used multiple logistic regression to model how different landscape conditions contributed to the probability of human-grizzly bear conflicts on private agricultural ranch lands. We used locations of livestock pastures, traditional livestock carcass disposal areas (boneyards), beehives, and wetland-riparian associated vegetation to model the locations of 178 reported human-grizzly bear conflicts along the Rocky Mountain East Front, Montana, USA during 1986-2001. We surveyed 61 livestock producers in the upper Teton watershed of north-central Montana, to collect spatial and temporal data on livestock pastures, boneyards, and beehives for the same period, accounting for changes in livestock and boneyard management and beehive location and protection, for each season. We used 2032 random points to represent the null hypothesis of random location relative to potential explanatory landscape features, and used Akaike's Information Criteria (AIC/AICC) and Hosmer-Lemeshow goodness-of-fit statistics for model selection. We used a resulting "best" model to map contours of predicted probabilities of conflict, and used this map for verification with an independent dataset of conflicts to provide additional insights regarding the nature of conflicts. The presence of riparian vegetation and distances to spring, summer, and fall sheep or cattle pastures, calving and sheep lambing areas, unmanaged boneyards, and fenced and unfenced beehives were all associated with the likelihood of human-grizzly bear conflicts. Our model suggests that collections of attractants concentrated in high quality bear habitat largely explain broad patterns of human-grizzly bear conflicts on private agricultural land in our study area. ?? 2005 Elsevier Ltd. All rights reserved.

  3. Modeling Heteroscedasticity of Wind Speed Time Series in the United Arab Emirates

    NASA Astrophysics Data System (ADS)

    Kim, H. Y.; Marpu, P. R.; Ouarda, T.

    2014-12-01

    There has been a growing interest in wind resources in the Gulf region, not only for evaluating wind energy potential, but also for understanding and forecasting changes in wind, as a regional climate variable. In particular, time varying variance—the second order moment—or heteroscedasticity in wind time series is important to investigate since high variance causes turbulence, which affects wind power potential and may lead to structural changes in wind turbines. Nevertheless, the conditional variance of wind time series has been rarely explored, especially in the Gulf region. Therefore, the seasonal autoregressive integrated moving average-generalized autoregressive conditional heteroscedasticity (SARIMA-GARCH) model is applied to observed wind data in the United Arab Emirates (UAE). This model allows considering apparent seasonality which is present in wind time series and the heteroscedasticity in residuals indicated with the Engle test, to understand and forecast changes in the conditional variance of wind time series. In this study, the autocorrelation function of daily average wind speed time series obtained from seven stations within the UAE—Al Aradh, Al Mirfa, Al Wagan, East of Jebel Haffet, Madinat Zayed, Masdar City, Sir Bani Yas Island—is inspected to fit a SARIMA model. The best SARIMA model is selected according to the minimum Akaike Information Criteria (AIC) and based on residuals of the model. Then, the GARCH model is applied to the remaining residuals to capture the conditional variance of the SARIMA model. Results indicate that the SARIMA-GARCH model provides a good fir to wind data in the UAE.

  4. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal

  5. Bait stations, hard mast, and black bear population growth in Great Smoky Mountains National Park

    USGS Publications Warehouse

    Clark, Joseph D.; van Manen, Frank T.; Pelton, Michael R.

    2005-01-01

    Bait-station surveys are used by wildlife managers as an index to American black bear (Ursus americanus) population abundance, but the relationship is not well established. Hard mast surveys are similarly used to assess annual black bear food availability which may affect mortality and natality rates. We used data collected in Great Smoky Mountains National Park (GSMNP) from 1989 to 2003 to determine whether changes in the bait-station index (ΔBSI) were associated with estimated rates of bear population growth (λ) and whether hard mast production was related to bear visitation to baits. We also evaluated whether hard mast production from previous years was related to λ. Estimates of λ were based on analysis of capture-recapture data with the Pradel temporal symmetry estimator. Using the Akaike's Information Criterion (AIC), our analysis revealed no direct relationship between ΔBSI and λ. A simulation analysis indicated that our data were adequate to detect a relationship had one existed. Model fit was marginally improved when we added total oak mast production of the previous year as an interaction term suggesting that the BSI was confounded with environmental variables. Consequently the utility of the bait-station survey as a population monitoring technique is questionable at the spatial and temporal scales we studied. Mast survey data, however, were valuable covariates of λ. Population growth for a given year was negatively related to oak mast production 4 and 5 years prior. That finding supported our hypothesis that mast failures can trigger reproductive synchrony, which may not be evident from the trapped sample until years later.

  6. Temporal and spatial characteristics of extreme precipitation events in the Midwest of Jilin Province based on multifractal detrended fluctuation analysis method and copula functions

    NASA Astrophysics Data System (ADS)

    Guo, Enliang; Zhang, Jiquan; Si, Ha; Dong, Zhenhua; Cao, Tiehua; Lan, Wu

    2016-08-01

    Environmental changes have brought about significant changes and challenges to water resources and management in the world; these include increasing climate variability, land use change, intensive agriculture, and rapid urbanization and industrial development, especially much more frequency extreme precipitation events. All of which greatly affect water resource and the development of social economy. In this study, we take extreme precipitation events in the Midwest of Jilin Province as an example; daily precipitation data during 1960-2014 are used. The threshold of extreme precipitation events is defined by multifractal detrended fluctuation analysis (MF-DFA) method. Extreme precipitation (EP), extreme precipitation ratio (EPR), and intensity of extreme precipitation (EPI) are selected as the extreme precipitation indicators, and then the Kolmogorov-Smirnov (K-S) test is employed to determine the optimal probability distribution function of extreme precipitation indicators. On this basis, copulas connect nonparametric estimation method and the Akaike Information Criterion (AIC) method is adopted to determine the bivariate copula function. Finally, we analyze the characteristics of single variable extremum and bivariate joint probability distribution of the extreme precipitation events. The results show that the threshold of extreme precipitation events in semi-arid areas is far less than that in subhumid areas. The extreme precipitation frequency shows a significant decline while the extreme precipitation intensity shows a trend of growth; there are significant differences in spatiotemporal of extreme precipitation events. The spatial variation trend of the joint return period gets shorter from the west to the east. The spatial distribution of co-occurrence return period takes on contrary changes and it is longer than the joint return period.

  7. SU-E-T-399: Determination of the Radiobiological Parameters That Describe the Dose-Response Relations of Xerostomia and Disgeusia From Head and Neck Radiotherapy

    SciTech Connect

    Mavroidis, P; Stathakis, S; Papanikolaou, N; Peixoto Xavier, C; Costa Ferreira, B; Khouri, L; Carmo Lopes, M do

    2014-06-01

    Purpose: To estimate the radiobiological parameters that describe the doseresponse relations of xerostomia and disgeusia from head and neck cancer radiotherapy. To identify the organs that are best correlated with the manifestation of those clinical endpoints. Finally, to evaluate the goodnessof- fit by comparing the model predictions against the actual clinical results. Methods: In this study, 349 head and neck cancer patients were included. For each patient the dose volume histograms (DVH) of parotids (separate and combined), mandible, submandibular glands (separate and combined) and salivary glands were calculated. The follow-up of those patients was recorded at different times after the completion of the treatment (7 weeks, 3, 7, 12, 18 and 24 months). Acute and late xerostomia and acute disgeusia were the clinical endpoints examined. A maximum likelihood fitting was performed to calculate the best estimates of the parameters used by the relative seriality model. The statistical methods of the error distribution, the receiver operating characteristic (ROC) curve, the Pearson's test and the Akaike's information criterion were utilized to assess the goodness-of-fit and the agreement between the pattern of the radiobiological predictions with that of the clinical records. Results: The estimated values of the radiobiological parameters of salivary glands are D50 = 25.2 Gy, γ = 0.52, s = 0.001. The statistical analysis confirmed the clinical validity of those parameters (area under the ROC curve = 0.65 and AIC = 38.3). Conclusion: The analysis proved that the treatment outcome pattern of the patient material can be reproduced by the relative seriality model and the estimated radiobiological parameters. Salivary glands were found to have strong volume dependence (low relative seriality). Diminishing the biologically effective uniform dose to salivary glands below 30 Gy may significantly reduce the risk of complications to the patients irradiated for prostate cancer.

  8. Plant species invasions along the latitudinal gradient in the United States

    USGS Publications Warehouse

    Stohlgren, T.J.; Barnett, D.; Flather, C.; Kartesz, J.; Peterjohn, B.

    2005-01-01

    It has been long established that the richness of vascular plant species and many animal taxa decreases with increasing latitude, a pattern that very generally follows declines in actual and potential evapotranspiration, solar radiation, temperature, and thus, total productivity. Using county-level data on vascular plants from the United States (3000 counties in the conterminous 48 states), we used the Akaike Information Criterion (AIC) to evaluate competing models predicting native and nonnative plant species density (number of species per square kilometer in a county) from various combinations of biotic variables (e.g., native bird species density, vegetation carbon, normalized difference vegetation index), environmental/topographic variables (elevation, variation in elevation, the number of land cover classes in the county; radiation, mean precipitation, actual evapotranspiration, and potential evapotranspiration), and human variables (human population density, crop-land, and percentage of disturbed lands in a county). We found no evidence of a latitudinal gradient for the density of native plant species and a significant, slightly positive latitudinal gradient for the density of nonnative plant species. We found stronger evidence of a significant, positive productivity gradient (vegetation carbon) for the density of native plant species and nonnative plant species. We found much stronger significant relationships when biotic, environmental/topographic, and human variables were used to predict native plant species density and nonnative plant species density. Biotic variables generally had far greater influence in multivariate models than human or environmental/topographic variables. Later, we found that the best, single, positive predictor of the density of nonnative plant species in a county was the density of native plant species in a county. While further study is needed, it may be that, while humans facilitate the initial establishment invasions of nonnative

  9. A functional biological network centered on XRCC3: a new possible marker of chemoradiotherapy resistance in rectal cancer patients.

    PubMed

    Agostini, Marco; Zangrando, Andrea; Pastrello, Chiara; D'Angelo, Edoardo; Romano, Gabriele; Giovannoni, Roberto; Giordan, Marco; Maretto, Isacco; Bedin, Chiara; Zanon, Carlo; Digito, Maura; Esposito, Giovanni; Mescoli, Claudia; Lavitrano, Marialuisa; Rizzolio, Flavio; Jurisica, Igor; Giordano, Antonio; Pucciarelli, Salvatore; Nitti, Donato

    2015-01-01

    Preoperative chemoradiotherapy is widely used to improve local control of disease, sphincter preservation and to improve survival in patients with locally advanced rectal cancer. Patients enrolled in the present study underwent preoperative chemoradiotherapy, followed by surgical excision. Response to chemoradiotherapy was evaluated according to Mandard's Tumor Regression Grade (TRG). TRG 3, 4 and 5 were considered as partial or no response while TRG 1 and 2 as complete response. From pretherapeutic biopsies of 84 locally advanced rectal carcinomas available for the analysis, only 42 of them showed 70% cancer cellularity at least. By determining gene expression profiles, responders and non-responders showed significantly different expression levels for 19 genes (P < 0.001). We fitted a logistic model selected with a stepwise procedure optimizing the Akaike Information Criterion (AIC) and then validated by means of leave one out cross validation (LOOCV, accuracy = 95%). Four genes were retained in the achieved model: ZNF160, XRCC3, HFM1 and ASXL2. Real time PCR confirmed that XRCC3 is overexpressed in responders group and HFM1 and ASXL2 showed a positive trend. In vitro test on colon cancer resistant/susceptible to chemoradioterapy cells, finally prove that XRCC3 deregulation is extensively involved in the chemoresistance mechanisms. Protein-protein interactions (PPI) analysis involving the predictive classifier revealed a network of 45 interacting nodes (proteins) with TRAF6 gene playing a keystone role in the network. The present study confirmed the possibility that gene expression profiling combined with integrative computational biology is useful to predict complete responses to preoperative chemoradiotherapy in patients with advanced rectal cancer.

  10. Climatic patterns in the establishment of wintering areas by North American migratory birds.

    PubMed

    Pérez-Moreno, Heidi; Martínez-Meyer, Enrique; Soberón Mainero, Jorge; Rojas-Soto, Octavio

    2016-04-01

    Long-distance migration in birds is relatively well studied in nature; however, one aspect of this phenomenon that remains poorly understood is the pattern of distribution presented by species during arrival to and establishment of wintering areas. Some studies suggest that the selection of areas in winter is somehow determined by climate, given its influence on both the distribution of bird species and their resources. We analyzed whether different migrant passerine species of North America present climatic preferences during arrival to and departure from their wintering areas. We used ecological niche modeling to generate monthly potential climatic distributions for 13 migratory bird species during the winter season by combining the locations recorded per month with four environmental layers. We calculated monthly coefficients of climate variation and then compared two GLM (generalized linear models), evaluated with the AIC (Akaike information criterion), to describe how these coefficients varied over the course of the season, as a measure of the patterns of establishment in the wintering areas. For 11 species, the sites show nonlinear patterns of variation in climatic preferences, with low coefficients of variation at the beginning and end of the season and higher values found in the intermediate months. The remaining two species analyzed showed a different climatic pattern of selective establishment of wintering areas, probably due to taxonomic discrepancy, which would affect their modeled winter distribution. Patterns of establishment of wintering areas in the species showed a climatic preference at the macroscale, suggesting that individuals of several species actively select wintering areas that meet specific climatic conditions. This probably gives them an advantage over the winter and during the return to breeding areas. As these areas become full of migrants, alternative suboptimal sites are occupied. Nonrandom winter area selection may also have

  11. Biogeographical Interpretation of Elevational Patterns of Genus Diversity of Seed Plants in Nepal

    PubMed Central

    Li, Miao; Feng, Jianmeng

    2015-01-01

    This study tests if the biogeographical affinities of genera are relevant for explaining elevational plant diversity patterns in Nepal. We used simultaneous autoregressive (SAR) models to investigate the explanatory power of several predictors in explaining the diversity-elevation relationships shown in genera with different biogeographical affinities. Delta akaike information criterion (ΔAIC) was used for multi-model inferences and selections. Our results showed that both the total and tropical genus diversity peaked below the mid-point of the elevational gradient, whereas that of temperate genera had a nearly symmetrical, unimodal relationship with elevation. The proportion of temperate genera increased markedly with elevation, while that of tropical genera declined. Compared to tropical genera, temperate genera had wider elevational ranges and were observed at higher elevations. Water-related variables, rather than mid-domain effects (MDE), were the most significant predictors of elevational patterns of tropical genus diversity. The temperate genus diversity was influenced by energy availability, but only in quadratic terms of the models. Though climatic factors and mid-domain effects jointly explained most of the variation in the diversity of temperate genera with elevation, the former played stronger roles. Total genus diversity was most strongly influenced by climate and the floristic overlap of tropical and temperate floras, while the influences of mid-domain effects were relatively weak. The influences of water-related and energy-related variables may vary with biogeographical affinities. The elevational patterns may be most closely related to climatic factors, while MDE may somewhat modify the patterns. Caution is needed when investigating the causal factors underlying diversity patterns for large taxonomic groups composed of taxa of different biogeographical affinities. Right-skewed diversity-elevation patterns may be produced by the differential

  12. Potential for Inclusion of Information Encountering within Information Literacy Models

    ERIC Educational Resources Information Center

    Erdelez, Sanda; Basic, Josipa; Levitov, Deborah D.

    2011-01-01

    Introduction: Information encountering (finding information while searching for some other information), is a type of opportunistic discovery of information that complements purposeful approaches to finding information. The motivation for this paper was to determine if the current models of information literacy instruction refer to information…

  13. General Information about Melanoma

    MedlinePlus

    ... Treatment for more information.) Unusual moles, exposure to sunlight, and health history can affect the risk of ... Red or blond hair. Being exposed to natural sunlight or artificial sunlight (such as from tanning beds) ...

  14. Alternative fuel information sources

    SciTech Connect

    Not Available

    1994-06-01

    This short document contains a list of more than 200 US sources of information (Name, address, phone number, and sometimes contact) related to the use of alternative fuels in automobiles and trucks. Electric-powered cars are also included.

  15. Physiological Information Database (PID)

    EPA Science Inventory

    EPA has developed a physiological information database (created using Microsoft ACCESS) intended to be used in PBPK modeling. The database contains physiological parameter values for humans from early childhood through senescence as well as similar data for laboratory animal spec...

  16. Online patient information.

    PubMed

    Oakley, Amanda

    2002-09-01

    Information appropriate for patients with skin diseases is readily available on the Internet. Patients primarily seek disease-related information, but may also search for a dermatologist, shop for skin care products, or look to a consumer organization for support. Authoritative educational material is supplied by academic dermatologic associations and institutions and distributed by independent websites, large health portals, and search directories. Interactive opportunities include bulletin boards, ask-an-expert forums, and live chat. Although it is easy to find excellent dermatological information, the Internet is dynamic and unmoderated and patients can be misled or exploited by inaccurate or fraudulent websites. Health on the Net and other organizations have developed ethical principles to aid consumers in evaluating the quality of health-related information.

  17. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  18. Information for Adults

    MedlinePlus

    ... Foundation has shared over 7,000 Gund Teddy Bears with repaired cleft lips with children and families ... call the Cleftline for more information about our bears. If you are interested in helping us continue ...

  19. Evaluating Informal Support.

    ERIC Educational Resources Information Center

    Litwin, Howard; Auslander, Gail K.

    1990-01-01

    Dilemmas inherent in the attempt to measure and evaluate informal supports available to individuals in need of social care are illustrated through a study of 400 elderly persons in Jerusalem. Practical guidelines for evaluation are presented. (SLD)

  20. PREFACE: Quantum information processing

    NASA Astrophysics Data System (ADS)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  1. Energy Information Online

    ERIC Educational Resources Information Center

    Miller, Betty

    1978-01-01

    The need to search several files to obtain the maximum information on energy is emphasized. Energyline, APILIT, APIPAT, PIE News, TULSA, NTIS, and Chemical Abstracts Condensates files are described. (KP)

  2. The Information Gap.

    ERIC Educational Resources Information Center

    Sharp, Bill; Appleton, Elaine

    1993-01-01

    Addresses the misconception that the ecosystems involving plants and animals in our national parks are thoroughly monitored. Discusses research and other programs designed to inventory species throughout the national parks' and to inform the national parks concerning its ecosystems. (MDH)

  3. Zika Travel Information

    MedlinePlus

    ... GeoSentinel Global TravEpiNet Mobile Apps RSS Feeds Zika Travel Information Recommend on Facebook Tweet Share Compartir Language: ... Map of Areas with Risk of Zika Zika Travel Notices Zika Virus in Cape Verde Zika Virus ...

  4. Value of Information References

    SciTech Connect

    Morency, Christina

    2014-12-12

    This file contains a list of relevant references on value of information (VOI) in RIS format. VOI provides a quantitative analysis to evaluate the outcome of the combined technologies (seismology, hydrology, geodesy) used to monitor Brady's Geothermal Field.

  5. Zika Travel Information

    MedlinePlus

    ... Partners GeoSentinel Global TravEpiNet Mobile Apps RSS Feeds Zika Travel Information Recommend on Facebook Tweet Share Compartir ... website . World Map of Areas with Risk of Zika Zika Travel Notices Zika Virus in Cape Verde ...

  6. Designing Information Interoperability

    SciTech Connect

    Gorman, Bryan L.; Shankar, Mallikarjun; Resseguie, David R.

    2009-01-01

    Examples of incompatible systems are offered with a discussion of the relationship between incompatibility and innovation. Engineering practices and the role of standards are reviewed as a means of resolving issues of incompatibility, with particular attention to the issue of innovation. Loosely-coupled systems are described as a means of achieving and sustaining both interoperability and innovation in heterogeneous environments. A virtual unifying layer, in terms of a standard, a best practice, and a methodology, is proposed as a modality for designing information interoperability for enterprise applicaitons. The Uniform Resource Identifier (URI), microformats, and Joshua Porter s AOF Method are described and presented as solutions for designing interoperable information sharing web sites. The Special Operations Force Information Access (SOFIA), a mock design, is presented as an example of information interoperability.

  7. Microcephaly Information Page

    MedlinePlus

    ... Strategy Current Research Research Funded by NINDS Basic Neuroscience Clinical Research Translational Research Research at NINDS Focus ... Information Current Research Research Funded by NINDS Basic Neuroscience Clinical Research Translational Research Research at NINDS Focus ...

  8. Remote Sensing Information Gateway

    EPA Pesticide Factsheets

    Remote Sensing Information Gateway, a tool that allows scientists, researchers and decision makers to access a variety of multi-terabyte, environmental datasets and to subset the data and obtain only needed variables, greatly improving the download time.

  9. Library and Information Work

    ERIC Educational Resources Information Center

    Bird, J.

    1973-01-01

    Describes the necessity of information processing and dissemination in the present development of scientific knowledge. Indicates that training programs are available at Sheffield and City Universities and Leeds Polytechnic. (CC)

  10. Energy information directory 1998

    SciTech Connect

    1998-11-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are: (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory lists most Government offices and trade associations that are involved in energy matters.

  11. Tuberculosis: General Information

    MedlinePlus

    TB Elimination Tuberculosis: General Information What is TB? Tuberculosis (TB) is a disease caused by germs that are spread from person ... Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination CS227840_A What Does a Positive Test ...

  12. SSE Data and Information

    Atmospheric Science Data Center

    2013-01-31

    Surface meteorology and Solar Energy (SSE) Data and Information   The Release 6.0 Surface meteorology ... Collaboration Benefits International Priorities of Energy Management" features SSE data and the RETScreen renewable energy tool. ( Read ...

  13. NARSTO Data and Information

    Atmospheric Science Data Center

    2016-02-16

    NARSTO Data and Information NARSTO  (formerly North American ... effective strategies for local and regional air-pollution management. Data products from local, regional, and international monitoring and research ...

  14. CADDIS Basic Information

    EPA Pesticide Factsheets

    The Causal Analysis/Diagnosis Decision Information System, or CADDIS, is a website developed to help scientists and engineers in the Regions, States, and Tribes conduct causal assessments in aquatic systems.

  15. EPA Grants Information

    EPA Pesticide Factsheets

    The Integrated Grants Management System (IGMS) is a web-based system that contains information on the recipient of the grant, fellowship, cooperative agreement and interagency agreement, including the name of the entity accepting the award.

  16. Retrieving Patent Information Online

    ERIC Educational Resources Information Center

    Kaback, Stuart M.

    1978-01-01

    This paper discusses patent information retrieval from online files in terms of types of questions, file contents, coverage, timeliness, and other file variations. CLAIMS, Derwent, WPI, APIPAT and Chemical Abstracts Service are described. (KP)

  17. Information Technology for Education.

    ERIC Educational Resources Information Center

    Snyder, Cathrine E.; And Others

    1990-01-01

    Eight papers address technological, behavioral, and philosophical aspects of the application of information technology to training. Topics include instructional technology centers, intelligent training systems, distance learning, automated task analysis, training system selection, the importance of instructional methods, formative evaluation and…

  18. Pituitary Tumors: Condition Information

    MedlinePlus

    ... FOIA Jobs at NICHD Meetings, Conferences & Events Partnering & Donating to the NICHD Staff Directory Overview Condition Information ... gland causes the release of hormones in the body that control growth, metabolism, response to stress, and ...

  19. Congenital Heart Information Network

    MedlinePlus

    ... Baemayr for The Congenital Heart Information Network Exempt organization under Section 501(c)3. Copyright ©1996 - 2016 C.H.I.N. All rights reserved TX4-390-685 Original site design and HTML by Panoptic Communications

  20. Regulatory Information By Topic

    EPA Pesticide Factsheets

    EPA develops and enforces regulations that span many environmental topics, from acid rain reduction to wetlands restoration. Each topic listed below may include related laws and regulations, compliance enforcement information, policies guidance

  1. Profiling phylogenetic informativeness.

    PubMed

    Townsend, Jeffrey P

    2007-04-01

    The resolution of four controversial topics in phylogenetic experimental design hinges upon the informativeness of characters about the historical relationships among taxa. These controversies regard the power of different classes of phylogenetic character, the relative utility of increased taxonomic versus character sampling, the differentiation between lack of phylogenetic signal and a historical rapid radiation, and the design of taxonomically broad phylogenetic studies optimized by taxonomically sparse genome-scale data. Quantification of the informativeness of characters for resolution of phylogenetic hypotheses during specified historical epochs is key to the resolution of these controversies. Here, such a measure of phylogenetic informativeness is formulated. The optimal rate of evolution of a character to resolve a dated four-taxon polytomy is derived. By scaling the asymptotic informativeness of a character evolving at a nonoptimal rate by the derived asymptotic optimum, and by normalizing so that net phylogenetic informativeness is equivalent for all rates when integrated across all of history, an informativeness profile across history is derived. Calculation of the informativeness per base pair allows estimation of the cost-effectiveness of character sampling. Calculation of the informativeness per million years allows comparison across historical radiations of the utility of a gene for the inference of rapid adaptive radiation. The theory is applied to profile the phylogenetic informativeness of the genes BRCA1, RAG1, GHR, and c-myc from a muroid rodent sequence data set. Bounded integrations of the phylogenetic profile of these genes over four epochs comprising the diversifications of the muroid rodents, the mammals, the lobe-limbed vertebrates, and the early metazoans demonstrate the differential power of these genes to resolve the branching order among ancestral lineages. This measure of phylogenetic informativeness yields a new kind of information

  2. Information Operations Primer

    DTIC Science & Technology

    2008-11-01

    Information Environment Beliefs Opinions Perceptions a. This use of information is frequently referred to as ―soft-power‖ or ―non-kinetic‖ as contrasted ...conditions for follow-on operations, provides commanders with an additional set of options to achieve desired end-states. c. Products from the IO planning...perspectives must be avoided. The preparation of IO products and an evaluation of their potential effectiveness must be done from the perspective of

  3. Information matters: Beyond OASIS

    SciTech Connect

    Ramesh, V.C.

    1997-03-01

    Congestion on the Internet and the overwhelming volume of information are two important issues to consider as energy markets move towards a common real-time information infrastructure that permits trading of both electricity and natural gas. The rush to comply with the Phase I OASIS mandate should not cloud the vision needed to design the Energy Real-time Information System (ERIS) of the near future. Federal Energy Regulatory Commission Order 889 has mandated the establishment of Open Access Same-time Information Systems (OASIS) using the Internet infrastructure. Each Transmission Provider (TP) is required to establish an OASIS node, either alone or in conjunction with other TPs. A Responsible Party (RP), such as an Independent System Operator (ISO), can manage a node on behalf of many TPs. It is anticipated that such OASIS nodes would be regional in nature, with about 10 to 15 nodes nationwide. Transmission Customers (TCs) can access an OASIS node using a Web browser and request firm/non-firm transmission reservations. A TP is required to provide on the OASIS frequently updated information on the Available Transmission Capability (ATC) along certain {open_quotes}paths{close_quotes} in its system. This article points out that the twin problems of Internet congestion and information overload can cause problems for TCs that rely on the {open_quotes}standard{close_quotes} access mode enabled by the S&CP document. These problems will likely become more acute as the electricity industry moves towards Phase II implementation and beyond. The convergence of the information needs of the electricity and natural gas industries will likely result in a large-scale common infrastructure. The Energy Real-time Information System (ERIS) of the near future will require a sophisticated infrastructure based on emerging Internet technologies.

  4. Informative Essence Of Chaos

    NASA Astrophysics Data System (ADS)

    Timashev, S. F.; Vstovsky, G. V.; Solovieva, A. B.

    2005-11-01

    Flicker-Noise Spectroscopy (FNS) as a new phenomenological approach for extracting information hidden in chaotic signals is presented. According to FNS, the information is stored in sequences of distinct types of irregularities — spikes, jumps, and discontinuities of derivatives of different orders of chaotic signals. FNS application to analysis of Parkinsonian tremor dynamics studied by using laser systems is demonstrated. FNS could be considered as a new instrument to early diagnostics of patient states.

  5. The Complex Information Process

    NASA Astrophysics Data System (ADS)

    Taborsky, Edwina

    2000-09-01

    This paper examines the semiosic development of energy to information within a dyadic reality that operates within the contradictions of both classical and quantum physics. These two realities are examined within the three Peircean modal categories of Firstness, Secondness and Thirdness. The paper concludes that our world cannot operate within either of the two physical realities but instead filiates the two to permit a semiosis or information-generation of complex systems.

  6. Information Theoretic Causal Coordination

    DTIC Science & Technology

    2013-09-12

    his 1969 paper, Clive Granger , British economist and Nobel laureate, proposed a statistical def- inition of causality between stochastic processes. It...showed that the directed infor- mation, an information theoretic quantity, quantifies Granger causality . We also explored a more pessimistic setup...Final Technical Report Project Title: Information Theoretic Causal Coordination AFOSR Award Number: AF FA9550-10-1-0345 Reporting Period: July 15

  7. Asymmetric information and economics

    NASA Astrophysics Data System (ADS)

    Frieden, B. Roy; Hawkins, Raymond J.

    2010-01-01

    We present an expression of the economic concept of asymmetric information with which it is possible to derive the dynamical laws of an economy. To illustrate the utility of this approach we show how the assumption of optimal information flow leads to a general class of investment strategies including the well-known Q theory of Tobin. Novel consequences of this formalism include a natural definition of market efficiency and an uncertainty principle relating capital stock and investment flow.

  8. Information applications: Rapporteur summary

    SciTech Connect

    Siegel, S.

    1990-12-31

    An increased level of mathematical sophistication will be needed in the future to be able to handle the spectrum of information as it comes from a broad array of biological systems and other sources. Classification will be an increasingly complex and difficult issue. Several projects that are discussed are being developed by the US Department of Health and Human Services (DHHS), including a directory of risk assessment projects and a directory of exposure information resources.

  9. Management Information System

    NASA Technical Reports Server (NTRS)

    1984-01-01

    New Automated Management Information Center (AMIC) employs innovative microcomputer techniques to create color charts, viewgraphs, or other data displays in a fraction of the time formerly required. Developed under Kennedy Space Center's contract by Boeing Services International Inc., Seattle, WA, AMIC can produce an entirely new informational chart in 30 minutes, or an updated chart in only five minutes. AMIC also has considerable potential as a management system for business firms.

  10. Information Systems Plan.

    DTIC Science & Technology

    1985-04-01

    calculation FREQUENCY THAT DATA SET IS USED: Variable - several times, several purposes CURRENT METHOD OF INFORMATION MANAGIMENT : Harris 500, stored on paper...INFORMATION MANAGIMENT : 0.2 FTE’s $9,400 A-85 29 CONDUCT PUBLIC AFFAIRS PROGRAM Conduct public affairs program by advising DE and staff on potential... Compensation Program) VINTAGE REQUIREMENT OF DATA SET: Variable DECISIONS OR PRODUCTS DATA SET SUPPORTS: FREQUENCY THAT DATA SET IS USED: Variable CURRENT

  11. OLEM Performance Assessment Information

    EPA Pesticide Factsheets

    This asset includes a variety of data sets that measure the performance of Office of Land and Emergency Management (OLEM) programs in support of the Office of the Chief Financial Officer's Annual Commitment System (ACS) and Performance Evaluation Reporting System (PERS). Information is drawn from OLEM's ACRES, RCRAInfo, CERCLIS/SEMS, ICIS, and LUST4 systems, as well as input manually by authorized individuals in OLEM's program offices. Information is reviewed by OLEM program staff prior to being pushed to ACS and entered into PERS. This data asset also pulls in certain performance information input directly by Regional Office staff into ACS. Information is managed by the Performance Assessment Tool (PAT) and displayed in the PAT Dashboard.Information in this asset include:--Government Performance and Results Act (GPRA) of 1993: Measures reported for Innovations, Partnerships and Communications Office (IPCO), the Office of Brownfields and Land Revitalization (OBLR), the Office of Emergency Management (OEM), the Office of Resource Conservation and Recovery (ORCR), the Office of Superfund Remediation and Technology Innovation (OSRTI), and the Office of Underground Storage Tanks (OUST).-- Performance and Environmental Results System (PERS): Includes OLEM's information on performance results and baselines for the EPA Annual Plan and Budget.--Key Performance Indicators: OLEM has identified five KPIs that are tracked annually.--Integrated Cleanup Initiative: A pilot pe

  12. Information technology resources assessment

    SciTech Connect

    Loken, S.C.

    1993-01-01

    The emphasis in Information Technology (IT) development has shifted from technology management to information management, and the tools of information management are increasingly at the disposal of end-users, people who deal with information. Moreover, the interactive capabilities of technologies such as hypertext, scientific visualization, virtual reality, video conferencing, and even database management systems have placed in the hands of users a significant amount of discretion over how these resources will be used. The emergence of high-performance networks, as well as network operating systems, improved interoperability, and platform independence of applications will eliminate technical barriers to the use of data, increase the power and range of resources that can be used cooperatively, and open up a wealth of possibilities for new applications. The very scope of these prospects for the immediate future is a problem for the IT planner or administrator. Technology procurement and implementation, integration of new technologies into the existing infrastructure, cost recovery and usage of networks and networked resources, training issues, and security concerns such as data protection and access to experiments are just some of the issues that need to be considered in the emerging IT environment. As managers we must use technology to improve competitiveness. When procuring new systems, we must take advantage of scalable resources. New resources such as distributed file systems can improve access to and efficiency of existing operating systems. In addition, we must assess opportunities to improve information worker productivity and information management through tedmologies such as distributed computational visualization and teleseminar applications.

  13. Avoiding Cancer Risk Information

    PubMed Central

    Emanuel, Amber S.; Kiviniemi, Marc T.; Howell, Jennifer L.; Hay, Jennifer L.; Waters, Erika A.; Orom, Heather; Shepperd, James A.

    2015-01-01

    RATIONALE Perceived risk for health problems such as cancer is a central construct in many models of health decision making and a target for behavior change interventions. However, some portion of the population actively avoids cancer risk information. The prevalence of, explanations for, and consequences of such avoidance are not well understood. OBJECTIVE We examined the prevalence and demographic and psychosocial correlates of cancer risk information avoidance preference in a nationally representative sample. We also examined whether avoidance of cancer risk information corresponds with avoidance of cancer screening. RESULTS Based on our representative sample, 39% of the population indicated that they agreed or strongly agreed that they would “rather not know [their] chance of getting cancer.” This preference was stronger among older participants, female participants, and participants with lower levels of education. Preferring to avoid cancer risk information was stronger among participants who agreed with the beliefs that everything causes cancer, that there’s not much one can do to prevent cancer, and that there are too many recommendations to follow. Finally, the preference to avoid cancer risk information was associated with lower levels of screening for colon cancer. CONCLUSION These findings suggest that cancer risk information avoidance is a multi-determined phenomenon that is associated with demographic characteristics and psychosocial individual differences and also relates to engagement in cancer screening. PMID:26560410

  14. Photovoltaics information user study

    SciTech Connect

    Belew, W.W.; Wood, B.L.; Marie, T.L.; Reinhardt, C.L.

    1980-10-01

    The results of a series of telephone interviews with groups of users of information on photovoltaics (PV) are described. These results, part of a larger study on many different solar technologies, identify types of information each group needed and the best ways to get information to each group. The report is 1 of 10 discussing study results. The overall study provides baseline data about information needs in the solar community. It covers these technological areas: photovoltaics, passive solar heating and cooling, active solar heating and cooling, biomass energy, solar thermal electric power, solar industrial and agricultural process heat, wind energy, ocean energy, and advanced energy storage. An earlier study identified the information user groups in the solar community and the priority (to accelerate solar energy commercialization) of getting information to each group. In the current study only high-priority groups were examined. Results from seven PV groups respondents are analyzed in this report: DOE-Funded Researchers, Non-DOE-Funded Researchers, Researchers Working for Manufacturers, Representatives of Other Manufacturers, Representatives of Utilities, Electric Power Engineers, and Educators.

  15. Cockpit weather information system

    NASA Technical Reports Server (NTRS)

    Tu, Jeffrey Chen-Yu (Inventor)

    2000-01-01

    Weather information, periodically collected from throughout a global region, is periodically assimilated and compiled at a central source and sent via a high speed data link to a satellite communication service, such as COMSAT. That communication service converts the compiled weather information to GSDB format, and transmits the GSDB encoded information to an orbiting broadcast satellite, INMARSAT, transmitting the information at a data rate of no less than 10.5 kilobits per second. The INMARSAT satellite receives that data over its P-channel and rebroadcasts the GDSB encoded weather information, in the microwave L-band, throughout the global region at a rate of no less than 10.5 KB/S. The transmission is received aboard an aircraft by means of an onboard SATCOM receiver and the output is furnished to a weather information processor. A touch sensitive liquid crystal panel display allows the pilot to select the weather function by touching a predefined icon overlain on the display's surface and in response a color graphic display of the weather is displayed for the pilot.

  16. Information in statistical physics

    NASA Astrophysics Data System (ADS)

    Balian, Roger

    We review with a tutorial scope the information theory foundations of quantum statistical physics. Only a small proportion of the variables that characterize a system at the microscopic scale can be controlled, for both practical and theoretical reasons, and a probabilistic description involving the observers is required. The criterion of maximum von Neumann entropy is then used for making reasonable inferences. It means that no spurious information is introduced besides the known data. Its outcomes can be given a direct justification based on the principle of indifference of Laplace. We introduce the concept of relevant entropy associated with some set of relevant variables; it characterizes the information that is missing at the microscopic level when only these variables are known. For equilibrium problems, the relevant variables are the conserved ones, and the Second Law is recovered as a second step of the inference process. For non-equilibrium problems, the increase of the relevant entropy expresses an irretrievable loss of information from the relevant variables towards the irrelevant ones. Two examples illustrate the flexibility of the choice of relevant variables and the multiplicity of the associated entropies: the thermodynamic entropy (satisfying the Clausius-Duhem inequality) and the Boltzmann entropy (satisfying the H -theorem). The identification of entropy with missing information is also supported by the paradox of Maxwell's demon. Spin-echo experiments show that irreversibility itself is not an absolute concept: use of hidden information may overcome the arrow of time.

  17. Information Entropy of Fullerenes.

    PubMed

    Sabirov, Denis Sh; Ōsawa, Eiji

    2015-08-24

    The reasons for the formation of the highly symmetric C60 molecule under nonequilibrium conditions are widely discussed as it dominates over numerous similar fullerene structures. In such conditions, evolution of structure rather than energy defines the processes. We have first studied the diversity of fullerenes in terms of information entropy. Sorting 2079 structures from An Atlas of Fullerenes [ Fowler , P. W. ; Manolopoulos , D. E. An Atlas of Fullerenes ; Oxford : Clarendon , 1995 . ], we have found that the information entropies of only 14 fullerenes (<1% of the studied structures) lie between the values of C60 and C70, the two most abundant fullerenes. Interestingly, buckminsterfullerene is the only fullerene with zero information entropy, i.e., an exclusive compound among the other members of the fullerene family. Such an efficient sorting demonstrates possible relevance of information entropy to chemical processes. For this reason, we have introduced an algorithm for calculating changes in information entropy at chemical transformations. The preliminary calculations of changes in information entropy at the selected fullerene reactions show good agreement with thermochemical data.

  18. Universal Payload Information Management

    NASA Technical Reports Server (NTRS)

    Elmore, Ralph B.

    2003-01-01

    As the overall manager and integrator of International Space Station (ISS) science payloads, the Payload Operations Integration Center (POIC) at Marshall Space Flight Center has a critical need to provide an information management system for exchange and control of ISS payload files as well as to coordinate ISS payload related operational changes. The POIC's information management system has a fundamental requirement to provide secure operational access not only to users physically located at the POIC, but also to remote experimenters and International Partners physically located in different parts of the world. The Payload Information Management System (PIMS) is a ground-based electronic document configuration management and collaborative workflow system that was built to service the POIC's information management needs. This paper discusses the application components that comprise the PIMS system, the challenges that influenced its design and architecture, and the selected technologies it employs. This paper will also touch on the advantages of the architecture, details of the user interface, and lessons learned along the way to a successful deployment. With PIMS, a sophisticated software solution has been built that is not only universally accessible for POIC customer s information management needs, but also universally adaptable in implementation and application as a generalized information management system.

  19. Information Literacy in the Workplace.

    ERIC Educational Resources Information Center

    Oman, Julie N.

    2001-01-01

    Discusses the need for information literacy in the workplace in the face of information overload and problems related to end user information skills. Explains how to improve information literacy by assessing the organization's infrastructure, including available information technologies and information processes; considering demographics; and…

  20. Information Management and Educational Pluralism.

    ERIC Educational Resources Information Center

    Broadbent, Marianne

    1984-01-01

    Maps area of information studies as basis for discussion and course development with other university departments. Diversity versus disparity, meaning of information, information professions, information as commodity and process, information management programs and course models, information science and technology, and relationships of…

  1. Entrepreneurship, Information, and Growth

    PubMed Central

    Bunten, Devin; Weiler, Stephan; Weiler, Stephan; Zahran, Sammy

    2016-01-01

    We examine the contribution to economic growth of entrepreneurial “marketplace information” within a regional endogenous growth framework. Entrepreneurs are posited to provide an input to economic growth through the information revealed by their successes and failures. We empirically identify this information source with the regional variation in establishment births and deaths, which create geographic information asymmetries that influence subsequent entrepreneurial activity and economic growth. We find that local establishment birth and death rates are significantly and positively correlated with subsequent entrepreneurship for US counties. To account for the potential endogeneity caused by forward-looking entrepreneurs, we utilize instruments based on historic mining activity. We find that the information spillover component of local establishment birth and death rates have significant positive effects on subsequent entrepreneurship and employment growth for US counties and metropolitan areas. With the help of these intruments, we show that establishment births have a positive and significant effect on future employment growth within all counties, and that in line with the information hypothesis, local establishment death rates have a similar positive effect within metropolitan counties. PMID:27516625

  2. Energy information directory 1997

    SciTech Connect

    1997-09-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, state, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are: (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory lists some of the Government offices and trade associations that are involved in energy matters. It includes those DOE offices which deal with the public or public information. For the purposes of this publication, each entry has been given a numeric identification symbol. The index found in the back of this publication uses these identification numbers to refer the reader to relevant entries.

  3. Information in Our World: Conceptions of Information and Problems of Method in Information Science

    ERIC Educational Resources Information Center

    Ma, Lai

    2012-01-01

    Many concepts of information have been proposed and discussed in library and information science. These concepts of information can be broadly categorized as empirical and situational information. Unlike nomenclatures in many sciences, however, the concept of information in library and information science does not bear a generally accepted…

  4. Classical Information Theory

    NASA Astrophysics Data System (ADS)

    Suhov, Y.

    We begin with the definition of information gained by knowing that an event A has occurred: iota (A) = -log_2 {{P}}(A). (A dual point of view is also useful (although more evasive), where iota (A) is the amount of information needed to specify event A.) Here and below {{P}} stands for the underlying probability distribution. So the rarer an event A, the more information we gain if we know it has occurred. (More broadly, the rarer an event A, the more impact it will have. For example, the unlikely event that occurred in 1938 when fishermen caught a coelacanth - a prehistoric fish believed to be extinct - required a significant change to beliefs about evolution and biology. On the other hand, the likely event of catching a herring or a tuna would hardly imply any change in theories.)

  5. Materials management information systems.

    PubMed

    1996-01-01

    The hospital materials management function--ensuring that goods and services get from a source to an end user--encompasses many areas of the hospital and can significantly affect hospital costs. Performing this function in a manner that will keep costs down and ensure adequate cash flow requires effective management of a large amount of information from a variety of sources. To effectively coordinate such information, most hospitals have implemented some form of materials management information system (MMIS). These systems can be used to automate or facilitate functions such as purchasing, accounting, inventory management, and patient supply charges. In this study, we evaluated seven MMISs from seven vendors, focusing on the functional capabilities of each system and the quality of the service and support provided by the vendor. This Evaluation is intended to (1) assist hospitals purchasing an MMIS by educating materials managers about the capabilities, benefits, and limitations of MMISs and (2) educate clinical engineers and information system managers about the scope of materials management within a healthcare facility. Because software products cannot be evaluated in the same manner as most devices typically included in Health Devices Evaluations, our standard Evaluation protocol was not applicable for this technology. Instead, we based our ratings on our observations (e.g., during site visits), interviews we conducted with current users of each system, and information provided by the vendor (e.g., in response to a request for information [RFI]). We divided the Evaluation into the following sections: Section 1. Responsibilities and Information Requirements of Materials Management: Provides an overview of typical materials management functions and describes the capabilities, benefits, and limitations of MMISs. Also includes the supplementary article, "Inventory Cost and Reimbursement Issues" and the glossary, "Materials Management Terminology." Section 2. The

  6. Information sciences experiment system

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Murray, Nicholas D.; Benz, Harry F.; Bowker, David E.; Hendricks, Herbert D.

    1990-01-01

    The rapid expansion of remote sensing capability over the last two decades will take another major leap forward with the advent of the Earth Observing System (Eos). An approach is presented that will permit experiments and demonstrations in onboard information extraction. The approach is a non-intrusive, eavesdropping mode in which a small amount of spacecraft real estate is allocated to an onboard computation resource. How such an approach allows the evaluation of advanced technology in the space environment, advanced techniques in information extraction for both Earth science and information science studies, direct to user data products, and real-time response to events, all without affecting other on-board instrumentation is discussed.

  7. Information and strategic voting.

    PubMed

    Tyszler, Marcelo; Schram, Arthur

    2016-01-01

    We theoretically and experimentally study voter behavior in a setting characterized by plurality rule and mandatory voting. Voters choose from three options. We are interested in the occurrence of strategic voting in an environment where Condorcet cycles may occur and focus on how information about the preference distribution affects strategic behavior. We also vary the relative importance of the second preferred option. Quantal response equilibrium analysis is used to analyze the game and derive predictions. Our results indeed show that strategic voting arises. Its extent depends on (i) information availability; (ii) the relative importance of the intermediate candidate; (iii) the electorate's relative support for one's preferred candidate; (iv) the relative position of the plurality-supported candidate in one's preference ordering. Our results show that information serves as a coordination device where strategic voting does not harm the plurality-preferred candidate's chances of winning.

  8. Information Interaction: Providing a Framework for Information Architecture.

    ERIC Educational Resources Information Center

    Toms, Elaine G.

    2002-01-01

    Discussion of information architecture focuses on a model of information interaction that bridges the gap between human and computer and between information behavior and information retrieval. Illustrates how the process of information interaction is affected by the user, the system, and the content. (Contains 93 references.) (LRW)

  9. 75 FR 28777 - Information Collection; Financial Information Security Request Form

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-24

    ...; ] DEPARTMENT OF AGRICULTURE Forest Service Information Collection; Financial Information Security Request Form... Information Security Request Form. DATES: Comments must be received in writing on or before July 23, 2010 to... INFORMATION: Title: Financial Information Security Request Form. OMB Number: 0596-0204. Expiration Date...

  10. Information Requirements for a Procurement Management Information System.

    DTIC Science & Technology

    1975-08-01

    Management Information System is...described and some justification for this type of procurement management information system is presented. A literature search was made to determine...information systems. If information requirements are correctly identified and satisfied by a procurement management information system , contract administration and procurement management can be

  11. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  12. Social Dynamics of Information

    DTIC Science & Technology

    2013-07-01

    social media and relate them to network structure. The findings challenged a widely-held view that information spreads like a pathogen and showed that the differences between the spread of disease and information stem from human cognitive limitations. While highly connected people amplify pathogenic contagion, in social contagion they are cognitively overloaded with messages their friends produce and are less likely to see and spread a particular message. Accounting for cognitive constraints significantly simplifies social contagion, and leads to new ways to measure

  13. Information science team

    NASA Technical Reports Server (NTRS)

    Billingsley, F.

    1982-01-01

    Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.

  14. Informed consent and research

    PubMed Central

    Mandal, Jharna; Parija, Subhash Chandra

    2014-01-01

    Informed consent is the central doctrine to any research based on the principles of autonomy and self-determination. For it to be genuine and effective, it should be in simple regional language catering to the cultural and psychological and social requisites of the participant. The information entailed in the consent form must be true, should cover all the relevant aspects, and no fact should be hidden however seemingly important or unimportant. Every research volunteer puts his or her health and life at risk for the sake of science, and this must be respected at all times during the research. PMID:25250226

  15. Topological forms of information

    SciTech Connect

    Baudot, Pierre; Bennequin, Daniel

    2015-01-13

    We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: 1) classical probabilities and random variables; 2) quantum probabilities and observable operators; 3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes. We discuss briefly its application to complex data, in particular to the structures of information flows in biological systems. This short note summarizes results obtained during the last years by the authors. The proofs are not included, but the definitions and theorems are stated with precision.

  16. Geographic information systems

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.

    1982-01-01

    Information and activities are provided to: (1) enhance the ability to distinguish between a Geographic Information System (GIS) and a data management system; (2) develop understanding of spatial data handling by conventional methods versus the automated approach; (3) promote awareness of GIS design and capabilities; (4) foster understanding of the concepts and problems of data base development and management; (5) facilitate recognition of how a computerized GIS can model conditions in the present "real world" to project conditions in the future; and (6) appreciate the utility of integrating LANDSAT and other remotely sensed data into the GIS.

  17. Neural Analog Information Processing

    NASA Astrophysics Data System (ADS)

    Hecht-Nielsen, Robert

    1982-07-01

    Neural Analog Information Processing (NAIP) is an effort to develop general purpose pattern classification architectures based upon biological information processing principles. This paper gives an overview of NAIP and its relationship to the previous work in neural modeling from which its fundamental principles are derived. It also presents a theorem concerning the stability of response of a slab (a two dimensional array of identical simple processing units) to time-invariant (spatial) patterns. An experiment (via computer emulation) demonstrating classification of a spatial pattern by a simple, but complete NAIP architecture is described. A concept for hardware implementation of NAIP architectures is briefly discussed.

  18. Advanced information society(7)

    NASA Astrophysics Data System (ADS)

    Chiba, Toshihiro

    Various threats are hiding in advanced informationalized society. As we see car accident problems in motorization society light aspects necessarily accompy shady ones. Under the changing circumstances of advanced informationalization added values of information has become much higher. It causes computer crime, hacker, computer virus to come to the surface. In addition it can be said that infringement of intellectual property and privacy are threats brought by advanced information. Against these threats legal, institutional and insurance measures have been progressed, and newly security industry has been established. However, they are not adequate individually or totally. The future vision should be clarified, and countermeasures according to the visions have to be considered.

  19. [Acupuncture: an information therapy?].

    PubMed

    Nissel, H

    1998-01-01

    Even though modern medicine continues to be governed by the morphological point of view, cybernetics and systems theory are beginning to gain in importance. The concept of "Infomedicine" serves as the basis for a discussion of regulation and the information mechanisms necessary for this to occur. Some of the new insights being made in physics, such as the theory of relativity, quantum physics, and chaos theory provide many valuable explanations. Acupuncture represents a regulation and information therapy, and many parallels can be drawn between traditional Chinese medicine and the discoveries being made in today's physics.

  20. Information and communication technology

    NASA Technical Reports Server (NTRS)

    Edelson, Burton I.; Pelton, Joseph N.; Bostian, Charles W.; Brandon, William T.; Chan, Vincent W. S.; Hager, E. Paul; Helm, Neil R.; Jennings, Raymond D.; Kwan, Robert K.; Mahle, Christoph E.

    1994-01-01

    NASA and the National Science Foundation (NSF) commissioned a panel of U.S. experts to study the international status of satellite communications systems and technology. The study covers emerging systems concepts, applications, services, and the attendant technologies. The panel members traveled to Europe, Japan, and Russia to gather information firsthand. They visited 17 sites in Europe, 20 in Japan, and 4 in Russia. These included major manufacturers, government organizations, service providers, and associated research and development facilities. The panel's report was reviewed by the sites visited, by the panel, and by representatives of U.S. industry. The report details the information collected and compares it to U.S. activities.