Science.gov

Sample records for aic akaike information

  1. Model Selection and Akaike's Information Criterion (AIC): The General Theory and Its Analytical Extensions.

    ERIC Educational Resources Information Center

    Bozdogan, Hamparsum

    1987-01-01

    This paper studies the general theory of Akaike's Information Criterion (AIC) and provides two analytical extensions. The extensions make AIC asymptotically consistent and penalize overparameterization more stringently to pick only the simplest of the two models. The criteria are applied in two Monte Carlo experiments. (Author/GDC)

  2. Improving data analysis in herpetology: Using Akaike's information criterion (AIC) to assess the strength of biological hypotheses

    USGS Publications Warehouse

    Mazerolle, M.J.

    2006-01-01

    In ecology, researchers frequently use observational studies to explain a given pattern, such as the number of individuals in a habitat patch, with a large number of explanatory (i.e., independent) variables. To elucidate such relationships, ecologists have long relied on hypothesis testing to include or exclude variables in regression models, although the conclusions often depend on the approach used (e.g., forward, backward, stepwise selection). Though better tools have surfaced in the mid 1970's, they are still underutilized in certain fields, particularly in herpetology. This is the case of the Akaike information criterion (AIC) which is remarkably superior in model selection (i.e., variable selection) than hypothesis-based approaches. It is simple to compute and easy to understand, but more importantly, for a given data set, it provides a measure of the strength of evidence for each model that represents a plausible biological hypothesis relative to the entire set of models considered. Using this approach, one can then compute a weighted average of the estimate and standard error for any given variable of interest across all the models considered. This procedure, termed model-averaging or multimodel inference, yields precise and robust estimates. In this paper, I illustrate the use of the AIC in model selection and inference, as well as the interpretation of results analysed in this framework with two real herpetological data sets. The AIC and measures derived from it is should be routinely adopted by herpetologists. ?? Koninklijke Brill NV 2006.

  3. Factor Analysis and AIC.

    ERIC Educational Resources Information Center

    Akaike, Hirotugu

    1987-01-01

    The Akaike Information Criterion (AIC) was introduced to extend the method of maximum likelihood to the multimodel situation. Use of the AIC in factor analysis is interesting when it is viewed as the choice of a Bayesian model; thus, wider applications of AIC are possible. (Author/GDC)

  4. Assessing Fit and Dimensionality in Least Squares Metric Multidimensional Scaling Using Akaike's Information Criterion

    ERIC Educational Resources Information Center

    Ding, Cody S.; Davison, Mark L.

    2010-01-01

    Akaike's information criterion is suggested as a tool for evaluating fit and dimensionality in metric multidimensional scaling that uses least squares methods of estimation. This criterion combines the least squares loss function with the number of estimated parameters. Numerical examples are presented. The results from analyses of both simulation…

  5. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Zanini, Andrea; Woodbury, Allan D.

    2016-02-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  6. Contaminant release history reconstruction through empirical Bayes and Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Zanini, A.; Woodbury, A. D.

    2015-12-01

    Contaminant release history identification has received considerable attention in the literature over the past several decades. In our review of this subject suggests that improvements are needed in terms of a reliable procedure, one that is easy to implement, with only few hyperparameters to estimate, and is able to evaluate confidence intervals. The purpose of this work is to propose an empirical Bayesian approach combined with the Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history starting from concentration observations in time or space. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method also quantifies the estimation error through confidence intervals. We successfully test the method out on three test cases: the classic Skaggs and Kabala (1994) source, a "midnight dump" example that consists of three delta-like sources and lastly a laboratory dataset, consisting of two measurement points spatially but with synoptic observations. This experiment reproduces the response of a 2-D unconfined aquifer. The performance of the inverse method was tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. Results show an excellent recovery of all sources used in the examples. Lastly, the obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  7. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.

    PubMed

    Zanini, Andrea; Woodbury, Allan D

    2016-01-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  8. Model Selection Information Criteria for Non-Nested Latent Class Models.

    ERIC Educational Resources Information Center

    Lin, Ting Hsiang; Dayton, C. Mitchell

    1997-01-01

    The use of these three model selection information criteria for latent class models was studied for nonnested models: (1) Akaike's information criterion (H. Akaike, 1973) (AIC); (2) the Schwarz information (G. Schwarz, 1978) (SIC) criterion; and (3) the Bozdogan version of the AIC (CAIC) (H. Bozdogan, 1987). Situations in which each is preferable…

  9. Multidimensional Rasch Model Information-Based Fit Index Accuracy

    ERIC Educational Resources Information Center

    Harrell-Williams, Leigh M.; Wolfe, Edward W.

    2013-01-01

    Most research on confirmatory factor analysis using information-based fit indices (Akaike information criterion [AIC], Bayesian information criteria [BIC], bias-corrected AIC [AICc], and consistent AIC [CAIC]) has used a structural equation modeling framework. Minimal research has been done concerning application of these indices to item response…

  10. Investigating the performance of AIC in selecting phylogenetic models.

    PubMed

    Jhwueng, Dwueng-Chwuan; Huzurbazar, Snehalata; O'Meara, Brian C; Liu, Liang

    2014-08-01

    The popular likelihood-based model selection criterion, Akaike's Information Criterion (AIC), is a breakthrough mathematical result derived from information theory. AIC is an approximation to Kullback-Leibler (KL) divergence with the derivation relying on the assumption that the likelihood function has finite second derivatives. However, for phylogenetic estimation, given that tree space is discrete with respect to tree topology, the assumption of a continuous likelihood function with finite second derivatives is violated. In this paper, we investigate the relationship between the expected log likelihood of a candidate model, and the expected KL divergence in the context of phylogenetic tree estimation. We find that given the tree topology, AIC is an unbiased estimator of the expected KL divergence. However, when the tree topology is unknown, AIC tends to underestimate the expected KL divergence for phylogenetic models. Simulation results suggest that the degree of underestimation varies across phylogenetic models so that even for large sample sizes, the bias of AIC can result in selecting a wrong model. As the choice of phylogenetic models is essential for statistical phylogenetic inference, it is important to improve the accuracy of model selection criteria in the context of phylogenetics. PMID:24867284

  11. An Evaluation of Information Criteria Use for Correct Cross-Classified Random Effects Model Selection

    ERIC Educational Resources Information Center

    Beretvas, S. Natasha; Murphy, Daniel L.

    2013-01-01

    The authors assessed correct model identification rates of Akaike's information criterion (AIC), corrected criterion (AICC), consistent AIC (CAIC), Hannon and Quinn's information criterion (HQIC), and Bayesian information criterion (BIC) for selecting among cross-classified random effects models. Performance of default values for the 5…

  12. Water-solvent partition coefficients and Delta Log P values as predictors for blood-brain distribution; application of the Akaike information criterion.

    PubMed

    Abraham, Michael H; Acree, William E; Leo, Albert J; Hoekman, David; Cavanaugh, Joseph E

    2010-05-01

    It is shown that log P values for water-alkane or water-cyclohexane partitions, and the corresponding Delta log P values when used as descriptors for blood-brain distribution, as log BB, yield equations with very poor correlation coefficients but very good standard deviations, S from 0.25 to 0.33 log units. Using quite large data sets, we have verified that similar S-values apply to predictions of log BB. A suggested model, based on log P for water-dodecane and water-hexadecane partition coefficients, has 109 data points and a fitted S = 0.254 log units. It is essential to include in the model an indicator variable for volatile compounds, and an indicator variable for drugs that contain the carboxylic group. A similar equation based on water-chloroform partition coefficients has 83 data points and a fitted S = 0.287 log units. We can find no causal connection between these log P values and log BB in terms of correlation or in terms of chemical similarity, but conclude that the log P descriptor will yield excellent predictions of log BB provided that predictions are within the chemical space of the compounds used to set up the model. We also show that model based on log P(octanol) and an Abraham descriptor provides a simple and easy method of predicting log BB with an error of no more than 0.31 log units. We have used the Akaike information criterion to investigate the most economic models for log BB.

  13. Derivation of 3-D surface deformation from an integration of InSAR and GNSS measurements based on Akaike's Bayesian Information Criterion

    NASA Astrophysics Data System (ADS)

    Luo, Haipeng; Liu, Yang; Chen, Ting; Xu, Caijun; Wen, Yangmao

    2016-01-01

    We present a new method to derive 3-D surface deformation from an integration of interferometric synthetic aperture radar (InSAR) images and Global Navigation Satellite System (GNSS) observations based on Akaike's Bayesian Information Criterion (ABIC), considering relationship between deformations of neighbouring locations. This method avoids interpolated errors by excluding the interpolation of GNSS into the same spatial resolution as InSAR images and harnesses the data sets and the prior smooth constraints of surface deformation objectively and simultaneously by using ABIC, which were inherently unresolved in previous studies. In particular, we define surface roughness measuring smoothing degree to evaluate the performance of the prior constraints and deduce the formula of the covariance for the estimation errors to estimate the uncertainty of modelled solution. We validate this method using synthetic tests and the 2008 Mw 7.9 Wenchuan earthquake. We find that the optimal weights associated with ABIC minimum are generally at trade-off locations that balance contributions from InSAR, GNSS data sets and the prior constraints. We use this method to evaluate the influence of the interpolated errors from the Ordinary Kriging algorithm on the derivation of surface deformation. Tests show that the interpolated errors may contribute to biasing very large weights imposed on Kriged GNSS data, suggesting that fixing the relative weights is required in this case. We also make a comparison with SISTEM method, indicating that our method allows obtaining better estimations even with sparse GNSS observations. In addition, this method can be generalized to provide a solution for situations where some types of data sets are lacking and can be exploited further to account for data sets such as the integration of displacements along radar lines and offsets along satellite tracks.

  14. Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective

    USGS Publications Warehouse

    Barker, Richard J.; Link, William A.

    2015-01-01

    Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.

  15. Autonomic Intelligent Cyber Sensor (AICS) Version 1.0.1

    SciTech Connect

    2015-03-01

    The Autonomic Intelligent Cyber Sensor (AICS) provides cyber security and industrial network state awareness for Ethernet based control network implementations. The AICS utilizes collaborative mechanisms based on Autonomic Research and a Service Oriented Architecture (SOA) to: 1) identify anomalous network traffic; 2) discover network entity information; 3) deploy deceptive virtual hosts; and 4) implement self-configuring modules. AICS achieves these goals by dynamically reacting to the industrial human-digital ecosystem in which it resides. Information is transported internally and externally on a standards based, flexible two-level communication structure.

  16. Autonomic Intelligent Cyber Sensor (AICS) Version 1.0.1

    2015-03-01

    The Autonomic Intelligent Cyber Sensor (AICS) provides cyber security and industrial network state awareness for Ethernet based control network implementations. The AICS utilizes collaborative mechanisms based on Autonomic Research and a Service Oriented Architecture (SOA) to: 1) identify anomalous network traffic; 2) discover network entity information; 3) deploy deceptive virtual hosts; and 4) implement self-configuring modules. AICS achieves these goals by dynamically reacting to the industrial human-digital ecosystem in which it resides. Information is transportedmore » internally and externally on a standards based, flexible two-level communication structure.« less

  17. How Well Can We Detect Lineage-Specific Diversification-Rate Shifts? A Simulation Study of Sequential AIC Methods

    PubMed Central

    May, Michael R.; Moore, Brian R.

    2016-01-01

    Evolutionary biologists have long been fascinated by the extreme differences in species numbers across branches of the Tree of Life. This has motivated the development of statistical methods for detecting shifts in the rate of lineage diversification across the branches of phylogenic trees. One of the most frequently used methods, MEDUSA, explores a set of diversification-rate models, where each model assigns branches of the phylogeny to a set of diversification-rate categories. Each model is first fit to the data, and the Akaike information criterion (AIC) is then used to identify the optimal diversification model. Surprisingly, the statistical behavior of this popular method is uncharacterized, which is a concern in light of: (1) the poor performance of the AIC as a means of choosing among models in other phylogenetic contexts; (2) the ad hoc algorithm used to visit diversification models, and; (3) errors that we reveal in the likelihood function used to fit diversification models to the phylogenetic data. Here, we perform an extensive simulation study demonstrating that MEDUSA (1) has a high false-discovery rate (on average, spurious diversification-rate shifts are identified ≈30% of the time), and (2) provides biased estimates of diversification-rate parameters. Understanding the statistical behavior of MEDUSA is critical both to empirical researchers—in order to clarify whether these methods can make reliable inferences from empirical datasets—and to theoretical biologists—in order to clarify the specific problems that need to be solved in order to develop more reliable approaches for detecting shifts in the rate of lineage diversification. [Akaike information criterion; extinction; lineage-specific diversification rates; phylogenetic model selection; speciation.] PMID:27037081

  18. Dynamic microphones M-87/AIC and M-101/AIC and earphone H-143/AIC. [for space shuttle

    NASA Technical Reports Server (NTRS)

    Reiff, F. H.

    1975-01-01

    The electrical characteristics of the M-87/AIC and M-101/AIC dynamic microphone and H-143 earphones were tested for the purpose of establishing the relative performance levels of units supplied by four vendors. The microphones and earphones were tested for frequency response, sensitivity, linearity, impedance and noise cancellation. Test results are presented and discussed.

  19. Application of "AIC" to Wald and Lagrange Multiplier Tests in Covariance Structure Analysis.

    ERIC Educational Resources Information Center

    Chou, Chih-Ping; Bentler, P. M.

    1996-01-01

    Some efficient procedures are proposed for using the Akaike Information Criterion (H. Akaike, 1987), an alternative to the conventional chi-square goodness of fit test, in covariance structure analysis based on backward search through the Wald test to impose constraints and forward search through the Lagrange test to release constraints. (SLD)

  20. Information criteria and selection of vibration models.

    PubMed

    Ruzek, Michal; Guyader, Jean-Louis; Pézerat, Charles

    2014-12-01

    This paper presents a method of determining an appropriate equation of motion of two-dimensional plane structures like membranes and plates from vibration response measurements. The local steady-state vibration field is used as input for the inverse problem that approximately determines the dispersion curve of the structure. This dispersion curve is then statistically treated with Akaike information criterion (AIC), which compares the experimentally measured curve to several candidate models (equations of motion). The model with the lowest AIC value is then chosen, and the utility of other models can also be assessed. This method is applied to three experimental case studies: A red cedar wood plate for musical instruments, a thick paper subjected to unknown membrane tension, and a thick composite sandwich panel. These three cases give three different situations of a model selection.

  1. Mission science value-cost savings from the Advanced Imaging Communication System (AICS)

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1984-01-01

    An Advanced Imaging Communication System (AICS) was proposed in the mid-1970s as an alternative to the Voyager data/communication system architecture. The AICS achieved virtually error free communication with little loss in the downlink data rate by concatenating a powerful Reed-Solomon block code with the Voyager convolutionally coded, Viterbi decoded downlink channel. The clean channel allowed AICS sophisticated adaptive data compression techniques. Both Voyager and the Galileo mission have implemented AICS components, and the concatenated channel itself is heading for international standardization. An analysis that assigns a dollar value/cost savings to AICS mission performance gains is presented. A conservative value or savings of $3 million for Voyager, $4.5 million for Galileo, and as much as $7 to 9.5 million per mission for future projects such as the proposed Mariner Mar 2 series is shown.

  2. AIC Computations Using Navier-Stokes Equations on Single Image Supercomputers For Design Optimization

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru

    2004-01-01

    A procedure to accurately generate AIC using the Navier-Stokes solver including grid deformation is presented. Preliminary results show good comparisons between experiment and computed flutter boundaries for a rectangular wing. A full wing body configuration of an orbital space plane is selected for demonstration on a large number of processors. In the final paper the AIC of full wing body configuration will be computed. The scalability of the procedure on supercomputer will be demonstrated.

  3. A novel hybrid dimension reduction technique for undersized high dimensional gene expression data sets using information complexity criterion for cancer classification.

    PubMed

    Pamukçu, Esra; Bozdogan, Hamparsum; Çalık, Sinan

    2015-01-01

    Gene expression data typically are large, complex, and highly noisy. Their dimension is high with several thousand genes (i.e., features) but with only a limited number of observations (i.e., samples). Although the classical principal component analysis (PCA) method is widely used as a first standard step in dimension reduction and in supervised and unsupervised classification, it suffers from several shortcomings in the case of data sets involving undersized samples, since the sample covariance matrix degenerates and becomes singular. In this paper we address these limitations within the context of probabilistic PCA (PPCA) by introducing and developing a new and novel approach using maximum entropy covariance matrix and its hybridized smoothed covariance estimators. To reduce the dimensionality of the data and to choose the number of probabilistic PCs (PPCs) to be retained, we further introduce and develop celebrated Akaike's information criterion (AIC), consistent Akaike's information criterion (CAIC), and the information theoretic measure of complexity (ICOMP) criterion of Bozdogan. Six publicly available undersized benchmark data sets were analyzed to show the utility, flexibility, and versatility of our approach with hybridized smoothed covariance matrix estimators, which do not degenerate to perform the PPCA to reduce the dimension and to carry out supervised classification of cancer groups in high dimensions.

  4. AIC649 Induces a Bi-Phasic Treatment Response in the Woodchuck Model of Chronic Hepatitis B.

    PubMed

    Paulsen, Daniela; Weber, Olaf; Ruebsamen-Schaeff, Helga; Tennant, Bud C; Menne, Stephan

    2015-01-01

    AIC649 has been shown to directly address the antigen presenting cell arm of the host immune defense leading to a regulated cytokine release and activation of T cell responses. In the present study we analyzed the antiviral efficacy of AIC649 as well as its potential to induce functional cure in animal models for chronic hepatitis B. Hepatitis B virus transgenic mice and chronically woodchuck hepatitis virus (WHV) infected woodchucks were treated with AIC649, respectively. In the mouse system AIC649 decreased the hepatitis B virus titer as effective as the "gold standard", Tenofovir. Interestingly, AIC649-treated chronically WHV infected woodchucks displayed a bi-phasic pattern of response: The marker for functional cure--hepatitis surface antigen--first increased but subsequently decreased even after cessation of treatment to significantly reduced levels. We hypothesize that the observed bi-phasic response pattern to AIC649 treatment reflects a physiologically "concerted", reconstituted immune response against WHV and therefore may indicate a potential for inducing functional cure in HBV-infected patients. PMID:26656974

  5. Use of the AIC with the EM algorithm: A demonstration of a probability model selection technique

    SciTech Connect

    Glosup, J.G.; Axelrod M.C.

    1994-11-15

    The problem of discriminating between two potential probability models, a Gaussian distribution and a mixture of Gaussian distributions, is considered. The focus of our interest is a case where the models are potentially non-nested and the parameters of the mixture model are estimated through the EM algorithm. The AIC, which is frequently used as a criterion for discriminating between non-nested models, is modified to work with the EM algorithm and is shown to provide a model selection tool for this situation. A particular problem involving an infinite mixture distribution known as Middleton`s Class A model is used to demonstrate the effectiveness and limitations of this method.

  6. Perturbation of energy metabolism by fatty-acid derivative AIC-47 and imatinib in BCR-ABL-harboring leukemic cells.

    PubMed

    Shinohara, Haruka; Kumazaki, Minami; Minami, Yosuke; Ito, Yuko; Sugito, Nobuhiko; Kuranaga, Yuki; Taniguchi, Kohei; Yamada, Nami; Otsuki, Yoshinori; Naoe, Tomoki; Akao, Yukihiro

    2016-02-01

    In Ph-positive leukemia, imatinib brought marked clinical improvement; however, further improvement is needed to prevent relapse. Cancer cells efficiently use limited energy sources, and drugs targeting cellular metabolism improve the efficacy of therapy. In this study, we characterized the effects of novel anti-cancer fatty-acid derivative AIC-47 and imatinib, focusing on cancer-specific energy metabolism in chronic myeloid leukemia cells. AIC-47 and imatinib in combination exhibited a significant synergic cytotoxicity. Imatinib inhibited only the phosphorylation of BCR-ABL; whereas AIC-47 suppressed the expression of the protein itself. Both AIC-47 and imatinib modulated the expression of pyruvate kinase M (PKM) isoforms from PKM2 to PKM1 through the down-regulation of polypyrimidine tract-binding protein 1 (PTBP1). PTBP1 functions as alternative splicing repressor of PKM1, resulting in expression of PKM2, which is an inactive form of pyruvate kinase for the last step of glycolysis. Although inactivation of BCR-ABL by imatinib strongly suppressed glycolysis, compensatory fatty-acid oxidation (FAO) activation supported glucose-independent cell survival by up-regulating CPT1C, the rate-limiting FAO enzyme. In contrast, AIC-47 inhibited the expression of CPT1C and directly fatty-acid metabolism. These findings were also observed in the CD34(+) fraction of Ph-positive acute lymphoblastic leukemia cells. These results suggest that AIC-47 in combination with imatinib strengthened the attack on cancer energy metabolism, in terms of both glycolysis and compensatory activation of FAO.

  7. Tightening the Noose on LMXB Formation of MSPs: Need for AIC ?

    NASA Astrophysics Data System (ADS)

    Grindlay, J. E.; Yi, I.

    1997-12-01

    The origin of millisecond pulsars (MSPs) remains an outstanding problem despite the early and considerable evidence that they are the descendents of neutron stars spun up by accretion in low mass x-ray binaries (LMXBs). The route to MSPs from LMXBs may pass through the high luminosity Z-source LMXBs but is (severely) limited by the very limited population (and apparent birth rate) of Z-sources available. The more numerous x-ray bursters, the Atoll sources, are likely to (still) be short in numbers or birth rate but are now also found to be likely inefficient in the spin-up torques they can provide: the accretion in these relatively low accretion rate systems is likely dominated by an advection dominated flow in which matter accretes onto the NS via sub-Keplerian flows which then transfer correspondingly less angular momentum to the NS. We investigate the implications of the possible ADAF flows in low luminosity NS-LMXBs and find it is unlikely they can produce MSPs. The standard model can still be allowed if most NS-LMXBs are quiescent and undergo transient-like outbursts similar to the soft x-ray transients (which mostly contain black holes). However, apart from Cen X-4 and Aql X-1, few such systems have been found and the SXTs appear instead to be significantly deficient in NS systems. Direct production of MSPs by the accretion induced collapse (AIC) of white dwarfs has been previously suggested to solve the MSP vs. LMXB birth rate problem. We re-examine AIC models in light of the new constraints on direct LMXB production and the additional difficulty imposed by ADAF flows and constraints on SXT populations and derive constraints on the progenitor WD spin and magnetic fields.

  8. AN/AIC-22(V) Intercommunications Set (ICS) fiber optic link engineering analysis report

    NASA Astrophysics Data System (ADS)

    Minter, Richard; Blocksom, Roland; Ling, Christopher

    1990-08-01

    Electromagnetic interference (EMI) problems constitute a serious threat to operational Navy aircraft systems. The application of fiber optic technology is a potential solution to these problems. EMI reported problems in the P-3 patrol aircraft AN/AIC-22(V) Intercommunications System (ICS) were selected from an EMI problem database for investigation and possible application of fiber optic technology. A proof-of-concept experiment was performed to demonstrate the level of EMI immunity of fiber optics when used in an ICS. A full duplex single channel fiber optic audio link was designed and assembled from modified government furnished equipment (GFE) previously used in another Navy fiber optic application. The link was taken to the Naval Air Test Center (NATC) Patuxent River, Maryland and temporarily installed in a Naval Research Laboratory (NRL) P-3A aircraft for a side-by-side comparison test with the installed ICS. With regards to noise reduction, the fiber optic link provided a qualitative improvement over the conventional ICS. In an effort to obtain a quantitative measure of comparison, audio frequency range both with and without operation of the aircraft VHF and UHF radio transmitters.

  9. Predicting the potential distribution of invasive exotic species using GIS and information-theoretic approaches: A case of ragweed (Ambrosia artemisiifolia L.) distribution in China

    USGS Publications Warehouse

    Chen, H.; Chen, L.; Albright, T.P.

    2007-01-01

    Invasive exotic species pose a growing threat to the economy, public health, and ecological integrity of nations worldwide. Explaining and predicting the spatial distribution of invasive exotic species is of great importance to prevention and early warning efforts. We are investigating the potential distribution of invasive exotic species, the environmental factors that influence these distributions, and the ability to predict them using statistical and information-theoretic approaches. For some species, detailed presence/absence occurrence data are available, allowing the use of a variety of standard statistical techniques. However, for most species, absence data are not available. Presented with the challenge of developing a model based on presence-only information, we developed an improved logistic regression approach using Information Theory and Frequency Statistics to produce a relative suitability map. This paper generated a variety of distributions of ragweed (Ambrosia artemisiifolia L.) from logistic regression models applied to herbarium specimen location data and a suite of GIS layers including climatic, topographic, and land cover information. Our logistic regression model was based on Akaike's Information Criterion (AIC) from a suite of ecologically reasonable predictor variables. Based on the results we provided a new Frequency Statistical method to compartmentalize habitat-suitability in the native range. Finally, we used the model and the compartmentalized criterion developed in native ranges to "project" a potential distribution onto the exotic ranges to build habitat-suitability maps. ?? Science in China Press 2007.

  10. Perceived challenges and attitudes to regimen and product selection from Italian haemophilia treaters: the 2013 AICE survey.

    PubMed

    Franchini, M; Coppola, A; Rocino, A; Zanon, E; Morfini, M; Accorsi, Arianna; Aru, Anna Brigida; Biasoli, Chiara; Cantori, Isabella; Castaman, Giancarlo; Cesaro, Simone; Ciabatta, Carlo; De Cristofaro, Raimondo; Delios, Grazia; Di Minno, Giovanni; D'Incà, Marco; Dragani, Alfredo; Ettorre, Cosimo Pietro; Gagliano, Fabio; Gamba, Gabriella; Gandini, Giorgio; Giordano, Paola; Giuffrida, Gaetano; Gresele, Paolo; Latella, Caterina; Luciani, Matteo; Margaglione, Maurizio; Marietta, Marco; Mazzucconi, Maria Gabriella; Messina, Maria; Molinari, Angelo Claudio; Notarangelo, Lucia Dora; Oliovecchio, Emily; Peyvandi, Flora; Piseddu, Gavino; Rossetti, Gina; Rossi, Vincenza; Santagostino, Elena; Schiavoni, Mario; Schinco, Piercarla; Serino, Maria Luisa; Tagliaferri, Annarita; Testa, Sophie

    2014-03-01

    Despite great advances in haemophilia care in the last 20 years, a number of questions on haemophilia therapy remain unanswered. These debated issues primarily involve the choice of the product type (plasma-derived vs. recombinant) for patients with different characteristics: specifically, if they were infected by blood-borne virus infections, and if they bear high or low risk of inhibitor development. In addition, the most appropriate treatment regimen in non-inhibitor and inhibitor patients compel physicians operating at the haemophilia treatment centres (HTCs) to take important therapeutic decisions, which are often based on their personal clinical experience rather than on evidence-based recommendations from published literature data. To know the opinion on the most controversial aspects in haemophilia care of Italian expert physicians, who are responsible for common clinical practice and therapeutic decisions, we have conducted a survey among the Directors of HTCs affiliated to the Italian Association of Haemophilia Centres (AICE). A questionnaire, consisting of 19 questions covering the most important topics related to haemophilia treatment, was sent to the Directors of all 52 Italian HTCs. Forty Directors out of 52 (76.9%) responded, accounting for the large majority of HTCs affiliated to the AICE throughout Italy. The results of this survey provide for the first time a picture of the attitudes towards clotting factor concentrate use and product selection of clinicians working at Italian HTCs.

  11. Teacher's Corner: Conducting Specification Searches with Amos

    ERIC Educational Resources Information Center

    Schumacker, Randall E.

    2006-01-01

    Amos 5.0 (Arbuckle, 2003) permits exploratory specification searches for the best theoretical model given an initial model using the following fit function criteria: chi-square (C), chi-square--df (C--df), Akaike Information Criteria (AIC), Browne-Cudeck criterion (BCC), Bayes Information Criterion (BIC) , chi-square divided by the degrees of…

  12. Egg distributions and the information a solitary parasitoid has and uses for its oviposition decisions.

    PubMed

    Hemerik, Lia; van der Hoeven, Nelly; van Alphen, Jacques J M

    2002-01-01

    Approximately three decades ago the question was first answered "whether parasitoids are able to assess the number or origin of eggs in a host" for a solitary parasitoid, Leptopilina heterotoma, by fitting theoretically derived distributions to empirical ones. We extend the set of different theoretically postulated distributions of eggs among hosts by combining searching modes and abilities in assessing host quality. In the models, parasitoids search either randomly (Poisson) (1) or by vibrotaxis (Negative Binomial) (2). Parasitoids are: (a) assumed to treat all hosts equally, (b) able to distinguish them in unparasitised and parasitised hosts only, (c) able to distinguish them by the number of eggs they contained, or (d) able to recognise their own eggs. Mathematically tractable combinations of searching mode (1 and 2) and abilities (a,b,c,d) result in seven different models (M1a, M1b, M1c, M1d, M2a, M2b and M2c). These models have been simulated for a varying number of searching parasitoids and various mean numbers of eggs per host. Each resulting distribution is fitted to all theoretical models. The model with the minimum Akaike's information criterion (AIC) is chosen as the best fitting for each simulated distribution. We thus investigate the power of the AIC and for each distribution with a specified mean number of eggs per host we derive a frequency distribution for classification. Firstly, we discuss the simulations of models including random search (M1a, M1b, M1c and M1d). For M1a, M1c and M1d the simulated distributions are correctly classified in at least 70% of all cases. However, in a few cases model M1b is only properly classified for intermediate mean values of eggs per host. The models including vibrotaxis as searching behaviour (M2a, M2b and M2c) cannot be distinguished from those with random search if the mean number of eggs per host is low. Among the models incorporating vibrotaxis the three abilities are detected analogously as in models with

  13. The role of multicollinearity in landslide susceptibility assessment by means of Binary Logistic Regression: comparison between VIF and AIC stepwise selection

    NASA Astrophysics Data System (ADS)

    Cama, Mariaelena; Cristi Nicu, Ionut; Conoscenti, Christian; Quénéhervé, Geraldine; Maerker, Michael

    2016-04-01

    Landslide susceptibility can be defined as the likelihood of a landslide occurring in a given area on the basis of local terrain conditions. In the last decades many research focused on its evaluation by means of stochastic approaches under the assumption that 'the past is the key to the future' which means that if a model is able to reproduce a known landslide spatial distribution, it will be able to predict the future locations of new (i.e. unknown) slope failures. Among the various stochastic approaches, Binary Logistic Regression (BLR) is one of the most used because it calculates the susceptibility in probabilistic terms and its results are easily interpretable from a geomorphological point of view. However, very often not much importance is given to multicollinearity assessment whose effect is that the coefficient estimates are unstable, with opposite sign and therefore difficult to interpret. Therefore, it should be evaluated every time in order to make a model whose results are geomorphologically correct. In this study the effects of multicollinearity in the predictive performance and robustness of landslide susceptibility models are analyzed. In particular, the multicollinearity is estimated by means of Variation Inflation Index (VIF) which is also used as selection criterion for the independent variables (VIF Stepwise Selection) and compared to the more commonly used AIC Stepwise Selection. The robustness of the results is evaluated through 100 replicates of the dataset. The study area selected to perform this analysis is the Moldavian Plateau where landslides are among the most frequent geomorphological processes. This area has an increasing trend of urbanization and a very high potential regarding the cultural heritage, being the place of discovery of the largest settlement belonging to the Cucuteni Culture from Eastern Europe (that led to the development of the great complex Cucuteni-Tripyllia). Therefore, identifying the areas susceptible to

  14. Model weights and the foundations of multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2006-01-01

    Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.

  15. Information-theoretic model selection and model averaging for closed-population capture-recapture studies

    USGS Publications Warehouse

    Stanley, T.R.; Burnham, K.P.

    1998-01-01

    Specification of an appropriate model is critical to valid stalistical inference. Given the "true model" for the data is unknown, the goal of model selection is to select a plausible approximating model that balances model bias and sampling variance. Model selection based on information criteria such as AIC or its variant AICc, or criteria like CAIC, has proven useful in a variety of contexts including the analysis of open-population capture-recapture data. These criteria have not been intensively evaluated for closed-population capture-recapture models, which are integer parameter models used to estimate population size (N), and there is concern that they will not perform well. To address this concern, we evaluated AIC, AICc, and CAIC model selection for closed-population capture-recapture models by empirically assessing the quality of inference for the population size parameter N. We found that AIC-, AICc-, and CAIC-selected models had smaller relative mean squared errors than randomly selected models, but that confidence interval coverage on N was poor unless unconditional variance estimates (which incorporate model uncertainty) were used to compute confidence intervals. Overall, AIC and AICc outperformed CAIC, and are preferred to CAIC for selection among the closed-population capture-recapture models we investigated. A model averaging approach to estimation, using AIC. AICc, or CAIC to estimate weights, was also investigated and proved superior to estimation using AIC-, AICc-, or CAIC-selected models. Our results suggested that, for model averaging, AIC or AICc. should be favored over CAIC for estimating weights.

  16. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  17. Time series ARIMA models for daily price of palm oil

    NASA Astrophysics Data System (ADS)

    Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu

    2015-02-01

    Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.

  18. An extended cure model and model selection.

    PubMed

    Peng, Yingwei; Xu, Jianfeng

    2012-04-01

    We propose a novel interpretation for a recently proposed Box-Cox transformation cure model, which leads to a natural extension of the cure model. Based on the extended model, we consider an important issue of model selection between the mixture cure model and the bounded cumulative hazard cure model via the likelihood ratio test, score test and Akaike's Information Criterion (AIC). Our empirical study shows that AIC is informative and both the score test and the likelihood ratio test have adequate power to differentiate between the mixture cure model and the bounded cumulative hazard cure model when the sample size is large. We apply the tests and AIC methods to leukemia and colon cancer data to examine the appropriateness of the cure models considered for them in the literature.

  19. Performance of soil particle-size distribution models for describing deposited soils adjacent to constructed dams in the China Loess Plateau

    NASA Astrophysics Data System (ADS)

    Zhao, Pei; Shao, Ming-an; Horton, Robert

    2011-02-01

    Soil particle-size distributions (PSD) have been used to estimate soil hydraulic properties. Various parametric PSD models have been proposed to describe the soil PSD from sparse experimental data. It is important to determine which PSD model best represents specific soils. Fourteen PSD models were examined in order to determine the best model for representing the deposited soils adjacent to dams in the China Loess Plateau; these were: Skaggs (S-1, S-2, and S-3), fractal (FR), Jaky (J), Lima and Silva (LS), Morgan (M), Gompertz (G), logarithm (L), exponential (E), log-exponential (LE), Weibull (W), van Genuchten type (VG) as well as Fredlund (F) models. Four-hundred and eighty samples were obtained from soils deposited in the Liudaogou catchment. The coefficient of determination (R 2), the Akaike's information criterion (AIC), and the modified AIC (mAIC) were used. Based upon R 2 and AIC, the three- and four-parameter models were both good at describing the PSDs of deposited soils, and the LE, FR, and E models were the poorest. However, the mAIC in conjunction with R 2 and AIC results indicated that the W model was optimum for describing PSD of the deposited soils for emphasizing the effect of parameter number. This analysis was also helpful for finding out which model is the best one. Our results are applicable to the China Loess Plateau.

  20. Model Selection for Geostatistical Models

    SciTech Connect

    Hoeting, Jennifer A.; Davis, Richard A.; Merton, Andrew A.; Thompson, Sandra E.

    2006-02-01

    We consider the problem of model selection for geospatial data. Spatial correlation is typically ignored in the selection of explanatory variables and this can influence model selection results. For example, the inclusion or exclusion of particular explanatory variables may not be apparent when spatial correlation is ignored. To address this problem, we consider the Akaike Information Criterion (AIC) as applied to a geostatistical model. We offer a heuristic derivation of the AIC in this context and provide simulation results that show that using AIC for a geostatistical model is superior to the often used approach of ignoring spatial correlation in the selection of explanatory variables. These ideas are further demonstrated via a model for lizard abundance. We also employ the principle of minimum description length (MDL) to variable selection for the geostatistical model. The effect of sampling design on the selection of explanatory covariates is also explored.

  1. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    Many statistical tools have been developed for evaluating, understanding, and comparing models, from both frequentist and Bayesian perspectives. In particular, the problem of model selection can be addressed according to whether the primary goal is explanation or, alternatively, prediction. In the former case, the criteria for model selection are defined over the parameter space whose physical interpretation can be difficult; in the latter case, they are defined over the space of the observations, which has a more direct physical meaning. In the frequentist approaches, model selection is generally based on an asymptotic approximation which may be poor for small data sets (e.g. the F-test, the Kolmogorov-Smirnov test, etc.); moreover, these methods often apply under specific assumptions on models (e.g. models have to be nested in the likelihood ratio test). In the Bayesian context, among the criteria for explanation, the ratio of the observed marginal densities for two competing models, named Bayes Factor (BF), is commonly used for both model choice and model averaging (Kass and Raftery, J. Am. Stat. Ass., 1995). But BF does not apply to improper priors and, even when the prior is proper, it is not robust to the specification of the prior. These limitations can be extended to two famous penalized likelihood methods as the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), since they are proved to be approximations of ‑2log BF . In the perspective that a model is as good as its predictions, the predictive information criteria aim at evaluating the predictive accuracy of Bayesian models or, in other words, at estimating expected out-of-sample prediction error using a bias-correction adjustment of within-sample error (Gelman et al., Stat. Comput., 2014). In particular, the Watanabe criterion is fully Bayesian because it averages the predictive distribution over the posterior distribution of parameters rather than conditioning on a point

  2. On the predictive information criteria for model determination in seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Varini, Elisa; Rotondi, Renata

    2016-04-01

    Many statistical tools have been developed for evaluating, understanding, and comparing models, from both frequentist and Bayesian perspectives. In particular, the problem of model selection can be addressed according to whether the primary goal is explanation or, alternatively, prediction. In the former case, the criteria for model selection are defined over the parameter space whose physical interpretation can be difficult; in the latter case, they are defined over the space of the observations, which has a more direct physical meaning. In the frequentist approaches, model selection is generally based on an asymptotic approximation which may be poor for small data sets (e.g. the F-test, the Kolmogorov-Smirnov test, etc.); moreover, these methods often apply under specific assumptions on models (e.g. models have to be nested in the likelihood ratio test). In the Bayesian context, among the criteria for explanation, the ratio of the observed marginal densities for two competing models, named Bayes Factor (BF), is commonly used for both model choice and model averaging (Kass and Raftery, J. Am. Stat. Ass., 1995). But BF does not apply to improper priors and, even when the prior is proper, it is not robust to the specification of the prior. These limitations can be extended to two famous penalized likelihood methods as the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC), since they are proved to be approximations of -2log BF . In the perspective that a model is as good as its predictions, the predictive information criteria aim at evaluating the predictive accuracy of Bayesian models or, in other words, at estimating expected out-of-sample prediction error using a bias-correction adjustment of within-sample error (Gelman et al., Stat. Comput., 2014). In particular, the Watanabe criterion is fully Bayesian because it averages the predictive distribution over the posterior distribution of parameters rather than conditioning on a point

  3. AICE Survey of USSR Air Pollution Literature, Volume 13: Technical Papers from the Leningrad International Symposium on the Meteorological Aspects of Atmospheric Pollution, Part 2.

    ERIC Educational Resources Information Center

    Nuttonson, M. Y., Ed.

    Twelve papers were translated from Russian: Automation of Information Processing Involved in Experimental Studies of Atmospheric Diffusion, Micrometeorological Characteristics of Atmospheric Pollution Conditions, Study of theInfluence of Irregularities of the Earth's Surface on the Air Flow Characteristics in a Wind Tunnel, Use of Parameters of…

  4. Polynomial order selection in random regression models via penalizing adaptively the likelihood.

    PubMed

    Corrales, J D; Munilla, S; Cantet, R J C

    2015-08-01

    Orthogonal Legendre polynomials (LP) are used to model the shape of additive genetic and permanent environmental effects in random regression models (RRM). Frequently, the Akaike (AIC) and the Bayesian (BIC) information criteria are employed to select LP order. However, it has been theoretically shown that neither AIC nor BIC is simultaneously optimal in terms of consistency and efficiency. Thus, the goal was to introduce a method, 'penalizing adaptively the likelihood' (PAL), as a criterion to select LP order in RRM. Four simulated data sets and real data (60,513 records, 6675 Colombian Holstein cows) were employed. Nested models were fitted to the data, and AIC, BIC and PAL were calculated for all of them. Results showed that PAL and BIC identified with probability of one the true LP order for the additive genetic and permanent environmental effects, but AIC tended to favour over parameterized models. Conversely, when the true model was unknown, PAL selected the best model with higher probability than AIC. In the latter case, BIC never favoured the best model. To summarize, PAL selected a correct model order regardless of whether the 'true' model was within the set of candidates.

  5. A hybrid model to simulate the annual runoff of the Kaidu River in northwest China

    NASA Astrophysics Data System (ADS)

    Xu, Jianhua; Chen, Yaning; Bai, Ling; Xu, Yiwen

    2016-04-01

    Fluctuant and complicated hydrological processes can result in the uncertainty of runoff forecasting. Thus, it is necessary to apply the multi-method integrated modeling approaches to simulate runoff. Integrating the ensemble empirical mode decomposition (EEMD), the back-propagation artificial neural network (BPANN) and the nonlinear regression equation, we put forward a hybrid model to simulate the annual runoff (AR) of the Kaidu River in northwest China. We also validate the simulated effects by using the coefficient of determination (R2) and the Akaike information criterion (AIC) based on the observed data from 1960 to 2012 at the Dashankou hydrological station. The average absolute and relative errors show the high simulation accuracy of the hybrid model. R2 and AIC both illustrate that the hybrid model has a much better performance than the single BPANN. The hybrid model and integrated approach elicited by this study can be applied to simulate the annual runoff of similar rivers in northwest China.

  6. Thermal signature identification system (TheSIS): a spread spectrum temperature cycling method

    NASA Astrophysics Data System (ADS)

    Merritt, Scott

    2015-03-01

    NASA GSFC's Thermal Signature Identification System (TheSIS) 1) measures the high order dynamic responses of optoelectronic components to direct sequence spread-spectrum temperature cycling, 2) estimates the parameters of multiple autoregressive moving average (ARMA) or other models the of the responses, 3) and selects the most appropriate model using the Akaike Information Criterion (AIC). Using the AIC-tested model and parameter vectors from TheSIS, one can 1) select high-performing components on a multivariate basis, i.e., with multivariate Figures of Merit (FOMs), 2) detect subtle reversible shifts in performance, and 3) investigate irreversible changes in component or subsystem performance, e.g. aging. We show examples of the TheSIS methodology for passive and active components and systems, e.g. fiber Bragg gratings (FBGs) and DFB lasers with coupled temperature control loops, respectively.

  7. Clinical-dosimetric relationship between lacrimal gland dose and ocular toxicity after intensity-modulated radiotherapy for sinonasal tumours

    PubMed Central

    Batth, S S; Sreeraman, R; Dienes, E; Beckett, L A; Daly, M E; Cui, J; Mathai, M; Purdy, J A

    2013-01-01

    Objective: To characterise the relationship between lacrimal gland dose and ocular toxicity among patients treated by intensity-modulated radiotherapy (IMRT) for sinonasal tumours. Methods: 40 patients with cancers involving the nasal cavity and paranasal sinuses were treated with IMRT to a median dose of 66.0 Gy. Toxicity was scored using the Radiation Therapy Oncology Group morbidity criteria based on conjunctivitis, corneal ulceration and keratitis. The paired lacrimal glands were contoured as organs at risk, and the mean dose, maximum dose, V10, V20 and V30 were determined. Statistical analysis was performed using logistic regression and the Akaike information criterion (AIC). Results: The maximum and mean dose to the ipsilateral lacrimal gland were 19.2 Gy (range, 1.4–75.4 Gy) and 14.5 Gy (range, 11.1–67.8 Gy), respectively. The mean V10, V20 and V30 values were 50%, 25% and 17%, respectively. The incidence of acute and late Grade 3+ toxicities was 23% and 19%, respectively. Based on logistic regression and AIC, the maximum dose to the ipsilateral lacrimal gland was identified as a more significant predictor of acute toxicity (AIC, 53.89) and late toxicity (AIC, 32.94) than the mean dose (AIC, 56.13 and 33.83, respectively). The V20 was identified as the most significant predictor of late toxicity (AIC, 26.81). Conclusion: A dose–response relationship between maximum dose to the lacrimal gland and ocular toxicity was established. Our data suggesting a threshold relationship may be useful in establishing dosimetric guidelines for IMRT planning that may decrease the risk of acute and late lacrimal toxicities in the future. Advances in knowledge: A threshold relationship between radiation dose to the lacrimal gland and ocular toxicity was demonstrated, which may aid in treatment planning and reducing the morbidity of radiotherapy for sinonasal tumours. PMID:24167183

  8. Rubber yield prediction by meteorological conditions using mixed models and multi-model inference techniques

    NASA Astrophysics Data System (ADS)

    Golbon, Reza; Ogutu, Joseph Ochieng; Cotter, Marc; Sauerborn, Joachim

    2015-12-01

    Linear mixed models were developed and used to predict rubber ( Hevea brasiliensis) yield based on meteorological conditions to which rubber trees had been exposed for periods ranging from 1 day to 2 months prior to tapping events. Predictors included a range of moving averages of meteorological covariates spanning different windows of time before the date of the tapping events. Serial autocorrelation in the latex yield measurements was accounted for using random effects and a spatial generalization of the autoregressive error covariance structure suited to data sampled at irregular time intervals. Information theoretics, specifically the Akaike information criterion (AIC), AIC corrected for small sample size (AICc), and Akaike weights, was used to select models with the greatest strength of support in the data from a set of competing candidate models. The predictive performance of the selected best model was evaluated using both leave-one-out cross-validation (LOOCV) and an independent test set. Moving averages of precipitation, minimum and maximum temperature, and maximum relative humidity with a 30-day lead period were identified as the best yield predictors. Prediction accuracy expressed in terms of the percentage of predictions within a measurement error of 5 g for cross-validation and also for the test dataset was above 99 %.

  9. Does weather confound or modify the association of particulate air pollution with mortality? An analysis of the Philadelphia data, 1973--1980

    SciTech Connect

    Samet, J.; Zeger, S.; Kelsall, J.; Xu, J.; Kalkstein, L.

    1998-04-01

    This report considers the consequences of using alternative approaches to controlling for weather and explores modification of air pollution effects by weather, as weather patterns could plausibly alter air pollution`s effect on health. The authors analyzed 1973--1980 total mortality data for Philadelphia using four weather models and compared estimates of the effects of TSP and SO{sub 2} on mortality using a Poisson regression model. Two synoptic categories developed by Kalkstein were selected--The Temporal Synoptic Index (TSI) and the Spatial Synoptic Classification (SSC)--and compared with (1) descriptive models developed by Schwartz and Dockery (S-D); and (2) LOESS, a nonparametric function of the previous day`s temperature and dew point. The authors considered model fit using Akaike`s Information Criterion (AIC) and changes in the estimated effects of TSP and SO{sub 2}. In the full-year analysis, S-D is better than LOESS at predicting mortality, and S-D and LOESS are better than TSI, as measured by AIC. When TSP or SO{sub 2} was fit alone, the results were qualitatively similar, regardless of how weather was controlled; when TSP and SO{sub 2} were fit simultaneously, the S-D and LOESS models give qualitatively different results than TSI, which attributes more of the pollution effect to SO{sub 2} than to TSP. Model fit is substantially poorer with TSI.

  10. Algorithm for systematic peak extraction from atomic pair distribution functions.

    PubMed

    Granlund, L; Billinge, S J L; Duxbury, P M

    2015-07-01

    The study presents an algorithm, ParSCAPE, for model-independent extraction of peak positions and intensities from atomic pair distribution functions (PDFs). It provides a statistically motivated method for determining parsimony of extracted peak models using the information-theoretic Akaike information criterion (AIC) applied to plausible models generated within an iterative framework of clustering and chi-square fitting. All parameters the algorithm uses are in principle known or estimable from experiment, though careful judgment must be applied when estimating the PDF baseline of nanostructured materials. ParSCAPE has been implemented in the Python program SrMise. Algorithm performance is examined on synchrotron X-ray PDFs of 16 bulk crystals and two nanoparticles using AIC-based multimodeling techniques, and particularly the impact of experimental uncertainties on extracted models. It is quite resistant to misidentification of spurious peaks coming from noise and termination effects, even in the absence of a constraining structural model. Structure solution from automatically extracted peaks using the Liga algorithm is demonstrated for 14 crystals and for C60. Special attention is given to the information content of the PDF, theory and practice of the AIC, as well as the algorithm's limitations. PMID:26131896

  11. [Species-abundance distribution patterns along succession series of Phyllostachys glauca forest in a limestone mountain].

    PubMed

    Shi, Jian-min; Fan, Cheng-fang; Liu, Yang; Yang, Qing-pei; Fang, Kai; Fan, Fang-li; Yang, Guang-yao

    2015-12-01

    To detect the ecological process of the succession series of Phyllostachys glauca forest in a limestone mountain, five niche models, i.e., broken stick model (BSM), niche preemption model (NPM), dominance preemption model (DPM), random assortment model (RAM) and overlap- ping niche model (ONM) were employed to describe the species-abundance distribution patterns (SDPs) of 15 samples. χ² test and Akaike information criterion (AIC) were used to test the fitting effects of the five models. The results showed that the optimal SDP models for P. glauca forest, bamboo-broadleaved mixed forest and broadleaved forest were DPM (χ² = 35.86, AIC = -69.77), NPM (χ² = 1.60, AIC = -94.68) and NPM (χ² = 0.35, AIC = -364.61), respectively. BSM also well fitted the SDP of bamboo-broadleaved mixed forest and broad-leaved forest, while it was unsuitable to describe the SDP of P. glauca forest. The fittings of RAM and ONM in the three forest types were all rejected by the χ² test and AIC. With the development of community succession from P. glauca forest to broadleaved forest, the species richness and evenness increased, and the optimal SDP model changed from DPM to NPM. It was inferred that the change of ecological process from habitat filtration to interspecific competition was the main driving force of the forest succession. The results also indicated that the application of multiple SDP models and test methods would be beneficial to select the best model and deeply understand the ecological process of community succession. PMID:27111994

  12. [Species-abundance distribution patterns along succession series of Phyllostachys glauca forest in a limestone mountain].

    PubMed

    Shi, Jian-min; Fan, Cheng-fang; Liu, Yang; Yang, Qing-pei; Fang, Kai; Fan, Fang-li; Yang, Guang-yao

    2015-12-01

    To detect the ecological process of the succession series of Phyllostachys glauca forest in a limestone mountain, five niche models, i.e., broken stick model (BSM), niche preemption model (NPM), dominance preemption model (DPM), random assortment model (RAM) and overlap- ping niche model (ONM) were employed to describe the species-abundance distribution patterns (SDPs) of 15 samples. χ² test and Akaike information criterion (AIC) were used to test the fitting effects of the five models. The results showed that the optimal SDP models for P. glauca forest, bamboo-broadleaved mixed forest and broadleaved forest were DPM (χ² = 35.86, AIC = -69.77), NPM (χ² = 1.60, AIC = -94.68) and NPM (χ² = 0.35, AIC = -364.61), respectively. BSM also well fitted the SDP of bamboo-broadleaved mixed forest and broad-leaved forest, while it was unsuitable to describe the SDP of P. glauca forest. The fittings of RAM and ONM in the three forest types were all rejected by the χ² test and AIC. With the development of community succession from P. glauca forest to broadleaved forest, the species richness and evenness increased, and the optimal SDP model changed from DPM to NPM. It was inferred that the change of ecological process from habitat filtration to interspecific competition was the main driving force of the forest succession. The results also indicated that the application of multiple SDP models and test methods would be beneficial to select the best model and deeply understand the ecological process of community succession.

  13. Measure the Semantic Similarity of GO Terms Using Aggregate Information Content.

    PubMed

    Song, Xuebo; Li, Lin; Srimani, Pradip K; Yu, Philip S; Wang, James Z

    2014-01-01

    The rapid development of gene ontology (GO) and huge amount of biomedical data annotated by GO terms necessitate computation of semantic similarity of GO terms and, in turn, measurement of functional similarity of genes based on their annotations. In this paper we propose a novel and efficient method to measure the semantic similarity of GO terms. The proposed method addresses the limitations in existing GO term similarity measurement techniques; it computes the semantic content of a GO term by considering the information content of all of its ancestor terms in the graph. The aggregate information content (AIC) of all ancestor terms of a GO term implicitly reflects the GO term's location in the GO graph and also represents how human beings use this GO term and all its ancestor terms to annotate genes. We show that semantic similarity of GO terms obtained by our method closely matches the human perception. Extensive experimental studies show that this novel method also outperforms all existing methods in terms of the correlation with gene expression data. We have developed web services for measuring semantic similarity of GO terms and functional similarity of genes using the proposed AIC method and other popular methods. These web services are available at http://bioinformatics.clemson.edu/G-SESAME.

  14. Measure the Semantic Similarity of GO terms Using Aggregate Information Content.

    PubMed

    Song, Xuebo; Li, Lin; Srimani, Pradip K; Yu, Philip S; Wang, James Z

    2013-12-11

    The rapid development of Gene Ontology (GO) and huge amount of biomedical data annotated by GO terms necessitate computation of semantic similarity of GO terms and, in turn, measurement of functional similarity of genes based on their annotations. In this paper we propose a novel and efficient method to measure the semantic similarity of GO terms. The proposed method addresses the limitations in existing GO term similarity measurement techniques; it computes the semantic content of a GO term by considering the information content of all of its ancestor terms in the graph. The aggregate information content (AIC) of all ancestor terms of a GO term implicitly reflects the GO term's location in the GO graph and also represents how human beings use this GO term and all its ancestor terms to annotate genes. We show that semantic similarity of GO terms obtained by our method closely matches the human perception. Extensive experimental studies show that this novel method also outperforms all existing methods in terms of the correlation with gene expression data. We have developed Web services for measuring semantic similarity of GO terms and functional similarity of genes using the proposed AIC method and other popular methods. These Web services are available at http://bioinformatics.clemson.edu/G-SESAME.

  15. Measure the Semantic Similarity of GO Terms Using Aggregate Information Content.

    PubMed

    Song, Xuebo; Li, Lin; Srimani, Pradip K; Yu, Philip S; Wang, James Z

    2014-01-01

    The rapid development of gene ontology (GO) and huge amount of biomedical data annotated by GO terms necessitate computation of semantic similarity of GO terms and, in turn, measurement of functional similarity of genes based on their annotations. In this paper we propose a novel and efficient method to measure the semantic similarity of GO terms. The proposed method addresses the limitations in existing GO term similarity measurement techniques; it computes the semantic content of a GO term by considering the information content of all of its ancestor terms in the graph. The aggregate information content (AIC) of all ancestor terms of a GO term implicitly reflects the GO term's location in the GO graph and also represents how human beings use this GO term and all its ancestor terms to annotate genes. We show that semantic similarity of GO terms obtained by our method closely matches the human perception. Extensive experimental studies show that this novel method also outperforms all existing methods in terms of the correlation with gene expression data. We have developed web services for measuring semantic similarity of GO terms and functional similarity of genes using the proposed AIC method and other popular methods. These web services are available at http://bioinformatics.clemson.edu/G-SESAME. PMID:26356015

  16. Relating body condition to inorganic contaminant concentrations of diving ducks wintering in coastal California.

    PubMed

    Takekawa, J Y; Wainwright-De La Cruz, S E; Hothem, R L; Yee, J

    2002-01-01

    In wild waterfowl, poor winter body condition may negatively affect migration, survival, and reproduction. Environmental contaminants have been shown to adversely affect the body condition of captive birds, but few field studies have examined body condition and contaminants in wild birds during the winter. We assessed the body condition of carcasses from a collection of canvasbacks (Aythya valisineria) and lesser (A. affinis) and greater scaup (A. marila) wintering in coastal California. We used Akaike information criterion (AIC) to select the model with the best balance of parsimony and goodness of fit that related indices of body condition with concentrations of Cd, Cu, Hg, Se, and Zn. Total ash-free protein in canvasbacks decreased with increasing Se concentrations, and pancreas mass decreased with increasing Hg. We combined the closely related lesser and greater scaup in analyses and found that total carcass fat, pancreas mass, and carcass mass decreased with increasing Zn concentrations, and pancreas mass decreased with increasing Hg. Our AIC analysis indicated that some indices of body condition in diving ducks were inversely related to some environmental contaminants in this collection, but additional AIC analyses should be conducted across a wider range of contaminant concentrations to corroborate our findings. PMID:11706369

  17. Relating body condition to inorganic contaminant concentrations of diving ducks wintering in coastal California

    USGS Publications Warehouse

    Takekawa, J.Y.; Wainwright-De La Cruz, S.E.; Hothem, R.L.; Yee, J.

    2002-01-01

    In wild waterfowl, poor winter body condition may negatively affect migration, survival, and reproduction. Environmental contaminants have been shown to adversely affect the body condition of captive birds, but few field studies have examined body condition and contaminants in wild birds during the winter. We assessed the body condition of carcasses from a collection of canvasbacks (Aythya valisineria) and lesser (A. affinis) and greater scaup (A. marila) wintering in coastal California. We used Akaike information criterion (AIC) to select the model with the best balance of parsimony and goodness of fit that related indices of body condition with concentrations of Cd, Cu, Hg, Se, and Zn. Total ash-free protein in canvasbacks decreased with increasing Se concentrations, and pancreas mass decreased with increasing Hg. We combined the closely related lesser and greater scaup in analyses and found that total carcass fat, pancreas mass, and carcass mass decreased with increasing Zn concentrations, and pancreas mass decreased with increasing Hg. Our AIC analysis indicated that some indices of body condition in diving ducks were inversely related to some environmental contaminants in this collection, but additional AIC analyses should be conducted across a wider range of contaminant concentrations to corroborate our findings.

  18. Microbial functional diversity enhances predictive models linking environmental parameters to ecosystem properties.

    PubMed

    Powell, Jeff R; Welsh, Allana; Hallin, Sara

    2015-07-01

    Microorganisms drive biogeochemical processes, but linking these processes to real changes in microbial communities under field conditions is not trivial. Here, we present a model-based approach to estimate independent contributions of microbial community shifts to ecosystem properties. The approach was tested empirically, using denitrification potential as our model process, in a spatial survey of arable land encompassing a range of edaphic conditions and two agricultural production systems. Soil nitrate was the most important single predictor of denitrification potential (the change in Akaike's information criterion, corrected for sample size, ΔAIC(c) = 20.29); however, the inclusion of biotic variables (particularly the evenness and size of denitrifier communities [ΔAIC(c) = 12.02], and the abundance of one denitrifier genotype [ΔAIC(c) = 18.04]) had a substantial effect on model precision, comparable to the inclusion of abiotic variables (biotic R2 = 0.28, abiotic R2 = 0.50, biotic + abiotic R2 = 0.76). This approach provides a valuable tool for explicitly linking microbial communities to ecosystem functioning. By making this link, we have demonstrated that including aspects of microbial community structure and diversity in biogeochemical models can improve predictions of nutrient cycling in ecosystems and enhance our understanding of ecosystem functionality.

  19. Comparing Smoothing Techniques for Fitting the Nonlinear Effect of Covariate in Cox Models

    PubMed Central

    Roshani, Daem; Ghaderi, Ebrahim

    2016-01-01

    Background and Objective: Cox model is a popular model in survival analysis, which assumes linearity of the covariate on the log hazard function, While continuous covariates can affect the hazard through more complicated nonlinear functional forms and therefore, Cox models with continuous covariates are prone to misspecification due to not fitting the correct functional form for continuous covariates. In this study, a smooth nonlinear covariate effect would be approximated by different spline functions. Material and Methods: We applied three flexible nonparametric smoothing techniques for nonlinear covariate effect in the Cox models: penalized splines, restricted cubic splines and natural splines. Akaike information criterion (AIC) and degrees of freedom were used to smoothing parameter selection in penalized splines model. The ability of nonparametric methods was evaluated to recover the true functional form of linear, quadratic and nonlinear functions, using different simulated sample sizes. Data analysis was carried out using R 2.11.0 software and significant levels were considered 0.05. Results: Based on AIC, the penalized spline method had consistently lower mean square error compared to others to selection of smoothed parameter. The same result was obtained with real data. Conclusion: Penalized spline smoothing method, with AIC to smoothing parameter selection, was more accurate in evaluate of relation between covariate and log hazard function than other methods. PMID:27041809

  20. The optimum order of a Markov chain model for daily rainfall in Nigeria

    NASA Astrophysics Data System (ADS)

    Jimoh, O. D.; Webster, P.

    1996-11-01

    Markov type models are often used to describe the occurrence of daily rainfall. Although models of Order 1 have been successfully employed, there remains uncertainty concerning the optimum order for such models. This paper is concerned with estimation of the optimum order of Markov chains and, in particular, the use of objective criteria of the Akaike and Bayesian Information Criteria (AIC and BIC, respectively). Using daily rainfall series for five stations in Nigeria, it has been found that the AIC and BIC estimates vary with month as well as the value of the rainfall threshold used to define a wet day. There is no apparent system to this variation, although AIC estimates are consistently greater than or equal to BIC estimates, with values of the latter limited to zero or unity. The optimum order is also investigated through generation of synthetic sequences of wet and dry days using the transition matrices of zero-, first- and second-order Markov chains. It was found that the first-order model is superior to the zero-order model in representing the characteristics of the historical sequence as judged using frequency duration curves. There was no discernible difference between the model performance for first- and second-order models. There was no seasonal varation in the model performance, which contrasts with the optimum models identified using AIC and BIC estimates. It is concluded that caution is needed with the use of objective criteria for determining the optimum order of the Markov model and that the use of frequency duration curves can provide a robust alternative method of model identification. Comments are also made on the importance of record length and non-stationarity for model identification

  1. Power-law ansatz in complex systems: Excessive loss of information.

    PubMed

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring. PMID:26764792

  2. Power-law ansatz in complex systems: Excessive loss of information.

    PubMed

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.

  3. Power-law ansatz in complex systems: Excessive loss of information

    NASA Astrophysics Data System (ADS)

    Tsai, Sun-Ting; Chang, Chin-De; Chang, Ching-Hao; Tsai, Meng-Xue; Hsu, Nan-Jung; Hong, Tzay-Ming

    2015-12-01

    The ubiquity of power-law relations in empirical data displays physicists' love of simple laws and uncovering common causes among seemingly unrelated phenomena. However, many reported power laws lack statistical support and mechanistic backings, not to mention discrepancies with real data are often explained away as corrections due to finite size or other variables. We propose a simple experiment and rigorous statistical procedures to look into these issues. Making use of the fact that the occurrence rate and pulse intensity of crumple sound obey a power law with an exponent that varies with material, we simulate a complex system with two driving mechanisms by crumpling two different sheets together. The probability function of the crumple sound is found to transit from two power-law terms to a bona fide power law as compaction increases. In addition to showing the vicinity of these two distributions in the phase space, this observation nicely demonstrates the effect of interactions to bring about a subtle change in macroscopic behavior and more information may be retrieved if the data are subject to sorting. Our analyses are based on the Akaike information criterion that is a direct measurement of information loss and emphasizes the need to strike a balance between model simplicity and goodness of fit. As a show of force, the Akaike information criterion also found the Gutenberg-Richter law for earthquakes and the scale-free model for a brain functional network, a two-dimensional sandpile, and solar flare intensity to suffer an excessive loss of information. They resemble more the crumpled-together ball at low compactions in that there appear to be two driving mechanisms that take turns occurring.

  4. Spot counting on fluorescence in situ hybridization in suspension images using Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Liu, Sijia; Sa, Ruhan; Maguire, Orla; Minderman, Hans; Chaudhary, Vipin

    2015-03-01

    Cytogenetic abnormalities are important diagnostic and prognostic criteria for acute myeloid leukemia (AML). A flow cytometry-based imaging approach for FISH in suspension (FISH-IS) was established that enables the automated analysis of several log-magnitude higher number of cells compared to the microscopy-based approaches. The rotational positioning can occur leading to discordance between spot count. As a solution of counting error from overlapping spots, in this study, a Gaussian Mixture Model based classification method is proposed. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) of GMM are used as global image features of this classification method. Via Random Forest classifier, the result shows that the proposed method is able to detect closely overlapping spots which cannot be separated by existing image segmentation based spot detection methods. The experiment results show that by the proposed method we can obtain a significant improvement in spot counting accuracy.

  5. Pharmacokinetic analysis of tissue microcirculation using nested models: multimodel inference and parameter identifiability.

    PubMed

    Brix, Gunnar; Zwick, Stefan; Kiessling, Fabian; Griebel, Jürgen

    2009-07-01

    The purpose of this study is to evaluate the identifiability of physiological tissue parameters by pharmacokinetic modeling of concentration-time curves derived under conditions that are realistic for dynamic-contrast-enhanced (DCE) imaging and to assess the information-theoretic approach of multimodel inference using nested models. Tissue curves with a realistic noise level were simulated by means of an axially distributed multipath reference model using typical values reported in literature on plasma flow, permeability-surface area product, and volume fractions of the intravascular and interstitial space. The simulated curves were subsequently analyzed by a two-compartment model containing these physiological quantities as fit parameters as well as by two reduced models with only three and two parameters formulated for the case of a permeability-limited and a flow-limited scenario, respectively. The competing models were ranked according to Akaike's information criterion (AIC), balancing the bias versus variance trade-off. To utilize the information available from all three models, model-averaged parameters were estimated using Akaike weights that quantify the relative strength of evidence in favor of each model. As compared to the full model, the reduced models yielded equivalent or even superior AIC values for scenarios where the structural information in the tissue curves on either the plasma flow or the capillary permeability was limited. Multimodel inference took effect to a considerable extent in half of the curves and improved the precision of the estimated tissue parameters. As theoretically expected, the plasma flow was subject to a systematic (but largely correctable) overestimation, whereas the other three physiological tissue parameters could be determined in a numerically robust and almost unbiased manner. The presented concept of pharmacokinetic analysis of noisy DCE data using three nested models under an information-theoretic paradigm offers

  6. Hardware-Algorithms Co-Design and Implementation of an Analog-to-Information Converter for Biosignals Based on Compressed Sensing.

    PubMed

    Pareschi, Fabio; Albertini, Pierluigi; Frattini, Giovanni; Mangia, Mauro; Rovatti, Riccardo; Setti, Gianluca

    2016-02-01

    We report the design and implementation of an Analog-to-Information Converter (AIC) based on Compressed Sensing (CS). The system is realized in a CMOS 180 nm technology and targets the acquisition of bio-signals with Nyquist frequency up to 100 kHz. To maximize performance and reduce hardware complexity, we co-design hardware together with acquisition and reconstruction algorithms. The resulting AIC outperforms previously proposed solutions mainly thanks to two key features. First, we adopt a novel method to deal with saturations in the computation of CS measurements. This allows no loss in performance even when 60% of measurements saturate. Second, the system is able to adapt itself to the energy distribution of the input by exploiting the so-called rakeness to maximize the amount of information contained in the measurements. With this approach, the 16 measurement channels integrated into a single device are expected to allow the acquisition and the correct reconstruction of most biomedical signals. As a case study, measurements on real electrocardiograms (ECGs) and electromyograms (EMGs) show signals that these can be reconstructed without any noticeable degradation with a compression rate, respectively, of 8 and 10.

  7. EFFECT OF DIET QUALITY ON NUTRIENT ALLOCATION TO THE TEST AND ARISTOTLE'S LANTERN IN THE SEA URCHIN LYTECHINUS VARIEGATUS (LAMARCK, 1816).

    PubMed

    Heflin, Laura Elizabeth; Gibbs, Victoria K; Powell, Mickie L; Makowsky, Robert; Lawrence, Addison L; Lawrence, John M

    2012-08-01

    Small adult (19.50 ± 2.01g wet weight) Lytechinus variegatus were fed eight formulated diets with different protein (12 to 36% dry weight as fed) and carbohydrate (21 to 39 % dry weight) levels. Each sea urchin (n = 8 per treatment) was fed a daily ration of 1.5% of the average body weight of all individuals for 9 weeks. Akaike information criterion scores were used to compare six different dietary composition hypotheses for eight growth measurements. For each physical growth response, different mathematical models representing a priori hypotheses were compared using the Akaike Information Criterion (AIC) score. The AIC is one of many information-theoretic approaches that allows for direct comparison of non-nested models with varying number of parameters. Dietary protein level and protein: energy ratio were the best models for prediction of test diameter increase. Dietary protein level was the best model of test with spines wet weight gain and test with spines dry matter production. When the Aristotle's lantern was corrected for size of the test, there was an inverse relationship with dietary protein level. Log transformed lantern to test with spines index was also best associated with the dietary protein model. Dietary carbohydrate level was a poor predictor for growth parameters. However, the protein × carbohydrate interaction model was the best model of organic content (% dry weight) of the test without spines. These data suggest that there is a differential allocation of resources when dietary protein is limiting and the test with spines, but not the Aristotle's lantern, is affected by availability of dietary nutrients.

  8. Method for identifying electromagnetically induced transparency in a tunable circuit quantum electrodynamics system

    NASA Astrophysics Data System (ADS)

    Liu, Qi-Chun; Li, Tie-Fu; Luo, Xiao-Qing; Zhao, Hu; Xiong, Wei; Zhang, Ying-Shan; Chen, Zhen; Liu, J. S.; Chen, Wei; Nori, Franco; Tsai, J. S.; You, J. Q.

    2016-05-01

    Electromagnetically induced transparency (EIT) has been realized in atomic systems, but fulfilling the EIT conditions for artificial atoms made from superconducting circuits is a more difficult task. Here we report an experimental observation of the EIT in a tunable three-dimensional transmon by probing the cavity transmission. To fulfill the EIT conditions, we tune the transmon to adjust its damping rates by utilizing the effect of the cavity on the transmon states. From the experimental observations, we clearly identify the EIT and Autler-Townes splitting (ATS) regimes as well as the transition regime in between. Also, the experimental data demonstrate that the threshold ΩAIC determined by the Akaike information criterion can describe the EIT-ATS transition better than the threshold ΩEIT given by the EIT theory.

  9. Thermal Signature Identification System (TheSIS)

    NASA Technical Reports Server (NTRS)

    Merritt, Scott; Bean, Brian

    2015-01-01

    We characterize both nonlinear and high order linear responses of fiber-optic and optoelectronic components using spread spectrum temperature cycling methods. This Thermal Signature Identification System (TheSIS) provides much more detail than conventional narrowband or quasi-static temperature profiling methods. This detail allows us to match components more thoroughly, detect subtle reversible shifts in performance, and investigate the cause of instabilities or irreversible changes. In particular, we create parameterized models of athermal fiber Bragg gratings (FBGs), delay line interferometers (DLIs), and distributed feedback (DFB) lasers, then subject the alternative models to selection via the Akaike Information Criterion (AIC). Detailed pairing of components, e.g. FBGs, is accomplished by means of weighted distance metrics or norms, rather than on the basis of a single parameter, such as center wavelength.

  10. Particle-Size Distribution Models for the Conversion of Chinese Data to FAO/USDA System

    PubMed Central

    Dai, YongJiu; García-Gutiérrez, Carlos; Yuan, Hua

    2014-01-01

    We investigated eleven particle-size distribution (PSD) models to determine the appropriate models for describing the PSDs of 16349 Chinese soil samples. These data are based on three soil texture classification schemes, including one ISSS (International Society of Soil Science) scheme with four data points and two Katschinski's schemes with five and six data points, respectively. The adjusted coefficient of determination r 2, Akaike's information criterion (AIC), and geometric mean error ratio (GMER) were used to evaluate the model performance. The soil data were converted to the USDA (United States Department of Agriculture) standard using PSD models and the fractal concept. The performance of PSD models was affected by soil texture and classification of fraction schemes. The performance of PSD models also varied with clay content of soils. The Anderson, Fredlund, modified logistic growth, Skaggs, and Weilbull models were the best. PMID:25121108

  11. Bivariate copula in fitting rainfall data

    NASA Astrophysics Data System (ADS)

    Yee, Kong Ching; Suhaila, Jamaludin; Yusof, Fadhilah; Mean, Foo Hui

    2014-07-01

    The usage of copula to determine the joint distribution between two variables is widely used in various areas. The joint distribution of rainfall characteristic obtained using the copula model is more ideal than the standard bivariate modelling where copula is belief to have overcome some limitation. Six copula models will be applied to obtain the most suitable bivariate distribution between two rain gauge stations. The copula models are Ali-Mikhail-Haq (AMH), Clayton, Frank, Galambos, Gumbel-Hoogaurd (GH) and Plackett. The rainfall data used in the study is selected from rain gauge stations which are located in the southern part of Peninsular Malaysia, during the period from 1980 to 2011. The goodness-of-fit test in this study is based on the Akaike information criterion (AIC).

  12. A K-BKZ Formulation for Soft-Tissue Viscoelasticity

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.; Diethelm, Kai

    2005-01-01

    A viscoelastic model of the K-BKZ (Kaye 1962; Bernstein et al. 1963) type is developed for isotropic biological tissues, and applied to the fat pad of the human heel. To facilitate this pursuit, a class of elastic solids is introduced through a novel strain-energy function whose elements possess strong ellipticity, and therefore lead to stable material models. The standard fractional-order viscoelastic (FOV) solid is used to arrive at the overall elastic/viscoelastic structure of the model, while the elastic potential via the K-BKZ hypothesis is used to arrive at the tensorial structure of the model. Candidate sets of functions are proposed for the elastic and viscoelastic material functions present in the model, including a regularized fractional derivative that was determined to be the best. The Akaike information criterion (AIC) is advocated for performing multi-model inference, enabling an objective selection of the best material function from within a candidate set.

  13. Cluster Analysis and Gaussian Mixture Estimation of Correlated Time-Series by Means of Multi-dimensional Scaling

    NASA Astrophysics Data System (ADS)

    Ibuki, Takero; Suzuki, Sei; Inoue, Jun-ichi

    We investigate cross-correlations between typical Japanese stocks collected through Yahoo!Japan website ( http://finance.yahoo.co.jp/ ). By making use of multi-dimensional scaling (MDS) for the cross-correlation matrices, we draw two-dimensional scattered plots in which each point corresponds to each stock. To make a clustering for these data plots, we utilize the mixture of Gaussians to fit the data set to several Gaussian densities. By minimizing the so-called Akaike Information Criterion (AIC) with respect to parameters in the mixture, we attempt to specify the best possible mixture of Gaussians. It might be naturally assumed that all the two-dimensional data points of stocks shrink into a single small region when some economic crisis takes place. The justification of this assumption is numerically checked for the empirical Japanese stock data, for instance, those around 11 March 2011.

  14. Scaling cosmology with variable dark-energy equation of state

    SciTech Connect

    Castro, David R.; Velten, Hermano; Zimdahl, Winfried E-mail: velten@physik.uni-bielefeld.de

    2012-06-01

    Interactions between dark matter and dark energy which result in a power-law behavior (with respect to the cosmic scale factor) of the ratio between the energy densities of the dark components (thus generalizing the ΛCDM model) have been considered as an attempt to alleviate the cosmic coincidence problem phenomenologically. We generalize this approach by allowing for a variable equation of state for the dark energy within the CPL-parametrization. Based on analytic solutions for the Hubble rate and using the Constitution and Union2 SNIa sets, we present a statistical analysis and classify different interacting and non-interacting models according to the Akaike (AIC) and the Bayesian (BIC) information criteria. We do not find noticeable evidence for an alleviation of the coincidence problem with the mentioned type of interaction.

  15. MMI: Multimodel inference or models with management implications?

    USGS Publications Warehouse

    Fieberg, J.; Johnson, Douglas H.

    2015-01-01

    We consider a variety of regression modeling strategies for analyzing observational data associated with typical wildlife studies, including all subsets and stepwise regression, a single full model, and Akaike's Information Criterion (AIC)-based multimodel inference. Although there are advantages and disadvantages to each approach, we suggest that there is no unique best way to analyze data. Further, we argue that, although multimodel inference can be useful in natural resource management, the importance of considering causality and accurately estimating effect sizes is greater than simply considering a variety of models. Determining causation is far more valuable than simply indicating how the response variable and explanatory variables covaried within a data set, especially when the data set did not arise from a controlled experiment. Understanding the causal mechanism will provide much better predictions beyond the range of data observed. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  16. Kinetics of Methane Production from Swine Manure and Buffalo Manure.

    PubMed

    Sun, Chen; Cao, Weixing; Liu, Ronghou

    2015-10-01

    The degradation kinetics of swine and buffalo manure for methane production was investigated. Six kinetic models were employed to describe the corresponding experimental data. These models were evaluated by two statistical measurements, which were root mean square prediction error (RMSPE) and Akaike's information criterion (AIC). The results showed that the logistic and Fitzhugh models could predict the experimental data very well for the digestion of swine and buffalo manure, respectively. The predicted methane yield potential for swine and buffalo manure was 487.9 and 340.4 mL CH4/g volatile solid (VS), respectively, which was close to experimental values, when the digestion temperature was 36 ± 1 °C in the biochemical methane potential assays. Besides, the rate constant revealed that swine manure had a much faster methane production rate than buffalo manure.

  17. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  18. Bayesian Decision Tree for the Classification of the Mode of Motion in Single-Molecule Trajectories

    PubMed Central

    Türkcan, Silvan; Masson, Jean-Baptiste

    2013-01-01

    Membrane proteins move in heterogeneous environments with spatially (sometimes temporally) varying friction and with biochemical interactions with various partners. It is important to reliably distinguish different modes of motion to improve our knowledge of the membrane architecture and to understand the nature of interactions between membrane proteins and their environments. Here, we present an analysis technique for single molecule tracking (SMT) trajectories that can determine the preferred model of motion that best matches observed trajectories. The method is based on Bayesian inference to calculate the posteriori probability of an observed trajectory according to a certain model. Information theory criteria, such as the Bayesian information criterion (BIC), the Akaike information criterion (AIC), and modified AIC (AICc), are used to select the preferred model. The considered group of models includes free Brownian motion, and confined motion in 2nd or 4th order potentials. We determine the best information criteria for classifying trajectories. We tested its limits through simulations matching large sets of experimental conditions and we built a decision tree. This decision tree first uses the BIC to distinguish between free Brownian motion and confined motion. In a second step, it classifies the confining potential further using the AIC. We apply the method to experimental Clostridium Perfingens -toxin (CPT) receptor trajectories to show that these receptors are confined by a spring-like potential. An adaptation of this technique was applied on a sliding window in the temporal dimension along the trajectory. We applied this adaptation to experimental CPT trajectories that lose confinement due to disaggregation of confining domains. This new technique adds another dimension to the discussion of SMT data. The mode of motion of a receptor might hold more biologically relevant information than the diffusion coefficient or domain size and may be a better tool to

  19. Empirical extensions of the lasso penalty to reduce the false discovery rate in high-dimensional Cox regression models.

    PubMed

    Ternès, Nils; Rotolo, Federico; Michiels, Stefan

    2016-07-10

    Correct selection of prognostic biomarkers among multiple candidates is becoming increasingly challenging as the dimensionality of biological data becomes higher. Therefore, minimizing the false discovery rate (FDR) is of primary importance, while a low false negative rate (FNR) is a complementary measure. The lasso is a popular selection method in Cox regression, but its results depend heavily on the penalty parameter λ. Usually, λ is chosen using maximum cross-validated log-likelihood (max-cvl). However, this method has often a very high FDR. We review methods for a more conservative choice of λ. We propose an empirical extension of the cvl by adding a penalization term, which trades off between the goodness-of-fit and the parsimony of the model, leading to the selection of fewer biomarkers and, as we show, to the reduction of the FDR without large increase in FNR. We conducted a simulation study considering null and moderately sparse alternative scenarios and compared our approach with the standard lasso and 10 other competitors: Akaike information criterion (AIC), corrected AIC, Bayesian information criterion (BIC), extended BIC, Hannan and Quinn information criterion (HQIC), risk information criterion (RIC), one-standard-error rule, adaptive lasso, stability selection, and percentile lasso. Our extension achieved the best compromise across all the scenarios between a reduction of the FDR and a limited raise of the FNR, followed by the AIC, the RIC, and the adaptive lasso, which performed well in some settings. We illustrate the methods using gene expression data of 523 breast cancer patients. In conclusion, we propose to apply our extension to the lasso whenever a stringent FDR with a limited FNR is targeted. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Comparative Study of Staging Systems for Hepatocellular Carcinoma in 428 Patients Treated with Radioembolization

    PubMed Central

    Memon, Khairuddin; Kulik, Laura M.; Lewandowski, Robert J.; Wang, Edward; Wang, Jonathan; Ryu, Robert K.; Hickey, Ryan; Vouche, Michael; Baker, Talia; Ganger, Daniel; Gates, Vanessa L; Habib, Ali; Mulcahy, Mary F.; Salem, Riad

    2014-01-01

    Purpose To compare the utility of different staging systems and analyzed independent predictors of survival in patients with hepatocellular carcinoma (HCC) treated with 90Y radioembolization. Materials and Methods 428 HCC patients were treated with 90Y from 2004-2011. All patients were staged prospectively by Child-Turcotte-Pugh[CTP], United Network for Organ Sharing, Barcelona Clinic Liver Cancer [BCLC], Okuda classification, Cancer of the Liver Italian Program [CLIP], Groupe d'Etude et de Traitement du Carcinome Hepatocellulaire, Chinese University Prognostic Index and the Japan Integrated System; their ability to predict survival was assessed. Staging systems were compared using cox-regression model, linear trend test, Akaike information criterion (AIC) and Concordance Index (C-index). Uni/Multivariate analyses were employed to assess independent predictors of survival. Results When tested independently, all staging systems provided significant ability to discriminate early (long survival) from advanced disease (worse survival). CLIP provided the most accurate information in predicting survival outcomes (AIC=2993, C-index=0.8503); CTP was least informative (AIC=3074, C-index=0.6445). Independent predictors of survival included ECOG 0 (HR:0.56, CI:0.34-0.93); non-infiltrative tumors (HR:0.62, CI:0.44-0.89); absence of portal venous thrombosis (HR:0.60, CI:0.40-0.89); absence of ascites (HR:0.56, CI:0.40-0.76); albumin ≥2.8 g/dL (HR:0.72, CI:0.55-0.94); alkaline phosphatase ≤200 U/L (HR:0.68, CI:0.50-0.92); and AFP ≤200 ng/mL (HR:0.67, CI:0.51-0.86). Conclusion CLIP was most accurate in predicting HCC survival. Given that not all patients receive the recommended BCLC treatment strategy, this information is relevant for clinical trial design and predicting long-term outcomes following 90Y. PMID:24613269

  1. Assessment of non-Gaussian diffusion with singly and doubly stretched biexponential models of diffusion-weighted MRI (DWI) signal attenuation in prostate tissue.

    PubMed

    Hall, Matt G; Bongers, Andre; Sved, Paul; Watson, Geoffrey; Bourne, Roger M

    2015-04-01

    Non-Gaussian diffusion dynamics was investigated in the two distinct water populations identified by a biexponential model of diffusion in prostate tissue. Diffusion-weighted MRI (DWI) signal attenuation was measured ex vivo in two formalin-fixed prostates at 9.4 T with diffusion times Δ = 10, 20 and 40 ms, and b values in the range 0.017-8.2 ms/µm(2) . A conventional biexponential model was compared with models in which either the lower diffusivity component or both of the components of the biexponential were stretched. Models were compared using Akaike's Information Criterion (AIC) and a leave-one-out (LOO) test of model prediction accuracy. The doubly stretched (SS) model had the highest LOO prediction accuracy and lowest AIC (highest information content) in the majority of voxels at Δ = 10 and 20 ms. The lower diffusivity stretching factor (α2 ) of the SS model was consistently lower (range ~0.3-0.9) than the higher diffusivity stretching factor (α1 , range ~0.7-1.1), indicating a high degree of diffusion heterogeneity in the lower diffusivity environment, and nearly Gaussian diffusion in the higher diffusivity environment. Stretched biexponential models demonstrate that, in prostate tissue, the two distinct water populations identified by the simple biexponential model individually exhibit non-Gaussian diffusion dynamics.

  2. The cross-validated AUC for MCP-logistic regression with high-dimensional data.

    PubMed

    Jiang, Dingfeng; Huang, Jian; Zhang, Ying

    2013-10-01

    We propose a cross-validated area under the receiving operator characteristic (ROC) curve (CV-AUC) criterion for tuning parameter selection for penalized methods in sparse, high-dimensional logistic regression models. We use this criterion in combination with the minimax concave penalty (MCP) method for variable selection. The CV-AUC criterion is specifically designed for optimizing the classification performance for binary outcome data. To implement the proposed approach, we derive an efficient coordinate descent algorithm to compute the MCP-logistic regression solution surface. Simulation studies are conducted to evaluate the finite sample performance of the proposed method and its comparison with the existing methods including the Akaike information criterion (AIC), Bayesian information criterion (BIC) or Extended BIC (EBIC). The model selected based on the CV-AUC criterion tends to have a larger predictive AUC and smaller classification error than those with tuning parameters selected using the AIC, BIC or EBIC. We illustrate the application of the MCP-logistic regression with the CV-AUC criterion on three microarray datasets from the studies that attempt to identify genes related to cancers. Our simulation studies and data examples demonstrate that the CV-AUC is an attractive method for tuning parameter selection for penalized methods in high-dimensional logistic regression models.

  3. The evaluation of different forest structural indices to predict the stand aboveground biomass of even-aged Scotch pine (Pinus sylvestris L.) forests in Kunduz, Northern Turkey.

    PubMed

    Ercanli, İlker; Kahriman, Aydın

    2015-03-01

    We assessed the effect of stand structural diversity, including the Shannon, improved Shannon, Simpson, McIntosh, Margelef, and Berger-Parker indices, on stand aboveground biomass (AGB) and developed statistical prediction models for the stand AGB values, including stand structural diversity indices and some stand attributes. The AGB prediction model, including only stand attributes, accounted for 85 % of the total variance in AGB (R (2)) with an Akaike's information criterion (AIC) of 807.2407, Bayesian information criterion (BIC) of 809.5397, Schwarz Bayesian criterion (SBC) of 818.0426, and root mean square error (RMSE) of 38.529 Mg. After inclusion of the stand structural diversity into the model structure, considerable improvement was observed in statistical accuracy, including 97.5 % of the total variance in AGB, with an AIC of 614.1819, BIC of 617.1242, SBC of 633.0853, and RMSE of 15.8153 Mg. The predictive fitting results indicate that some indices describing the stand structural diversity can be employed as significant independent variables to predict the AGB production of the Scotch pine stand. Further, including the stand diversity indices in the AGB prediction model with the stand attributes provided important predictive contributions in estimating the total variance in AGB.

  4. Nested Sampling for Bayesian Model Comparison in the Context of Salmonella Disease Dynamics

    PubMed Central

    Dybowski, Richard; McKinley, Trevelyan J.; Mastroeni, Pietro; Restif, Olivier

    2013-01-01

    Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC), Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a) integration across the parameter space, (b) estimation of the posterior parameter distributions (with visualisations of parameter correlations), and (c) estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered. PMID:24376528

  5. [Applying multi-model inference to estimate growth parameters of greater lizard fish Saurida tumbil in Beibu Gulf, South China Sea].

    PubMed

    Hou, Gang; Liu, Jin-Dian; Feng, Bo; Yan, Yun-Rong; Lu, Huo-Sheng

    2014-03-01

    Age and growth parameters are key parameters in fish stock assessment and management strategies, thus it is crucial to choose an appropriate growth model for a target species. In this study, five growth models were set to fit the length-age data of greater lizard fish Saurida tumbil (n = 2046) collected monthly from December 2006 to July 2009 in the Beibu Gulf, South China Sea. The parameters for each model were estimated using the maximum likelihood method under the assumption of the additive error structure. Adjusted coefficient of determination (R2adj), root mean squared error (RMSE), Akaike's information criterion (AIC), and Bayesian information criterion (BIC) were calculated for each model for fitness selection. The results indicated that the four statistical approaches were consistent in selection of the best growth model. The MMI approach indicated that the generalized VBGF was strongly verified and made up 95.9% of the AIC weight, indicating that this function fitted the length-age data of the greater lizard fish well. The growth function was Lt = 578.49 [1-e -0.05(t-0.14) 0.361.

  6. How many parameters in the cosmological models with dark energy? [rapid communication

    NASA Astrophysics Data System (ADS)

    Godłowski, Włodzimierz; Szydłowski, Marek

    2005-09-01

    In cosmology many dramatically different scenarios in the past (big bang versus bounce) and in the future (de Sitter versus big rip) are compatible with the present day observations. This difficulties are called the degeneracy problem. We use the Akaike (AIC) and Bayesian (BIC) information criteria of model selection to avoid this degeneracy and to determine the model with such a set of parameters which gives the most preferred fit to the data. We consider seven representative scenarios, namely: the ΛCDM, CDM model with topological defect, phantom CDM model, bouncing ΛCDM model, bouncing phantom CDM model, brane ΛCDM model and model with the dynamical equation of state parameter linearized around the present epoch. Applying the information criteria to the currently available SNIa data we show that AIC indicates the flat phantom model while BIC indicates both flat phantom CDM and flat ΛCDM models. Finally we conclude that number of essential parameters chosen by dark energy models which are compared with SNIa data is two.

  7. The growth response of ostrich (Struthio camelus var. domesticus) chicks fed on diets with three different dietary protein and amino acid concentrations.

    PubMed

    Carstens, P D; Sharifi, A R; Brand, T S; Hoffman, L C

    2014-01-01

    1. Feeding costs are the largest expense in an ostrich production system, and protein is one of the more expensive components of the diet. This study evaluated the growth response of ostrich chicks on diets containing different concentrations of protein (amino acids). The diets were formulated to contain three concentrations of protein (one diet with 20% less protein than the conventional concentration, L; one diet with the conventional concentration of protein, M and one diet with 20% more protein than the conventional concentration, H) for each of the phase diets. The phase diets were pre-starter, starter, grower and finisher. 2. This study includes the analysis of ostrich body weight (BW) by modelling growth with linear polynomial and non-linear functions for all the data not separated for treatments. In total, 3378 BW recordings of 90 animals were collected weekly from hatch (d 0) to 287 d (41 weeks) of age. 3. Seven non-linear growth models and three linear polynomial models were fitted to the data. The growth functions were compared by using Akaike's information criterion (AIC). For the non-linear models, the Bridges and Janoschek models had the lowest AIC values for the H treatment, while the Richards curve had the lowest value for M and the von Bertalanffy for the L treatment. 4. For the linear polynomial models, the linear polynomial of the third degree had the lowest AIC values for all three treatments, thus making it the most suitable model for the data; therefore, the predictions of this model were used to interpret the growth data. Significant differences were found between treatments for growth data. 5. The results from this study can aid in describing the growth of ostriches subjected to optimum feeding conditions. This information can also be used in research when modelling the nutrient requirements of growing birds.

  8. The growth response of ostrich (Struthio camelus var. domesticus) chicks fed on diets with three different dietary protein and amino acid concentrations.

    PubMed

    Carstens, P D; Sharifi, A R; Brand, T S; Hoffman, L C

    2014-01-01

    1. Feeding costs are the largest expense in an ostrich production system, and protein is one of the more expensive components of the diet. This study evaluated the growth response of ostrich chicks on diets containing different concentrations of protein (amino acids). The diets were formulated to contain three concentrations of protein (one diet with 20% less protein than the conventional concentration, L; one diet with the conventional concentration of protein, M and one diet with 20% more protein than the conventional concentration, H) for each of the phase diets. The phase diets were pre-starter, starter, grower and finisher. 2. This study includes the analysis of ostrich body weight (BW) by modelling growth with linear polynomial and non-linear functions for all the data not separated for treatments. In total, 3378 BW recordings of 90 animals were collected weekly from hatch (d 0) to 287 d (41 weeks) of age. 3. Seven non-linear growth models and three linear polynomial models were fitted to the data. The growth functions were compared by using Akaike's information criterion (AIC). For the non-linear models, the Bridges and Janoschek models had the lowest AIC values for the H treatment, while the Richards curve had the lowest value for M and the von Bertalanffy for the L treatment. 4. For the linear polynomial models, the linear polynomial of the third degree had the lowest AIC values for all three treatments, thus making it the most suitable model for the data; therefore, the predictions of this model were used to interpret the growth data. Significant differences were found between treatments for growth data. 5. The results from this study can aid in describing the growth of ostriches subjected to optimum feeding conditions. This information can also be used in research when modelling the nutrient requirements of growing birds. PMID:25132424

  9. Modeling the influence of Peromyscus leucopus body mass, sex, and habitat on immature Dermacentor variabilis burden.

    PubMed

    Dallas, Tad A; Foré, Stephanie A; Kim, Hyun-Joo

    2012-12-01

    Immature (larvae and nymph) tick burden on rodents is an important determinant of adult tick population size and understanding infectious disease dynamics. The objective of this research was to build a descriptive model for immature Dermacentor variabilis burden on Peromyscus leucopus. Mice were live-trapped on two permanent grids in an old field and an early successional forest every other month between April and October, 2006-2009. Negative binomial regression was used to examine the association between immature D. variabilis burden and the host related variables of host habitat, body mass, and/or sex. The model containing all three variables had the lowest Akaike's Information Criterion (AIC), corrected AIC (AICc), and greatest AICc weight. Immature D. variabilis burden was positively associated with mice with higher body mass, male mice, and those captured in the field habitat. These data are consistent with studies from other tick-rodent systems and suggest that single factor models do not describe host burden. Variables other than those that are related to the host may also be important in describing the tick burden on rodents. The next step is to examine variables that affect tick development rate and questing behavior. PMID:23181857

  10. Comparison of the 7th and proposed 8th editions of the AJCC/UICC TNM staging system for non-small cell lung cancer undergoing radical surgery

    PubMed Central

    Jin, Ying; Chen, Ming; Yu, Xinmin

    2016-01-01

    The present study aims to compare the 7th and the proposed 8th edition of the AJCC/UICC TNM staging system for NSCLC in a cohort of patients from a single institution. A total of 408 patients with NSCLC who underwent radical surgery were analyzed retrospectively. Survivals were analyzed using the Kaplan –Meier method and were compared using the log-rank test. Multivariate analysis was performed by the Cox proportional hazard model. The Akaike information criterion (AIC) and C-index were applied to compare the two prognostic systems with different numbers of stages. The 7th AJCC T categories, the proposed 8th AJCC T categories, N categories, visceral pleural invasion, and vessel invasion were found to have statistically significant associations with disease-free survival (DFS) on univariate analysis. In the 7th edition staging system as well as in the proposed 8th edition, T categories, N categories, and pleural invasion were independent factors for DFS on multivariate analysis. The AIC value was smaller for the 8th edition compared to the 7th edition staging system. The C-index value was larger for the 8th edition compared to the 7th edition staging system. Based on the data from our single center, the proposed 8th AJCC T classification seems to be superior to the 7th AJCC T classification in terms of DFS for patients with NSCLC underwent radical surgery. PMID:27641932

  11. The Hyper-Envelope Modeling Interface (HEMI): A Novel Approach Illustrated Through Predicting Tamarisk (Tamarix spp.) Habitat in the Western USA

    USGS Publications Warehouse

    Graham, Jim; Young, Nick; Jarnevich, Catherine S.; Newman, Greg; Evangelista, Paul; Stohlgren, Thomas J.

    2013-01-01

    Habitat suitability maps are commonly created by modeling a species’ environmental niche from occurrences and environmental characteristics. Here, we introduce the hyper-envelope modeling interface (HEMI), providing a new method for creating habitat suitability models using Bezier surfaces to model a species niche in environmental space. HEMI allows modeled surfaces to be visualized and edited in environmental space based on expert knowledge and does not require absence points for model development. The modeled surfaces require relatively few parameters compared to similar modeling approaches and may produce models that better match ecological niche theory. As a case study, we modeled the invasive species tamarisk (Tamarix spp.) in the western USA. We compare results from HEMI with those from existing similar modeling approaches (including BioClim, BioMapper, and Maxent). We used synthetic surfaces to create visualizations of the various models in environmental space and used modified area under the curve (AUC) statistic and akaike information criterion (AIC) as measures of model performance. We show that HEMI produced slightly better AUC values, except for Maxent and better AIC values overall. HEMI created a model with only ten parameters while Maxent produced a model with over 100 and BioClim used only eight. Additionally, HEMI allowed visualization and editing of the model in environmental space to develop alternative potential habitat scenarios. The use of Bezier surfaces can provide simple models that match our expectations of biological niche models and, at least in some cases, out-perform more complex approaches.

  12. Double point source W-phase inversion: Real-time implementation and automated model selection

    NASA Astrophysics Data System (ADS)

    Nealy, Jennifer L.; Hayes, Gavin P.

    2015-12-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  13. Breeding bird diversity in relation to environmental gradients in China

    NASA Astrophysics Data System (ADS)

    Qian, Hong; Wang, Silong; Li, Yuanliang; Wang, Xihua

    2009-11-01

    Geographic variation in species richness has been explained by different theories such as energy, productivity, energy-water balance, habitat heterogeneity, and freezing tolerance. This study determines which of these theories best account for gradients of breeding bird richness in China. In addition, we develop a best-fit model to account for the relationship between breeding bird richness and environment in China. Breeding bird species richness in 207 localities (3271 km 2 per locality on average) from across China was related to thirteen environmental variables after accounting for sampling area. The Akaike's information criterion (AIC) was used to evaluate model performance. We used Moran's I to determine the magnitude of spatial autocorrelation in model residuals, and used simultaneous autoregressive model to determine coefficients of determination and AIC of explanatory variables after accounting for residual spatial autocorrelation. Of all environmental variables examined, normalized difference vegetation index, a measure of plant productivity, is the best variable to explain the variance in breeding bird richness. We found that species richness of breeding birds at the scale examined is best predicted by a combination of plant productivity, elevation range, seasonal variation in potential evapotranspiration, and mean annual temperature. These variables explained 47.3% of the variance in breeding bird richness after accounting for sampling area; most of the explained variance in richness is attributable to the first two of the four variables.

  14. Double point source W-phase inversion: Real-time implementation and automated model selection

    USGS Publications Warehouse

    Nealy, Jennifer; Hayes, Gavin

    2015-01-01

    Rapid and accurate characterization of an earthquake source is an extremely important and ever evolving field of research. Within this field, source inversion of the W-phase has recently been shown to be an effective technique, which can be efficiently implemented in real-time. An extension to the W-phase source inversion is presented in which two point sources are derived to better characterize complex earthquakes. A single source inversion followed by a double point source inversion with centroid locations fixed at the single source solution location can be efficiently run as part of earthquake monitoring network operational procedures. In order to determine the most appropriate solution, i.e., whether an earthquake is most appropriately described by a single source or a double source, an Akaike information criterion (AIC) test is performed. Analyses of all earthquakes of magnitude 7.5 and greater occurring since January 2000 were performed with extended analyses of the September 29, 2009 magnitude 8.1 Samoa earthquake and the April 19, 2014 magnitude 7.5 Papua New Guinea earthquake. The AIC test is shown to be able to accurately select the most appropriate model and the selected W-phase inversion is shown to yield reliable solutions that match published analyses of the same events.

  15. Modeling the influence of Peromyscus leucopus body mass, sex, and habitat on immature Dermacentor variabilis burden.

    PubMed

    Dallas, Tad A; Foré, Stephanie A; Kim, Hyun-Joo

    2012-12-01

    Immature (larvae and nymph) tick burden on rodents is an important determinant of adult tick population size and understanding infectious disease dynamics. The objective of this research was to build a descriptive model for immature Dermacentor variabilis burden on Peromyscus leucopus. Mice were live-trapped on two permanent grids in an old field and an early successional forest every other month between April and October, 2006-2009. Negative binomial regression was used to examine the association between immature D. variabilis burden and the host related variables of host habitat, body mass, and/or sex. The model containing all three variables had the lowest Akaike's Information Criterion (AIC), corrected AIC (AICc), and greatest AICc weight. Immature D. variabilis burden was positively associated with mice with higher body mass, male mice, and those captured in the field habitat. These data are consistent with studies from other tick-rodent systems and suggest that single factor models do not describe host burden. Variables other than those that are related to the host may also be important in describing the tick burden on rodents. The next step is to examine variables that affect tick development rate and questing behavior.

  16. An Investigation of State-Space Model Fidelity for SSME Data

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2008-01-01

    In previous studies, a variety of unsupervised anomaly detection techniques for anomaly detection were applied to SSME (Space Shuttle Main Engine) data. The observed results indicated that the identification of certain anomalies were specific to the algorithmic method under consideration. This is the reason why one of the follow-on goals of these previous investigations was to build an architecture to support the best capabilities of all algorithms. We appeal to that goal here by investigating a cascade, serial architecture for the best performing and most suitable candidates from previous studies. As a precursor to a formal ROC (Receiver Operating Characteristic) curve analysis for validation of resulting anomaly detection algorithms, our primary focus here is to investigate the model fidelity as measured by variants of the AIC (Akaike Information Criterion) for state-space based models. We show that placing constraints on a state-space model during or after the training of the model introduces a modest level of suboptimality. Furthermore, we compare the fidelity of all candidate models including those embodying the cascade, serial architecture. We make recommendations on the most suitable candidates for application to subsequent anomaly detection studies as measured by AIC-based criteria.

  17. Marginal Likelihood Estimate Comparisons to Obtain Optimal Species Delimitations in Silene sect. Cryptoneurae (Caryophyllaceae)

    PubMed Central

    Aydin, Zeynep; Marcussen, Thomas; Ertekin, Alaattin Selcuk; Oxelman, Bengt

    2014-01-01

    Coalescent-based inference of phylogenetic relationships among species takes into account gene tree incongruence due to incomplete lineage sorting, but for such methods to make sense species have to be correctly delimited. Because alternative assignments of individuals to species result in different parametric models, model selection methods can be applied to optimise model of species classification. In a Bayesian framework, Bayes factors (BF), based on marginal likelihood estimates, can be used to test a range of possible classifications for the group under study. Here, we explore BF and the Akaike Information Criterion (AIC) to discriminate between different species classifications in the flowering plant lineage Silene sect. Cryptoneurae (Caryophyllaceae). We estimated marginal likelihoods for different species classification models via the Path Sampling (PS), Stepping Stone sampling (SS), and Harmonic Mean Estimator (HME) methods implemented in BEAST. To select among alternative species classification models a posterior simulation-based analog of the AIC through Markov chain Monte Carlo analysis (AICM) was also performed. The results are compared to outcomes from the software BP&P. Our results agree with another recent study that marginal likelihood estimates from PS and SS methods are useful for comparing different species classifications, and strongly support the recognition of the newly described species S. ertekinii. PMID:25216034

  18. Age and growth of the round stingray Urotrygon rogersi, a particularly fast-growing and short-lived elasmobranch.

    PubMed

    Mejía-Falla, Paola A; Cortés, Enric; Navia, Andrés F; Zapata, Fernando A

    2014-01-01

    We examined the age and growth of Urotrygon rogersi on the Colombian coast of the Eastern Tropical Pacific Ocean by directly estimating age using vertebral centra. We verified annual deposition of growth increments with marginal increment analysis. Eight growth curves were fitted to four data sets defined on the basis of the reproductive cycle (unadjusted or adjusted for age at first band) and size variables (disc width or total length). Model performance was evaluated using Akaike's Information Criterion (AIC), AIC weights and multi-model inference criteria. A two-phase growth function with adjusted age provided the best description of growth for females (based on five parameters, DW∞  =  20.1 cm, k  =  0.22 yr⁻¹) and males (based on four and five parameters, DW(∞)  =  15.5 cm, k  =  0.65 yr⁻¹). Median maturity of female and male U. rogersi is reached very fast (mean ± SE  =  1.0 ± 0.1 year). This is the first age and growth study for a species of the genus Urotrygon and results indicate that U. rogersi attains a smaller maximum size and has a shorter lifespan and lower median age at maturity than species of closely related genera. These life history traits are in contrast with those typically reported for other elasmobranchs. PMID:24776963

  19. Evaluation of viral load thresholds for predicting new WHO Stage 3 and 4 events in HIV-infected children receiving highly active antiretroviral therapy

    PubMed Central

    Siberry, George K; Harris, D. Robert; Oliveira, Ricardo Hugo; Krauss, Margot R.; Hofer, Cristina B.; Tiraboschi, Adriana Aparecida; Marques, Heloisa; Succi, Regina C.; Abreu, Thalita; Negra, Marinella Della; Mofenson, Lynne M.; Hazra, Rohan

    2012-01-01

    Background This study evaluated a wide range of viral load (VL) thresholds to identify a cut-point that best predicts new clinical events in children on stable highly-active antiretroviral therapy (HAART). Methods Cox proportional hazards modeling was used to assess the adjusted risk of World Health Organization stage 3 or 4 clinical events (WHO events) as a function of time-varying CD4, VL, and hemoglobin values in a cohort study of Latin American children on HAART ≥ 6 months. Models were fit using different VL cut-points between 400 and 50,000 copies/mL, with model fit evaluated on the basis of the minimum Akaike Information Criterion (AIC) value, a standard model fit statistic. Results Models were based on 67 subjects with WHO events out of 550 subjects on study. The VL cutpoints of > 2600 copies/mL and > 32,000 copies/mL corresponded to the lowest AIC values and were associated with the highest hazard ratios [2.0 (p = 0.015) and 2.1 (p = 0.0058), respectively] for WHO events. Conclusions In HIV-infected Latin American children on stable HAART, two distinct VL thresholds (> 2,600 copies/mL and > 32,000 copies/mL) were identified for predicting children at significantly increased risk of HIV-related clinical illness, after accounting for CD4 level, hemoglobin level, and other significant factors. PMID:22343177

  20. Comparison of the 7(th) and proposed 8(th) editions of the AJCC/UICC TNM staging system for non-small cell lung cancer undergoing radical surgery.

    PubMed

    Jin, Ying; Chen, Ming; Yu, Xinmin

    2016-01-01

    The present study aims to compare the 7(th) and the proposed 8(th) edition of the AJCC/UICC TNM staging system for NSCLC in a cohort of patients from a single institution. A total of 408 patients with NSCLC who underwent radical surgery were analyzed retrospectively. Survivals were analyzed using the Kaplan -Meier method and were compared using the log-rank test. Multivariate analysis was performed by the Cox proportional hazard model. The Akaike information criterion (AIC) and C-index were applied to compare the two prognostic systems with different numbers of stages. The 7(th) AJCC T categories, the proposed 8(th) AJCC T categories, N categories, visceral pleural invasion, and vessel invasion were found to have statistically significant associations with disease-free survival (DFS) on univariate analysis. In the 7(th) edition staging system as well as in the proposed 8(th) edition, T categories, N categories, and pleural invasion were independent factors for DFS on multivariate analysis. The AIC value was smaller for the 8(th) edition compared to the 7(th) edition staging system. The C-index value was larger for the 8(th) edition compared to the 7(th) edition staging system. Based on the data from our single center, the proposed 8(th) AJCC T classification seems to be superior to the 7(th) AJCC T classification in terms of DFS for patients with NSCLC underwent radical surgery. PMID:27641932

  1. Age and Growth of the Round Stingray Urotrygon rogersi, a Particularly Fast-Growing and Short-Lived Elasmobranch

    PubMed Central

    Mejía-Falla, Paola A.; Cortés, Enric; Navia, Andrés F.; Zapata, Fernando A.

    2014-01-01

    We examined the age and growth of Urotrygon rogersi on the Colombian coast of the Eastern Tropical Pacific Ocean by directly estimating age using vertebral centra. We verified annual deposition of growth increments with marginal increment analysis. Eight growth curves were fitted to four data sets defined on the basis of the reproductive cycle (unadjusted or adjusted for age at first band) and size variables (disc width or total length). Model performance was evaluated using Akaike's Information Criterion (AIC), AIC weights and multi-model inference criteria. A two-phase growth function with adjusted age provided the best description of growth for females (based on five parameters, DW∞  =  20.1 cm, k  =  0.22 yr–1) and males (based on four and five parameters, DW∞  =  15.5 cm, k  =  0.65 yr–1). Median maturity of female and male U. rogersi is reached very fast (mean ± SE  =  1.0 ± 0.1 year). This is the first age and growth study for a species of the genus Urotrygon and results indicate that U. rogersi attains a smaller maximum size and has a shorter lifespan and lower median age at maturity than species of closely related genera. These life history traits are in contrast with those typically reported for other elasmobranchs. PMID:24776963

  2. Usefulness of information criteria for the selection of calibration curves.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Hubert, Ph

    2013-07-01

    The reliability of analytical results obtained with quantitative analytical methods is highly dependent on the selection of the adequate model used as the calibration curve. To select the adequate response function or model the most used and known parameter is to determine the coefficient R(2). However, it is well-known that it suffers many inconveniences, such as leading to overfitting the data. A proposed solution is to use the adjusted determination coefficient R(adj)(2) that aims at reducing this problem. However, there is another family of criteria that exists to allow the selection of an adequate model: the information criteria AIC, AICc, and BIC. These criteria have rarely been used in analytical chemistry to select the adequate calibration curve. This works aims at assessing the performance of the statistical information criteria as well as R(2) and R(adj)(2) for the selection of an adequate calibration curve. They are applied to several analytical methods covering liquid chromatographic methods, as well as electrophoretic ones involved in the analysis of active substances in biological fluids or aimed at quantifying impurities in drug substances. In addition, Monte Carlo simulations are performed to assess the efficacy of these statistical criteria to select the adequate calibration curve.

  3. A procedure for seiche analysis with Bayesian information criterion

    NASA Astrophysics Data System (ADS)

    Aichi, Masaatsu

    2016-04-01

    Seiche is a standing wave in enclosed or semi-enclosed water body. Its amplitude irregularly changes in time due to weather condition etc. Then, extracting seiche signal is not easy by usual methods for time series analysis such as fast Fourier transform (FFT). In this study, a new method for time series analysis with Bayesian information criterion was developed to decompose seiche, tide, long-term trend and residual components from time series data of tide stations. The method was developed based on the maximum marginal likelihood estimation of tide amplitudes, seiche amplitude, and trend components. Seiche amplitude and trend components were assumed that they gradually changes as second derivative in time was close to zero. These assumptions were incorporated as prior distributions. The variances of prior distributions were estimated by minimizing Akaike-Bayes information criterion (ABIC). The frequency of seiche was determined by Newton method with initial guess by FFT. The accuracy of proposed method was checked by analyzing synthetic time series data composed of known components. The reproducibility of the original components was quite well. The proposed method was also applied to the actual time series data of sea level observed by tide station and the strain of coastal rock masses observed by fiber Bragg grating sensor in Aburatsubo Bay, Japan. The seiche in bay and its response of rock masses were successfully extracted.

  4. Prediction of Vigilant Attention and Cognitive Performance Using Self-Reported Alertness, Circadian Phase, Hours since Awakening, and Accumulated Sleep Loss.

    PubMed

    Bermudez, Eduardo B; Klerman, Elizabeth B; Czeisler, Charles A; Cohen, Daniel A; Wyatt, James K; Phillips, Andrew J K

    2016-01-01

    Sleep restriction causes impaired cognitive performance that can result in adverse consequences in many occupational settings. Individuals may rely on self-perceived alertness to decide if they are able to adequately perform a task. It is therefore important to determine the relationship between an individual's self-assessed alertness and their objective performance, and how this relationship depends on circadian phase, hours since awakening, and cumulative lost hours of sleep. Healthy young adults (aged 18-34) completed an inpatient schedule that included forced desynchrony of sleep/wake and circadian rhythms with twelve 42.85-hour "days" and either a 1:2 (n = 8) or 1:3.3 (n = 9) ratio of sleep-opportunity:enforced-wakefulness. We investigated whether subjective alertness (visual analog scale), circadian phase (melatonin), hours since awakening, and cumulative sleep loss could predict objective performance on the Psychomotor Vigilance Task (PVT), an Addition/Calculation Test (ADD) and the Digit Symbol Substitution Test (DSST). Mathematical models that allowed nonlinear interactions between explanatory variables were evaluated using the Akaike Information Criterion (AIC). Subjective alertness was the single best predictor of PVT, ADD, and DSST performance. Subjective alertness alone, however, was not an accurate predictor of PVT performance. The best AIC scores for PVT and DSST were achieved when all explanatory variables were included in the model. The best AIC score for ADD was achieved with circadian phase and subjective alertness variables. We conclude that subjective alertness alone is a weak predictor of objective vigilant or cognitive performance. Predictions can, however, be improved by knowing an individual's circadian phase, current wake duration, and cumulative sleep loss. PMID:27019198

  5. Prediction of Vigilant Attention and Cognitive Performance Using Self-Reported Alertness, Circadian Phase, Hours since Awakening, and Accumulated Sleep Loss

    PubMed Central

    Bermudez, Eduardo B.; Klerman, Elizabeth B.; Czeisler, Charles A.; Cohen, Daniel A.; Wyatt, James K.; Phillips, Andrew J. K.

    2016-01-01

    Sleep restriction causes impaired cognitive performance that can result in adverse consequences in many occupational settings. Individuals may rely on self-perceived alertness to decide if they are able to adequately perform a task. It is therefore important to determine the relationship between an individual’s self-assessed alertness and their objective performance, and how this relationship depends on circadian phase, hours since awakening, and cumulative lost hours of sleep. Healthy young adults (aged 18–34) completed an inpatient schedule that included forced desynchrony of sleep/wake and circadian rhythms with twelve 42.85-hour “days” and either a 1:2 (n = 8) or 1:3.3 (n = 9) ratio of sleep-opportunity:enforced-wakefulness. We investigated whether subjective alertness (visual analog scale), circadian phase (melatonin), hours since awakening, and cumulative sleep loss could predict objective performance on the Psychomotor Vigilance Task (PVT), an Addition/Calculation Test (ADD) and the Digit Symbol Substitution Test (DSST). Mathematical models that allowed nonlinear interactions between explanatory variables were evaluated using the Akaike Information Criterion (AIC). Subjective alertness was the single best predictor of PVT, ADD, and DSST performance. Subjective alertness alone, however, was not an accurate predictor of PVT performance. The best AIC scores for PVT and DSST were achieved when all explanatory variables were included in the model. The best AIC score for ADD was achieved with circadian phase and subjective alertness variables. We conclude that subjective alertness alone is a weak predictor of objective vigilant or cognitive performance. Predictions can, however, be improved by knowing an individual’s circadian phase, current wake duration, and cumulative sleep loss. PMID:27019198

  6. Comparison of Regression Methods to Compute Atmospheric Pressure and Earth Tidal Coefficients in Water Level Associated with Wenchuan Earthquake of 12 May 2008

    NASA Astrophysics Data System (ADS)

    He, Anhua; Singh, Ramesh P.; Sun, Zhaohua; Ye, Qing; Zhao, Gang

    2016-07-01

    The earth tide, atmospheric pressure, precipitation and earthquake fluctuations, especially earthquake greatly impacts water well levels, thus anomalous co-seismic changes in ground water levels have been observed. In this paper, we have used four different models, simple linear regression (SLR), multiple linear regression (MLR), principal component analysis (PCA) and partial least squares (PLS) to compute the atmospheric pressure and earth tidal effects on water level. Furthermore, we have used the Akaike information criterion (AIC) to study the performance of various models. Based on the lowest AIC and sum of squares for error values, the best estimate of the effects of atmospheric pressure and earth tide on water level is found using the MLR model. However, MLR model does not provide multicollinearity between inputs, as a result the atmospheric pressure and earth tidal response coefficients fail to reflect the mechanisms associated with the groundwater level fluctuations. On the premise of solving serious multicollinearity of inputs, PLS model shows the minimum AIC value. The atmospheric pressure and earth tidal response coefficients show close response with the observation using PLS model. The atmospheric pressure and the earth tidal response coefficients are found to be sensitive to the stress-strain state using the observed data for the period 1 April-8 June 2008 of Chuan 03# well. The transient enhancement of porosity of rock mass around Chuan 03# well associated with the Wenchuan earthquake (Mw = 7.9 of 12 May 2008) that has taken its original pre-seismic level after 13 days indicates that the co-seismic sharp rise of water well could be induced by static stress change, rather than development of new fractures.

  7. A Test of the DSM-5 Severity Scale for Alcohol Use Disorder

    PubMed Central

    Fazzino, Tera L.; Rose, Gail L.; Burt, Keith B.; Helzer, John E.

    2014-01-01

    BACKGROUND For the DSM-5-defined alcohol use disorder (AUD) diagnosis, a tricategorized scale that designates mild, moderate, and severe AUD was selected over a fully dimensional scale to represent AUD severity. The purpose of this study was to test whether the DSM-5-defined AUD severity measure was as proficient a predictor of alcohol use following a brief intervention, compared to a fully dimensional scale. METHODS Heavy drinking primary care patients (N=246) received a physician-delivered brief intervention (BI), and then reported daily alcohol consumption for six months using an Interactive Voice Response (IVR) system. The dimensional AUD measure we constructed was a summation of all AUD criteria met at baseline (mean = 6.5; SD = 2.5). A multi-model inference technique was used to determine whether the DSM-5 tri-categorized severity measure or a dimensional approach would provide a more precise prediction of change in weekly alcohol consumption following a BI. RESULTS The Akaike information criterion (AIC) for the dimensional AUD model (AIC=7623.88) was four points lower than the tri-categorized model (AIC=7627.88) and weight of evidence calculations indicated there was 88% likelihood the dimensional model was the better approximating model. The dimensional model significantly predicted change in alcohol consumption (p =.04) whereas the DSM-5 tri-categorized model did not. CONCLUSION A dimensional AUD measure was superior, detecting treatment effects that were not apparent with tri-categorized severity model as defined by the DSM-5. We recommend using a dimensional measure for determining AUD severity. PMID:24893979

  8. Spatial Distribution of Black Bear Incident Reports in Michigan.

    PubMed

    McFadden-Hiller, Jamie E; Beyer, Dean E; Belant, Jerrold L

    2016-01-01

    Interactions between humans and carnivores have existed for centuries due to competition for food and space. American black bears are increasing in abundance and populations are expanding geographically in many portions of its range, including areas that are also increasing in human density, often resulting in associated increases in human-bear conflict (hereafter, bear incidents). We used public reports of bear incidents in Michigan, USA, from 2003-2011 to assess the relative contributions of ecological and anthropogenic variables in explaining the spatial distribution of bear incidents and estimated the potential risk of bear incidents. We used weighted Normalized Difference Vegetation Index mean as an index of primary productivity, region (i.e., Upper Peninsula or Lower Peninsula), primary and secondary road densities, and percentage land cover type within 6.5-km2 circular buffers around bear incidents and random points. We developed 22 a priori models and used generalized linear models and Akaike's Information Criterion (AIC) to rank models. The global model was the best compromise between model complexity and model fit (w = 0.99), with a ΔAIC 8.99 units from the second best performing model. We found that as deciduous forest cover increased, the probability of bear incident occurrence increased. Among the measured anthropogenic variables, cultivated crops and primary roads were the most important in our AIC-best model and were both positively related to the probability of bear incident occurrence. The spatial distribution of relative bear incident risk varied markedly throughout Michigan. Forest cover fragmented with agriculture and other anthropogenic activities presents an environment that likely facilitates bear incidents. Our map can help wildlife managers identify areas of bear incident occurrence, which in turn can be used to help develop strategies aimed at reducing incidents. Researchers and wildlife managers can use similar mapping techniques to

  9. Spatial Distribution of Black Bear Incident Reports in Michigan.

    PubMed

    McFadden-Hiller, Jamie E; Beyer, Dean E; Belant, Jerrold L

    2016-01-01

    Interactions between humans and carnivores have existed for centuries due to competition for food and space. American black bears are increasing in abundance and populations are expanding geographically in many portions of its range, including areas that are also increasing in human density, often resulting in associated increases in human-bear conflict (hereafter, bear incidents). We used public reports of bear incidents in Michigan, USA, from 2003-2011 to assess the relative contributions of ecological and anthropogenic variables in explaining the spatial distribution of bear incidents and estimated the potential risk of bear incidents. We used weighted Normalized Difference Vegetation Index mean as an index of primary productivity, region (i.e., Upper Peninsula or Lower Peninsula), primary and secondary road densities, and percentage land cover type within 6.5-km2 circular buffers around bear incidents and random points. We developed 22 a priori models and used generalized linear models and Akaike's Information Criterion (AIC) to rank models. The global model was the best compromise between model complexity and model fit (w = 0.99), with a ΔAIC 8.99 units from the second best performing model. We found that as deciduous forest cover increased, the probability of bear incident occurrence increased. Among the measured anthropogenic variables, cultivated crops and primary roads were the most important in our AIC-best model and were both positively related to the probability of bear incident occurrence. The spatial distribution of relative bear incident risk varied markedly throughout Michigan. Forest cover fragmented with agriculture and other anthropogenic activities presents an environment that likely facilitates bear incidents. Our map can help wildlife managers identify areas of bear incident occurrence, which in turn can be used to help develop strategies aimed at reducing incidents. Researchers and wildlife managers can use similar mapping techniques to

  10. Anterior insular cortex and emotional awareness.

    PubMed

    Gu, Xiaosi; Hof, Patrick R; Friston, Karl J; Fan, Jin

    2013-10-15

    This paper reviews the foundation for a role of the human anterior insular cortex (AIC) in emotional awareness, defined as the conscious experience of emotions. We first introduce the neuroanatomical features of AIC and existing findings on emotional awareness. Using empathy, the awareness and understanding of other people's emotional states, as a test case, we then present evidence to demonstrate: 1) AIC and anterior cingulate cortex (ACC) are commonly coactivated as revealed by a meta-analysis, 2) AIC is functionally dissociable from ACC, 3) AIC integrates stimulus-driven and top-down information, and 4) AIC is necessary for emotional awareness. We propose a model in which AIC serves two major functions: integrating bottom-up interoceptive signals with top-down predictions to generate a current awareness state and providing descending predictions to visceral systems that provide a point of reference for autonomic reflexes. We argue that AIC is critical and necessary for emotional awareness.

  11. Anterior Insular Cortex and Emotional Awareness

    PubMed Central

    Gu, Xiaosi; Hof, Patrick R.; Friston, Karl J.; Fan, Jin

    2014-01-01

    This paper reviews the foundation for a role of the human anterior insular cortex (AIC) in emotional awareness, defined as the conscious experience of emotions. We first introduce the neuroanatomical features of AIC and existing findings on emotional awareness. Using empathy, the awareness and understanding of other people’s emotional states, as a test case, we then present evidence to demonstrate: 1) AIC and anterior cingulate cortex (ACC) are commonly coactivated as revealed by a meta-analysis, 2) AIC is functionally dissociable from ACC, 3) AIC integrates stimulus-driven and top-down information, and 4) AIC is necessary for emotional awareness. We propose a model in which AIC serves two major functions: integrating bottom-up interoceptive signals with top-down predictions to generate a current awareness state and providing descending predictions to visceral systems that provide a point of reference for autonomic reflexes. We argue that AIC is critical and necessary for emotional awareness. PMID:23749500

  12. Information Economics: Valuing Information.

    ERIC Educational Resources Information Center

    Brinberg, Herbert R.

    1989-01-01

    Addresses the question of why previous articles and studies on the value of information have failed to provide meaningful techniques for measuring that value. The discussion covers four principle causes for confusion surrounding the valuation of information and draws conclusions about the value added model of information. (seven references) (CLB)

  13. Theorizing Information for Information Science.

    ERIC Educational Resources Information Center

    Cornelius, Ian

    2002-01-01

    Considers whether information science has a theory of information. Highlights include guides to information and its theory; constructivism; information outside information science; process theories; cognitive views of information; measuring information; meaning; and misinformation. (Contains 89 references.) (LRW)

  14. Average Information Content Maximization—A New Approach for Fingerprint Hybridization and Reduction

    PubMed Central

    Śmieja, Marek; Warszycki, Dawid

    2016-01-01

    Fingerprints, bit representations of compound chemical structure, have been widely used in cheminformatics for many years. Although fingerprints with the highest resolution display satisfactory performance in virtual screening campaigns, the presence of a relatively high number of irrelevant bits introduces noise into data and makes their application more time-consuming. In this study, we present a new method of hybrid reduced fingerprint construction, the Average Information Content Maximization algorithm (AIC-Max algorithm), which selects the most informative bits from a collection of fingerprints. This methodology, applied to the ligands of five cognate serotonin receptors (5-HT2A, 5-HT2B, 5-HT2C, 5-HT5A, 5-HT6), proved that 100 bits selected from four non-hashed fingerprints reflect almost all structural information required for a successful in silico discrimination test. A classification experiment indicated that a reduced representation is able to achieve even slightly better performance than the state-of-the-art 10-times-longer fingerprints and in a significantly shorter time. PMID:26784447

  15. Some novel growth functions and their application with reference to growth in ostrich.

    PubMed

    Faridi, A; López, S; Ammar, H; Salwa, K S; Golian, A; Thornley, J H M; France, J

    2015-06-01

    Four novel growth functions, namely, Pareto, extreme value distribution (EVD), Lomolino, and cumulative β-P distribution (CBP), are derived, and their ability to describe ostrich growth curves is evaluated. The functions were compared with standard growth equations, namely, the monomolecular, Michaelis-Menten (MM), Gompertz, Richards, and generalized MM (gMM). For this purpose, 2 separate comparisons were conducted. In the first, all the functions were fitted to 40 individual growth curves (5 males and 35 females) of ostriches using nonlinear regression. In the second, performance of the functions was assessed when data from 71 individuals were composited (570 data points). This comparison was undertaken using nonlinear mixed models and considering 3 approaches: 1) models with no random effect, 2) random effect incorporated as the intercept, and 3) random effect incorporated into the asymptotic weight parameter (Wf). The results from the first comparison showed that the functions generally gave acceptable values of R2 and residual variance. On the basis of the Akaike information criterion (AIC), CBP gave the best fit, whereas the Gompertz and Lomolino equations were the preferred functions on the basis of corrected AIC (AICc). Bias, accuracy factor, the Durbin-Watson statistic, and the number of runs of sign were used to analyze the residuals. CBP gave the best distribution of residuals but also produced more residual autocorrelation (significant Durbin-Watson statistic). The functions were applied to sample data for a more conventional farm species (2 breeds of cattle) to verify the results of the comparison of fit among functions and their applicability across species. In the second comparison, analysis of mixed models showed that incorporation of a random effect into Wf gave the best fit, resulting in smaller AIC and AIC values compared with those in the other 2 approaches. On the basis of AICc, best fit was achieved with CBP, followed by gMM, Lomolino, and

  16. Information Inquiry.

    ERIC Educational Resources Information Center

    Callison, Daniel

    2002-01-01

    Discusses the components of information inquiry that are necessary to meet basic information and media literacy skills. Highlights include questioning; exploration; assimilation; inference; reflection; information environments, including school, workplace, and personal; information needs; information problems; and literacy and fluency. (LRW)

  17. Identification of sorption processes and parameters for radionuclide transport in fractured rock

    NASA Astrophysics Data System (ADS)

    Dai, Zhenxue; Wolfsberg, Andrew; Reimus, Paul; Deng, Hailin; Kwicklis, Edward; Ding, Mei; Ware, Doug; Ye, Ming

    2012-01-01

    SummaryIdentification of chemical reaction processes in subsurface environments is a key issue for reactive transport modeling because simulating different processes requires developing different chemical-mathematical models. In this paper, two sorption processes (equilibrium and kinetics) are considered for modeling neptunium and uranium sorption in fractured rock. Based on different conceptualizations of the two processes occurring in fracture and/or matrix media, seven dual-porosity, multi-component reactive transport models are developed. The process models are identified with a stepwise strategy by using multi-tracer concentration data obtained from a series of transport experiments. In the first step, breakthrough data of a conservative tracer (tritium) obtained from four experiments are used to estimate the flow and non-reactive transport parameters (i.e., mean fluid residence time in fracture, fracture aperture, and matrix tortuosity) common to all the reactive transport models. In the second and third steps, by fixing the common non-reactive flow and transport parameters, the sorption parameters (retardation factor, sorption coefficient, and kinetic rate constant) of each model are estimated using the breakthrough data of reactive tracers, neptunium and uranium, respectively. Based on the inverse modeling results, the seven sorption-process models are discriminated using four model discrimination (or selection) criteria, Akaike information criterion ( AIC), modified Akaike information criterion ( AICc), Bayesian information criterion ( BIC) and Kashyap information criterion ( KIC). These criteria suggest the kinetic sorption process for modeling reactive transport of neptunium and uranium transport in both fracture and matrix. This conclusion is confirmed by two chemical criteria, the half reaction time and Damköhler number criterion.

  18. Effects of error covariance structure on estimation of model averaging weights and predictive performance

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Meyer, Philip D.; Curtis, Gary P.; Shi, Xiaoqing; Niu, Xu-Feng; Yabusaki, Steve B.

    2013-01-01

    When conducting model averaging for assessing groundwater conceptual model uncertainty, the averaging weights are often evaluated using model selection criteria such as AIC, AICc, BIC, and KIC (Akaike Information Criterion, Corrected Akaike Information Criterion, Bayesian Information Criterion, and Kashyap Information Criterion, respectively). However, this method often leads to an unrealistic situation in which the best model receives overwhelmingly large averaging weight (close to 100%), which cannot be justified by available data and knowledge. It was found in this study that this problem was caused by using the covariance matrix, CE, of measurement errors for estimating the negative log likelihood function common to all the model selection criteria. This problem can be resolved by using the covariance matrix, Cek, of total errors (including model errors and measurement errors) to account for the correlation between the total errors. An iterative two-stage method was developed in the context of maximum likelihood inverse modeling to iteratively infer the unknown Cek from the residuals during model calibration. The inferred Cek was then used in the evaluation of model selection criteria and model averaging weights. While this method was limited to serial data using time series techniques in this study, it can be extended to spatial data using geostatistical techniques. The method was first evaluated in a synthetic study and then applied to an experimental study, in which alternative surface complexation models were developed to simulate column experiments of uranium reactive transport. It was found that the total errors of the alternative models were temporally correlated due to the model errors. The iterative two-stage method using Cekresolved the problem that the best model receives 100% model averaging weight, and the resulting model averaging weights were supported by the calibration results and physical understanding of the alternative models. Using Cek

  19. Automatic analysis of composite physical signals using non-negative factorization and information criterion.

    PubMed

    Watanabe, Kenji; Hidaka, Akinori; Otsu, Nobuyuki; Kurita, Takio

    2012-01-01

    In time-resolved spectroscopy, composite signal sequences representing energy transfer in fluorescence materials are measured, and the physical characteristics of the materials are analyzed. Each signal sequence is represented by a sum of non-negative signal components, which are expressed by model functions. For analyzing the physical characteristics of a measured signal sequence, the parameters of the model functions are estimated. Furthermore, in order to quantitatively analyze real measurement data and to reduce the risk of improper decisions, it is necessary to obtain the statistical characteristics from several sequences rather than just a single sequence. In the present paper, we propose an automatic method by which to analyze composite signals using non-negative factorization and an information criterion. The proposed method decomposes the composite signal sequences using non-negative factorization subjected to parametric base functions. The number of components (i.e., rank) is also estimated using Akaike's information criterion. Experiments using simulated and real data reveal that the proposed method automatically estimates the acceptable ranks and parameters.

  20. Automatic Analysis of Composite Physical Signals Using Non-Negative Factorization and Information Criterion

    PubMed Central

    Watanabe, Kenji; Hidaka, Akinori; Otsu, Nobuyuki; Kurita, Takio

    2012-01-01

    In time-resolved spectroscopy, composite signal sequences representing energy transfer in fluorescence materials are measured, and the physical characteristics of the materials are analyzed. Each signal sequence is represented by a sum of non-negative signal components, which are expressed by model functions. For analyzing the physical characteristics of a measured signal sequence, the parameters of the model functions are estimated. Furthermore, in order to quantitatively analyze real measurement data and to reduce the risk of improper decisions, it is necessary to obtain the statistical characteristics from several sequences rather than just a single sequence. In the present paper, we propose an automatic method by which to analyze composite signals using non-negative factorization and an information criterion. The proposed method decomposes the composite signal sequences using non-negative factorization subjected to parametric base functions. The number of components (i.e., rank) is also estimated using Akaike's information criterion. Experiments using simulated and real data reveal that the proposed method automatically estimates the acceptable ranks and parameters. PMID:22396759

  1. Validity of methods for model selection, weighting for model uncertainty, and small sample adjustment in capture-recapture estimation.

    PubMed

    Hook, E B; Regal, R R

    1997-06-15

    In log-linear capture-recapture approaches to population size, the method of model selection may have a major effect upon the estimate. In addition, the estimate may also be very sensitive if certain cells are null or very sparse, even with the use of multiple sources. The authors evaluated 1) various approaches to the issue of model uncertainty and 2) a small sample correction for three or more sources recently proposed by Hook and Regal. The authors compared the estimates derived using 1) three different information criteria that included Akaike's Information Criterion (AIC) and two alternative formulations of the Bayesian Information Criterion (BIC), one proposed by Draper ("two pi") and one by Schwarz ("not two pi"); 2) two related methods of weighting estimates associated with models; 3) the independent model; and 4) the saturated model, with the known totals in 20 different populations studied by five separate groups of investigators. For each method, we also compared the estimate derived with or without the proposed small sample correction. At least in these data sets, the use of AIC appeared on balance to be preferable. The BIC formulation suggested by Draper appeared slightly preferable to that suggested by Schwarz. Adjustment for model uncertainty appears to improve results slightly. The proposed small sample correction appeared to diminish relative log bias but only when sparse cells were present. Otherwise, its use tended to increase relative log bias. Use of the saturated model (with or without the small sample correction) appears to be optimal if the associated interval is not uselessly large, and if one can plausibly exclude an all-source interaction. All other approaches led to an estimate that was too low by about one standard deviation.

  2. Combining Models is More Likely to Give Better Predictions than Single Models.

    PubMed

    Hu, Xiaoping; Madden, Laurence V; Edwards, Simon; Xu, Xiangming

    2015-09-01

    In agricultural research, it is often difficult to construct a single "best" predictive model based on data collected under field conditions. We studied the relative prediction performance of combining empirical linear models over the single best model in relation to number of models to be combined, number of variates in the models, magnitude of residual errors, and weighting schemes. Two scenarios were simulated: the modeler did or did not know the relative of performance of the models to be combined. For the former case, model averaging is achieved either through weights based on the Akaike Information Criterion (AIC) statistic or with arithmetic averaging; for the latter case, only the arithmetic averaging is possible (because the relative model predictive performance is not known for a common dataset). In addition to two experimental datasets on oat mycotoxins in relation to environmental variables, two datasets were generated assuming a consistent correlation structure among explanatory variates with two magnitudes of residual errors. For the majority of cases, model averaging resulted in improved prediction performance over the single-model predictions, especially when a modeler does not have the information of relative model performance. The fewer variates in the models to be combined, the greater is improvement of model averaging over the single-model predictions. Combining models led to very little improvement over individual models when there were many variates in individual models. Overall, simple arithmetic averaging resulted in slightly better performance than the AIC-based weighted averaging. The advantage in model averaging is also noticeable for larger residual errors. This study suggests that model averaging generally performs better than single-model predictions, especially when a modeler does not have information on the relative performance of the candidate models.

  3. Information management

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell; Corker, Kevin

    1990-01-01

    Primary Flight Display (PFD) information management and cockpit display of information management research is presented in viewgraph form. The information management problem in the cockpit, information management burdens, the key characteristics of an information manager, the interface management system handling the flow of information and the dialogs between the system and the pilot, and overall system architecture are covered.

  4. Distinguishing between invasions and habitat changes as drivers of diversity loss among California's freshwater fishes.

    PubMed

    Light, Theo; Marchetti, Michael P

    2007-04-01

    Many of California's native populations of freshwater fish are in serious decline, as are freshwater faunas worldwide. Habitat loss and alteration, hydrologic modification, water pollution, and invasions have been identified as major drivers of these losses. Because these potential causes of decline are frequently correlated, it is difficult to separate direct from indirect effects of each factor and to appropriately rank their importance for conservation action. Recently a few authors have questioned the conservation significance of invasions, suggesting that they are "passengers" rather than "drivers" of ecological change. We compiled an extensive, watershed-level data set of fish presence and conservation status, land uses, and hydrologic modifications in California and used an information theoretic approach (Akaike's information criterion, AIC) and path analysis to evaluate competing models of native fish declines. Hydrologic modification (impoundments and diversions), invasions, and proportion of developed land were all predictive of the number of extinct and at-risk native fishes in California watersheds in the AIC analysis. Although nonindigenous fish richness was the best single predictor (after native richness) of fishes of conservation concern, the combined ranking of models containing hydrologic modification variables was slightly higher than that of models containing nonindigenous richness. Nevertheless, the path analysis indicated that the effects of both hydrologic modification and development on fishes of conservation concern were largely indirect, through their positive effects on nonindigenous fish richness. The best-fitting path model was the driver model, which included no direct effects of abiotic disturbance on native fish declines. Our results suggest that, for California freshwater fishes, invasions are the primary direct driver of extinctions and population declines, whereas the most damaging effect of habitat alteration is the tendency of

  5. Estimating Dbh of Trees Employing Multiple Linear Regression of the best Lidar-Derived Parameter Combination Automated in Python in a Natural Broadleaf Forest in the Philippines

    NASA Astrophysics Data System (ADS)

    Ibanez, C. A. G.; Carcellar, B. G., III; Paringit, E. C.; Argamosa, R. J. L.; Faelga, R. A. G.; Posilero, M. A. V.; Zaragosa, G. P.; Dimayacyac, N. A.

    2016-06-01

    Diameter-at-Breast-Height Estimation is a prerequisite in various allometric equations estimating important forestry indices like stem volume, basal area, biomass and carbon stock. LiDAR Technology has a means of directly obtaining different forest parameters, except DBH, from the behavior and characteristics of point cloud unique in different forest classes. Extensive tree inventory was done on a two-hectare established sample plot in Mt. Makiling, Laguna for a natural growth forest. Coordinates, height, and canopy cover were measured and types of species were identified to compare to LiDAR derivatives. Multiple linear regression was used to get LiDAR-derived DBH by integrating field-derived DBH and 27 LiDAR-derived parameters at 20m, 10m, and 5m grid resolutions. To know the best combination of parameters in DBH Estimation, all possible combinations of parameters were generated and automated using python scripts and additional regression related libraries such as Numpy, Scipy, and Scikit learn were used. The combination that yields the highest r-squared or coefficient of determination and lowest AIC (Akaike's Information Criterion) and BIC (Bayesian Information Criterion) was determined to be the best equation. The equation is at its best using 11 parameters at 10mgrid size and at of 0.604 r-squared, 154.04 AIC and 175.08 BIC. Combination of parameters may differ among forest classes for further studies. Additional statistical tests can be supplemented to help determine the correlation among parameters such as Kaiser- Meyer-Olkin (KMO) Coefficient and the Barlett's Test for Spherecity (BTS).

  6. VLP Source Inversion and Evaluation of Error Analysis Techniques at Fuego Volcano, Guatemala

    NASA Astrophysics Data System (ADS)

    Brill, K. A.; Waite, G. P.

    2015-12-01

    In January of 2012, our team occupied 10 sites around Fuego volcano with broadband seismometers, two of which were collocated with infrasound microphone arrays and tilt-meters (see Figure 1 for full deployment details). Our radial coverage around Fuego during the 2012 campaign satisfies conditions outlined by Dawson et al. [2011] for good network coverage. Very-long-period (VLP) events that accompany small-scale explosions were classified by waveform and eruption style. We located these VLP event families which have been persistent at Fuego since at least 2008 through inversion in the same manner employed by Lyons and Waite [2011] with improved radial coverage in our network. We compare results for source inversions performed with independent tilt data against inversions incorporating tilt data extracted from the broadband. The current best-practice method for choosing an optimum solution for inversion results is based on each solution's residual error, the relevance of free parameters used in the model, and the physical significance of the source mechanism. Error analysis was performed through a boot strapping in order to explore the source location uncertainty and significance of components of the moment tensor. The significance of the number of free parameters has mostly been evaluated by calculating Akaike's Information Criterion (AIC), but little has been done to evaluate the sensitivity of AIC or other criteria (i.e. Bayesian Information Criterion) to the number of model parameters. We compare solutions as chosen by these alternate methods with more standard techniques for our real data set as well through the use of synthetic data and make recommendations as to best practices. Figure 1: a) Map of 2012 station network: stations highlighted in red were collocated with infrasound arrays. b) Location of Fuego within Guatemala and view of the complex from the west with different eruptive centers labeled. c) Operational times for each of the stations and cameras.

  7. An Antiproton Ion Collider (AIC) for Measuring Neutron and Proton Distributions in Stable and Radioactive Nuclei

    SciTech Connect

    Kienle, Paul

    2005-10-19

    An antiproton-ion collider is proposed to independently determine mean square radii for protons and neutrons in stable and short lived nuclei by means of antiproton absorption at medium energies. The experiment makes use of the electron ion collider complex (ELISE) of the GSI FAIR project with appropriate modifications of the electron ring to store, cool and collide antiprotons of 30 MeV energy with 740A MeV energy ions.The total absorption cross-section of antiprotons by the stored ions will be measured by detecting their loss by means of the Schottky noise spectroscopy method. Cross sections for the absorption on protons and neutrons, respectively, will be studied by detection of residual nuclei with A-1 either by the Schottky method or by analysing them in recoil detectors after the first dipole stage of the NESR following the interaction zone. With a measurement of the A-1 fragment momentum distribution, one can test the momentum wave functions of the annihilated neutron and proton, respectively. Furthermore by changing the incident ion energy the tails of neutron and proton distribution can be measured.The absorption cross section is at asymptotic energies in leading order proportional to the mean square radius of the nucleus. Predicted cross sections and luminosities show that the method is applicable to nuclei with production rates of about 105 s-1 or lower, depending on the lifetime of the ions in the NESR, and for half-lives down to 1 second.

  8. Information Ethics.

    ERIC Educational Resources Information Center

    Smith, Martha Montague

    1997-01-01

    Focuses on information ethics in scholarly and professional literature. Computer ethics, cyberethics, and the philosophies of information and information technology are also discussed. The recent use of the term global information ethics, suggesting the unification of many concerns common to information ethics, computer ethics, and cyberethics, is…

  9. Regression with Empirical Variable Selection: Description of a New Method and Application to Ecological Datasets

    PubMed Central

    Goodenough, Anne E.; Hart, Adam G.; Stafford, Richard

    2012-01-01

    Despite recent papers on problems associated with full-model and stepwise regression, their use is still common throughout ecological and environmental disciplines. Alternative approaches, including generating multiple models and comparing them post-hoc using techniques such as Akaike's Information Criterion (AIC), are becoming more popular. However, these are problematic when there are numerous independent variables and interpretation is often difficult when competing models contain many different variables and combinations of variables. Here, we detail a new approach, REVS (Regression with Empirical Variable Selection), which uses all-subsets regression to quantify empirical support for every independent variable. A series of models is created; the first containing the variable with most empirical support, the second containing the first variable and the next most-supported, and so on. The comparatively small number of resultant models (n = the number of predictor variables) means that post-hoc comparison is comparatively quick and easy. When tested on a real dataset – habitat and offspring quality in the great tit (Parus major) – the optimal REVS model explained more variance (higher R2), was more parsimonious (lower AIC), and had greater significance (lower P values), than full, stepwise or all-subsets models; it also had higher predictive accuracy based on split-sample validation. Testing REVS on ten further datasets suggested that this is typical, with R2 values being higher than full or stepwise models (mean improvement = 31% and 7%, respectively). Results are ecologically intuitive as even when there are several competing models, they share a set of “core” variables and differ only in presence/absence of one or two additional variables. We conclude that REVS is useful for analysing complex datasets, including those in ecology and environmental disciplines. PMID:22479605

  10. Effects of isolation and environmental variables on fish community structure in the Brazilian Amazon Madeira-Purus interfluve.

    PubMed

    Barros, D F; Albernaz, A L M; Zuanon, J; Espírito Santo, H M V; Mendonça, F P; Galuch, A V

    2013-08-01

    Due to the existence of terrestrial barriers to freshwater fish dispersion, it is believed that its distribution is strongly associated with historical factors related to the formation of the habitats they occupy. By the other hand, some studies reveal the influence of abiotic conditions (such as size of water bodies, pH, conductivity) on the composition of fish fauna occurring in small streams. This study aimed to investigate whether drainage basins, because catchment boundaries are potential barriers to fish dispersion, or the physical structure and physico-chemical characteristics of water have a greater influence on fish community structure in small streams. We sampled 22 streams belonging to five drainage basins in the Madeira-Purus interfluve. Fish were caught with dip nets and a small trawl, and data were simultaneously obtained on structural characteristics of the streams and physico-chemical characteristics of the water. Community composition was analyzed using Non-Metric Multidimensional Scaling (NMDS), and variables related to structural and physico-chemical characteristics were summarized by Principal Component Analysis (PCA). Two explanatory models relating faunal composition to environmental factors were constructed: the first using only continuous variables and the second including the drainage basin as a categorical variable. The Akaike Information Criterion (AIC) and AIC weight were used to select the best model. Although structural and physico-chemical variables significantly contributed to explaining faunal composition, the model including the drainage basin was clearly the better of the two models (more than 90% support in the data). The importance of drainage basins in structuring fish communities in streams may have significant consequences for conservation planning in these environments.

  11. Impact of Schedule Duration on Head and Neck Radiotherapy: Accelerated Tumor Repopulation Versus Compensatory Mucosal Proliferation

    SciTech Connect

    Fenwick, John D.; Pardo-Montero, Juan; Nahum, Alan E.; Malik, Zafar I.

    2012-02-01

    Purpose: To determine how modelled maximum tumor control rates, achievable without exceeding mucositis tolerance (tcp{sub max-early}) vary with schedule duration for head and neck squamous cell carcinoma (HNSCC). Methods and materials: Using maximum-likelihood techniques, we have fitted a range of tcp models to two HNSCC datasets (Withers' and British Institute of Radiology [BIR]), characterizing the dependence of tcp on duration and equivalent dose in 2 Gy fractions (EQD{sub 2}). Models likely to best describe future data have been selected using the Akaike information criterion (AIC) and its quasi-AIC extension to overdispersed data. Setting EQD{sub 2}s in the selected tcp models to levels just tolerable for mucositis, we have plotted tcp{sub max-early} against schedule duration. Results: While BIR dataset tcp fits describe dose levels isoeffective for tumor control as rising significantly with schedule protraction, indicative of accelerated tumor repopulation, repopulation terms in fits to Withers' dataset do not reach significance after accounting for overdispersion of the data. The tcp{sub max-early} curves calculated from tcp fits to the overall Withers' and BIR datasets rise by 8% and 0-4%, respectively, between 20 and 50 days duration; likewise, tcp{sub max-early} curves calculated for stage-specific cohorts also generally rise slowly with increasing duration. However none of the increases in tcp{sub max-early} calculated from the overall or stage-specific fits reach significance. Conclusions: Local control rates modeled for treatments which lie just within mucosal tolerance rise slowly but insignificantly with increasing schedule length. This finding suggests that whereas useful gains may be made by accelerating unnecessarily slow schedules until they approach early reaction tolerance, little is achieved by shortening schedules further while reducing doses to remain within mucosal tolerance, an approach that may slightly worsen outcomes.

  12. Modeling the Growth of Infrarenal Abdominal Aortic Aneurysms

    PubMed Central

    Bailey, Marc A.; Baxter, Paul D.; Jiang, Tao; Charnell, Aimee M.; Griffin, Kathryn J.; Johnson, Anne B.; Bridge, Katherine I.; Sohrabi, Soroush; Scott, D. Julian A.

    2013-01-01

    Background: Abdominal aortic aneurysm (AAA) growth is a complex process that is incompletely understood. Significant heterogeneity in growth trajectories between patients has led to difficulties in accurately modeling aneurysm growth across cohorts of patients. We set out to compare four models of aneurysm growth commonly used in the literature and confirm which best fits the patient data of our AAA cohort. Methods: Patients with AAA were included in the study if they had two or more abdominal ultrasound scans greater than 3 months apart. Patients were censored from analysis once their AAA exceeded 5.5 cm. Four models were applied using the R environment for statistical computing. Growth estimates and goodness of fit (using the Akaike Information Criterion, AIC) were compared, with p-values based on likelihood ratio testing. Results: Of 510 enrolled patients, 264 met the inclusion criteria, yielding a total of 1861 imaging studies during 932 cumulative years of surveillance. Overall, growth rates were: (1) 0.35 (0.31,0.39) cm/yr in the growth/time calculation, (2) 0.056 (0.042,0.068) cm/yr in the linear regression model, (3) 0.19 (0.17,0.21) cm/yr in the linear multilevel model, and (4) 0.21 (0.18,0.24) cm/yr in the quadratic multilevel model at time 0, slowing to 0.15 (0.12,0.17) cm/yr at 10 years. AIC was lowest in the quadratic multilevel model (1508) compared to other models (P < 0.0001). Conclusion: AAA growth was heterogeneous between patients; the nested nature of the data is most appropriately modeled by multilevel modeling techniques. PMID:26798704

  13. Canada lynx Lynx canadensis habitat and forest succession in northern Maine, USA

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Jakubas, W.J.; McCollough, M.A.

    2004-01-01

    The contiguous United States population of Canada lynx Lynx canadensis was listed as threatened in 2000. The long-term viability of lynx populations at the southern edge of their geographic range has been hypothesized to be dependent on old growth forests; however, lynx are a specialist predator on snowshoe hare Lepus americanus, a species associated with early-successional forests. To quantify the effects of succession and forest management on landscape-scale (100 km2) patterns of habitat occupancy by lynx, we compared landscape attributes in northern Maine, USA, where lynx had been detected on snow track surveys to landscape attributes where surveys had been conducted, but lynx tracks had not been detected. Models were constructed a priori and compared using logistic regression and Akaike's Information Criterion (AIC), which quantitatively balances data fit and parsimony. In the models with the lowest (i.e. best) AIC, lynx were more likely to occur in landscapes with much regenerating forest, and less likely to occur in landscapes with much recent clearcut, partial harvest and forested wetland. Lynx were not associated positively or negatively with mature coniferous forest. A probabilistic map of the model indicated a patchy distribution of lynx habitat in northern Maine. According to an additional survey of the study area for lynx tracks during the winter of 2003, the model correctly classified 63.5% of the lynx occurrences and absences. Lynx were more closely associated with young forests than mature forests; however, old-growth forests were functionally absent from the landscape. Lynx habitat could be reduced in northern Maine, given recent trends in forest management practices. Harvest strategies have shifted from clearcutting to partial harvesting. If this trend continues, future landscapes will shift away from extensive regenerating forests and toward landscapes dominated by pole-sized and larger stands. Because Maine presently supports the only verified

  14. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    SciTech Connect

    Houweling, Antonetta C.; Philippens, Marielle E.P.; Dijkema, Tim; Roesink, Judith M.; Terhaard, Chris H.J.; Schilstra, Cornelis; Ten Haken, Randall K.; Eisbruch, Avraham; Raaijmakers, Cornelis P.J.

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulated salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.

  15. Monthly streamflow prediction in the Volta Basin of West Africa: A SISO NARMAX polynomial modelling

    NASA Astrophysics Data System (ADS)

    Amisigo, B. A.; van de Giesen, N.; Rogers, C.; Andah, W. E. I.; Friesen, J.

    Single-input-single-output (SISO) non-linear system identification techniques were employed to model monthly catchment runoff at selected gauging sites in the Volta Basin of West Africa. NARMAX (Non-linear Autoregressive Moving Average with eXogenous Input) polynomial models were fitted to basin monthly rainfall and gauging station runoff data for each of the selected sites and used to predict monthly runoff at the sites. An error reduction ratio (ERR) algorithm was used to order regressors for various combinations of input, output and noise lags (various model structures) and the significant regressors for each model selected by applying an Akaike Information Criterion (AIC) to independent rainfall-runoff validation series. Model parameters were estimated from the Matlab REGRESS function (an orthogonal least squares method). In each case, the sub-model without noise terms was fitted first followed by a fitting of the noise model. The coefficient of determination ( R-squared), the Nash-Sutcliffe Efficiency criterion (NSE) and the F statistic for the estimation (training) series were used to evaluate the significance of fit of each model to this series while model selection from the range of models fitted for each gauging site was done by examining the NSEs and the AICs of the validation series. Monthly runoff predictions from the selected models were very good, and the polynomial models appeared to have captured a good part of the rainfall-runoff non-linearity. The results indicate that the NARMAX modelling framework is suitable for monthly river runoff prediction in the Volta Basin. The several good models made available by the NARMAX modelling framework could be useful in the selection of model structures that also provide insights into the physical behaviour of the catchment rainfall-runoff system.

  16. Assessing the wildlife habitat value of New England salt marshes: II. Model testing and validation.

    PubMed

    McKinney, Richard A; Charpentier, Michael A; Wigand, Cathleen

    2009-07-01

    We tested a previously described model to assess the wildlife habitat value of New England salt marshes by comparing modeled habitat values and scores with bird abundance and species richness at sixteen salt marshes in Narragansett Bay, Rhode Island USA. As a group, wildlife habitat value assessment scores for the marshes ranged from 307-509, or 31-67% of the maximum attainable score. We recorded 6 species of wading birds (Ardeidae; herons, egrets, and bitterns) at the sites during biweekly survey. Species richness (r (2)=0.24, F=4.53, p=0.05) and abundance (r (2)=0.26, F=5.00, p=0.04) of wading birds significantly increased with increasing assessment score. We optimized our assessment model for wading birds by using Akaike information criteria (AIC) to compare a series of models comprised of specific components and categories of our model that best reflect their habitat use. The model incorporating pre-classification, wading bird habitat categories, and natural land surrounding the sites was substantially supported by AIC analysis as the best model. The abundance of wading birds significantly increased with increasing assessment scores generated with the optimized model (r (2)=0.48, F=12.5, p=0.003), demonstrating that optimizing models can be helpful in improving the accuracy of the assessment for a given species or species assemblage. In addition to validating the assessment model, our results show that in spite of their urban setting our study marshes provide substantial wildlife habitat value. This suggests that even small wetlands in highly urbanized coastal settings can provide important wildlife habitat value if key habitat attributes (e.g., natural buffers, habitat heterogeneity) are present.

  17. (C-11)-thymidine PET imaging as a measure of DNA synthesis rate: A preliminary quantitative study of human brain glioblastoma

    SciTech Connect

    Wong, C.Y.O.; Yung, B.C.Y.; Conti, P.

    1994-05-01

    (C-11)-Thymidine (TdR) PET imaging can potentially be used to measure the tumor proliferation in-vivo and monitor treatment. Twenty-four stereotactic brain biopsies (SBB) following in-vivo bromodeoxyuridine (BUDR) under MRI guidance were obtained to correlate with TdR PET imaging of primary glioblastoma in human brain. Following data acquisition, standard 4 by 4 pixel (2mm/pixel) regions of interest (ROIs) were placed over the tumor site based on SBB and the corresponding homologous region of contralateral normal cortices. After correcting input function for major metabolites and subtracting TdR activity in the normal side from the tumor side of the brain, 2- and 3- compartmental analysis was performed for all the ROIs. Akaike :(AIC) and Bayes (BIC) information criteria was calculated to compare these 2 kinetic models for differentiating pure blood pool effects from TdR incorporation into DNA. Of 24 SBB regions, 20 non-overlapping and corresponding ROIs in PET were identified and quantified. Eight ROIs were selected based on the AIC, BIC and root-mean-square errors (RMSE < 0.1) (4 couldn`t be modelled and 8 most likely represented blood flow effects). The percentage (%) of BUDR per high power field area %BUDR labelling. The k3, the forward phosphorylation rate (hence an index of DNA synthesis), was categorized into 2 groups according to a threshold value of %BUDR/hpfa - 5%. The tumor regions with low proliferative index (%BUDR/hpfa<5%) have significantly lower k3 than those with high proliferative index (p<0.005). We also find that k4 is at least an order less than k3, suggesting minimal effects of dephosphorylation and efflux of metabolites. We conclude that 3-compartmental, 4-parameter modeling is adequate for TdR PET studies and k3 correlates with DNA synthesis rate.

  18. Effects of human recreation on the incubation behavior of American Oystercatchers

    USGS Publications Warehouse

    McGowan, C.P.; Simons, T.R.

    2006-01-01

    Human recreational disturbance and its effects on wildlife demographics and behavior is an increasingly important area of research. We monitored the nesting success of American Oystercatchers (Haematopus palliatus) in coastal North Carolina in 2002 and 2003. We also used video monitoring at nests to measure the response of incubating birds to human recreation. We counted the number of trips per hour made by adult birds to and from the nest, and we calculated the percent time that adults spent incubating. We asked whether human recreational activities (truck, all-terrain vehicle [ATV], and pedestrian traffic) were correlated with parental behavioral patterns. Eleven a priori models of nest survival and behavioral covariates were evaluated using Akaike's Information Criterion (AIC) to see whether incubation behavior influenced nest survival. Factors associated with birds leaving their nests (n = 548) included ATV traffic (25%), truck traffic (17%), pedestrian traffic (4%), aggression with neighboring oystercatchers or paired birds exchanging incubation duties (26%), airplane traffic (1%) and unknown factors (29%). ATV traffic was positively associated with the rate of trips to and away from the nest (??1 = 0.749, P < 0.001) and negatively correlated with percent time spent incubating (??1 = -0.037, P = 0.025). Other forms of human recreation apparently had little effect on incubation behaviors. Nest survival models incorporating the frequency of trips by adults to and from the nest, and the percentage of time adults spent incubating, were somewhat supported in the AIC analyses. A low frequency of trips to and from the nest and, counter to expectations, low percent time spent incubating were associated with higher daily nest survival rates. These data suggest that changes in incubation behavior might be one mechanism by which human recreation affects the reproductive success of American Oystercatchers.

  19. Pharmacokinetic Modeling of Intranasal Scopolamine in Plasma Saliva and Urine

    NASA Technical Reports Server (NTRS)

    Wu, L.; Chow, D. S. L.; Tam, V.; Putcha, L.

    2014-01-01

    An intranasal gel formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness. The bioavailability and pharmacokinetics (PK) were evaluated under the Food and Drug Administration guidelines for clinical trials for an Investigative New Drug (IND). The aim of this project was to develop a PK model that can predict the relationship between plasma, saliva and urinary scopolamine concentrations using data collected from the IND clinical trial with INSCOP. METHODS: Twelve healthy human subjects were administered three dose levels (0.1, 0.2 and 0.4 mg) of INSCOP. Serial blood, saliva and urine samples were collected between 5 min to 24 h after dosing and scopolamine concentrations measured by using a validated LC-MS-MS assay. Pharmacokinetic Compartmental models, using actual dosing and sampling times, were built using Phoenix (version 1.2). Model discrimination was performed, by minimizing the Akaike Information Criteria (AIC), maximizing the coefficient of determination (r²) and by comparison of the quality of fit plots. RESULTS: The best structural model to describe scopolamine disposition after INSCOP administration (minimal AIC =907.2) consisted of one compartment for plasma, saliva and urine respectively that were inter-connected with different rate constants. The estimated values of PK parameters were compiled in Table 1. The model fitting exercises revealed a nonlinear PK for scopolamine between plasma and saliva compartments for K21, Vmax and Km. CONCLUSION: PK model for INSCOP was developed and for the first time it satisfactorily predicted the PK of scopolamine in plasma, saliva and urine after INSCOP administration. Using non-linear PK yielded the best structural model to describe scopolamine disposition between plasma and saliva compartments, and inclusion of non-linear PK resulted in a significant improved model fitting. The model can be utilized to predict scopolamine plasma concentration using saliva and/or urine data that

  20. Preliminary analysis using multi-atlas labeling algorithms for tracing longitudinal change

    PubMed Central

    Kim, Regina E. Y.; Lourens, Spencer; Long, Jeffrey D.; Paulsen, Jane S.; Johnson, Hans J.

    2015-01-01

    Multicenter longitudinal neuroimaging has great potential to provide efficient and consistent biomarkers for research of neurodegenerative diseases and aging. In rare disease studies it is of primary importance to have a reliable tool that performs consistently for data from many different collection sites to increase study power. A multi-atlas labeling algorithm is a powerful brain image segmentation approach that is becoming increasingly popular in image processing. The present study examined the performance of multi-atlas labeling tools for subcortical identification using two types of in-vivo image database: Traveling Human Phantom (THP) and PREDICT-HD. We compared the accuracy (Dice Similarity Coefficient; DSC and intraclass correlation; ICC), multicenter reliability (Coefficient of Variance; CV), and longitudinal reliability (volume trajectory smoothness and Akaike Information Criterion; AIC) of three automated segmentation approaches: two multi-atlas labeling tools, MABMIS and MALF, and a machine-learning-based tool, BRAINSCut. In general, MALF showed the best performance (higher DSC, ICC, lower CV, AIC, and smoother trajectory) with a couple of exceptions. First, the results of accumben, where BRAINSCut showed higher reliability, were still premature to discuss their reliability levels since their validity is still in doubt (DSC < 0.7, ICC < 0.7). For caudate, BRAINSCut presented slightly better accuracy while MALF showed significantly smoother longitudinal trajectory. We discuss advantages and limitations of these performance variations and conclude that improved segmentation quality can be achieved using multi-atlas labeling methods. While multi-atlas labeling methods are likely to help improve overall segmentation quality, caution has to be taken when one chooses an approach, as our results suggest that segmentation outcome can vary depending on research interest. PMID:26236182

  1. Application of Cox and Parametric Survival Models to Assess Social Determinants of Health Affecting Three-Year Survival of Breast Cancer Patients.

    PubMed

    Mohseny, Maryam; Amanpour, Farzaneh; Mosavi-Jarrahi, Alireza; Jafari, Hossein; Moradi-Joo, Mohammad; Davoudi Monfared, Esmat

    2016-01-01

    Breast cancer is one of the most common causes of cancer mortality in Iran. Social determinants of health are among the key factors affecting the pathogenesis of diseases. This cross-sectional study aimed to determine the social determinants of breast cancer survival time with parametric and semi-parametric regression models. It was conducted on male and female patients diagnosed with breast cancer presenting to the Cancer Research Center of Shohada-E-Tajrish Hospital from 2006 to 2010. The Cox proportional hazard model and parametric models including the Weibull, log normal and log-logistic models were applied to determine the social determinants of survival time of breast cancer patients. The Akaike information criterion (AIC) was used to assess the best fit. Statistical analysis was performed with STATA (version 11) software. This study was performed on 797 breast cancer patients, aged 25-93 years with a mean age of 54.7 (±11.9) years. In both semi-parametric and parametric models, the three-year survival was related to level of education and municipal district of residence (P<0.05). The AIC suggested that log normal distribution was the best fit for the three-year survival time of breast cancer patients. Social determinants of health such as level of education and municipal district of residence affect the survival of breast cancer cases. Future studies must focus on the effect of childhood social class on the survival times of cancers, which have hitherto only been paid limited attention. PMID:27165244

  2. Evaluating a coupled discrete wavelet transform and support vector regression for daily and monthly streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyong; Zhou, Ping; Chen, Gang; Guo, Ledong

    2014-11-01

    This study investigated the performance and potential of a hybrid model that combined the discrete wavelet transform and support vector regression (the DWT-SVR model) for daily and monthly streamflow forecasting. Three key factors of the wavelet decomposition phase (mother wavelet, decomposition level, and edge effect) were proposed to consider for improving the accuracy of the DWT-SVR model. The performance of DWT-SVR models with different combinations of these three factors was compared with the regular SVR model. The effectiveness of these models was evaluated using the root-mean-squared error (RMSE) and Nash-Sutcliffe model efficiency coefficient (NSE). Daily and monthly streamflow data observed at two stations in Indiana, United States, were used to test the forecasting skill of these models. The results demonstrated that the different hybrid models did not always outperform the SVR model for 1-day and 1-month lead time streamflow forecasting. This suggests that it is crucial to consider and compare the three key factors when using the DWT-SVR model (or other machine learning methods coupled with the wavelet transform), rather than choosing them based on personal preferences. We then combined forecasts from multiple candidate DWT-SVR models using a model averaging technique based upon Akaike's information criterion (AIC). This ensemble prediction was superior to the single best DWT-SVR model and regular SVR model for both 1-day and 1-month ahead predictions. With respect to longer lead times (i.e., 2- and 3-day and 2-month), the ensemble predictions using the AIC averaging technique were consistently better than the best DWT-SVR model and SVR model. Therefore, integrating model averaging techniques with the hybrid DWT-SVR model would be a promising approach for daily and monthly streamflow forecasting. Additionally, we strongly recommend considering these three key factors when using wavelet-based SVR models (or other wavelet-based forecasting models).

  3. Predicting crappie recruitment in Ohio reservoirs with spawning stock size, larval density, and chlorophyll concentrations

    USGS Publications Warehouse

    Bunnell, David B.; Hale, R. Scott; Vanni, Michael J.; Stein, Roy A.

    2006-01-01

    Stock-recruit models typically use only spawning stock size as a predictor of recruitment to a fishery. In this paper, however, we used spawning stock size as well as larval density and key environmental variables to predict recruitment of white crappies Pomoxis annularis and black crappies P. nigromaculatus, a genus notorious for variable recruitment. We sampled adults and recruits from 11 Ohio reservoirs and larvae from 9 reservoirs during 1998-2001. We sampled chlorophyll as an index of reservoir productivity and obtained daily estimates of water elevation to determine the impact of hydrology on recruitment. Akaike's information criterion (AIC) revealed that Ricker and Beverton-Holt stock-recruit models that included chlorophyll best explained the variation in larval density and age-2 recruits. Specifically, spawning stock catch per effort (CPE) and chlorophyll explained 63-64% of the variation in larval density. In turn, larval density and chlorophyll explained 43-49% of the variation in age-2 recruit CPE. Finally, spawning stock CPE and chlorophyll were the best predictors of recruit CPE (i.e., 74-86%). Although larval density and recruitment increased with chlorophyll, neither was related to seasonal water elevation. Also, the AIC generally did not distinguish between Ricker and Beverton-Holt models. From these relationships, we concluded that crappie recruitment can be limited by spawning stock CPE and larval production when spawning stock sizes are low (i.e., CPE , 5 crappies/net-night). At higher levels of spawning stock sizes, spawning stock CPE and recruitment were less clearly related. To predict recruitment in Ohio reservoirs, managers should assess spawning stock CPE with trap nets and estimate chlorophyll concentrations. To increase crappie recruitment in reservoirs where recruitment is consistently poor, managers should use regulations to increase spawning stock size, which, in turn, should increase larval production and recruits to the fishery.

  4. Water availability determines the richness and density of fig trees within Brazilian semideciduous forest landscapes

    NASA Astrophysics Data System (ADS)

    Coelho, Luís Francisco Mello; Ribeiro, Milton Cezar; Pereira, Rodrigo Augusto Santinelo

    2014-05-01

    The success of fig trees in tropical ecosystems is evidenced by the great diversity (+750 species) and wide geographic distribution of the genus. We assessed the contribution of environmental variables on the species richness and density of fig trees in fragments of seasonal semideciduous forest (SSF) in Brazil. We assessed 20 forest fragments in three regions in Sao Paulo State, Brazil. Fig tree richness and density was estimated in rectangular plots, comprising 31.4 ha sampled. Both richness and fig tree density were linearly modeled as function of variables representing (1) fragment metrics, (2) forest structure, and (3) landscape metrics expressing water drainage in the fragments. Model selection was performed by comparing the AIC values (Akaike Information Criterion) and the relative weight of each model (wAIC). Both species richness and fig tree density were better explained by the water availability in the fragment (meter of streams/ha): wAICrichness = 0.45, wAICdensity = 0.96. The remaining variables related to anthropic perturbation and forest structure were of little weight in the models. The rainfall seasonality in SSF seems to select for both establishment strategies and morphological adaptations in the hemiepiphytic fig tree species. In the studied SSF, hemiepiphytes established at lower heights in their host trees than reported for fig trees in evergreen rainforests. Some hemiepiphytic fig species evolved superficial roots extending up to 100 m from their trunks, resulting in hectare-scale root zones that allow them to efficiently forage water and soil nutrients. The community of fig trees was robust to variation in forest structure and conservation level of SSF fragments, making this group of plants an important element for the functioning of seasonal tropical forests.

  5. Prognosis Evaluation in Patients with Hepatocellular Carcinoma after Hepatectomy: Comparison of BCLC, TNM and Hangzhou Criteria Staging Systems

    PubMed Central

    Lu, Wu-sheng; Yan, Lu-nan; Xiao, Guang-qin; Jiang, Li; Yang, Jian; Yang, Jia-yin

    2014-01-01

    Purpose This study is to evaluate the Hangzhou criteria (HC) for patients with HCC undergoing surgical resection and to identify whether this staging system is superior to other staging systems in predicting the survival of resectable HCC. Method 774 HCC patients underwent surgical resection between 2007 and 2009 in West China Hospital were enrolled retrospectively. Predictors of survival were identified using the Kaplan–Meier method and the Cox model. The disease state was staged by the HC, as well as by the TNM and BCLC staging systems. Prognostic powers were quantified using a linear trend χ2 test, c-index, and the likelihood ratio (LHR) χ2 test and correlated using Cox's regression model adjusted using the Akaike information criterion (AIC). Results Serum AFP level (P = 0.02), tumor size (P<0.001), tumor number (P<0.001), portal vein invasion (P<0.001), hepatic vein invasion (P<0.001), tumor differentiation (P<0.001), and distant organ (P = 0.016) and lymph node metastasis (P<0.001) were identified as independent risk factors of survival after resection by multivariate analysis. The comparison of the different staging system results showed that BCLC had the best homogeneity (likelihood ratio χ2 test 151.119, P<0.001), the TNM system had the best monotonicity of gradients (linear trend χ2 test 137.523, P<0.001), and discriminatory ability was the highest for the BCLC (the AUCs for 1-year mortality were 0.759) and TNM staging systems (the AUCs for 3-, and 5-year mortality were 0.738 and 0.731, respectively). However, based on the c-index and AIC, the HC was the most informative staging system in predicting survival (c-index 0.6866, AIC 5924.4729). Conclusions The HC can provide important prognostic information after surgery. The HC were shown to be a promising survival predictor in a Chinese cohort of patients with resectable HCC. PMID:25133493

  6. Model averaging and muddled multimodel inferences

    USGS Publications Warehouse

    Cade, Brian S.

    2015-01-01

    Three flawed practices associated with model averaging coefficients for predictor variables in regression models commonly occur when making multimodel inferences in analyses of ecological data. Model-averaged regression coefficients based on Akaike information criterion (AIC) weights have been recommended for addressing model uncertainty but they are not valid, interpretable estimates of partial effects for individual predictors when there is multicollinearity among the predictor variables. Multicollinearity implies that the scaling of units in the denominators of the regression coefficients may change across models such that neither the parameters nor their estimates have common scales, therefore averaging them makes no sense. The associated sums of AIC model weights recommended to assess relative importance of individual predictors are really a measure of relative importance of models, with little information about contributions by individual predictors compared to other measures of relative importance based on effects size or variance reduction. Sometimes the model-averaged regression coefficients for predictor variables are incorrectly used to make model-averaged predictions of the response variable when the models are not linear in the parameters. I demonstrate the issues with the first two practices using the college grade point average example extensively analyzed by Burnham and Anderson. I show how partial standard deviations of the predictor variables can be used to detect changing scales of their estimates with multicollinearity. Standardizing estimates based on partial standard deviations for their variables can be used to make the scaling of the estimates commensurate across models, a necessary but not sufficient condition for model averaging of the estimates to be sensible. A unimodal distribution of estimates and valid interpretation of individual parameters are additional requisite conditions. The standardized estimates or equivalently the

  7. Model averaging and muddled multimodel inferences.

    PubMed

    Cade, Brian S

    2015-09-01

    Three flawed practices associated with model averaging coefficients for predictor variables in regression models commonly occur when making multimodel inferences in analyses of ecological data. Model-averaged regression coefficients based on Akaike information criterion (AIC) weights have been recommended for addressing model uncertainty but they are not valid, interpretable estimates of partial effects for individual predictors when there is multicollinearity among the predictor variables. Multicollinearity implies that the scaling of units in the denominators of the regression coefficients may change across models such that neither the parameters nor their estimates have common scales, therefore averaging them makes no sense. The associated sums of AIC model weights recommended to assess relative importance of individual predictors are really a measure of relative importance of models, with little information about contributions by individual predictors compared to other measures of relative importance based on effects size or variance reduction. Sometimes the model-averaged regression coefficients for predictor variables are incorrectly used to make model-averaged predictions of the response variable when the models are not linear in the parameters. I demonstrate the issues with the first two practices using the college grade point average example extensively analyzed by Burnham and Anderson. I show how partial standard deviations of the predictor variables can be used to detect changing scales of their estimates with multicollinearity. Standardizing estimates based on partial standard deviations for their variables can be used to make the scaling of the estimates commensurate across models, a necessary but not sufficient condition for model averaging of the estimates to be sensible. A unimodal distribution of estimates and valid interpretation of individual parameters are additional requisite conditions. The standardized estimates or equivalently the t

  8. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese.

    PubMed

    Cai, Qianqian; Turner, Brett D; Sheng, Daichao; Sloan, Scott

    2015-01-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd(2+), Ba(2+) and Mn(2+)) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L(-1)) and Cd (100 mg L(-1)) respectively increases the rate of fluoride sorption by a factor of ~28.3 and ~10.9, the maximum sorption capacity is increased by ~2.2 and ~1.7. The presence of Ba (100 mg L(-1)) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba.

  9. Evaluating Key Watershed Components of Low Flow Regimes in New England Streams.

    PubMed

    Morrison, Alisa C; Gold, Arthur J; Pelletier, Marguerite C

    2016-05-01

    Water resource managers seeking to optimize stream ecosystem services and abstractions of water from watersheds need an understanding of the importance of land use, physical and climatic characteristics, and hydrography on different low flow components of stream hydrographs. Within 33 USGS gaged watersheds of southern New England, we assessed relationships between watershed variables and a set of low flow parameters by using an information-theoretical approach. The key variables identified by the Akaike Information Criteria (AIC) weighting factors as generating positive relationships with low flow events included percent stratified drift, mean elevation, drainage area, and mean August precipitation. The extent of wetlands in the watershed was negatively related to low flow magnitudes. Of the various land use variables, the percentage of developed land was found to have the highest importance and a negative relationship on low flow magnitudes, but was less important than wetlands and physical and climatic features. Our results suggest that management practices aimed to sustain low flows in fluvial systems can benefit from attention to specific watershed features. We draw attention to the finding that streams located in watersheds with high proportions of wetlands may require more stringent approaches to withdrawals to sustain fluvial ecosystems during drought periods, particularly in watersheds with extensive development and limited deposits of stratified drift.

  10. A novel electrocardiogram parameterization algorithm and its application in myocardial infarction detection.

    PubMed

    Liu, Bin; Liu, Jikui; Wang, Guoqing; Huang, Kun; Li, Fan; Zheng, Yang; Luo, Youxi; Zhou, Fengfeng

    2015-06-01

    The electrocardiogram (ECG) is a biophysical electric signal generated by the heart muscle, and is one of the major measurements of how well a heart functions. Automatic ECG analysis algorithms usually extract the geometric or frequency-domain features of the ECG signals and have already significantly facilitated automatic ECG-based cardiac disease diagnosis. We propose a novel ECG feature by fitting a given ECG signal with a 20th order polynomial function, defined as PolyECG-S. The PolyECG-S feature is almost identical to the fitted ECG curve, measured by the Akaike information criterion (AIC), and achieved a 94.4% accuracy in detecting the Myocardial Infarction (MI) on the test dataset. Currently ST segment elongation is one of the major ways to detect MI (ST-elevation myocardial infarction, STEMI). However, many ECG signals have weak or even undetectable ST segments. Since PolyECG-S does not rely on the information of ST waves, it can be used as a complementary MI detection algorithm with the STEMI strategy. Overall, our results suggest that the PolyECG-S feature may satisfactorily reconstruct the fitted ECG curve, and is complementary to the existing ECG features for automatic cardiac function analysis.

  11. Joint inversion of T1-T2 spectrum combining the iterative truncated singular value decomposition and the parallel particle swarm optimization algorithms

    NASA Astrophysics Data System (ADS)

    Ge, Xinmin; Wang, Hua; Fan, Yiren; Cao, Yingchang; Chen, Hua; Huang, Rui

    2016-01-01

    With more information than the conventional one dimensional (1D) longitudinal relaxation time (T1) and transversal relaxation time (T2) spectrums, a two dimensional (2D) T1-T2 spectrum in a low field nuclear magnetic resonance (NMR) is developed to discriminate the relaxation components of fluids such as water, oil and gas in porous rock. However, the accuracy and efficiency of the T1-T2 spectrum are limited by the existing inversion algorithms and data acquisition schemes. We introduce a joint method to inverse the T1-T2 spectrum, which combines iterative truncated singular value decomposition (TSVD) and a parallel particle swarm optimization (PSO) algorithm to get fast computational speed and stable solutions. We reorganize the first kind Fredholm integral equation of two kernels to a nonlinear optimization problem with non-negative constraints, and then solve the ill-conditioned problem by the iterative TSVD. Truncating positions of the two diagonal matrices are obtained by the Akaike information criterion (AIC). With the initial values obtained by TSVD, we use a PSO with parallel structure to get the global optimal solutions with a high computational speed. We use the synthetic data with different signal to noise ratio (SNR) to test the performance of the proposed method. The result shows that the new inversion algorithm can achieve favorable solutions for signals with SNR larger than 10, and the inversion precision increases with the decrease of the components of the porous rock.

  12. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  13. Multimodel predictive system for carbon dioxide solubility in saline formation waters.

    PubMed

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-01

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO(2) solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304-433 K, pressure range 74-500 bar, and salt concentration range 0-7 m (NaCl equivalent), using 173 published CO(2) solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO(2) solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO(2) solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  14. Multimodel Predictive System for Carbon Dioxide Solubility in Saline Formation Waters

    SciTech Connect

    Wang, Zan; Small, Mitchell J; Karamalidis, Athanasios K

    2013-02-05

    The prediction of carbon dioxide solubility in brine at conditions relevant to carbon sequestration (i.e., high temperature, pressure, and salt concentration (T-P-X)) is crucial when this technology is applied. Eleven mathematical models for predicting CO{sub 2} solubility in brine are compared and considered for inclusion in a multimodel predictive system. Model goodness of fit is evaluated over the temperature range 304–433 K, pressure range 74–500 bar, and salt concentration range 0–7 m (NaCl equivalent), using 173 published CO{sub 2} solubility measurements, particularly selected for those conditions. The performance of each model is assessed using various statistical methods, including the Akaike Information Criterion (AIC) and the Bayesian Information Criterion (BIC). Different models emerge as best fits for different subranges of the input conditions. A classification tree is generated using machine learning methods to predict the best-performing model under different T-P-X subranges, allowing development of a multimodel predictive system (MMoPS) that selects and applies the model expected to yield the most accurate CO{sub 2} solubility prediction. Statistical analysis of the MMoPS predictions, including a stratified 5-fold cross validation, shows that MMoPS outperforms each individual model and increases the overall accuracy of CO{sub 2} solubility prediction across the range of T-P-X conditions likely to be encountered in carbon sequestration applications.

  15. Effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum.

    PubMed

    Kadam, Shekhar U; Tiwari, Brijesh K; O'Donnell, Colm P

    2015-03-01

    The effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum under hot-air convective drying was investigated. Pretreatments were carried out at ultrasound intensity levels ranging from 7.00 to 75.78 Wcm(-2) for 10 min using an ultrasonic probe system. It was observed that ultrasound pre-treatments reduced the drying time required. The shortest drying times were obtained from samples pre-treated at 75.78 Wcm(-2). The fit quality of 6 thin-layer drying models was also evaluated using the determination of coefficient (R(2)), root means square error (RMSE), AIC (Akaike information criterion) and BIC (Bayesian information criterion). Drying kinetics were modelled using the Newton, Henderson and Pabis, Page, Wang and Singh, Midilli et al. and Weibull models. The Newton, Wang and Singh, and Midilli et al. models showed the best fit to the experimental drying data. Color of ultrasound pretreated dried seaweed samples were lighter compared to control samples. It was concluded that ultrasound pretreatment can be effectively used to reduce the energy cost and drying time for drying of A. nodosum. PMID:25454823

  16. Prediction and extension of curves of distillation of vacuum residue using probability functions

    NASA Astrophysics Data System (ADS)

    León, A. Y.; Riaño, P. A.; Laverde, D.

    2016-02-01

    The use of the probability functions for the prediction of crude distillation curves has been implemented in different characterization studies for refining processes. The study of four functions of probability (Weibull extreme, Weibull, Kumaraswamy and Riazi), was analyzed in this work for the fitting of curves of distillation of vacuum residue. After analysing the experimental data was selected the Weibull extreme function as the best prediction function, the fitting capability of the best function was validated considering as criterions of estimation the AIC (Akaike Information Criterion), BIC (Bayesian information Criterion), and correlation coefficient R2. To cover a wide range of composition were selected fifty-five (55) vacuum residue derived from different hydrocarbon mixture. The parameters of the probability function Weibull Extreme were adjusted from simple measure properties such as Conradson Carbon Residue (CCR), and compositional analysis SARA (saturates, aromatics, resins and asphaltenes). The proposed method is an appropriate tool to describe the tendency of distillation curves and offers a practical approach in terms of classification of vacuum residues.

  17. Modelling lactation curve for milk fat to protein ratio in Iranian buffaloes (Bubalus bubalis) using non-linear mixed models.

    PubMed

    Hossein-Zadeh, Navid Ghavi

    2016-08-01

    The aim of this study was to compare seven non-linear mathematical models (Brody, Wood, Dhanoa, Sikka, Nelder, Rook and Dijkstra) to examine their efficiency in describing the lactation curves for milk fat to protein ratio (FPR) in Iranian buffaloes. Data were 43 818 test-day records for FPR from the first three lactations of Iranian buffaloes which were collected on 523 dairy herds in the period from 1996 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly FPR records of buffaloes using the non-linear mixed model procedure (PROC NLMIXED) in SAS and the parameters were estimated. The models were tested for goodness of fit using Akaike's information criterion (AIC), Bayesian information criterion (BIC) and log maximum likelihood (-2 Log L). The Nelder and Sikka mixed models provided the best fit of lactation curve for FPR in the first and second lactations of Iranian buffaloes, respectively. However, Wood, Dhanoa and Sikka mixed models provided the best fit of lactation curve for FPR in the third parity buffaloes. Evaluation of first, second and third lactation features showed that all models, except for Dijkstra model in the third lactation, under-predicted test time at which daily FPR was minimum. On the other hand, minimum FPR was over-predicted by all equations. Evaluation of the different models used in this study indicated that non-linear mixed models were sufficient for fitting test-day FPR records of Iranian buffaloes. PMID:27600968

  18. The kinetics of fluoride sorption by zeolite: Effects of cadmium, barium and manganese.

    PubMed

    Cai, Qianqian; Turner, Brett D; Sheng, Daichao; Sloan, Scott

    2015-01-01

    Industrial wastewaters often consist of a complex chemical cocktail with treatment of target contaminants complicated by adverse chemical reactions. The impact of metal ions (Cd(2+), Ba(2+) and Mn(2+)) on the kinetics of fluoride removal from solution by natural zeolite was investigated. In order to better understand the kinetics, the pseudo-second order (PSO), Hill (Hill 4 and Hill 5) and intra-particle diffusion (IPD) models were applied. Model fitting was compared using the Akaike Information Criterion (AIC) and the Schwarz Bayesian Information Criterion (BIC). The Hill models (Hill 4 and Hill 5) were found to be superior in describing the fluoride removal processes due to the sigmoidal nature of the kinetics. Results indicate that the presence of Mn (100 mg L(-1)) and Cd (100 mg L(-1)) respectively increases the rate of fluoride sorption by a factor of ~28.3 and ~10.9, the maximum sorption capacity is increased by ~2.2 and ~1.7. The presence of Ba (100 mg L(-1)) initially inhibited fluoride removal and very poor fits were obtained for all models. Fitting was best described with a biphasic sigmoidal model with the degree of inhibition decreasing with increasing temperature suggesting that at least two processes are involved with fluoride sorption onto natural zeolite in the presence of Ba. PMID:25909159

  19. Comparison of Two Gas Selection Methodologies: An Application of Bayesian Model Averaging

    SciTech Connect

    Renholds, Andrea S.; Thompson, Sandra E.; Anderson, Kevin K.; Chilton, Lawrence K.

    2006-03-31

    One goal of hyperspectral imagery analysis is the detection and characterization of plumes. Characterization includes identifying the gases in the plumes, which is a model selection problem. Two gas selection methods compared in this report are Bayesian model averaging (BMA) and minimum Akaike information criterion (AIC) stepwise regression (SR). Simulated spectral data from a three-layer radiance transfer model were used to compare the two methods. Test gases were chosen to span the types of spectra observed, which exhibit peaks ranging from broad to sharp. The size and complexity of the search libraries were varied. Background materials were chosen to either replicate a remote area of eastern Washington or feature many common background materials. For many cases, BMA and SR performed the detection task comparably in terms of the receiver operating characteristic curves. For some gases, BMA performed better than SR when the size and complexity of the search library increased. This is encouraging because we expect improved BMA performance upon incorporation of prior information on background materials and gases.

  20. Effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum.

    PubMed

    Kadam, Shekhar U; Tiwari, Brijesh K; O'Donnell, Colm P

    2015-03-01

    The effect of ultrasound pre-treatment on the drying kinetics of brown seaweed Ascophyllum nodosum under hot-air convective drying was investigated. Pretreatments were carried out at ultrasound intensity levels ranging from 7.00 to 75.78 Wcm(-2) for 10 min using an ultrasonic probe system. It was observed that ultrasound pre-treatments reduced the drying time required. The shortest drying times were obtained from samples pre-treated at 75.78 Wcm(-2). The fit quality of 6 thin-layer drying models was also evaluated using the determination of coefficient (R(2)), root means square error (RMSE), AIC (Akaike information criterion) and BIC (Bayesian information criterion). Drying kinetics were modelled using the Newton, Henderson and Pabis, Page, Wang and Singh, Midilli et al. and Weibull models. The Newton, Wang and Singh, and Midilli et al. models showed the best fit to the experimental drying data. Color of ultrasound pretreated dried seaweed samples were lighter compared to control samples. It was concluded that ultrasound pretreatment can be effectively used to reduce the energy cost and drying time for drying of A. nodosum.

  1. Modelling lactation curve for milk fat to protein ratio in Iranian buffaloes (Bubalus bubalis) using non-linear mixed models.

    PubMed

    Hossein-Zadeh, Navid Ghavi

    2016-08-01

    The aim of this study was to compare seven non-linear mathematical models (Brody, Wood, Dhanoa, Sikka, Nelder, Rook and Dijkstra) to examine their efficiency in describing the lactation curves for milk fat to protein ratio (FPR) in Iranian buffaloes. Data were 43 818 test-day records for FPR from the first three lactations of Iranian buffaloes which were collected on 523 dairy herds in the period from 1996 to 2012 by the Animal Breeding Center of Iran. Each model was fitted to monthly FPR records of buffaloes using the non-linear mixed model procedure (PROC NLMIXED) in SAS and the parameters were estimated. The models were tested for goodness of fit using Akaike's information criterion (AIC), Bayesian information criterion (BIC) and log maximum likelihood (-2 Log L). The Nelder and Sikka mixed models provided the best fit of lactation curve for FPR in the first and second lactations of Iranian buffaloes, respectively. However, Wood, Dhanoa and Sikka mixed models provided the best fit of lactation curve for FPR in the third parity buffaloes. Evaluation of first, second and third lactation features showed that all models, except for Dijkstra model in the third lactation, under-predicted test time at which daily FPR was minimum. On the other hand, minimum FPR was over-predicted by all equations. Evaluation of the different models used in this study indicated that non-linear mixed models were sufficient for fitting test-day FPR records of Iranian buffaloes.

  2. Flexible and fixed mathematical models describing growth patterns of chukar partridges

    NASA Astrophysics Data System (ADS)

    Aygün, Ali; Narinç, Doǧan

    2016-04-01

    In animal science, the nonlinear regression models for growth curve analysis ofgrowth patterns are separated into two groups called fixed and flexible according to their point of inflection. The aims of this study were to compare fixed and flexible growth functions and to determine the best fit model for the growth data of chukar partridges. With this aim, the growth data of partridges were modeled with widely used models, such as Gompertz, Logistic, Von Bertalanffy as well as the flexible functions, such as, Richards, Janoschek, Levakovich. So as to evaluate growth functions, the R2 (coefficient of determination), adjusted R2 (adjusted coefficient of determination), MSE (mean square error), AIC (Akaike's information criterion) and BIC (Bayesian information criterion) goodness of fit criteria were used. It has been determined that the best fit model from the point of chukar partridge growth data according to mentioned goodness of fit criteria is Janoschek function which has a flexible structure. The Janoschek model is not only important because it has a higher number of parameters with biological meaning than the other functions (the mature weight and initial weight parameters), but also because it was not previously used in the modeling of the chukar partridge growth.

  3. Information Integrity

    ERIC Educational Resources Information Center

    Graves, Eric

    2013-01-01

    This dissertation introduces the concept of Information Integrity, which is the detection and possible correction of information manipulation by any intermediary node in a communication system. As networks continue to grow in complexity, information theoretic security has failed to keep pace. As a result many parties whom want to communicate,…

  4. Information Gatekeepers.

    ERIC Educational Resources Information Center

    Metoyer-Duran, Cheryl

    1993-01-01

    Discusses the role of human information gatekeepers. Topics addressed include gatekeeper research in health sciences, education, science and technology, communication studies, journalism, and library and information science; cultural context; information needs and use models; cross-cultural research; ethnolinguistic gatekeepers; and research…

  5. Information "Literacies"

    ERIC Educational Resources Information Center

    Anderson, Byron

    2007-01-01

    As communication technologies change, so do libraries. Library instruction programs are now focused on teaching information literacy, a term that may just as well be referred to as information "literacies." The new media age involves information in a wide variety of mediums. Educators everywhere are realizing media's power to communicate and…

  6. Information Technology.

    ERIC Educational Resources Information Center

    Reynolds, Roger

    1983-01-01

    Describes important information-handling products, predicting future devices in light of convergence and greater flexibility offered through use of microchip technology. Contends that information technology and its impact of privacy depends on how information systems are used, arguing that the privacy issue deals more with moral/physiological…

  7. Informal Taxation*

    PubMed Central

    Olken, Benjamin A.; Singhal, Monica

    2011-01-01

    Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this “informal taxation.” Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts. PMID:22199993

  8. Informal Taxation.

    PubMed

    Olken, Benjamin A; Singhal, Monica

    2011-10-01

    Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this "informal taxation." Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts.

  9. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina L.; Thompson, Shelby G.; Sandor, Aniko; McCann, Robert S.; Kaiser, Mary K.; Adelstein, Barnard D.; Begault, Durand R.; Beutter, Brent R.; Stone, Leland S.; Godfroy, Martine

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. In addition to addressing display design issues associated with information formatting, style, layout, and interaction, the Information Presentation DRP is also working toward understanding the effects of extreme environments encountered in space travel on information processing. Work is also in progress to refine human factors-based design tools, such as human performance modeling, that will supplement traditional design techniques and help ensure that optimal information design is accomplished in the most cost-efficient manner. The major areas of work, or subtasks, within the Information Presentation DRP for FY10 are: 1) Displays, 2) Controls, 3) Procedures and Fault Management, and 4) Human Performance Modeling. The poster will highlight completed and planned work for each subtask.

  10. Double-input compartmental modeling and spectral analysis for the quantification of positron emission tomography data in oncology

    NASA Astrophysics Data System (ADS)

    Tomasi, G.; Kimberley, S.; Rosso, L.; Aboagye, E.; Turkheimer, F.

    2012-04-01

    In positron emission tomography (PET) studies involving organs different from the brain, ignoring the metabolite contribution to the tissue time-activity curves (TAC), as in the standard single-input (SI) models, may compromise the accuracy of the estimated parameters. We employed here double-input (DI) compartmental modeling (CM), previously used for [11C]thymidine, and a novel DI spectral analysis (SA) approach on the tracers 5-[18F]fluorouracil (5-[18F]FU) and [18F]fluorothymidine ([18F]FLT). CM and SA were performed initially with a SI approach using the parent plasma TAC as an input function. These methods were then employed using a DI approach with the metabolite plasma TAC as an additional input function. Regions of interest (ROIs) corresponding to healthy liver, kidneys and liver metastases for 5-[18F]FU and to tumor, vertebra and liver for [18F]FLT were analyzed. For 5-[18F]FU, the improvement of the fit quality with the DI approaches was remarkable; in CM, the Akaike information criterion (AIC) always selected the DI over the SI model. Volume of distribution estimates obtained with DI CM and DI SA were in excellent agreement, for both parent 5-[18F]FU (R2 = 0.91) and metabolite [18F]FBAL (R2 = 0.99). For [18F]FLT, the DI methods provided notable improvements but less substantial than for 5-[18F]FU due to the lower rate of metabolism of [18F]FLT. On the basis of the AIC values, agreement between [18F]FLT Ki estimated with the SI and DI models was good (R2 = 0.75) for the ROIs where the metabolite contribution was negligible, indicating that the additional input did not bias the parent tracer only-related estimates. When the AIC suggested a substantial contribution of the metabolite [18F]FLT-glucuronide, on the other hand, the change in the parent tracer only-related parameters was significant (R2 = 0.33 for Ki). Our results indicated that improvements of DI over SI approaches can range from moderate to substantial and are more significant for tracers with

  11. Effects of floods on fish assemblages in an intermittent prairie stream

    USGS Publications Warehouse

    Franssen, N.R.; Gido, K.B.; Guy, C.S.; Tripe, J.A.; Shrank, S.J.; Strakosh, T.R.; Bertrand, K.N.; Franssen, C.M.; Pitts, K.L.; Paukert, C.P.

    2006-01-01

    1. Floods are major disturbances to stream ecosystems that can kill or displace organisms and modify habitats. Many studies have reported changes in fish assemblages after a single flood, but few studies have evaluated the importance of timing and intensity of floods on long-term fish assemblage dynamics. 2. We used a 10-year dataset to evaluate the effects of floods on fishes in Kings Creek, an intermittent prairie stream in north-eastern, Kansas, U.S.A. Samples were collected seasonally at two perennial headwater sites (1995-2005) and one perennial downstream flowing site (1997-2005) allowing us to evaluate the effects of floods at different locations within a watershed. In addition, four surveys during 2003 and 2004 sampled 3-5 km of stream between the long-term study sites to evaluate the use of intermittent reaches of this stream. 3. Because of higher discharge and bed scouring at the downstream site, we predicted that the fish assemblage would have lowered species richness and abundance following floods. In contrast, we expected increased species richness and abundance at headwater sites because floods increase stream connectivity and create the potential for colonisation from downstream reaches. 4. Akaike Information Criteria (AIC) was used to select among candidate regression models that predicted species richness and abundance based on Julian date, time since floods, season and physical habitat at each site. At the downstream site, AIC weightings suggested Julian date was the best predictor of fish assemblage structure, but no model explained >16% of the variation in species richness or community structure. Variation explained by Julian date was primarily attributed to a long-term pattern of declining abundance of common species. At the headwater sites, there was not a single candidate model selected to predict total species abundance and assemblage structure. AIC weightings suggested variation in assemblage structure was associated with either Julian date

  12. Estimating the risk of cardiovascular disease using an obese-years metric

    PubMed Central

    Abdullah, Asnawi; Amin, Fauzi Ali; Stoelwinder, Johannes; Tanamas, Stephanie K; Wolfe, Rory; Barendregt, Jan; Peeters, Anna

    2014-01-01

    Objective To examine the association between obese-years and the risk of cardiovascular disease (CVD). Study design Prospective cohort study. Setting Boston, USA. Participants 5036 participants of the Framingham Heart Study were examined. Methods Obese-years was calculated by multiplying for each participant the number of body mass index (BMI) units above 29 kg/m2 by the number of years lived at that BMI during approximately 50 years of follow-up. The association between obese-years and CVD was analysed using time-dependent Cox regression adjusted for potential confounders and compared with other models using the Akaike information criterion (AIC). The lowest AIC indicated better fit. Primary outcome CVD. Results The median cumulative obese-years was 24 (range 2–556 obese-years). During 138 918 person-years of follow-up, 2753 (55%) participants were diagnosed with CVD. The incidence rates and adjusted HR (AHR) for CVD increased with an increase in the number of obese-years. AHR for the categories 1–24.9, 25–49.9, 50–74.9 and ≥75 obese-years were, respectively, 1.31 (95% CI 1.15 to 1.48), 1.37 (95% CI 1.14 to 1.65), 1.62 (95% CI 1.32 to 1.99) and 1.80 (95% CI 1.54 to 2.10) compared with those who were never obese (ie, had zero obese-years). The effect of obese-years was stronger in males than females. For every 10 unit increase in obese-years, the AHR of CVD increased by 6% (95% CI 4% to 8%) for males and 3% (95% CI 2% to 4%) for females. The AIC was lowest for the model containing obese-years compared with models containing either the level of BMI or the duration of obesity alone. Conclusions This study demonstrates that obese-years metric conceptually captures the cumulative damage of obesity on body systems, and is found to provide slightly more precise estimation of the risk of CVD than the level or duration of obesity alone. PMID:25231490

  13. "Information, Information Everywhere and Not..."

    ERIC Educational Resources Information Center

    Wright, Paula

    Demographic and economic materials relevant to rural economic development are the focus of this description of the types of information that are collected by the U.S. Bureau of the Census and how this information can be accessed. Information provided on demographic materials includes collection methods--the census, surveys, and administrative…

  14. Forecast of natural aquifer discharge using a data-driven, statistical approach.

    PubMed

    Boggs, Kevin G; Van Kirk, Rob; Johnson, Gary S; Fairley, Jerry P

    2014-01-01

    In the Western United States, demand for water is often out of balance with limited water supplies. This has led to extensive water rights conflict and litigation. A tool that can reliably forecast natural aquifer discharge months ahead of peak water demand could help water practitioners and managers by providing advanced knowledge of potential water-right mitigation requirements. The timing and magnitude of natural aquifer discharge from the Eastern Snake Plain Aquifer (ESPA) in southern Idaho is accurately forecast 4 months ahead of the peak water demand, which occurs annually in July. An ARIMA time-series model with exogenous predictors (ARIMAX model) was used to develop the forecast. The ARIMAX model fit to a set of training data was assessed using Akaike's information criterion to select the optimal model that forecasts aquifer discharge, given the previous year's discharge and values of the predictor variables. Model performance was assessed by application of the model to a validation subset of data. The Nash-Sutcliffe efficiency for model predictions made on the validation set was 0.57. The predictor variables used in our forecast represent the major recharge and discharge components of the ESPA water budget, including variables that reflect overall water supply and important aspects of water administration and management. Coefficients of variation on the regression coefficients for streamflow and irrigation diversions were all much less than 0.5, indicating that these variables are strong predictors. The model with the highest AIC weight included streamflow, two irrigation diversion variables, and storage.

  15. Butyltins, trace metals and morphological variables in surf scoter (Melanitta perspicillata) wintering on the south coast of British Columbia, Canada.

    PubMed

    Elliott, J E; Harris, M L; Wilson, L K; Smith, B D; Batchelor, S P; Maguire, J

    2007-09-01

    From 1998 to 2001 we examined spatial and temporal variation in uptake of contaminants by surf scoters (Melanitta perspicillata) in the Georgia Basin region of the Pacific coast of Canada. Samples were collected during late fall and early spring at industrialized and reference locations, carcasses examined, and tissues collected for histology, biomarkers, and contaminant analyses. Scoters from both Vancouver and Victoria harbours had significantly higher hepatic concentrations of summation operatorbutyltins than birds from a reference site. In adult male surf scoters, hepatic summation operatorbutyltins increased over the winter at two sites (p=0.02, n=26), while mercury increased (p=0.03, n=15) and selenium decreased at one site (p=0.001, n=15). Body condition decreased over the winter at both the treatment site, Howe Sound (p<0.0001, n=12), and the reference site, Baynes Sound (p=0.02, n=15). Multiple regression analysis using Akaike's Information Criteria (AIC(C)) showed an association between hepatic butyltin concentrations and overall body condition (p=0.06, r=-0.237).

  16. Dynamically tunable plasmonically induced transparency in sinusoidally curved and planar graphene layers.

    PubMed

    Xia, Sheng-Xuan; Zhai, Xiang; Wang, Ling-Ling; Sun, Bin; Liu, Jian-Qiang; Wen, Shuang-Chun

    2016-08-01

    To achieve plasmonically induced transparency (PIT), general near-field plasmonic systems based on couplings between localized plasmon resonances of nanostructures rely heavily on the well-designed interantenna separations. However, the implementation of such devices and techniques encounters great difficulties mainly to due to very small sized dimensions of the nanostructures and gaps between them. Here, we propose and numerically demonstrate that PIT can be achieved by using two graphene layers that are composed of a upper sinusoidally curved layer and a lower planar layer, avoiding any pattern of the graphene sheets. Both the analytical fitting and the Akaike Information Criterion (AIC) method are employed efficiently to distinguish the induced window, which is found to be more likely caused by Autler-Townes splitting (ATS) instead of electromagnetically induced transparency (EIT). Besides, our results show that the resonant modes cannot only be tuned dramatically by geometrically changing the grating amplitude and the interlayer spacing, but also by dynamically varying the Fermi energy of the graphene sheets. Potential applications of the proposed system could be expected on various photonic functional devices, including optical switches, plasmonic sensors. PMID:27505756

  17. ToPS: a framework to manipulate probabilistic models of sequence data.

    PubMed

    Kashiwabara, André Yoshiaki; Bonadio, Igor; Onuchic, Vitor; Amado, Felipe; Mathias, Rafael; Durham, Alan Mitchell

    2013-01-01

    Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i) independent and identically distributed process; (ii) variable-length Markov chain; (iii) inhomogeneous Markov chain; (iv) hidden Markov model; (v) profile hidden Markov model; (vi) pair hidden Markov model; (vii) generalized hidden Markov model; and (viii) similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC). The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently. PMID:24098098

  18. The optimal number of lymph nodes removed in maximizing the survival of breast cancer patients

    NASA Astrophysics Data System (ADS)

    Peng, Lim Fong; Taib, Nur Aishah; Mohamed, Ibrahim; Daud, Noorizam

    2014-07-01

    The number of lymph nodes removed is one of the important predictors for survival in breast cancer study. Our aim is to determine the optimal number of lymph nodes to be removed for maximizing the survival of breast cancer patients. The study population consists of 873 patients with at least one of axillary nodes involved among 1890 patients from the University of Malaya Medical Center (UMMC) breast cancer registry. For this study, the Chi-square test of independence is performed to determine the significant association between prognostic factors and survival status, while Wilcoxon test is used to compare the estimates of the hazard functions of the two or more groups at each observed event time. Logistic regression analysis is then conducted to identify important predictors of survival. In particular, Akaike's Information Criterion (AIC) are calculated from the logistic regression model for all thresholds of node involved, as an alternative measure for the Wald statistic (χ2), in order to determine the optimal number of nodes that need to be removed to obtain the maximum differential in survival. The results from both measurements are compared. It is recommended that, for this particular group, the minimum of 10 nodes should be removed to maximize survival of breast cancer patients.

  19. Ultrafine particle concentrations in the surroundings of an urban area: comparing downwind to upwind conditions using Generalized Additive Models (GAMs).

    PubMed

    Sartini, Claudio; Zauli Sajani, Stefano; Ricciardelli, Isabella; Delgado-Saborit, Juana Mari; Scotto, Fabiana; Trentini, Arianna; Ferrari, Silvia; Poluzzi, Vanes

    2013-10-01

    The aim of this study was to investigate the influence of an urban area on ultrafine particle (UFP) concentration in nearby surrounding areas. We assessed how downwind and upwind conditions affect the UFP concentration at a site placed a few kilometres from the city border. Secondarily, we investigated the relationship among other meteorological factors, temporal variables and UFP. Data were collected for 44 days during 2008 and 2009 at a rural site placed about 3 kilometres from Bologna, in northern Italy. Measurements were performed using a spectrometer (FMPS TSI 3091). The average UFP number concentration was 11 776 (±7836) particles per cm(3). We analysed the effect of wind direction in a multivariate Generalized Additive Model (GAM) adjusted for the principal meteorological parameters and temporal trends. An increase of about 25% in UFP levels was observed when the site was downwind of the urban area, compared with the levels observed when wind blew from rural areas. The size distribution of particles was also affected by the wind direction, showing higher concentration of small size particles when the wind blew from the urban area. The GAM showed a good fit to the data (R(2) = 0.81). Model choice was via Akaike Information Criteria (AIC). The analysis also revealed that an approach based on meteorological data plus temporal trends improved the goodness of the fit of the model. In addition, the findings contribute to evidence on effects of exposure to ultrafine particles on a population living in city surroundings. PMID:24077061

  20. Age and growth of chub mackerel ( Xcomber japonicus) in the East China and Yellow Seas using sectioned otolith samples

    NASA Astrophysics Data System (ADS)

    Li, Gang; Chen, Xinjun; Feng, Bo

    2008-11-01

    Although chub mackerel ( Scomber japonicus) is a primary pelagic fish species, we have only limited knowledge on its key life history processes. The present work studied the age and growth of chub mackerel in the East China and Yellow Seas. Age was determined by interpreting and counting growth rings on the sagitta otoliths of 252 adult fish caught by the Chinese commercial purse seine fleet during the period from November 2006 to January 2007 and 150 juveniles from bottom trawl surveys on the spawning ground in May 2006. The difference between the assumed birth date of 1st April and date of capture was used to adjust the age determined from counting the number of complete translucent rings. The parameters of three commonly used growth models, the von Bertalanffy, Logistic and Gompertz models, were estimated using the maximum likelihood method. Based on the Akaike Information Criterion ( AIC), the von Bertalanffy growth model was found to be the most appropriate model. The size-at-age and size-at-maturity values were also found to decrease greatly compared with the results achieved in the 1950s, which was caused by heavy exploitation over the last few decades.

  1. Negative binomial models for abundance estimation of multiple closed populations

    USGS Publications Warehouse

    Boyce, Mark S.; MacKenzie, Darry I.; Manly, Bryan F.J.; Haroldson, Mark A.; Moody, David W.

    2001-01-01

    Counts of uniquely identified individuals in a population offer opportunities to estimate abundance. However, for various reasons such counts may be burdened by heterogeneity in the probability of being detected. Theoretical arguments and empirical evidence demonstrate that the negative binomial distribution (NBD) is a useful characterization for counts from biological populations with heterogeneity. We propose a method that focuses on estimating multiple populations by simultaneously using a suite of models derived from the NBD. We used this approach to estimate the number of female grizzly bears (Ursus arctos) with cubs-of-the-year in the Yellowstone ecosystem, for each year, 1986-1998. Akaike's Information Criteria (AIC) indicated that a negative binomial model with a constant level of heterogeneity across all years was best for characterizing the sighting frequencies of female grizzly bears. A lack-of-fit test indicated the model adequately described the collected data. Bootstrap techniques were used to estimate standard errors and 95% confidence intervals. We provide a Monte Carlo technique, which confirms that the Yellowstone ecosystem grizzly bear population increased during the period 1986-1998.

  2. Extensions to minimum relative entropy inversion for noisy data

    NASA Astrophysics Data System (ADS)

    Ulrych, Tadeusz J.; Woodbury, Allan D.

    2003-12-01

    Minimum relative entropy (MRE) and Tikhonov regularization (TR) were compared by Neupauer et al. [Water Resour. Res. 36 (2000) 2469] on the basis of an example plume source reconstruction problem originally proposed by Skaggs and Kabala [Water Resour. Res. 30 (1994) 71] and a boxcar-like function. Although Neupauer et al. [Water Resour. Res. 36 (2000) 2469] were careful in their conclusions to note the basis of these comparisons, we show that TR does not perform well on problems in which delta-like sources are convolved with diffuse-groundwater contamination response functions, particularly in the presence of noise. We also show that it is relatively easy to estimate an appropriate value for ɛ, the hyperparameter needed in the minimum relative entropy solution for the inverse problem in the presence of noise. This can be estimated in a variety of ways, including estimation from the data themselves, analysis of data residuals, and a rigorous approach using the real cepstrum and the Akaike Information Criterion (AIC). Regardless of the approach chosen, for the sample problem reported herein, excellent resolution of multiple delta-like spikes is produced from MRE from noisy, diffuse data. The usefulness of MRE for noisy inverse problems has been demonstrated.

  3. Extensions to minimum relative entropy inversion for noisy data.

    PubMed

    Ulrych, Tadeusz J; Woodbury, Allan D

    2003-12-01

    Minimum relative entropy (MRE) and Tikhonov regularization (TR) were compared by Neupauer et al. [Water Resour. Res. 36 (2000) 2469] on the basis of an example plume source reconstruction problem originally proposed by Skaggs and Kabala [Water Resour. Res. 30 (1994) 71] and a boxcar-like function. Although Neupauer et al. [Water Resour. Res. 36 (2000) 2469] were careful in their conclusions to note the basis of these comparisons, we show that TR does not perform well on problems in which delta-like sources are convolved with diffuse-groundwater contamination response functions, particularly in the presence of noise. We also show that it is relatively easy to estimate an appropriate value for epsilon, the hyperparameter needed in the minimum relative entropy solution for the inverse problem in the presence of noise. This can be estimated in a variety of ways, including estimation from the data themselves, analysis of data residuals, and a rigorous approach using the real cepstrum and the Akaike Information Criterion (AIC). Regardless of the approach chosen, for the sample problem reported herein, excellent resolution of multiple delta-like spikes is produced from MRE from noisy, diffuse data. The usefulness of MRE for noisy inverse problems has been demonstrated.

  4. Analysis of mathematical models of Pseudomonas spp. growth in pallet-package pork stored at different temperatures.

    PubMed

    Li, Miaoyun; Niu, Huimin; Zhao, Gaiming; Tian, Lu; Huang, Xianqing; Zhang, Jianwei; Tian, Wei; Zhang, Qiuhui

    2013-04-01

    Pseudomonas of pallet-packaged raw pork grown at 0, 5, 10, 15, 20 and 25°C has been studied in this paper. The modified Gompertz, Baranyi and Huang models were used for data fitting. Statistical criteria such as residual sum of squares, mean square error, Akaike's information criterion, and pseudo-R(2) were used to evaluate model performance. Results showed that there was an apparent decline in Pseudomonas growth at initial-storage phase at low temperatures. The modified Gompertz model outperformed the others at 5, 15, and 20°C, while Baranyi model was appropriate for 0 and 25°C. The Huang model was optimal at 10°C. No single model can give a consistently preferable goodness-of-fit for all growth data. The Gompertz model, with the smallest average values of RSS, AIC, MSE and the biggest pseudo-R(2) at all temperatures, is the most appropriate model to describe the growth of Pseudomonas of raw pork under pallet packaging.

  5. Temporal relationship between rainfall, temperature and occurrence of dengue cases in São Luís, Maranhão, Brazil.

    PubMed

    Silva, Fabrício Drummond; dos Santos, Alcione Miranda; Corrêa, Rita da Graça Carvalhal Frazão; Caldas, Arlene de Jesus Mendes

    2016-02-01

    This study analyzed the relationship between rainfall, temperature and occurrence of dengue cases. Ecological study performed with autochthonous dengue cases reported during 2003 to 2010 in São Luís, Maranhão. Data of rainfall and temperature were collected monthly. The monthly incidence of dengue cases was calculated by year/100,000 inhabitants. In order to identify the influence of climate variables and dengue cases different distributed lag models using negative binomial distribution were considered. Model selection was based on the lowest AIC (Akaike Information Criterion). Thirteen thousand, four hundred forty-four cases of dengue between 2003 and 2010 were reported, with peaks in 2005, 2007 and 2010. The correlation between rainfall and the occurrence of dengue cases showed increase in the first months after the rainy months. Occurrence of dengue cases was observed during all the period of study. Only rainfall-lag per three months showed a positive association with the number of cases dengue. Thus, this municipality is considered as an endemic and epidemic site. In addition, the relation between rainfall and dengue cases was significant with a lag of three months. These results should be useful to the future development of politics healthy for dengue prevention and control.

  6. Impact of Large-scale Circulation Patterns on Surface Ozone Variability in Houston-Galveston-Brazoria

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Jia, B.; Xie, Y.

    2015-12-01

    The Bermuda High (BH) is a key driver of large-scale circulation patterns for Southeastern Texas and other Gulf coast states in summer, with the expected influence on surface ozone through its modulation of marine air inflow with lower ozone background from the Gulf of Mexico. We develop a statistical relationship through multiple linear regression (MLR) to quantify the impact of the BH variations on surface ozone variability during the ozone season in the Houston-Galveston-Brazoria (HGB) area, a major ozone nonattainment region on the Gulf Coast. We find that the variability in BH location, represented by a longitude index of the BH west edge (BH-Lon) in the MLR, explains 50-60% of the year-to-year variability in monthly mean ozone over HGB for Jun and July during 1998-2013, and the corresponding figure for Aug and Sep is 20%. Additional 30%-40% of the ozone variability for Aug and Sep can be explained by the variability in BH strength, represented by two BH intensity indices (BHI) in the MLR, but its contribution is only 5% for June and not significant for July. Including a maximum Through stepwise regression based on Akaike Information Criterion (AIC), the MLR model captures 58~72% of monthly ozone variability during Jun-Sep with a cross-validation R2 of 0.5. This observation-derived statistical relationship will be valuable to constrain model simulations of ozone variability attributable to large-scale circulation patterns.

  7. Degree-day accumulation influences annual variability in growth of age-0 walleye

    USGS Publications Warehouse

    Uphoff, Christopher S.; Schoenebeck, Casey W.; Hoback, W. Wyatt; Koupal, Keith D.; Pope, Kevin L.

    2013-01-01

    The growth of age-0 fishes influences survival, especially in temperate regions where size-dependent over-winter mortality can be substantial. Additional benefits of earlier maturation and greater fecundity may exist for faster growing individuals. This study correlated prey densities, growing-degree days, water-surface elevation, turbidity, and chlorophyll a with age-0 walleye Sander vitreus growth in a south-central Nebraska irrigation reservoir. Growth of age-0 walleye was variable between 2003 and 2011, with mean lengths ranging from 128 to 231 mm by fall (September 30th–October 15th). A set of a priori candidate models were used to assess the relative support of explanatory variables using Akaike's information criterion (AIC). A temperature model using the growing degree-days metric was the best supported model, describing 65% of the variability in annual mean lengths of age-0 walleye. The second and third best supported models included the variables chlorophyll a (r2 = 0.49) and larval freshwater drum density (r2 = 0.45), respectively. There have been mixed results concerning the importance of temperature effects on growth of age-0 walleye. This study supports the hypothesis that temperature is the most important predictor of age-0 walleye growth near the southwestern limits of its natural range.

  8. Analysis of mathematical models of Pseudomonas spp. growth in pallet-package pork stored at different temperatures.

    PubMed

    Li, Miaoyun; Niu, Huimin; Zhao, Gaiming; Tian, Lu; Huang, Xianqing; Zhang, Jianwei; Tian, Wei; Zhang, Qiuhui

    2013-04-01

    Pseudomonas of pallet-packaged raw pork grown at 0, 5, 10, 15, 20 and 25°C has been studied in this paper. The modified Gompertz, Baranyi and Huang models were used for data fitting. Statistical criteria such as residual sum of squares, mean square error, Akaike's information criterion, and pseudo-R(2) were used to evaluate model performance. Results showed that there was an apparent decline in Pseudomonas growth at initial-storage phase at low temperatures. The modified Gompertz model outperformed the others at 5, 15, and 20°C, while Baranyi model was appropriate for 0 and 25°C. The Huang model was optimal at 10°C. No single model can give a consistently preferable goodness-of-fit for all growth data. The Gompertz model, with the smallest average values of RSS, AIC, MSE and the biggest pseudo-R(2) at all temperatures, is the most appropriate model to describe the growth of Pseudomonas of raw pork under pallet packaging. PMID:23313972

  9. Determining Individual Variation in Growth and Its Implication for Life-History and Population Processes Using the Empirical Bayes Method

    PubMed Central

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J.; Munch, Stephan; Skaug, Hans J.

    2014-01-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish. PMID:25211603

  10. Classification of microarray data with penalized logistic regression

    NASA Astrophysics Data System (ADS)

    Eilers, Paul H. C.; Boer, Judith M.; van Ommen, Gert-Jan; van Houwelingen, Hans C.

    2001-06-01

    Classification of microarray data needs a firm statistical basis. In principle, logistic regression can provide it, modeling the probability of membership of a class with (transforms of) linear combinations of explanatory variables. However, classical logistic regression does not work for microarrays, because generally there will be far more variables than observations. One problem is multicollinearity: estimating equations become singular and have no unique and stable solution. A second problem is over-fitting: a model may fit well into a data set, but perform badly when used to classify new data. We propose penalized likelihood as a solution to both problems. The values of the regression coefficients are constrained in a similar way as in ridge regression. All variables play an equal role, there is no ad-hoc selection of most relevant or most expressed genes. The dimension of the resulting systems of equations is equal to the number of variables, and generally will be too large for most computers, but it can dramatically be reduced with the singular value decomposition of some matrices. The penalty is optimized with AIC (Akaike's Information Criterion), which essentially is a measure of prediction performance. We find that penalized logistic regression performs well on a public data set (the MIT ALL/AML data).

  11. Land-use and land-cover change in Western Ghats of India.

    PubMed

    Kale, Manish P; Chavan, Manoj; Pardeshi, Satish; Joshi, Chitiz; Verma, Prabhakar A; Roy, P S; Srivastav, S K; Srivastava, V K; Jha, A K; Chaudhari, Swapnil; Giri, Yogesh; Krishna Murthy, Y V N

    2016-07-01

    The Western Ghats (WG) of India, one of the hottest biodiversity hotspots in the world, has witnessed major land-use and land-cover (LULC) change in recent times. The present research was aimed at studying the patterns of LULC change in WG during 1985-1995-2005, understanding the major drivers that caused such change, and projecting the future (2025) spatial distribution of forest using coupled logistic regression and Markov model. The International Geosphere Biosphere Program (IGBP) classification scheme was mainly followed in LULC characterization and change analysis. The single-step Markov model was used to project the forest demand. The spatial allocation of such forest demand was based on the predicted probabilities derived through logistic regression model. The R statistical package was used to set the allocation rules. The projection model was selected based on Akaike information criterion (AIC) and area under receiver operating characteristic (ROC) curve. The actual and projected areas of forest in 2005 were compared before making projection for 2025. It was observed that forest degradation has reduced from 1985-1995 to 1995-2005. The study obtained important insights about the drivers and their impacts on LULC simulations. To the best of our knowledge, this is the first attempt where projection of future state of forest in entire WG is made based on decadal LULC and socio-economic datasets at the Taluka (sub-district) level. PMID:27256392

  12. Demographic, Reproductive, and Dietary Determinants of Perfluorooctane Sulfonic (PFOS) and Perfluorooctanoic Acid (PFOA) Concentrations in Human Colostrum.

    PubMed

    Jusko, Todd A; Oktapodas, Marina; Palkovičová Murinová, L'ubica; Babinská, Katarina; Babjaková, Jana; Verner, Marc-André; DeWitt, Jamie C; Thevenet-Morrison, Kelly; Čonka, Kamil; Drobná, Beata; Chovancová, Jana; Thurston, Sally W; Lawrence, B Paige; Dozier, Ann M; Järvinen, Kirsi M; Patayová, Henrieta; Trnovec, Tomáš; Legler, Juliette; Hertz-Picciotto, Irva; Lamoree, Marja H

    2016-07-01

    To determine demographic, reproductive, and maternal dietary factors that predict perfluoroalkyl substance (PFAS) concentrations in breast milk, we measured perfluorooctane sulfonic (PFOS) and perfluorooctanoic acid (PFOA) concentrations, using liquid chromatography-mass spectrometry, in 184 colostrum samples collected from women participating in a cohort study in Eastern Slovakia between 2002 and 2004. During their hospital delivery stay, mothers completed a food frequency questionnaire, and demographic and reproductive data were also collected. PFOS and PFOA predictors were identified by optimizing multiple linear regression models using Akaike's information criterion (AIC). The geometric mean concentration in colostrum was 35.3 pg/mL for PFOS and 32.8 pg/mL for PFOA. In multivariable models, parous women had 40% lower PFOS (95% CI: -56 to -17%) and 40% lower PFOA (95% CI: -54 to -23%) concentrations compared with nulliparous women. Moreover, fresh/frozen fish consumption, longer birth intervals, and Slovak ethnicity were associated with higher PFOS and PFOA concentrations in colostrum. These results will help guide the design of future epidemiologic studies examining milk PFAS concentrations in relation to health end points in children.

  13. Construction of a cancer-perturbed protein-protein interaction network for discovery of apoptosis drug targets

    PubMed Central

    Chu, Liang-Hui; Chen, Bor-Sen

    2008-01-01

    Background Cancer is caused by genetic abnormalities, such as mutations of oncogenes or tumor suppressor genes, which alter downstream signal transduction pathways and protein-protein interactions. Comparisons of the interactions of proteins in cancerous and normal cells can shed light on the mechanisms of carcinogenesis. Results We constructed initial networks of protein-protein interactions involved in the apoptosis of cancerous and normal cells by use of two human yeast two-hybrid data sets and four online databases. Next, we applied a nonlinear stochastic model, maximum likelihood parameter estimation, and Akaike Information Criteria (AIC) to eliminate false-positive protein-protein interactions in our initial protein interaction networks by use of microarray data. Comparisons of the networks of apoptosis in HeLa (human cervical carcinoma) cells and in normal primary lung fibroblasts provided insight into the mechanism of apoptosis and allowed identification of potential drug targets. The potential targets include BCL2, caspase-3 and TP53. Our comparison of cancerous and normal cells also allowed derivation of several party hubs and date hubs in the human protein-protein interaction networks involved in caspase activation. Conclusion Our method allows identification of cancer-perturbed protein-protein interactions involved in apoptosis and identification of potential molecular targets for development of anti-cancer drugs. PMID:18590547

  14. Molecular detection of hematozoa infections in tundra swans relative to migration patterns and ecological conditions at breeding grounds.

    PubMed

    Ramey, Andrew M; Ely, Craig R; Schmutz, Joel A; Pearce, John M; Heard, Darryl J

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.

  15. Molecular detection of hematozoa infections in tundra swans relative to migration patterns and ecological conditions at breeding grounds

    USGS Publications Warehouse

    Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.

    2012-01-01

    Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.

  16. Copulation patterns in captive hamadryas baboons: a quantitative analysis.

    PubMed

    Nitsch, Florian; Stueckle, Sabine; Stahl, Daniel; Zinner, Dietmar

    2011-10-01

    For primates, as for many other vertebrates, copulation which results in ejaculation is a prerequisite for reproduction. The probability of ejaculation is affected by various physiological and social factors, for example reproductive state of male and female and operational sex-ratio. In this paper, we present quantitative and qualitative data on patterns of sexual behaviour in a captive group of hamadryas baboons (Papio hamadryas), a species with a polygynous-monandric mating system. We observed more than 700 copulations and analysed factors that can affect the probability of ejaculation. Multilevel logistic regression analysis and Akaike's information criterion (AIC) model selection procedures revealed that the probability of successful copulation increased as the size of female sexual swellings increased, indicating increased probability of ovulation, and as the number of females per one-male unit (OMU) decreased. In contrast, occurrence of female copulation calls, sex of the copulation initiator, and previous male aggression toward females did not affect the probability of ejaculation. Synchrony of oestrus cycles also had no effect (most likely because the sample size was too small). We also observed 29 extra-group copulations by two non-adult males. Our results indicate that male hamadryas baboons copulated more successfully around the time of ovulation and that males in large OMUs with many females may be confronted by time or energy-allocation problems.

  17. Prevalence and predictors for musculoskeletal discomfort in Malaysian office workers: Investigating explanatory factors for a developing country.

    PubMed

    Maakip, Ismail; Keegel, Tessa; Oakman, Jodi

    2016-03-01

    Musculoskeletal disorders (MSDs) are a major occupational health issue for workers in developed and developing countries, including Malaysia. Most research related to MSDs has been undertaken in developed countries; given the different regulatory and cultural practices it is plausible that contributions of hazard and risk factors may be different. A population of Malaysian public service office workers were surveyed (N = 417, 65.5% response rate) to determine prevalence and associated predictors of MSD discomfort. The 6-month period prevalence of MSD discomfort was 92.8% (95%CI = 90.2-95.2%). Akaike's Information Criterion (AIC) analyses was used to compare a range of models and determine a model of best fit. Contributions associated with MSD discomfort in the final model consisted of physical demands (61%), workload (14%), gender (13%), work-home balance (9%) and psychosocial factors (3%). Factors associated with MSD discomfort were similar in developed and developing countries but the relative contribution of factors was different, providing insight into future development of risk management strategies. PMID:26499952

  18. Lee-Carter state space modeling: Application to the Malaysia mortality data

    NASA Astrophysics Data System (ADS)

    Zakiyatussariroh, W. H. Wan; Said, Z. Mohammad; Norazan, M. R.

    2014-06-01

    This article presents an approach that formalizes the Lee-Carter (LC) model as a state space model. Maximum likelihood through Expectation-Maximum (EM) algorithm was used to estimate the model. The methodology is applied to Malaysia's total population mortality data. Malaysia's mortality data was modeled based on age specific death rates (ASDR) data from 1971-2009. The fitted ASDR are compared to the actual observed values. However, results from the comparison of the fitted and actual values between LC-SS model and the original LC model shows that the fitted values from the LC-SS model and original LC model are quite close. In addition, there is not much difference between the value of root mean squared error (RMSE) and Akaike information criteria (AIC) from both models. The LC-SS model estimated for this study can be extended for forecasting ASDR in Malaysia. Then, accuracy of the LC-SS compared to the original LC can be further examined by verifying the forecasting power using out-of-sample comparison.

  19. Estimation of exposure to toxic releases using spatial interaction modeling

    PubMed Central

    2011-01-01

    Background The United States Environmental Protection Agency's Toxic Release Inventory (TRI) data are frequently used to estimate a community's exposure to pollution. However, this estimation process often uses underdeveloped geographic theory. Spatial interaction modeling provides a more realistic approach to this estimation process. This paper uses four sets of data: lung cancer age-adjusted mortality rates from the years 1990 through 2006 inclusive from the National Cancer Institute's Surveillance Epidemiology and End Results (SEER) database, TRI releases of carcinogens from 1987 to 1996, covariates associated with lung cancer, and the EPA's Risk-Screening Environmental Indicators (RSEI) model. Results The impact of the volume of carcinogenic TRI releases on each county's lung cancer mortality rates was calculated using six spatial interaction functions (containment, buffer, power decay, exponential decay, quadratic decay, and RSEI estimates) and evaluated with four multivariate regression methods (linear, generalized linear, spatial lag, and spatial error). Akaike Information Criterion values and P values of spatial interaction terms were computed. The impacts calculated from the interaction models were also mapped. Buffer and quadratic interaction functions had the lowest AIC values (22298 and 22525 respectively), although the gains from including the spatial interaction terms were diminished with spatial error and spatial lag regression. Conclusions The use of different methods for estimating the spatial risk posed by pollution from TRI sites can give different results about the impact of those sites on health outcomes. The most reliable estimates did not always come from the most complex methods. PMID:21418644

  20. Mapping the mean monthly precipitation of a small island using kriging with external drifts

    NASA Astrophysics Data System (ADS)

    Cantet, Philippe

    2015-09-01

    This study focuses on the spatial distribution of mean annual and monthly precipitation in a small island (1128 km2) named Martinique, located in the Lesser Antilles. Only 35 meteorological stations are available on the territory, which has a complex topography. With a digital elevation model (DEM), 17 covariates that are likely to explain precipitation were built. Several interpolation methods, such as regression-kriging (MLRK, PCRK,and PLSK) and external drift kriging (EDK) were tested using a cross-validation procedure. For the regression methods, predictors were chosen by established techniques whereas a new approach is proposed to select external drifts in a kriging which is based on a stepwise model selection by the Akaike Information Criterion (AIC). The prediction accuracy was assessed at validation sites with three different skill scores. Results show that using methods with no predictors such as inverse distance weighting (IDW) or universal kriging (UK) is inappropriate in such a territory. EDK appears to outperform regression methods for any criteria, and selecting predictors by our approach improves the prediction of mean annual precipitation compared to kriging with only elevation as drift. Finally, the predicting performance was also studied by varying the size of the training set leading to less conclusive results for EDK and its performance. Nevertheless, the proposed method seems to be a good way to improve the mapping of climatic variables in a small island.

  1. Population demographics of two local South Carolina mourning dove populations

    USGS Publications Warehouse

    McGowan, D.P.; Otis, D.L.

    1998-01-01

    The mourning dove (Zenaida macroura) call-count index had a significant (P 2,300 doves and examined >6,000 individuals during harvest bag checks. An age-specific band recovery model with time- and area-specific recovery rates, and constant survival rates, was chosen for estimation via Akaike's Information Criterion (AIC), likelihood ratio, and goodness-of-fit criteria. After-hatching-year (AHY) annual survival rate was 0.359 (SE = 0.056), and hatching-year (HY) annual survival rate was 0.118 (SE = 0.042). Average estimated recruitment per adult female into the prehunting season population was 3.40 (SE = 1.25) and 2.32 (SE = 0.46) for the 2 study areas. Our movement data support earlier hypotheses of nonmigratory breeding and harvested populations in South Carolina. Low survival rates and estimated population growth rate in the study areas may be representative only of small-scale areas that are heavily managed for dove hunting. Source-sink theory was used to develop a model of region-wide populations that is composed of source areas with positive growth rates and sink areas of declining growth. We suggest management of mourning doves in the Southeast might benefit from improved understanding of local population dynamics, as opposed to regional-scale population demographics.

  2. Estimating annual survival and movement rates of adults within a metapopulation of roseate terns

    USGS Publications Warehouse

    Spendelow, J.A.; Nichols, J.D.; Nisbet, I.C.T.; Hays, H.; Cormons, G.D.; Burger, J.; Safina, C.; Hines, J.E.; Gochfeld, M.

    1995-01-01

    Several multistratum capture-recapture models were used to test various hypotheses about possible geographic and temporal variation in survival, movement, and recapture/resighting probabilities of 2399 adult Roseate Terns (Sterna dougallii) color-banded from 1988 to 1992 at the sites of the four largest breeding colonies of this species in the northeastern USA. Linear-logistic ultrastructural models also were developed to investigate possible correlates of geographic variation in movement probabilities. Based on goodness-of-fit tests and comparisons of Akaike's Information Criterion (AIC) values, the fully parameterized model (Model A) with time- and location-specific survival, movement, and capture probabilities, was selected as the most appropriate model for this metapopulation structure. With almost all movement accounted for, on average gt 90% of the surviving adults from each colony site returned to the same site the following year. Variations in movement probabilities were more closely associated with the identity of the destination colony site than with either the identity of the colony site of origin or the distance between colony sites. The average annual survival estimates (0.740.84) of terns from all four sites indicate a high rate of annual mortality relative to that of other species of marine birds.

  3. Determining individual variation in growth and its implication for life-history and population processes using the empirical Bayes method.

    PubMed

    Vincenzi, Simone; Mangel, Marc; Crivelli, Alain J; Munch, Stephan; Skaug, Hans J

    2014-09-01

    The differences in demographic and life-history processes between organisms living in the same population have important consequences for ecological and evolutionary dynamics. Modern statistical and computational methods allow the investigation of individual and shared (among homogeneous groups) determinants of the observed variation in growth. We use an Empirical Bayes approach to estimate individual and shared variation in somatic growth using a von Bertalanffy growth model with random effects. To illustrate the power and generality of the method, we consider two populations of marble trout Salmo marmoratus living in Slovenian streams, where individually tagged fish have been sampled for more than 15 years. We use year-of-birth cohort, population density during the first year of life, and individual random effects as potential predictors of the von Bertalanffy growth function's parameters k (rate of growth) and L∞ (asymptotic size). Our results showed that size ranks were largely maintained throughout marble trout lifetime in both populations. According to the Akaike Information Criterion (AIC), the best models showed different growth patterns for year-of-birth cohorts as well as the existence of substantial individual variation in growth trajectories after accounting for the cohort effect. For both populations, models including density during the first year of life showed that growth tended to decrease with increasing population density early in life. Model validation showed that predictions of individual growth trajectories using the random-effects model were more accurate than predictions based on mean size-at-age of fish.

  4. Determination of original infection source of H7N9 avian influenza by dynamical model.

    PubMed

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-01-01

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters. PMID:24786135

  5. Towards a Model Selection Rule for Quantum State Tomography

    NASA Astrophysics Data System (ADS)

    Scholten, Travis; Blume-Kohout, Robin

    Quantum tomography on large and/or complex systems will rely heavily on model selection techniques, which permit on-the-fly selection of small efficient statistical models (e.g. small Hilbert spaces) that accurately fit the data. Many model selection tools, such as hypothesis testing or Akaike's AIC, rely implicitly or explicitly on the Wilks Theorem, which predicts the behavior of the loglikelihood ratio statistic (LLRS) used to choose between models. We used Monte Carlo simulations to study the behavior of the LLRS in quantum state tomography, and found that it disagrees dramatically with Wilks' prediction. We propose a simple explanation for this behavior; namely, that boundaries (in state space and between models) play a significant role in determining the distribution of the LLRS. The resulting distribution is very complex, depending strongly both on the true state and the nature of the data. We consider a simplified model that neglects anistropy in the Fisher information, derive an almost analytic prediction for the mean value of the LLRS, and compare it to numerical experiments. While our simplified model outperforms the Wilks Theorem, it still does not predict the LLRS accurately, implying that alternative methods may be necessary for tomographic model selection. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE.

  6. Demographic, Reproductive, and Dietary Determinants of Perfluorooctane Sulfonic (PFOS) and Perfluorooctanoic Acid (PFOA) Concentrations in Human Colostrum.

    PubMed

    Jusko, Todd A; Oktapodas, Marina; Palkovičová Murinová, L'ubica; Babinská, Katarina; Babjaková, Jana; Verner, Marc-André; DeWitt, Jamie C; Thevenet-Morrison, Kelly; Čonka, Kamil; Drobná, Beata; Chovancová, Jana; Thurston, Sally W; Lawrence, B Paige; Dozier, Ann M; Järvinen, Kirsi M; Patayová, Henrieta; Trnovec, Tomáš; Legler, Juliette; Hertz-Picciotto, Irva; Lamoree, Marja H

    2016-07-01

    To determine demographic, reproductive, and maternal dietary factors that predict perfluoroalkyl substance (PFAS) concentrations in breast milk, we measured perfluorooctane sulfonic (PFOS) and perfluorooctanoic acid (PFOA) concentrations, using liquid chromatography-mass spectrometry, in 184 colostrum samples collected from women participating in a cohort study in Eastern Slovakia between 2002 and 2004. During their hospital delivery stay, mothers completed a food frequency questionnaire, and demographic and reproductive data were also collected. PFOS and PFOA predictors were identified by optimizing multiple linear regression models using Akaike's information criterion (AIC). The geometric mean concentration in colostrum was 35.3 pg/mL for PFOS and 32.8 pg/mL for PFOA. In multivariable models, parous women had 40% lower PFOS (95% CI: -56 to -17%) and 40% lower PFOA (95% CI: -54 to -23%) concentrations compared with nulliparous women. Moreover, fresh/frozen fish consumption, longer birth intervals, and Slovak ethnicity were associated with higher PFOS and PFOA concentrations in colostrum. These results will help guide the design of future epidemiologic studies examining milk PFAS concentrations in relation to health end points in children. PMID:27244128

  7. Determination of Original Infection Source of H7N9 Avian Influenza by Dynamical Model

    NASA Astrophysics Data System (ADS)

    Zhang, Juan; Jin, Zhen; Sun, Gui-Quan; Sun, Xiang-Dong; Wang, You-Ming; Huang, Baoxu

    2014-05-01

    H7N9, a newly emerging virus in China, travels among poultry and human. Although H7N9 has not aroused massive outbreaks, recurrence in the second half of 2013 makes it essential to control the spread. It is believed that the most effective control measure is to locate the original infection source and cut off the source of infection from human. However, the original infection source and the internal transmission mechanism of the new virus are not totally clear. In order to determine the original infection source of H7N9, we establish a dynamical model with migratory bird, resident bird, domestic poultry and human population, and view migratory bird, resident bird, domestic poultry as original infection source respectively to fit the true dynamics during the 2013 pandemic. By comparing the date fitting results and corresponding Akaike Information Criterion (AIC) values, we conclude that migrant birds are most likely the original infection source. In addition, we obtain the basic reproduction number in poultry and carry out sensitivity analysis of some parameters.

  8. IDF relationships using bivariate copula for storm events in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Ariff, N. M.; Jemain, A. A.; Ibrahim, K.; Wan Zin, W. Z.

    2012-11-01

    SummaryIntensity-duration-frequency (IDF) curves are used in many hydrologic designs for the purpose of water managements and flood preventions. The IDF curves available in Malaysia are those obtained from univariate analysis approach which only considers the intensity of rainfalls at fixed time intervals. As several rainfall variables are correlated with each other such as intensity and duration, this paper aims to derive IDF points for storm events in Peninsular Malaysia by means of bivariate frequency analysis. This is achieved through utilizing the relationship between storm intensities and durations using the copula method. Four types of copulas; namely the Ali-Mikhail-Haq (AMH), Frank, Gaussian and Farlie-Gumbel-Morgenstern (FGM) copulas are considered because the correlation between storm intensity, I, and duration, D, are negative and these copulas are appropriate when the relationship between the variables are negative. The correlations are attained by means of Kendall's τ estimation. The analysis was performed on twenty rainfall stations with hourly data across Peninsular Malaysia. Using Akaike's Information Criteria (AIC) for testing goodness-of-fit, both Frank and Gaussian copulas are found to be suitable to represent the relationship between I and D. The IDF points found by the copula method are compared to the IDF curves yielded based on the typical IDF empirical formula of the univariate approach. This study indicates that storm intensities obtained from both methods are in agreement with each other for any given storm duration and for various return periods.

  9. Prevalence and predictors for musculoskeletal discomfort in Malaysian office workers: Investigating explanatory factors for a developing country.

    PubMed

    Maakip, Ismail; Keegel, Tessa; Oakman, Jodi

    2016-03-01

    Musculoskeletal disorders (MSDs) are a major occupational health issue for workers in developed and developing countries, including Malaysia. Most research related to MSDs has been undertaken in developed countries; given the different regulatory and cultural practices it is plausible that contributions of hazard and risk factors may be different. A population of Malaysian public service office workers were surveyed (N = 417, 65.5% response rate) to determine prevalence and associated predictors of MSD discomfort. The 6-month period prevalence of MSD discomfort was 92.8% (95%CI = 90.2-95.2%). Akaike's Information Criterion (AIC) analyses was used to compare a range of models and determine a model of best fit. Contributions associated with MSD discomfort in the final model consisted of physical demands (61%), workload (14%), gender (13%), work-home balance (9%) and psychosocial factors (3%). Factors associated with MSD discomfort were similar in developed and developing countries but the relative contribution of factors was different, providing insight into future development of risk management strategies.

  10. Persistent disturbance by commercial navigation afters the relative abundance of channel-dwelling fishes in a large river

    USGS Publications Warehouse

    Gutreuter, S.; Vallazza, J.M.; Knights, B.C.

    2006-01-01

    We provide the first evidence for chronic effects of disturbance by commercial vessels on the spatial distribution and abundance of fishes in the channels of a large river. Most of the world's large rivers are intensively managed to satisfy increasing demands for commercial shipping, but little research has been conducted to identify and alleviate any adverse consequences of commercial navigation. We used a combination of a gradient sampling design incorporating quasicontrol areas with Akaike's information criterion (AIC)-weighted model averaging to estimate effects of disturbances by commercial vessels on fishes in the upper Mississippi River. Species density, which mainly measured species evenness, decreased with increasing disturbance frequency. The most abundant species - gizzard shad (Dorosoma cepedianum) and freshwater drum (Aplodinotus grunniens) - and the less abundant shovelnose sturgeon (Scaphirhynchus platorhynchus) and flathead catfish (Pylodictis olivaris) were seemingly unaffected by traffic disturbance. In contrast, the relative abundance of the toothed herrings (Hiodon spp.), redhorses (Moxostoma spp.), buffaloes (Ictiobus spp.), channel catfish (Ictalurus punctatus), sauger (Sander canadensis), and white bass (Morone chrysops) decreased with increasing traffic in the navigation channel. We hypothesized that the combination of alteration of hydraulic features within navigation channels and rehabilitation of secondary channels might benefit channel-dependent species. ?? 2006 NRC.

  11. Land-use and land-cover change in Western Ghats of India.

    PubMed

    Kale, Manish P; Chavan, Manoj; Pardeshi, Satish; Joshi, Chitiz; Verma, Prabhakar A; Roy, P S; Srivastav, S K; Srivastava, V K; Jha, A K; Chaudhari, Swapnil; Giri, Yogesh; Krishna Murthy, Y V N

    2016-07-01

    The Western Ghats (WG) of India, one of the hottest biodiversity hotspots in the world, has witnessed major land-use and land-cover (LULC) change in recent times. The present research was aimed at studying the patterns of LULC change in WG during 1985-1995-2005, understanding the major drivers that caused such change, and projecting the future (2025) spatial distribution of forest using coupled logistic regression and Markov model. The International Geosphere Biosphere Program (IGBP) classification scheme was mainly followed in LULC characterization and change analysis. The single-step Markov model was used to project the forest demand. The spatial allocation of such forest demand was based on the predicted probabilities derived through logistic regression model. The R statistical package was used to set the allocation rules. The projection model was selected based on Akaike information criterion (AIC) and area under receiver operating characteristic (ROC) curve. The actual and projected areas of forest in 2005 were compared before making projection for 2025. It was observed that forest degradation has reduced from 1985-1995 to 1995-2005. The study obtained important insights about the drivers and their impacts on LULC simulations. To the best of our knowledge, this is the first attempt where projection of future state of forest in entire WG is made based on decadal LULC and socio-economic datasets at the Taluka (sub-district) level.

  12. Accretion Timescales from Kepler AGN

    NASA Astrophysics Data System (ADS)

    Kasliwal, Vishal P.; Vogeley, Michael S.; Richards, Gordon T.

    2015-01-01

    We constrain AGN accretion disk variability mechanisms using the optical light curves of AGN observed by Kepler. AGN optical fluxes are known to exhibit stochastic variations on timescales of hours, days, months and years. The excellent sampling properties of the original Kepler mission - high S/N ratio (105), short sampling interval (30 minutes), and long sampling duration (~ 3.5 years) - allow for a detailed examination of the differences between the variability processes present in various sub-types of AGN such as Type I and II Seyferts, QSOs, and Blazars. We model the flux data using the Auto-Regressive Moving Average (ARMA) representation from the field of time series analysis. We use the Kalman filter to determine optimal mode parameters and use the Akaike Information Criteria (AIC) to select the optimal model. We find that optical light curves from Kepler AGN cannot be fit by low order statistical models such as the popular AR(1) process or damped random walk. Kepler light curves exhibit complicated power spectra and are better modeled by higher order ARMA processes. We find that Kepler AGN typically exhibit power spectra that change from a bending power law (PSD ~ 1/fa) to a flat power spectrum on timescales in the range of ~ 5 - 100 days consistent with the orbital and thermal timescales of a typical 107 solar mass black hole.

  13. Predictive occurrence models for coastal wetland plant communities: delineating hydrologic response surfaces with multinomial logistic regression

    USGS Publications Warehouse

    Snedden, Gregg A.; Steyer, Gregory D.

    2013-01-01

    Understanding plant community zonation along estuarine stress gradients is critical for effective conservation and restoration of coastal wetland ecosystems. We related the presence of plant community types to estuarine hydrology at 173 sites across coastal Louisiana. Percent relative cover by species was assessed at each site near the end of the growing season in 2008, and hourly water level and salinity were recorded at each site Oct 2007–Sep 2008. Nine plant community types were delineated with k-means clustering, and indicator species were identified for each of the community types with indicator species analysis. An inverse relation between salinity and species diversity was observed. Canonical correspondence analysis (CCA) effectively segregated the sites across ordination space by community type, and indicated that salinity and tidal amplitude were both important drivers of vegetation composition. Multinomial logistic regression (MLR) and Akaike's Information Criterion (AIC) were used to predict the probability of occurrence of the nine vegetation communities as a function of salinity and tidal amplitude, and probability surfaces obtained from the MLR model corroborated the CCA results. The weighted kappa statistic, calculated from the confusion matrix of predicted versus actual community types, was 0.7 and indicated good agreement between observed community types and model predictions. Our results suggest that models based on a few key hydrologic variables can be valuable tools for predicting vegetation community development when restoring and managing coastal wetlands.

  14. Informed Consent

    PubMed Central

    Manion, F.; Hsieh, K.; Harris, M.

    2015-01-01

    Summary Background Despite efforts to provide standard definitions of terms such as “medical record”, “computer-based patient record”, “electronic medical record” and “electronic health record”, the terms are still used interchangeably. Initiatives like data and information governance, research biorepositories, and learning health systems require availability and reuse of data, as well as common understandings of the scope for specific purposes. Lacking widely shared definitions, utilization of the afore-mentioned terms in research informed consent documents calls to question whether all participants in the research process — patients, information technology and regulatory staff, and the investigative team — fully understand what data and information they are asking to obtain and agreeing to share. Objectives This descriptive study explored the terminology used in research informed consent documents when describing patient data and information, asking the question “Does the use of the term “medical record” in the context of a research informed consent document accurately represent the scope of the data involved?” Methods Informed consent document templates found on 17 Institutional Review Board (IRB) websites with Clinical and Translational Science Awards (CTSA) were searched for terms that appeared to be describing the data resources to be accessed. The National Library of Medicine’s (NLM) Terminology Services was searched for definitions provided by key standards groups that deposit terminologies with the NLM. Discussion The results suggest research consent documents are using outdated terms to describe patient information, health care terminology systems need to consider the context of research for use cases, and that there is significant work to be done to assure the HIPAA Omnibus Rule is applied to contemporary activities such as biorepositories and learning health systems. Conclusions “Medical record”, a term used extensively

  15. Prediction of thoracic injury severity in frontal impacts by selected anatomical morphomic variables through model-averaged logistic regression approach.

    PubMed

    Zhang, Peng; Parenteau, Chantal; Wang, Lu; Holcombe, Sven; Kohoyda-Inglis, Carla; Sullivan, June; Wang, Stewart

    2013-11-01

    This study resulted in a model-averaging methodology that predicts crash injury risk using vehicle, demographic, and morphomic variables and assesses the importance of individual predictors. The effectiveness of this methodology was illustrated through analysis of occupant chest injuries in frontal vehicle crashes. The crash data were obtained from the International Center for Automotive Medicine (ICAM) database for calendar year 1996 to 2012. The morphomic data are quantitative measurements of variations in human body 3-dimensional anatomy. Morphomics are obtained from imaging records. In this study, morphomics were obtained from chest, abdomen, and spine CT using novel patented algorithms. A NASS-trained crash investigator with over thirty years of experience collected the in-depth crash data. There were 226 cases available with occupants involved in frontal crashes and morphomic measurements. Only cases with complete recorded data were retained for statistical analysis. Logistic regression models were fitted using all possible configurations of vehicle, demographic, and morphomic variables. Different models were ranked by the Akaike Information Criteria (AIC). An averaged logistic regression model approach was used due to the limited sample size relative to the number of variables. This approach is helpful when addressing variable selection, building prediction models, and assessing the importance of individual variables. The final predictive results were developed using this approach, based on the top 100 models in the AIC ranking. Model-averaging minimized model uncertainty, decreased the overall prediction variance, and provided an approach to evaluating the importance of individual variables. There were 17 variables investigated: four vehicle, four demographic, and nine morphomic. More than 130,000 logistic models were investigated in total. The models were characterized into four scenarios to assess individual variable contribution to injury risk. Scenario

  16. Prediction of thoracic injury severity in frontal impacts by selected anatomical morphomic variables through model-averaged logistic regression approach.

    PubMed

    Zhang, Peng; Parenteau, Chantal; Wang, Lu; Holcombe, Sven; Kohoyda-Inglis, Carla; Sullivan, June; Wang, Stewart

    2013-11-01

    This study resulted in a model-averaging methodology that predicts crash injury risk using vehicle, demographic, and morphomic variables and assesses the importance of individual predictors. The effectiveness of this methodology was illustrated through analysis of occupant chest injuries in frontal vehicle crashes. The crash data were obtained from the International Center for Automotive Medicine (ICAM) database for calendar year 1996 to 2012. The morphomic data are quantitative measurements of variations in human body 3-dimensional anatomy. Morphomics are obtained from imaging records. In this study, morphomics were obtained from chest, abdomen, and spine CT using novel patented algorithms. A NASS-trained crash investigator with over thirty years of experience collected the in-depth crash data. There were 226 cases available with occupants involved in frontal crashes and morphomic measurements. Only cases with complete recorded data were retained for statistical analysis. Logistic regression models were fitted using all possible configurations of vehicle, demographic, and morphomic variables. Different models were ranked by the Akaike Information Criteria (AIC). An averaged logistic regression model approach was used due to the limited sample size relative to the number of variables. This approach is helpful when addressing variable selection, building prediction models, and assessing the importance of individual variables. The final predictive results were developed using this approach, based on the top 100 models in the AIC ranking. Model-averaging minimized model uncertainty, decreased the overall prediction variance, and provided an approach to evaluating the importance of individual variables. There were 17 variables investigated: four vehicle, four demographic, and nine morphomic. More than 130,000 logistic models were investigated in total. The models were characterized into four scenarios to assess individual variable contribution to injury risk. Scenario

  17. Spacetime information

    SciTech Connect

    Hartle, J.B. Isaac Newton Institute for the Mathematical Sciences, University of Cambridge, Cambridge CB3 0EH )

    1995-02-15

    In usual quantum theory, the information available about a quantum system is defined in terms of the density matrix describing it on a spacelike surface. This definition must be generalized for extensions of quantum theory which neither require, nor always permit, a notion of state on a spacelike surface. In particular, it must be generalized for the generalized quantum theories appropriate when spacetime geometry fluctuates quantum mechanically or when geometry is fixed but not foliable by spacelike surfaces. This paper introduces a four-dimensional notion of the information available about a quantum system's boundary conditions in the various sets of decohering, coarse-grained histories it may display. This spacetime notion of information coincides with the familiar one when quantum theory [ital is] formulable in terms of states on spacelike surfaces but generalizes this notion when it cannot be so formulated. The idea of spacetime information is applied in several contexts: When spacetime geometry is fixed the information available through alternatives restricted to a fixed spacetime region is defined. The information available through histories of alternatives of general operators is compared to that obtained from the more limited coarse grainings of sum-over-histories quantum mechanics that refer only to coordinates. The definition of information is considered in generalized quantum theories. We consider as specific examples time-neutral quantum mechanics with initial and final conditions, quantum theories with nonunitary evolution, and the generalized quantum frameworks appropriate for quantum spacetime. In such theories complete information about a quantum system is not necessarily available on any spacelike surface but must be searched for throughout spacetime. The information loss commonly associated with the evolution of pure states into mixed states'' in black hole evaporation is thus not in conflict with the principles of generalized quantum mechanics.

  18. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, Kritina; Sandor, A.; Thompson, S. G.; McCann, R. S.; Kaiser, M. K.; Begault, D. R.; Adelstein, B. D.; Beutter, B. R.; Stone, L. S.

    2008-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew on flight vehicles, surface landers and habitats, and during extra-vehicular activities (EVA). Designers of displays and controls for exploration missions must be prepared to select the text formats, label styles, alarms, electronic procedure designs, and cursor control devices that provide for optimal crew performance on exploration tasks. The major areas of work, or subtasks, within the Information Presentation DRP are: 1) Controls, 2) Displays, 3) Procedures, and 4) EVA Operations.

  19. [Informed consent].

    PubMed

    Rodríguez, C R; González Parra, E; Martínez Castelao, A

    2008-01-01

    - Basic law 41/2002 on patient autonomy regulates the rights and obligations of patients, users and professionals, as well as those of public and private health care centers and services. This regulation refers to patient autonomy, the right to information and essential clinical documentation. - This law establishes the minimum requirements for the information the patient should receive and the decision making in which the patient should take part. Diagnostic tests are performed and therapeutic decisions are taken in the ACKD unit in which patient information is an essential and mandatory requirement according to this law. PMID:19018748

  20. Agricultural Information and Scientific Information.

    ERIC Educational Resources Information Center

    Parker, J. Stephen, Ed.

    1991-01-01

    Six articles discuss the need for increased access to information for agricultural and scientific research in the countries of Zambia, Kenya, Ghana, Turkey, India, and Nigeria. Discussions focus on each country's current scientific and agricultural development and the demand for scientific materials and greater information dissemination. (MAB)

  1. Copyright Information

    Atmospheric Science Data Center

    2013-03-25

    ... not copyrighted. You may use NASA imagery, video and audio material for educational or informational purposes, including photo ...   NASA should be acknowledged as the source of the material except in cases of advertising. See  NASA Advertising Guidelines . ...

  2. Information Presentation

    NASA Technical Reports Server (NTRS)

    Holden, K.L.; Boyer, J.L.; Sandor, A.; Thompson, S.G.; McCann, R.S.; Begault, D.R.; Adelstein, B.D.; Beutter, B.R.; Stone, L.S.

    2009-01-01

    The goal of the Information Presentation Directed Research Project (DRP) is to address design questions related to the presentation of information to the crew. The major areas of work, or subtasks, within this DRP are: 1) Displays, 2) Controls, 3) Electronic Procedures and Fault Management, and 4) Human Performance Modeling. This DRP is a collaborative effort between researchers at Johnson Space Center and Ames Research Center.

  3. [Information systems].

    PubMed

    Rodríguez Maniega, José Antonio; Trío Maseda, Reyes

    2005-03-01

    The arrival of victims of the terrorist attacks of 11 March at the hospital put the efficiency of its information systems to the test. To be most efficient, these systems should be simple and directed, above all, to the follow-up of victims and to providing the necessary information to patients and families. A specific and easy to use system is advisable. PMID:15771852

  4. Information engineering

    SciTech Connect

    Hunt, D.N.

    1997-02-01

    The Information Engineering thrust area develops information technology to support the programmatic needs of Lawrence Livermore National Laboratory`s Engineering Directorate. Progress in five programmatic areas are described in separate reports contained herein. These are entitled Three-dimensional Object Creation, Manipulation, and Transport, Zephyr:A Secure Internet-Based Process to Streamline Engineering Procurements, Subcarrier Multiplexing: Optical Network Demonstrations, Parallel Optical Interconnect Technology Demonstration, and Intelligent Automation Architecture.

  5. MH2c: Characterization of major histocompatibility α-helices - an information criterion approach

    NASA Astrophysics Data System (ADS)

    Hischenhuber, B.; Frommlet, F.; Schreiner, W.; Knapp, B.

    2012-07-01

    Major histocompatibility proteins share a common overall structure or peptide binding groove. Two binding groove domains, on the same chain for major histocompatibility class I or on two different chains for major histocompatibility class II, contribute to that structure that consists of two α-helices (“wall”) and a sheet of eight anti-parallel beta strands (“floor”). Apart from the peptide presented in the groove, the major histocompatibility α-helices play a central role for the interaction with the T cell receptor. This study presents a generalized mathematical approach for the characterization of these helices. We employed polynomials of degree 1 to 7 and splines with 1 to 2 nodes based on polynomials of degree 1 to 7 on the α-helices projected on their principal components. We evaluated all models with a corrected Akaike Information Criterion to determine which model represents the α-helices in the best way without overfitting the data. This method is applicable for both the stationary and the dynamic characterization of α-helices. By deriving differential geometric parameters from these models one obtains a reliable method to characterize and compare α-helices for a broad range of applications. Catalogue identifier: AELX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 327 565 No. of bytes in distributed program, including test data, etc.: 17 433 656 Distribution format: tar.gz Programming language: Matlab Computer: Personal computer architectures Operating system: Windows, Linux, Mac (all systems on which Matlab can be installed) RAM: Depends on the trajectory size, min. 1 GB (Matlab) Classification: 2.1, 4.9, 4.14 External routines: Curve Fitting Toolbox and Statistic Toolbox of

  6. MMA, A Computer Code for Multi-Model Analysis

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  7. When Information Improves Information Security

    NASA Astrophysics Data System (ADS)

    Grossklags, Jens; Johnson, Benjamin; Christin, Nicolas

    This paper presents a formal, quantitative evaluation of the impact of bounded-rational security decision-making subject to limited information and externalities. We investigate a mixed economy of an individual rational expert and several naïve near-sighted agents. We further model three canonical types of negative externalities (weakest-link, best shot and total effort), and study the impact of two information regimes on the threat level agents are facing.

  8. Selecting the right statistical model for analysis of insect count data by using information theoretic measures.

    PubMed

    Sileshi, G

    2006-10-01

    Researchers and regulatory agencies often make statistical inferences from insect count data using modelling approaches that assume homogeneous variance. Such models do not allow for formal appraisal of variability which in its different forms is the subject of interest in ecology. Therefore, the objectives of this paper were to (i) compare models suitable for handling variance heterogeneity and (ii) select optimal models to ensure valid statistical inferences from insect count data. The log-normal, standard Poisson, Poisson corrected for overdispersion, zero-inflated Poisson, the negative binomial distribution and zero-inflated negative binomial models were compared using six count datasets on foliage-dwelling insects and five families of soil-dwelling insects. Akaike's and Schwarz Bayesian information criteria were used for comparing the various models. Over 50% of the counts were zeros even in locally abundant species such as Ootheca bennigseni Weise, Mesoplatys ochroptera Stål and Diaecoderus spp. The Poisson model after correction for overdispersion and the standard negative binomial distribution model provided better description of the probability distribution of seven out of the 11 insects than the log-normal, standard Poisson, zero-inflated Poisson or zero-inflated negative binomial models. It is concluded that excess zeros and variance heterogeneity are common data phenomena in insect counts. If not properly modelled, these properties can invalidate the normal distribution assumptions resulting in biased estimation of ecological effects and jeopardizing the integrity of the scientific inferences. Therefore, it is recommended that statistical models appropriate for handling these data properties be selected using objective criteria to ensure efficient statistical inference.

  9. [Informed consent].

    PubMed

    Medina Castellano, Carmen Delia

    2009-10-01

    At present times, numerous complaints claiming defects produced at some point in the process of obtaining informed consent are filed in courts of justice; in these complaints there is an underlying comment element which is the roles that health professionals have in these processes. In obtaining this consent, one can see this more as a means to obtain judicial protection for professional practices rather than this process being a respectful exercise for the dignity and freedom which health service patients have. This article reflects on two basic rights related to informed consent: adequately obtaining this consent and the need to protect those people who lack, either partially or totally, the capacity to make this decision by themselves. Afterwards, the author makes some considerations about the necessity to obtain informed consent for nursing practices and treatment.

  10. Information Service.

    ERIC Educational Resources Information Center

    Scofield, James

    Newspaper librarians discussed the public use of their newspapers' libraries. Policies run the gamut from well-staffed public information services, within or outside the newspaper library, to no service at all to those outside the staff of the paper. Problems of dealing with tax and law enforcement agencies were covered, as well as cooperative…

  11. Working Information

    ERIC Educational Resources Information Center

    Lloyd, Annemaree; Somerville, Margaret

    2006-01-01

    Purpose: The purpose of this article is to explore the contribution that an information literacy approach to the empirical study of workplace learning can make to how people understand and conceptualise workplace learning. Design/methodology/approach: Three cohorts of fire-fighters working in two regional locations in NSW, Australia were…

  12. Yesterday's Information.

    ERIC Educational Resources Information Center

    McKay, Martin D.; Stout, J. David

    1999-01-01

    Discusses access to Internet resources in school libraries, including the importance of evaluating content and appropriate use. The following online services that provide current factual information from legitimate resources are described: SIRS (Social Issues Resource Series), InfoTrac, EBSCO Host, SearchBank, and the Electric Library. (MES)

  13. Envisioning Information.

    ERIC Educational Resources Information Center

    Tufte, Edward R.

    This book presents over 400 illustrations of complex data that show how the dimensionality and density of portrayals can be enhanced. Practical advice on how to explain complex materials by visual means is given, and examples illustrate the fundamental principles of information display. Design strategies presented are exemplified in maps, the…

  14. Information, Please.

    ERIC Educational Resources Information Center

    Hardy, Lawrence

    2003-01-01

    Requirements of the No Child Left Behind Act present school districts with a massive lesson in data-driven decision-making. Technology companies offer data-management tools that organize student information from state tests. Offers districts advice in choosing a technology provider. (MLF)

  15. Information Processing.

    ERIC Educational Resources Information Center

    Jennings, Carol Ann; McDonald, Sandy

    This publication contains instructional materials for teacher and student use for a course in information processing. The materials are written in terms of student performance using measurable objectives. The course includes 10 units. Each instructional unit contains some or all of the basic components of a unit of instruction: performance…

  16. Teaching Information Skills: Recording Information.

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2002-01-01

    Discusses how to teach students in primary and intermediate grades to record and organize information. Highlights include developing a research question; collaborative planning between teachers and library media specialists; consistency of data entry; and an example of a unit on animal migration based on an appropriate Web site. (LRW)

  17. Information management - Assessing the demand for information

    NASA Technical Reports Server (NTRS)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  18. Information Environments

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J.; Naiman, Cynthia

    2003-01-01

    The objective of GRC CNIS/IE work is to build a plug-n-play infrastructure that provides the Grand Challenge Applications with a suite of tools for coupling codes together, numerical zooming between fidelity of codes and gaining deployment of these simulations onto the Information Power Grid. The GRC CNIS/IE work will streamline and improve this process by providing tighter integration of various tools through the use of object oriented design of component models and data objects and through the use of CORBA (Common Object Request Broker Architecture).

  19. Geographic Information Office

    USGS Publications Warehouse

    ,

    2004-01-01

    The Geographic Information Office (GIO) is the principal information office for U.S. Geological Survey (USGS), focused on: Information Policy and Services, Information Technology, Science Information, Information Security, and the Federal Geographic Data Committee/Geospatial One Stop.

  20. Change in BMI accurately predicted by social exposure to acquaintances.

    PubMed

    Oloritun, Rahman O; Ouarda, Taha B M J; Moturu, Sai; Madan, Anmol; Pentland, Alex Sandy; Khayal, Inas

    2013-01-01

    Research has mostly focused on obesity and not on processes of BMI change more generally, although these may be key factors that lead to obesity. Studies have suggested that obesity is affected by social ties. However these studies used survey based data collection techniques that may be biased toward select only close friends and relatives. In this study, mobile phone sensing techniques were used to routinely capture social interaction data in an undergraduate dorm. By automating the capture of social interaction data, the limitations of self-reported social exposure data are avoided. This study attempts to understand and develop a model that best describes the change in BMI using social interaction data. We evaluated a cohort of 42 college students in a co-located university dorm, automatically captured via mobile phones and survey based health-related information. We determined the most predictive variables for change in BMI using the least absolute shrinkage and selection operator (LASSO) method. The selected variables, with gender, healthy diet category, and ability to manage stress, were used to build multiple linear regression models that estimate the effect of exposure and individual factors on change in BMI. We identified the best model using Akaike Information Criterion (AIC) and R(2). This study found a model that explains 68% (p<0.0001) of the variation in change in BMI. The model combined social interaction data, especially from acquaintances, and personal health-related information to explain change in BMI. This is the first study taking into account both interactions with different levels of social interaction and personal health-related information. Social interactions with acquaintances accounted for more than half the variation in change in BMI. This suggests the importance of not only individual health information but also the significance of social interactions with people we are exposed to, even people we may not consider as close friends.

  1. Informed consent.

    PubMed

    Steevenson, Grania

    2006-08-01

    Disclosure of information prior to consent is a very complex area of medical ethics. On the surface it would seem to be quite clear cut, but on closer inspection the scope for 'grey areas' is vast. In practice, however, it could be argued that the number of cases that result in complaint or litigation is comparatively small. However, this does not mean that wrong decisions or unethical scenarios do not occur. It would seem that in clinical practice these ethical grey areas concerning patients' full knowledge of their condition or treatment are quite common. One of the barometers for how much disclosure should be given prior to consent could be the feedback obtained from the patients. Are they asking relevant questions pertinent to their condition and do they show a good understanding of the options available? This should be seen as a positive trait and should be welcomed by the healthcare professionals. Ultimately it gives patients greater autonomy and the healthcare professional can expand and build on the patient's knowledge as well as allay fears perhaps based on wrongly held information. Greater communication with the patient would help the healthcare professional pitch their explanations at the right level. Every case and scenario is different and unique and deserves to be treated as such. Studies have shown that most patients can understand their medical condition and treatment provided communication has been thorough (Gillon 1996). It is in the patients' best interests to feel comfortable with the level of disclosure offered to them. It can only foster greater trust and respect between them and the healthcare profession which has to be mutually beneficial to both parties.

  2. Informed consent.

    PubMed

    Steevenson, Grania

    2006-08-01

    Disclosure of information prior to consent is a very complex area of medical ethics. On the surface it would seem to be quite clear cut, but on closer inspection the scope for 'grey areas' is vast. In practice, however, it could be argued that the number of cases that result in complaint or litigation is comparatively small. However, this does not mean that wrong decisions or unethical scenarios do not occur. It would seem that in clinical practice these ethical grey areas concerning patients' full knowledge of their condition or treatment are quite common. One of the barometers for how much disclosure should be given prior to consent could be the feedback obtained from the patients. Are they asking relevant questions pertinent to their condition and do they show a good understanding of the options available? This should be seen as a positive trait and should be welcomed by the healthcare professionals. Ultimately it gives patients greater autonomy and the healthcare professional can expand and build on the patient's knowledge as well as allay fears perhaps based on wrongly held information. Greater communication with the patient would help the healthcare professional pitch their explanations at the right level. Every case and scenario is different and unique and deserves to be treated as such. Studies have shown that most patients can understand their medical condition and treatment provided communication has been thorough (Gillon 1996). It is in the patients' best interests to feel comfortable with the level of disclosure offered to them. It can only foster greater trust and respect between them and the healthcare profession which has to be mutually beneficial to both parties. PMID:16939165

  3. Effects of ADC Nonlinearity on the Spurious Dynamic Range Performance of Compressed Sensing

    PubMed Central

    Tian, Pengwu; Yu, Hongyi

    2014-01-01

    Analog-to-information converter (AIC) plays an important role in the compressed sensing system; it has the potential to significantly extend the capabilities of conventional analog-to-digital converter. This paper evaluates the impact of AIC nonlinearity on the dynamic performance in practical compressed sensing system, which included the nonlinearity introduced by quantization as well as the circuit non-ideality. It presents intuitive yet quantitative insights into the harmonics of quantization output of AIC, and the effect of other AIC nonlinearity on the spurious dynamic range (SFDR) performance is also analyzed. The analysis and simulation results demonstrated that, compared with conventional ADC-based system, the measurement process decorrelates the input signal and the quantization error and alleviate the effect of other decorrelates of AIC, which results in a dramatic increase in spurious free dynamic range (SFDR). PMID:24895645

  4. Testing the consistency of wildlife data types before combining them: the case of camera traps and telemetry.

    PubMed

    Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A

    2014-04-01

    Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case

  5. Testing the consistency of wildlife data types before combining them: the case of camera traps and telemetry.

    PubMed

    Popescu, Viorel D; Valpine, Perry; Sweitzer, Rick A

    2014-04-01

    Wildlife data gathered by different monitoring techniques are often combined to estimate animal density. However, methods to check whether different types of data provide consistent information (i.e., can information from one data type be used to predict responses in the other?) before combining them are lacking. We used generalized linear models and generalized linear mixed-effects models to relate camera trap probabilities for marked animals to independent space use from telemetry relocations using 2 years of data for fishers (Pekania pennanti) as a case study. We evaluated (1) camera trap efficacy by estimating how camera detection probabilities are related to nearby telemetry relocations and (2) whether home range utilization density estimated from telemetry data adequately predicts camera detection probabilities, which would indicate consistency of the two data types. The number of telemetry relocations within 250 and 500 m from camera traps predicted detection probability well. For the same number of relocations, females were more likely to be detected during the first year. During the second year, all fishers were more likely to be detected during the fall/winter season. Models predicting camera detection probability and photo counts solely from telemetry utilization density had the best or nearly best Akaike Information Criterion (AIC), suggesting that telemetry and camera traps provide consistent information on space use. Given the same utilization density, males were more likely to be photo-captured due to larger home ranges and higher movement rates. Although methods that combine data types (spatially explicit capture-recapture) make simple assumptions about home range shapes, it is reasonable to conclude that in our case, camera trap data do reflect space use in a manner consistent with telemetry data. However, differences between the 2 years of data suggest that camera efficacy is not fully consistent across ecological conditions and make the case

  6. Selecting a distributional assumption for modelling relative densities of benthic macroinvertebrates

    USGS Publications Warehouse

    Gray, B.R.

    2005-01-01

    The selection of a distributional assumption suitable for modelling macroinvertebrate density data is typically challenging. Macroinvertebrate data often exhibit substantially larger variances than expected under a standard count assumption, that of the Poisson distribution. Such overdispersion may derive from multiple sources, including heterogeneity of habitat (historically and spatially), differing life histories for organisms collected within a single collection in space and time, and autocorrelation. Taken to extreme, heterogeneity of habitat may be argued to explain the frequent large proportions of zero observations in macroinvertebrate data. Sampling locations may consist of habitats defined qualitatively as either suitable or unsuitable. The former category may yield random or stochastic zeroes and the latter structural zeroes. Heterogeneity among counts may be accommodated by treating the count mean itself as a random variable, while extra zeroes may be accommodated using zero-modified count assumptions, including zero-inflated and two-stage (or hurdle) approaches. These and linear assumptions (following log- and square root-transformations) were evaluated using 9 years of mayfly density data from a 52 km, ninth-order reach of the Upper Mississippi River (n = 959). The data exhibited substantial overdispersion relative to that expected under a Poisson assumption (i.e. variance:mean ratio = 23 ??? 1), and 43% of the sampling locations yielded zero mayflies. Based on the Akaike Information Criterion (AIC), count models were improved most by treating the count mean as a random variable (via a Poisson-gamma distributional assumption) and secondarily by zero modification (i.e. improvements in AIC values = 9184 units and 47-48 units, respectively). Zeroes were underestimated by the Poisson, log-transform and square root-transform models, slightly by the standard negative binomial model but not by the zero-modified models (61%, 24%, 32%, 7%, and 0%, respectively

  7. Predictive Information: Status or Alert Information?

    NASA Technical Reports Server (NTRS)

    Trujillo, Anna C.; Bruneau, Daniel; Press, Hayes N.

    2008-01-01

    Previous research investigating the efficacy of predictive information for detecting and diagnosing aircraft system failures found that subjects like to have predictive information concerning when a parameter would reach an alert range. This research focused on where the predictive information should be located, whether the information should be more closely associated with the parameter information or with the alert information. Each subject saw 3 forms of predictive information: (1) none, (2) a predictive alert message, and (3) predictive information on the status display. Generally, subjects performed better and preferred to have predictive information available although the difference between status and alert predictive information was minimal. Overall, for detection and recalling what happened, status predictive information is best; however for diagnosis, alert predictive information holds a slight edge.

  8. Seasonality and Trend Forecasting of Tuberculosis Prevalence Data in Eastern Cape, South Africa, Using a Hybrid Model

    PubMed Central

    Azeez, Adeboye; Obaromi, Davies; Odeyemi, Akinwumi; Ndege, James; Muntabayi, Ruffin

    2016-01-01

    Background: Tuberculosis (TB) is a deadly infectious disease caused by Mycobacteria tuberculosis. Tuberculosis as a chronic and highly infectious disease is prevalent in almost every part of the globe. More than 95% of TB mortality occurs in low/middle income countries. In 2014, approximately 10 million people were diagnosed with active TB and two million died from the disease. In this study, our aim is to compare the predictive powers of the seasonal autoregressive integrated moving average (SARIMA) and neural network auto-regression (SARIMA-NNAR) models of TB incidence and analyse its seasonality in South Africa. Methods: TB incidence cases data from January 2010 to December 2015 were extracted from the Eastern Cape Health facility report of the electronic Tuberculosis Register (ERT.Net). A SARIMA model and a combined model of SARIMA model and a neural network auto-regression (SARIMA-NNAR) model were used in analysing and predicting the TB data from 2010 to 2015. Simulation performance parameters of mean square error (MSE), root mean square error (RMSE), mean absolute error (MAE), mean percent error (MPE), mean absolute scaled error (MASE) and mean absolute percentage error (MAPE) were applied to assess the better performance of prediction between the models. Results: Though practically, both models could predict TB incidence, the combined model displayed better performance. For the combined model, the Akaike information criterion (AIC), second-order AIC (AICc) and Bayesian information criterion (BIC) are 288.56, 308.31 and 299.09 respectively, which were lower than the SARIMA model with corresponding values of 329.02, 327.20 and 341.99, respectively. The seasonality trend of TB incidence was forecast to have a slightly increased seasonal TB incidence trend from the SARIMA-NNAR model compared to the single model. Conclusions: The combined model indicated a better TB incidence forecasting with a lower AICc. The model also indicates the need for resolute

  9. Ventilation/Perfusion Positron Emission Tomography—Based Assessment of Radiation Injury to Lung

    SciTech Connect

    Siva, Shankar; Hardcastle, Nicholas; Kron, Tomas; Bressel, Mathias; Callahan, Jason; MacManus, Michael P.; Shaw, Mark; Plumridge, Nikki; Hicks, Rodney J.; Steinfort, Daniel; Ball, David L.; Hofman, Michael S.

    2015-10-01

    Purpose: To investigate {sup 68}Ga-ventilation/perfusion (V/Q) positron emission tomography (PET)/computed tomography (CT) as a novel imaging modality for assessment of perfusion, ventilation, and lung density changes in the context of radiation therapy (RT). Methods and Materials: In a prospective clinical trial, 20 patients underwent 4-dimensional (4D)-V/Q PET/CT before, midway through, and 3 months after definitive lung RT. Eligible patients were prescribed 60 Gy in 30 fractions with or without concurrent chemotherapy. Functional images were registered to the RT planning 4D-CT, and isodose volumes were averaged into 10-Gy bins. Within each dose bin, relative loss in standardized uptake value (SUV) was recorded for ventilation and perfusion, and loss in air-filled fraction was recorded to assess RT-induced lung fibrosis. A dose-effect relationship was described using both linear and 2-parameter logistic fit models, and goodness of fit was assessed with Akaike Information Criterion (AIC). Results: A total of 179 imaging datasets were available for analysis (1 scan was unrecoverable). An almost perfectly linear negative dose-response relationship was observed for perfusion and air-filled fraction (r{sup 2}=0.99, P<.01), with ventilation strongly negatively linear (r{sup 2}=0.95, P<.01). Logistic models did not provide a better fit as evaluated by AIC. Perfusion, ventilation, and the air-filled fraction decreased 0.75 ± 0.03%, 0.71 ± 0.06%, and 0.49 ± 0.02%/Gy, respectively. Within high-dose regions, higher baseline perfusion SUV was associated with greater rate of loss. At 50 Gy and 60 Gy, the rate of loss was 1.35% (P=.07) and 1.73% (P=.05) per SUV, respectively. Of 8/20 patients with peritumoral reperfusion/reventilation during treatment, 7/8 did not sustain this effect after treatment. Conclusions: Radiation-induced regional lung functional deficits occur in a dose-dependent manner and can be estimated by simple linear models with 4D-V/Q PET

  10. Family-Joining: A Fast Distance-Based Method for Constructing Generally Labeled Trees

    PubMed Central

    Kalaghatgi, Prabhav; Pfeifer, Nico; Lengauer, Thomas

    2016-01-01

    The widely used model for evolutionary relationships is a bifurcating tree with all taxa/observations placed at the leaves. This is not appropriate if the taxa have been densely sampled across evolutionary time and may be in a direct ancestral relationship, or if there is not enough information to fully resolve all the branching points in the evolutionary tree. In this article, we present a fast distance-based agglomeration method called family-joining (FJ) for constructing so-called generally labeled trees in which taxa may be placed at internal vertices and the tree may contain polytomies. FJ constructs such trees on the basis of pairwise distances and a distance threshold. We tested three methods for threshold selection, FJ-AIC, FJ-BIC, and FJ-CV, which minimize Akaike information criterion, Bayesian information criterion, and cross-validation error, respectively. When compared with related methods on simulated data, FJ-BIC was among the best at reconstructing the correct tree across a wide range of simulation scenarios. FJ-BIC was applied to HIV sequences sampled from individuals involved in a known transmission chain. The FJ-BIC tree was found to be compatible with almost all transmission events. On average, internal branches in the FJ-BIC tree have higher bootstrap support than branches in the leaf-labeled bifurcating tree constructed using RAxML. 36% and 25% of the internal branches in the FJ-BIC tree and RAxML tree, respectively, have bootstrap support greater than 70%. To the best of our knowledge the method presented here is the first attempt at modeling evolutionary relationships using generally labeled trees. PMID:27436007

  11. Endogenous and exogenous factors controlling temporal abundance patterns of tropical mosquitoes.

    PubMed

    Yang, Guo-Jing; Brook, Barry W; Whelan, Peter I; Cleland, Sam; Bradshaw, Corey J A

    2008-12-01

    The growing demand for efficient and effective mosquito control requires a better understanding of vector population dynamics and how these are modified by endogenous and exogenous factors. A long-term (11-year) monitoring data set describing the relative abundance of the saltmarsh mosquito (Aedes vigilax) in the greater Darwin region, northern Australia, was examined in a suite of Gompertz-logistic (GL) models with and without hypothesized environmental correlates (high tide frequency, rainfall, and relative humidity). High tide frequency and humidity were hypothesized to influence saltmarsh mosquito abundance positively, and rainfall was hypothesized to correlate negatively by reducing the availability of suitable habitats (moist substrata) required by ovipositing adult female mosquitoes. We also examined whether environmental correlates explained the variance in seasonal carrying capacity (K) because environmental stochasticity is hypothesized to modify population growth rate (r), carrying capacity, or both. Current and lagged-time effects were tested by comparing alternative population dynamics models using three different information criteria (Akaike's Information Criterion [corrected; AIC(c)], Bayesian Information Criterion [BIC], and cross-validation [C-V]). The GL model with a two-month lag without environmental effects explained 31% of the deviance in population growth rate. This increased to > 70% under various model combinations of high tide frequency, rainfall, and relative humidity, of which, high tide frequency and rainfall had the highest contributions. Temporal variation in K was explained weakly by high tide frequency, and there was some evidence that the filling of depressions to reduce standing water availability has reduced Aedes vigilax carrying capacity over the study period. This study underscores the need to consider simultaneously both types of drivers (endogenous and exogenous) when predicting mosquito abundance and population growth

  12. Projecting climate-driven increases in North American fire activity

    NASA Astrophysics Data System (ADS)

    Wang, D.; Morton, D. C.; Collatz, G. J.

    2013-12-01

    Climate regulates fire activity through controls on vegetation productivity (fuels), lightning ignitions, and conditions governing fire spread. In many regions of the world, human management also influences the timing, duration, and extent of fire activity. These coupled interactions between human and natural systems make fire a complex component of the Earth system. Satellite data provide valuable information on the spatial and temporal dynamics of recent fire activity, as active fires, burned area, and land cover information can be combined to separate wildfires from intentional burning for agriculture and forestry. Here, we combined satellite-derived burned area data with land cover and climate data to assess fire-climate relationships in North America between 2000-2012. We used the latest versions of the Global Fire Emissions Database (GFED) burned area product and Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate data to develop regional relationships between burned area and potential evaporation (PE), an integrated dryness metric. Logistic regression models were developed to link burned area with PE and individual climate variables during and preceding the fire season, and optimal models were selected based on Akaike Information Criterion (AIC). Overall, our model explained 85% of the variance in burned area since 2000 across North America. Fire-climate relationships from the era of satellite observations provide a blueprint for potential changes in fire activity under scenarios of climate change. We used that blueprint to evaluate potential changes in fire activity over the next 50 years based on twenty models from the Coupled Model Intercomparison Project Phase 5 (CMIP5). All models suggest an increase of PE under low and high emissions scenarios (Representative Concentration Pathways (RCP) 4.5 and 8.5, respectively), with largest increases in projected burned area across the western US and central Canada. Overall, near

  13. Comparison of different models for genetic evaluation of egg weight in Mazandaran fowl.

    PubMed

    Zamani, P; Jasouri, M; Moradi, M R

    2015-01-01

    1. The aim of the present study was to compare different models to estimate variance components for egg weight (EW) in laying hens. 2. The data set included 67 542 EW records of 18 245 Mazandaran hens at 24, 28, 30, 32 and 84 weeks of age, during 19 consecutive generations. Variance components were estimated using multi-trait, repeatability, fixed regression and random regression models (MTM, RM, FRM and RRM, respectively) by Average Information-Restricted Maximum Likelihood algorithm (AI-REML). The models were compared based on Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). 3. The MTM was the best model followed by the Legendre RRMs. A RRM with 2nd degree of fit for fixed regression and 3(rd) and 2(nd) degrees of fit for random regressions of direct additive genetic and permanent environmental effects, respectively, was the best RRM. The FRM and RM were not proper models to fit the data. However, nesting curves within contemporary groups improved the fit of FRM. 4. Heritability estimates for EW by MTM (0.06-0.41) were close to the estimates obtained by the best RRM (0.09-0.45). In both MTM and RRM, positive genetic correlations were estimated for EW records at different ages, with higher correlations for adjacent records. 5. The results suggest that MTM is the best model for EW data, at least when the records are taken at relatively few age points. Though selection based on EW at higher ages might be more precise, 30 or 32 weeks of age could be considered as the most appropriate time points for selection on EW to maximise genetic improvement per time unit.

  14. Teaching Information Skills.

    ERIC Educational Resources Information Center

    Hawkins, Nancye, Ed.

    This booklet provides a framework within which information skills may be taught. Four broad categories of information skills--identifying and locating information sources, information intake, organizing information, and communicating information--are described. The development of an information skills policy which includes a sequential list of…

  15. An Information Policy for the Information Age.

    ERIC Educational Resources Information Center

    Blake, Virgil; Surprenant, Thomas

    1988-01-01

    Discusses recent federal information policies that pose a threat to access to information. A short-lived policy for protection of sensitive but unclassified information is criticized, and the Computer Security Act of 1987, currently under consideration in Congress, is described. Involvement by the library and information community in developing…

  16. Agricultural Libraries and Information.

    ERIC Educational Resources Information Center

    Russell, Keith W., Ed.; Pisa, Maria G., Ed.

    1990-01-01

    Eleven articles address issues relating to agricultural libraries and information, including background on agricultural libraries and information, trend management, document delivery, reference services, user needs and library services, collection development, technologies for international information management, information sources,…

  17. The number and type of food retailers surrounding schools and their association with lunchtime eating behaviours in students

    PubMed Central

    2013-01-01

    Background The primary study objective was to examine whether the presence of food retailers surrounding schools was associated with students’ lunchtime eating behaviours. The secondary objective was to determine whether measures of the food retail environment around schools captured using road network or circular buffers were more strongly related to eating behaviours while at school. Methods Grade 9 and 10 students (N=6,971) who participated in the 2009/10 Canadian Health Behaviour in School Aged Children Survey were included in this study. The outcome was determined by students’ self-reports of where they typically ate their lunch during school days. Circular and road network-based buffers were created for a 1 km distance surrounding 158 schools participating in the HBSC. The addresses of fast food restaurants, convenience stores and coffee/donut shops were mapped within the buffers. Multilevel logistic regression was used to determine whether there was a relationship between the presence of food retailers near schools and students regularly eating their lunch at a fast food restaurant, snack-bar or café. The Akaike Information Criteria (AIC) value, a measure of goodness-of-fit, was used to determine the optimal buffer type. Results For the 1 km circular buffers, students with 1–2 (OR= 1.10, 95% CI: 0.57-2.11), 3–4 (OR=1.45, 95% CI: 0.75-2.82) and ≥5 nearby food retailers (OR=2.94, 95% CI: 1.71-5.09) were more likely to eat lunch at a food retailer compared to students with no nearby food retailers. The relationships were slightly stronger when assessed via 1 km road network buffers, with a greater likelihood of eating at a food retailer for 1–2 (OR=1.20, 95% CI:0.74-1.95), 3–4 (OR=3.19, 95% CI: 1.66-6.13) and ≥5 nearby food retailers (OR=3.54, 95% CI: 2.08-6.02). Road network buffers appeared to provide a better measure of the food retail environment, as indicated by a lower AIC value (3332 vs. 3346). Conclusions There was a strong

  18. Determination of optimal diagnostic criteria for purulent vaginal discharge and cytological endometritis in dairy cows.

    PubMed

    Denis-Robichaud, J; Dubuc, J

    2015-10-01

    The objectives of this observational study were to identify the optimal diagnostic criteria for purulent vaginal discharge (PVD) and cytological endometritis (ENDO) using vaginal discharge, endometrial cytology, and leukocyte esterase (LE) tests, and to quantify their effect on subsequent reproductive performance. Data generated from 1,099 untreated Holstein cows (28 herds) enrolled in a randomized clinical trial were used in this study. Cows were examined at 35 (± 7) d in milk for PVD using vaginal discharge scoring and for ENDO using endometrial cytology and LE testing. Optimal combinations of diagnostic criteria were determined based on the lowest Akaike information criterion (AIC) to predict pregnancy status at first service. Once identified, these criteria were used to quantify the effect of PVD and ENDO on pregnancy risk at first service and on pregnancy hazard until 200 d in milk (survival analysis). Predicting ability of these diagnostic criteria was determined using area under the curve (AUC) values. The prevalence of PVD and ENDO was calculated as well as the agreement between endometrial cytology and LE. The optimal diagnostic criteria (lowest AIC) identified in this study were purulent vaginal discharge or worse (≥ 4), ≥ 6% polymorphonuclear leukocytes (PMNL) by endometrial cytology, and small amounts of leukocytes or worse (≥ 1) by LE testing. When using the combination of vaginal discharge and PMNL percentage as diagnostic tools (n = 1,099), the prevalences of PVD and ENDO were 17.1 and 36.2%, respectively. When using the combination of vaginal discharge and LE (n = 915), the prevalences of PVD and ENDO were 17.1 and 48.4%. The optimal strategies for predicting pregnancy status at first service were the use of LE only (AUC = 0.578) and PMNL percentage only (AUC = 0.575). Cows affected by PVD and ENDO had 0.36 and 0.32 times the odds, respectively, of being pregnant at first service when using PMNL percentage compared with that of unaffected

  19. Characterizing the relationship between temperature and mortality in tropical and subtropical cities: a distributed lag non-linear model analysis in Hue, Viet Nam, 2009–2013

    PubMed Central

    Dang, Tran Ngoc; Seposo, Xerxes T.; Duc, Nguyen Huu Chau; Thang, Tran Binh; An, Do Dang; Hang, Lai Thi Minh; Long, Tran Thanh; Loan, Bui Thi Hong; Honda, Yasushi

    2016-01-01

    Background The relationship between temperature and mortality has been found to be U-, V-, or J-shaped in developed temperate countries; however, in developing tropical/subtropical cities, it remains unclear. Objectives Our goal was to investigate the relationship between temperature and mortality in Hue, a subtropical city in Viet Nam. Design We collected daily mortality data from the Vietnamese A6 mortality reporting system for 6,214 deceased persons between 2009 and 2013. A distributed lag non-linear model was used to examine the temperature effects on all-cause and cause-specific mortality by assuming negative binomial distribution for count data. We developed an objective-oriented model selection with four steps following the Akaike information criterion (AIC) rule (i.e. a smaller AIC value indicates a better model). Results High temperature-related mortality was more strongly associated with short lags, whereas low temperature-related mortality was more strongly associated with long lags. The low temperatures increased risk in all-category mortality compared to high temperatures. We observed elevated temperature-mortality risk in vulnerable groups: elderly people (high temperature effect, relative risk [RR]=1.42, 95% confidence interval [CI]=1.11–1.83; low temperature effect, RR=2.0, 95% CI=1.13–3.52), females (low temperature effect, RR=2.19, 95% CI=1.14–4.21), people with respiratory disease (high temperature effect, RR=2.45, 95% CI=0.91–6.63), and those with cardiovascular disease (high temperature effect, RR=1.6, 95% CI=1.15–2.22; low temperature effect, RR=1.99, 95% CI=0.92–4.28). Conclusions In Hue, the temperature significantly increased the risk of mortality, especially in vulnerable groups (i.e. elderly, female, people with respiratory and cardiovascular diseases). These findings may provide a foundation for developing adequate policies to address the effects of temperature on health in Hue City. PMID:26781954

  20. Human Benzene Metabolism Following Occupational and Environmental Exposures

    PubMed Central

    Rappaport, Stephen M.; Kim, Sungkyoon; Lan, Qing; Li, Guilan; Vermeulen, Roel; Waidyanatha, Suramya; Zhang, Luoping; Yin, Songnian; Smith, Martyn T.; Rothman, Nathaniel

    2011-01-01

    We previously reported evidence that humans metabolize benzene via two enzymes, including a hitherto unrecognized high-affinity enzyme that was responsible for an estimated 73 percent of total urinary metabolites [sum of phenol (PH), hydroquinone (HQ), catechol (CA), E,E-muconic acid (MA), and S-phenylmercapturic acid (SPMA)] in nonsmoking females exposed to benzene at sub-saturating (ppb) air concentrations. Here, we used the same Michaelis-Menten-like kinetic models to individually analyze urinary levels of PH, HQ, CA and MA from 263 nonsmoking Chinese women (179 benzene-exposed workers and 84 control workers) with estimated benzene air concentrations ranging from less than 0.001 ppm to 299 ppm. One model depicted benzene metabolism as a single enzymatic process (1-enzyme model) and the other as two enzymatic processes which competed for access to benzene (2-enzyme model). We evaluated model fits based upon the difference in values of Akaike’s Information Criterion (ΔAIC), and we gauged the weights of evidence favoring the two models based upon the associated Akaike weights and Evidence Ratios. For each metabolite, the 2-enzyme model provided a better fit than the 1-enzyme model with ΔAIC values decreasing in the order 9.511 for MA, 7.379 for PH, 1.417 for CA, and 0.193 for HQ. The corresponding weights of evidence favoring the 2-enzyme model (Evidence Ratios) were: 116.2:1 for MA, 40.0:1 for PH, 2.0:1 for CA and 1.1:1 for HQ. These results indicate that our earlier findings from models of total metabolites were driven largely by MA, representing the ring-opening pathway, and by PH, representing the ring-hydroxylation pathway. The predicted percentage of benzene metabolized by the putative high-affinity enzyme at an air concentration of 0.001 ppm was 88% based upon urinary MA and was 80% based upon urinary PH. As benzene concentrations increased, the respective percentages of benzene metabolized to MA and PH by the high-affinity enzyme decreased successively

  1. Morphological assessment on day 4 and its prognostic power in selecting viable embryos for transfer.

    PubMed

    Fabozzi, Gemma; Alteri, Alessandra; Rega, Emilia; Starita, Maria Flavia; Piscitelli, Claudio; Giannini, Pierluigi; Colicchia, Antonio

    2016-08-01

    The aim of this study was to describe a system for embryo morphology scoring at the morula stage and to determine the efficiency of this model in selecting viable embryos for transfer. In total, 519 embryos from 122 patients undergoing intracytoplasmic sperm injection (ICSI) were scored retrospectively on day 4 according to the grading system proposed in this article. Two separate quality scores were assigned to each embryo in relation to the grade of compaction and fragmentation and their developmental fate was then observed on days 5 and 6. Secondly, the prediction value of this scoring system was compared with the prediction value of the traditional scoring system adopted on day 3. Morulas classified as grade A showed a significant higher blastocyst formation rate (87.2%) compared with grades B, C and D (63.8, 41.3 and 15.0%, respectively), (P < 0.001). Furthermore, the ability to form top quality blastocysts was significantly higher for grade A morulas with respect to grades B, and C and D (37.8% vs. 22.4% vs. 11.1%), (P < 0.001). Finally, the morula scoring system showed more prediction power with respect to the embryo scoring a value of 1 [Akaike information criterion (AIC) index 16.4 vs. 635.3 and Bayesian information criterion (BIC) index -68.8 vs. -30.0 for morulas and embryos respectively]. In conclusion, results demonstrated that the presented scoring system allows for the evaluation of eligible embryos for transfer as a significant correlation between the grade of morula, blastulation rate and blastocyst quality was observed. Furthermore, the morula scoring system was shown to be the best predictive model when compared with the traditional scoring system performed on day 3.

  2. Predictability of Western Himalayan River flow: melt seasonal inflow into Bhakra Reservoir in Northern India

    NASA Astrophysics Data System (ADS)

    Pal, I.; Lall, U.; Robertson, A. W.; Cane, M. A.; Bansal, R.

    2012-07-01

    Snowmelt dominated streamflow of the Western Himalayan Rivers is an important water resource during the dry pre-monsoon spring months to meet the irrigation and hydropower needs in Northern India. Here we study the seasonal prediction of melt-dominated total inflow into the Bhakra Dam in Northern India based on statistical relationships with meteorological variables during the preceding winter. Total inflow into the Bhakra dam includes the Satluj River flow together with a flow diversion from its tributary, the Beas River. Both are tributaries of the Indus River that originate from the Western Himalayas, which is an under-studied region. Average measured winter snow volume at the upper elevation stations and corresponding lower elevation rainfall and temperature of the Satluj River basin were considered as empirical predictors. Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC) were used to select the best subset of inputs from all the possible combinations of predictors for a multiple linear regression framework. To test for potential issues arising due to multi-collinearity of the predictor variables, cross-validated prediction skills of best subset were also compared with the prediction skills of Principal Component Regression (PCR) and Partial Least Squares Regression (PLSR) techniques, which yielded broadly similar results. As a whole, the forecasts of the melt season at the end of winter and as the melt season commences were shown to have potential skill for guiding the development of stochastic optimization models to manage the trade-off between irrigation and hydropower releases versus flood control during the annual fill cycle of the Bhakra reservoir, a major energy and irrigation source in the region.

  3. Diversity of benthic biofilms along a land use gradient in tropical headwater streams, Puerto Rico.

    PubMed

    Burgos-Caraballo, Sofía; Cantrell, Sharon A; Ramírez, Alonso

    2014-07-01

    The properties of freshwater ecosystems can be altered, directly or indirectly, by different land uses (e.g., urbanization and agriculture). Streams heavily influenced by high nutrient concentrations associated with agriculture or urbanization may present conditions that can be intolerable for many aquatic species such as macroinvertebrates and fishes. However, information with respect to how benthic microbial communities may respond to changes in stream ecosystem properties in relation to agricultural or urban land uses is limited, in particular for tropical ecosystems. In this study, diversity of benthic biofilms was evaluated in 16 streams along a gradient of land use at the Turabo watershed in Puerto Rico using terminal restriction fragment length polymorphism. Diversity indices and community structure descriptors (species richness, Shannon diversity, dominance and evenness) were calculated for both bacteria and eukaryotes for each stream. Diversity of both groups, bacteria and eukaryotes, did not show a consistent pattern with land use, since it could be high or low at streams dominated by different land uses. This suggests that diversity of biofilms may be more related to site-specific conditions rather than watershed scale factors. To assess this contention, the relationship between biofilm diversity and reach-scale parameters (i.e., nutrient concentrations, canopy cover, conductivity, and dissolved oxygen) was determined using the Akaike Information Criterion (AIC(c)) for small sample size. Results indicated that nitrate was the variable that best explained variations in biofilm diversity. Since nitrate concentrations tend to increase with urban land use, our results suggest that urbanization may indeed increase microbial diversity indirectly by increasing nutrients in stream water.

  4. Productivity, embryo and eggshell characteristics, and contaminants in bald eagles from the Great Lakes, USA, 1986 to 2000.

    PubMed

    Best, David A; Elliott, Kyle H; Bowerman, William W; Shieldcastle, Mark; Postupalsky, Sergej; Kubiak, Timothy J; Tillitt, Donald E; Elliott, John E

    2010-07-01

    Chlorinated hydrocarbon concentrations in eggs of fish-eating birds from contaminated environments such as the Great Lakes of North America tend to be highly intercorrelated, making it difficult to elucidate mechanisms causing reproductive impairment, and to ascribe cause to specific chemicals. An information- theoretic approach was used on data from 197 salvaged bald eagle (Haliaeetus leucocephalus) eggs (159 clutches) that failed to hatch in Michigan and Ohio, USA (1986-2000). Contaminant levels declined over time while eggshell thickness increased, and by 2000 was at pre-1946 levels. The number of occupied territories and productivity increased during 1981 to 2004. For both the entire dataset and a subset of nests along the Great Lakes shoreline, polychlorinated biphenyls (SigmaPCBs, fresh wet wt) were generally included in the most parsimonious models (lowest-Akaike's information criterion [AICs]) describing productivity, with significant declines in productivity observed above 26 microg/g SigmaPCBs (fresh wet wt). Of 73 eggs with a visible embryo, eight (11%) were abnormal, including three with skewed bills, but they were not associated with known teratogens, including SigmaPCBs. Eggs with visible embryos had greater concentrations of all measured contaminants than eggs without visible embryos; the most parsimonious models describing the presence of visible embryos incorporated dieldrin equivalents and dichlorodiphenyldichloroethylene (DDE). There were significant negative correlations between eggshell thickness and all contaminants, with SigmaPCBs included in the most parsimonious models. There were, however, no relationships between productivity and eggshell thickness or Ratcliffe's index. The SigmaPCBs and DDE were negatively associated with nest success of bald eagles in the Great Lakes watersheds, but the mechanism does not appear to be via shell quality effects, at least at current contaminant levels, while it is not clear what other mechanisms were

  5. Data-driven input variable selection for rainfall-runoff modeling using binary-coded particle swarm optimization and Extreme Learning Machines

    NASA Astrophysics Data System (ADS)

    Taormina, Riccardo; Chau, Kwok-Wing

    2015-10-01

    Selecting an adequate set of inputs is a critical step for successful data-driven streamflow prediction. In this study, we present a novel approach for Input Variable Selection (IVS) that employs Binary-coded discrete Fully Informed Particle Swarm optimization (BFIPS) and Extreme Learning Machines (ELM) to develop fast and accurate IVS algorithms. A scheme is employed to encode the subset of selected inputs and ELM specifications into the binary particles, which are evolved using single objective and multi-objective BFIPS optimization (MBFIPS). The performances of these ELM-based methods are assessed using the evaluation criteria and the datasets included in the comprehensive IVS evaluation framework proposed by Galelli et al. (2014). From a comparison with 4 major IVS techniques used in their original study it emerges that the proposed methods compare very well in terms of selection accuracy. The best performers were found to be (1) a MBFIPS-ELM algorithm based on the concurrent minimization of an error function and the number of selected inputs, and (2) a BFIPS-ELM algorithm based on the minimization of a variant of the Akaike Information Criterion (AIC). The first technique is arguably the most accurate overall, and is able to reach an almost perfect specification of the optimal input subset for a partially synthetic rainfall-runoff experiment devised for the Kentucky River basin. In addition, MBFIPS-ELM allows for the determination of the relative importance of the selected inputs. On the other hand, the BFIPS-ELM is found to consistently reach high accuracy scores while being considerably faster. By extrapolating the results obtained on the IVS test-bed, it can be concluded that the proposed techniques are particularly suited for rainfall-runoff modeling applications characterized by high nonlinearity in the catchment dynamics.

  6. Density dependence and risk of extinction in a small population of sea otters

    USGS Publications Warehouse

    Gerber, L.R.; Buenau, K.E.; VanBlaricom, G.

    2004-01-01

    Sea otters (Enhydra lutris (L.)) were hunted to extinction off the coast of Washington State early in the 20th century. A new population was established by translocations from Alaska in 1969 and 1970. The population, currently numbering at least 550 animals, A major threat to the population is the ongoing risk of majour oil spills in sea otter habitat. We apply population models to census and demographic data in order to evaluate the status of the population. We fit several density dependent models to test for density dependence and determine plausible values for the carrying capacity (K) by comparing model goodness of fit to an exponential model. Model fits were compared using Akaike Information Criterion (AIC). A significant negative relationship was found between the population growth rate and population size (r2=0.27, F=5.57, df=16, p<0.05), suggesting density dependence in Washington state sea otters. Information criterion statistics suggest that the model is the most parsimonious, followed closely by the logistic Beverton-Holt model. Values of K ranged from 612 to 759 with best-fit parameter estimates for the Beverton-Holt model including 0.26 for r and 612 for K. The latest (2001) population index count (555) puts the population at 87-92% of the estimated carrying capacity, above the suggested range for optimum sustainable population (OSP). Elasticity analysis was conducted to examine the effects of proportional changes in vital rates on the population growth rate (??). The elasticity values indicate the population is most sensitive to changes in survival rates (particularly adult survival).

  7. Productivity, embryo and eggshell characteristics, and contaminants in bald eagles from the Great Lakes, USA, 1986 to 2000.

    PubMed

    Best, David A; Elliott, Kyle H; Bowerman, William W; Shieldcastle, Mark; Postupalsky, Sergej; Kubiak, Timothy J; Tillitt, Donald E; Elliott, John E

    2010-07-01

    Chlorinated hydrocarbon concentrations in eggs of fish-eating birds from contaminated environments such as the Great Lakes of North America tend to be highly intercorrelated, making it difficult to elucidate mechanisms causing reproductive impairment, and to ascribe cause to specific chemicals. An information- theoretic approach was used on data from 197 salvaged bald eagle (Haliaeetus leucocephalus) eggs (159 clutches) that failed to hatch in Michigan and Ohio, USA (1986-2000). Contaminant levels declined over time while eggshell thickness increased, and by 2000 was at pre-1946 levels. The number of occupied territories and productivity increased during 1981 to 2004. For both the entire dataset and a subset of nests along the Great Lakes shoreline, polychlorinated biphenyls (SigmaPCBs, fresh wet wt) were generally included in the most parsimonious models (lowest-Akaike's information criterion [AICs]) describing productivity, with significant declines in productivity observed above 26 microg/g SigmaPCBs (fresh wet wt). Of 73 eggs with a visible embryo, eight (11%) were abnormal, including three with skewed bills, but they were not associated with known teratogens, including SigmaPCBs. Eggs with visible embryos had greater concentrations of all measured contaminants than eggs without visible embryos; the most parsimonious models describing the presence of visible embryos incorporated dieldrin equivalents and dichlorodiphenyldichloroethylene (DDE). There were significant negative correlations between eggshell thickness and all contaminants, with SigmaPCBs included in the most parsimonious models. There were, however, no relationships between productivity and eggshell thickness or Ratcliffe's index. The SigmaPCBs and DDE were negatively associated with nest success of bald eagles in the Great Lakes watersheds, but the mechanism does not appear to be via shell quality effects, at least at current contaminant levels, while it is not clear what other mechanisms were

  8. Phylogenetic systematics and biogeography of hummingbirds: Bayesian and maximum likelihood analyses of partitioned data and selection of an appropriate partitioning strategy.

    PubMed

    McGuire, Jimmy A; Witt, Christopher C; Altshuler, Douglas L; Remsen, J V

    2007-10-01

    Hummingbirds are an important model system in avian biology, but to date the group has been the subject of remarkably few phylogenetic investigations. Here we present partitioned Bayesian and maximum likelihood phylogenetic analyses for 151 of approximately 330 species of hummingbirds and 12 outgroup taxa based on two protein-coding mitochondrial genes (ND2 and ND4), flanking tRNAs, and two nuclear introns (AK1 and BFib). We analyzed these data under several partitioning strategies ranging between unpartitioned and a maximum of nine partitions. In order to select a statistically justified partitioning strategy following partitioned Bayesian analysis, we considered four alternative criteria including Bayes factors, modified versions of the Akaike information criterion for small sample sizes (AIC(c)), Bayesian information criterion (BIC), and a decision-theoretic methodology (DT). Following partitioned maximum likelihood analyses, we selected a best-fitting strategy using hierarchical likelihood ratio tests (hLRTS), the conventional AICc, BIC, and DT, concluding that the most stringent criterion, the performance-based DT, was the most appropriate methodology for selecting amongst partitioning strategies. In the context of our well-resolved and well-supported phylogenetic estimate, we consider the historical biogeography of hummingbirds using ancestral state reconstructions of (1) primary geographic region of occurrence (i.e., South America, Central America, North America, Greater Antilles, Lesser Antilles), (2) Andean or non-Andean geographic distribution, and (3) minimum elevational occurrence. These analyses indicate that the basal hummingbird assemblages originated in the lowlands of South America, that most of the principle clades of hummingbirds (all but Mountain Gems and possibly Bees) originated on this continent, and that there have been many (at least 30) independent invasions of other primary landmasses, especially Central America.

  9. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size.

  10. Critical thresholds associated with habitat loss: a review of the concepts, evidence, and applications.

    PubMed

    Swift, Trisha L; Hannon, Susan J

    2010-02-01

    A major conservation concern is whether population size and other ecological variables change linearly with habitat loss, or whether they suddenly decline more rapidly below a "critical threshold" level of habitat. The most commonly discussed explanation for critical threshold responses to habitat loss focus on habitat configuration. As habitat loss progresses, the remaining habitat is increasingly fragmented or the fragments are increasingly isolated, which may compound the effects of habitat loss. In this review we also explore other possible explanations for apparently nonlinear relationships between habitat loss and ecological responses, including Allee effects and time lags, and point out that some ecological variables will inherently respond nonlinearly to habitat loss even in the absence of compounding factors. In the literature, both linear and nonlinear ecological responses to habitat loss are evident among simulation and empirical studies, although the presence and value of critical thresholds is influenced by characteristics of the species (e.g. dispersal, reproduction, area/edge sensitivity) and landscape (e.g. fragmentation, matrix quality, rate of change). With enough empirical support, such trends could be useful for making important predictions about species' responses to habitat loss, to guide future research on the underlying causes of critical thresholds, and to make better informed management decisions. Some have seen critical thresholds as a means of identifying conservation targets for habitat retention. We argue that in many cases this may be misguided, and that the meaning (and utility) of a critical threshold must be interpreted carefully and in relation to the response variable and management goal. Despite recent interest in critical threshold responses to habitat loss, most studies have not used any formal statistical methods to identify their presence or value. Methods that have been used include model comparisons using Akaike

  11. Earthquake interevent time distribution in Kachchh, Northwestern India

    NASA Astrophysics Data System (ADS)

    Pasari, Sumanta; Dikshit, Onkar

    2015-08-01

    Statistical properties of earthquake interevent times have long been the topic of interest to seismologists and earthquake professionals, mainly for hazard-related concerns. In this paper, we present a comprehensive study on the temporal statistics of earthquake interoccurrence times of the seismically active Kachchh peninsula (western India) from thirteen probability distributions. Those distributions are exponential, gamma, lognormal, Weibull, Levy, Maxwell, Pareto, Rayleigh, inverse Gaussian (Brownian passage time), inverse Weibull (Frechet), exponentiated exponential, exponentiated Rayleigh (Burr type X), and exponentiated Weibull distributions. Statistical inferences of the scale and shape parameters of these distributions are discussed from the maximum likelihood estimations and the Fisher information matrices. The latter are used as a surrogate tool to appraise the parametric uncertainty in the estimation process. The results were found on the basis of two goodness-of-fit tests: the maximum likelihood criterion with its modification to Akaike information criterion (AIC) and the Kolmogorov-Smirnov (K-S) minimum distance criterion. These results reveal that (i) the exponential model provides the best fit, (ii) the gamma, lognormal, Weibull, inverse Gaussian, exponentiated exponential, exponentiated Rayleigh, and exponentiated Weibull models provide an intermediate fit, and (iii) the rest, namely Levy, Maxwell, Pareto, Rayleigh, and inverse Weibull, fit poorly to the earthquake catalog of Kachchh and its adjacent regions. This study also analyzes the present-day seismicity in terms of the estimated recurrence interval and conditional probability curves (hazard curves). The estimated cumulative probability and the conditional probability of a magnitude 5.0 or higher event reach 0.8-0.9 by 2027-2036 and 2034-2043, respectively. These values have significant implications in a variety of practical applications including earthquake insurance, seismic zonation

  12. How good is crude MDL for solving the bias-variance dilemma? An empirical investigation based on Bayesian networks.

    PubMed

    Cruz-Ramírez, Nicandro; Acosta-Mesa, Héctor Gabriel; Mezura-Montes, Efrén; Guerra-Hernández, Alejandro; Hoyos-Rivera, Guillermo de Jesús; Barrientos-Martínez, Rocío Erandi; Gutiérrez-Fragoso, Karina; Nava-Fernández, Luis Alonso; González-Gaspar, Patricia; Novoa-del-Toro, Elva María; Aguilera-Rueda, Vicente Josué; Ameca-Alducin, María Yaneli

    2014-01-01

    The bias-variance dilemma is a well-known and important problem in Machine Learning. It basically relates the generalization capability (goodness of fit) of a learning method to its corresponding complexity. When we have enough data at hand, it is possible to use these data in such a way so as to minimize overfitting (the risk of selecting a complex model that generalizes poorly). Unfortunately, there are many situations where we simply do not have this required amount of data. Thus, we need to find methods capable of efficiently exploiting the available data while avoiding overfitting. Different metrics have been proposed to achieve this goal: the Minimum Description Length principle (MDL), Akaike's Information Criterion (AIC) and Bayesian Information Criterion (BIC), among others. In this paper, we focus on crude MDL and empirically evaluate its performance in selecting models with a good balance between goodness of fit and complexity: the so-called bias-variance dilemma, decomposition or tradeoff. Although the graphical interaction between these dimensions (bias and variance) is ubiquitous in the Machine Learning literature, few works present experimental evidence to recover such interaction. In our experiments, we argue that the resulting graphs allow us to gain insights that are difficult to unveil otherwise: that crude MDL naturally selects balanced models in terms of bias-variance, which not necessarily need be the gold-standard ones. We carry out these experiments using a specific model: a Bayesian network. In spite of these motivating results, we also should not overlook three other components that may significantly affect the final model selection: the search procedure, the noise rate and the sample size. PMID:24671204

  13. ActiveSeismoPick3D - automatic first arrival determination for large active seismic arrays

    NASA Astrophysics Data System (ADS)

    Paffrath, Marcel; Küperkoch, Ludger; Wehling-Benatelli, Sebastian; Friederich, Wolfgang

    2016-04-01

    We developed a tool for automatic determination of first arrivals in active seismic data based on an approach, that utilises higher order statistics (HOS) and the Akaike information criterion (AIC), commonly used in seismology, but not in active seismics. Automatic picking is highly desirable in active seismics as the number of data provided by large seismic arrays rapidly exceeds of what an analyst can evaluate in a reasonable amount of time. To bring the functionality of automatic phase picking into the context of active data, the software package ActiveSeismoPick3D was developed in Python. It uses a modified algorithm for the determination of first arrivals which searches for the HOS maximum in unfiltered data. Additionally, it offers tools for manual quality control and postprocessing, e.g. various visualisation and repicking functionalities. For flexibility, the tool also includes methods for the preparation of geometry information of large seismic arrays and improved interfaces to the Fast Marching Tomography Package (FMTOMO), which can be used for the prediction of travel times and inversion for subsurface properties. Output files are generated in the VTK format, allowing the 3D visualization of e.g. the inversion results. As a test case, a data set consisting of 9216 traces from 64 shots was gathered, recorded at 144 receivers deployed in a regular 2D array of a size of 100 x 100 m. ActiveSeismoPick3D automatically checks the determined first arrivals by a dynamic signal to noise ratio threshold. From the data a 3D model of the subsurface was generated using the export functionality of the package and FMTOMO.

  14. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    PubMed Central

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-01-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible. PMID:25745272

  15. Predictability of Western Himalayan river flow: melt seasonal inflow into Bhakra Reservoir in northern India

    NASA Astrophysics Data System (ADS)

    Pal, I.; Lall, U.; Robertson, A. W.; Cane, M. A.; Bansal, R.

    2013-06-01

    Snowmelt-dominated streamflow of the Western Himalayan rivers is an important water resource during the dry pre-monsoon spring months to meet the irrigation and hydropower needs in northern India. Here we study the seasonal prediction of melt-dominated total inflow into the Bhakra Dam in northern India based on statistical relationships with meteorological variables during the preceding winter. Total inflow into the Bhakra Dam includes the Satluj River flow together with a flow diversion from its tributary, the Beas River. Both are tributaries of the Indus River that originate from the Western Himalayas, which is an under-studied region. Average measured winter snow volume at the upper-elevation stations and corresponding lower-elevation rainfall and temperature of the Satluj River basin were considered as empirical predictors. Akaike information criteria (AIC) and Bayesian information criteria (BIC) were used to select the best subset of inputs from all the possible combinations of predictors for a multiple linear regression framework. To test for potential issues arising due to multicollinearity of the predictor variables, cross-validated prediction skills of the best subset were also compared with the prediction skills of principal component regression (PCR) and partial least squares regression (PLSR) techniques, which yielded broadly similar results. As a whole, the forecasts of the melt season at the end of winter and as the melt season commences were shown to have potential skill for guiding the development of stochastic optimization models to manage the trade-off between irrigation and hydropower releases versus flood control during the annual fill cycle of the Bhakra Reservoir, a major energy and irrigation source in the region.

  16. Factors associated with utilization of antenatal care services in Balochistan province of Pakistan: An analysis of the Multiple Indicator Cluster Survey (MICS) 2010

    PubMed Central

    Ghaffar, Abdul; Pongponich, Sathirakorn; Ghaffar, Najma; Mehmood, Tahir

    2015-01-01

    Objective: The study was conducted to identify factors affecting the utilization of Antenatal Care (ANC) in Balochistan Province, Pakistan. Methods: Data on ANC utilization, together with social and economic determinants, were derived from a Multiple Indicator Cluster Survey (MICS) conducted in Balochistan in 2010. The analysis was conducted including 2339 women who gave birth in last two years preceding the survey. The researchers established a model to identify influential factors contributing to the utilization of ANC by logistic regression; model selection was by Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). Results: Household wealth, education, health condition, age at first marriage, number of children and spouse violence justification were found to be significantly associated with ANC coverage. Literate mothers are 2.45 times more likely to have ANC, and women whose newborns showed symptoms of illness at birth that needed hospitalization are 0.47 times less likely to access ANC. Women with an increase in the number of surviving children are 1.07 times less likely to have ANC, and those who think their spouse violence is socially justified are 1.36 times less likely to have ANC. The results draw attention towards evidence based planning of factors associated with utilization of ANC in the Balochistan province. Conclusion: The study reveals that women from high wealth index and having education had more chances to get ANC. Factors like younger age of the women at first marriage, increased number of children, symptoms of any illness to neonates at birth that need hospitalization and women who justify spouse violence had less chances to get ANC. Among components of ANC urine sampling and having tetanus toxoid (TT) in the last pregnancy increased the frequency of visits. ANC from a doctor decreased the number of visits. There is dire need to reduce disparities for wealth index, education and urban/rural living. PMID:26870113

  17. Model selection on solid ground: Rigorous comparison of nine ways to evaluate Bayesian model evidence

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Samaniego, Luis; Nowak, Wolfgang

    2014-12-01

    Bayesian model selection or averaging objectively ranks a number of plausible, competing conceptual models based on Bayes' theorem. It implicitly performs an optimal trade-off between performance in fitting available data and minimum model complexity. The procedure requires determining Bayesian model evidence (BME), which is the likelihood of the observed data integrated over each model's parameter space. The computation of this integral is highly challenging because it is as high-dimensional as the number of model parameters. Three classes of techniques to compute BME are available, each with its own challenges and limitations: (1) Exact and fast analytical solutions are limited by strong assumptions. (2) Numerical evaluation quickly becomes unfeasible for expensive models. (3) Approximations known as information criteria (ICs) such as the AIC, BIC, or KIC (Akaike, Bayesian, or Kashyap information criterion, respectively) yield contradicting results with regard to model ranking. Our study features a theory-based intercomparison of these techniques. We further assess their accuracy in a simplistic synthetic example where for some scenarios an exact analytical solution exists. In more challenging scenarios, we use a brute-force Monte Carlo integration method as reference. We continue this analysis with a real-world application of hydrological model selection. This is a first-time benchmarking of the various methods for BME evaluation against true solutions. Results show that BME values from ICs are often heavily biased and that the choice of approximation method substantially influences the accuracy of model ranking. For reliable model selection, bias-free numerical methods should be preferred over ICs whenever computationally feasible.

  18. Broad-scale predictors of canada lynx occurrence in eastern North America

    USGS Publications Warehouse

    Hoving, C.L.; Harrison, D.J.; Krohn, W.B.; Joseph, R.A.; O'Brien, M.

    2005-01-01

    The Canada lynx (Lynx canadensis) is listed as a threatened species throughout the southern extent of its geographic range in the United States. Most research on lynx has been conducted in the western United States and Canada; little is known about the ecology of lynx in eastern North America. To fill critical knowledge gaps about this species, we modeled and mapped lynx occurrence using habitat and weather data from 7 eastern states and 3 Canadian provinces. Annual snowfall, road density, bobcat (L. rufus) harvest, deciduous forest, and coniferous forest were compared at 1,150 lynx locations and 1,288 random locations. Nineteen a priori models were developed using the information-theoretic approach, and logistic regression models were ranked using Akaike's Information Criterion (AIC) and by our ability to correctly classify reserved data (Kappa). Annual snowfall and deciduous forest predicted lynx presence and absence for a reserved dataset (n = 278) with 94% accuracy. A map of the probability of lynx occurrence throughout the region revealed that 92% of the potential habitat (i.e., >50% probability of occurrence) was concentrated in a relatively contiguous complex encompassing northern Maine, New Brunswick, and the Gaspe?? peninsula of Quebec. Most of the remaining potential habitat (5%) was on northern Cape Breton Island in Nova Scotia. Potential habitat in New Hampshire, Vermont, and New York was small (1,252 km2), fragmented, and isolated (>200 km) from known lynx populations. When federally listed as threatened in the contiguous United States in 2000, inadequate regulations on federal lands were cited as the primary threat to Canada lynx. However, the majority of potential lynx habitat in the eastern United States is on private lands and continuous with potential habitat in Canada. Therefore, lynx conservation in eastern North America will need to develop partnerships across national, state, and provincial boundaries as well as with private landowners.

  19. Organising and presenting information.

    PubMed

    Kankanady, Raghavendra; Wells, Marilyn

    2013-01-01

    Information management can be a daunting process for clinicians, health care providers and policy makers within the health care industry. This chapter discusses the importance of information classification and information architecture in the information economy and specific challenges faced within the health care industry. The healthcare sector has industry specific requirements for information management, standards and specifications for information presentation. Classification of information based on information criticality and the value in the health care industry is discussed in this paper. Presentation of information with reference to eHealth standards and specifications for healthcare information systems and their key requirements are also discussed, as are information architecture for eHealth implementation in Australia. This chapter also touches on information management and clinical governance since the importance of information governance is discussed by various researchers and how this is becoming of value to healthcare information management.

  20. [Teacher Referral Information and Statistical Information Forms.

    ERIC Educational Resources Information Center

    Short, N. J.

    This rating information form used to refer children to the PIC program, elicits information concerning the child's emotional, cognitive, and personality development. See TM 001 111 for details of the program in which it is used. (DLG)

  1. Information Skills for an Information Age?

    ERIC Educational Resources Information Center

    Gawith, Gwen

    1986-01-01

    Although information skills are the most basic of skills, the tendency is to teach strategies related to educational projects, erroneously assuming that these "information skills" are applicable to everyday decision-making. Educated imaginations are needed for today's variety of lifelong creative information situations. (17 references) (CJH)

  2. Size at the onset of maturity (SOM) revealed in length-weight relationships of brackish amphipods and isopods: An information theory approach

    NASA Astrophysics Data System (ADS)

    Longo, Emanuela; Mancinelli, Giorgio

    2014-01-01

    In amphipods and other small-sized crustaceans, allometric relationships are conventionally analysed by fitting the standard model Y = a·Xb (X and Y are, e.g., body length and weight, respectively) whose scaling exponent b is assumed to be constant. However, breakpoints in allometric relationships have long been documented in large-sized crustaceans, ultimately determined by ontogenetic, abrupt variations in the value of b. Here, the existence of breakpoints in length-weight relationships was investigated in four amphipod (i.e., Gammarus aequicauda, Gammarus insensibilis, Microdeutopus gryllotalpa, and Dexamine spinosa) and three isopod species (i.e., Lekanesphaera hookeri, Sphaeroma serratum, and Cymodoce truncata) from three Mediterranean lagoons. The power of two candidate linear models fitted to log10-transformed data - a simple model assuming a constant exponent b and a segmented model assuming b to vary after a breakpoint - was compared using a parsimonious selection strategy based on the Akaike information criterion. The segmented model with a breakpoint provided the most accurate fitting of length-weight data in the majority of the species analysed; non-conclusive results were obtained only for D. spinosa and C. truncata, of which a limited number of specimens was examined. Model parameters were consistent for amphipod and isopod species collected across the three different habitats; the generality of the results was further supported by a literature search confirming that the identified breakpoints corresponded with ontogenetic discontinuities related with sexual maturation in all the species investigated. In this study, segmented regression models were revealed to provide a statistically accurate and biologically meaningful description of length-weight relationships of common amphipod and isopod species. The methodological limitations of the approach are considered, while the practical implications for secondary production estimates are discussed.

  3. The Concept of Information.

    ERIC Educational Resources Information Center

    Capurro, Rafael; Hjorland, Birger

    2003-01-01

    Reviews the status of the concept of information in information science, with reference to interdisciplinary trends. Highlights include defining scientific terms; studies and sources of the word information; the concept of information in the natural sciences, and in the humanities and social sciences; librarianship; information retrieval; and the…

  4. Rethinking Information Literacy.

    ERIC Educational Resources Information Center

    Marcum, James W.

    2002-01-01

    Critiques the model of information literacy as a central purpose of librarianship. Reviews the appropriateness of the "learning methodology" of the information literacy model. Outlines the challenge of relating information literacy to workplace competencies. Proposes that information literacy be refocused away from information toward learning, and…

  5. Poisons information in Singapore.

    PubMed

    Chao, T C; Tay, M K; Bloodworth, B C; Lim, K H

    1993-03-01

    The Poisons Information Centre (PIC) provides viral and timely information to prevent and manage poisoning episodes. Comprehensive information on household, agricultural and industrial chemicals, natural toxins, pharmaceuticals, local antidote stocks and local poisons experts is retrieved from the Centre's computerised information system and printed literature. Public subscribers can obtain poisons information through Teleview.

  6. Plant species invasions along the latitudinal gradient in the United States

    USGS Publications Warehouse

    Stohlgren, T.J.; Barnett, D.; Flather, C.; Kartesz, J.; Peterjohn, B.

    2005-01-01

    It has been long established that the richness of vascular plant species and many animal taxa decreases with increasing latitude, a pattern that very generally follows declines in actual and potential evapotranspiration, solar radiation, temperature, and thus, total productivity. Using county-level data on vascular plants from the United States (3000 counties in the conterminous 48 states), we used the Akaike Information Criterion (AIC) to evaluate competing models predicting native and nonnative plant species density (number of species per square kilometer in a county) from various combinations of biotic variables (e.g., native bird species density, vegetation carbon, normalized difference vegetation index), environmental/topographic variables (elevation, variation in elevation, the number of land cover classes in the county; radiation, mean precipitation, actual evapotranspiration, and potential evapotranspiration), and human variables (human population density, crop-land, and percentage of disturbed lands in a county). We found no evidence of a latitudinal gradient for the density of native plant species and a significant, slightly positive latitudinal gradient for the density of nonnative plant species. We found stronger evidence of a significant, positive productivity gradient (vegetation carbon) for the density of native plant species and nonnative plant species. We found much stronger significant relationships when biotic, environmental/topographic, and human variables were used to predict native plant species density and nonnative plant species density. Biotic variables generally had far greater influence in multivariate models than human or environmental/topographic variables. Later, we found that the best, single, positive predictor of the density of nonnative plant species in a county was the density of native plant species in a county. While further study is needed, it may be that, while humans facilitate the initial establishment invasions of nonnative

  7. In Vivo Evaluation of Blood Based and Reference Tissue Based PET Quantifications of [11C]DASB in the Canine Brain

    PubMed Central

    Polis, Ingeborgh; Neyt, Sara; Kersemans, Ken; Dobbeleir, Andre; Saunders, Jimmy; Goethals, Ingeborg; Peremans, Kathelijne; De Vos, Filip

    2016-01-01

    This first-in-dog study evaluates the use of the PET-radioligand [11C]DASB to image the density and availability of the serotonin transporter (SERT) in the canine brain. Imaging the serotonergic system could improve diagnosis and therapy of multiple canine behavioural disorders. Furthermore, as many similarities are reported between several human neuropsychiatric conditions and naturally occurring canine behavioural disorders, making this tracer available for use in dogs also provide researchers an interesting non-primate animal model to investigate human disorders. Five adult beagles underwent a 90 minutes dynamic PET scan and arterial whole blood was sampled throughout the scan. For each ROI, the distribution volume (VT), obtained via the one- and two- tissue compartment model (1-TC, 2-TC) and the Logan Plot, was calculated and the goodness-of-fit was evaluated by the Akaike Information Criterion (AIC). For the preferred compartmental model BPND values were estimated and compared with those derived by four reference tissue models: 4-parameter RTM, SRTM2, MRTM2 and the Logan reference tissue model. The 2-TC model indicated in 61% of the ROIs a better fit compared to the 1-TC model. The Logan plot produced almost identical VT values and can be used as an alternative. Compared with the 2-TC model, all investigated reference tissue models showed high correlations but small underestimations of the BPND-parameter. The highest correlation was achieved with the Logan reference tissue model (Y = 0.9266 x + 0.0257; R2 = 0.9722). Therefore, this model can be put forward as a non-invasive standard model for future PET-experiments with [11C]DASB in dogs. PMID:26859850

  8. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    NASA Astrophysics Data System (ADS)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  9. Influence of habitat heterogeneity on the distribution of larval Pacific lamprey (Lampetra tridentata) at two spatial scales

    USGS Publications Warehouse

    Torgersen, Christian E.; Close, David A.

    2004-01-01

    1. Spatial patterns in channel morphology and substratum composition at small (1a??10 metres) and large scales (1a??10 kilometres) were analysed to determine the influence of habitat heterogeneity on the distribution and abundance of larval lamprey. 2. We used a nested sampling design and multiple logistic regression to evaluate spatial heterogeneity in the abundance of larval Pacific lamprey, Lampetra tridentata, and habitat in 30 sites (each composed of twelve 1-m2 quadrat samples) distributed throughout a 55-km section of the Middle Fork John Day River, OR, U.SA. Statistical models predicting the relative abundance of larvae both among sites (large scale) and among samples (small scale) were ranked using Akaike's Information Criterion (AIC) to identify the 'best approximating' models from a set of a priori candidate models determined from the literature on larval lamprey habitat associations. 3. Stream habitat variables predicted patterns in larval abundance but played different roles at different spatial scales. The abundance of larvae at large scales was positively associated with water depth and open riparian canopy, whereas patchiness in larval occurrence at small scales was associated with low water velocity, channel-unit morphology (pool habitats), and the availability of habitat suitable for burrowing. 4. Habitat variables explained variation in larval abundance at large and small scales, but locational factors, such as longitudinal position (river km) and sample location within the channel unit, explained additional variation in the logistic regression model. The results emphasise the need for spatially explicit analysis, both in examining fish habitat relationships and in developing conservation plans for declining fish populations.

  10. The HII Galaxy Hubble Diagram Strongly Favors Rh = ct over ΛCDM

    NASA Astrophysics Data System (ADS)

    Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio

    2016-08-01

    We continue to build support for the proposal to use HII galaxies (HIIGx) and giant extragalactic HII regions (GEHR) as standard candles to construct the Hubble diagram at redshifts beyond the current reach of Type Ia supernovae. Using a sample of 25 high-redshift HIIGx, 107 local HIIGx, and 24 GEHR, we confirm that the correlation between the emission-line luminosity and ionized-gas velocity dispersion is a viable luminosity indicator, and use it to test and compare the standard model ΛCDM and the Rh = ct Universe by optimizing the parameters in each cosmology using a maximization of the likelihood function. For the flat ΛCDM model, the best fit is obtained with Ω _m= 0.40_{-0.09}^{+0.09}. However, statistical tools, such as the Akaike (AIC), Kullback (KIC) and Bayes (BIC) Information Criteria favor Rh = ct over the standard model with a likelihood of ≈94.8% - 98.8% versus only ≈1.2% - 5.2%. For wCDM (the version of ΛCDM with a dark-energy equation of state wde ≡ pde/ρde rather than wde = wΛ = -1), a statistically acceptable fit is realized with Ω _m=0.22_{-0.14}^{+0.16} and w_de= -0.51_{-0.25}^{+0.15} which, however, are not fully consistent with their concordance values. In this case, wCDM has two more free parameters than Rh = ct, and is penalized more heavily by these criteria. We find that Rh = ct is strongly favored over wCDM with a likelihood of ≈92.9% - 99.6% versus only 0.4% - 7.1%. The current HIIGx sample is already large enough for the BIC to rule out ΛCDM/wCDM in favor of Rh = ct at a confidence level approaching 3σ.

  11. Alien plant invasion in mixed-grass prairie: Effects of vegetation type and anthropogenic disturbance

    USGS Publications Warehouse

    Larson, D.L.; Anderson, P.J.; Newton, W.

    2001-01-01

    The ability of alien plant species to invade a region depends not only on attributes of the plant, but on characteristics of the habitat being invaded. Here, we examine characteristics that may influence the success of alien plant invasion in mixed-grass prairie at Theodore Roosevelt National Park, in western North Dakota, USA. The park consists of two geographically separate units with similar vegetation types and management history, which allowed us to examine the effects of native vegetation type, anthropogenic disturbance, and the separate park units on the invasion of native plant communities by alien plant species common to counties surrounding both park units. If matters of chance related to availability of propagules and transient establishment opportunities determine the success of invasion, park unit and anthropogenic disturbance should better explain the variation in alien plant frequency. If invasibility is more strongly related to biotic or physical characteristics of the native plant communities, models of alien plant occurrence should include vegetation type as an explanatory variable. We examined >1300 transects across all vegetation types in both units of the park. Akaike's Information Criterion (AIC) indicated that the fully parameterized model, including the interaction among vegetation type, disturbance, and park unit, best described the distribution of both total number of alien plants per transect and frequency of alien plants on transects where they occurred. Although all vegetation types were invaded by alien plants, mesic communities had both greater numbers and higher frequencies of alien plants than did drier communities. A strong element of stochasticity, reflected in differences in frequencies of individual species between the two park units, suggests that prediction of risk of invasion will always involve uncertainty. In addition, despite well-documented associations between anthropogenic disturbance and alien plant invasion, five of

  12. Biogeographical Interpretation of Elevational Patterns of Genus Diversity of Seed Plants in Nepal

    PubMed Central

    Li, Miao; Feng, Jianmeng

    2015-01-01

    This study tests if the biogeographical affinities of genera are relevant for explaining elevational plant diversity patterns in Nepal. We used simultaneous autoregressive (SAR) models to investigate the explanatory power of several predictors in explaining the diversity-elevation relationships shown in genera with different biogeographical affinities. Delta akaike information criterion (ΔAIC) was used for multi-model inferences and selections. Our results showed that both the total and tropical genus diversity peaked below the mid-point of the elevational gradient, whereas that of temperate genera had a nearly symmetrical, unimodal relationship with elevation. The proportion of temperate genera increased markedly with elevation, while that of tropical genera declined. Compared to tropical genera, temperate genera had wider elevational ranges and were observed at higher elevations. Water-related variables, rather than mid-domain effects (MDE), were the most significant predictors of elevational patterns of tropical genus diversity. The temperate genus diversity was influenced by energy availability, but only in quadratic terms of the models. Though climatic factors and mid-domain effects jointly explained most of the variation in the diversity of temperate genera with elevation, the former played stronger roles. Total genus diversity was most strongly influenced by climate and the floristic overlap of tropical and temperate floras, while the influences of mid-domain effects were relatively weak. The influences of water-related and energy-related variables may vary with biogeographical affinities. The elevational patterns may be most closely related to climatic factors, while MDE may somewhat modify the patterns. Caution is needed when investigating the causal factors underlying diversity patterns for large taxonomic groups composed of taxa of different biogeographical affinities. Right-skewed diversity-elevation patterns may be produced by the differential

  13. Recovery of native treefrogs after removal of nonindigenous Cuban Treefrogs, Osteopilus septentrionalis

    USGS Publications Warehouse

    Rice, K.G.; Waddle, J.H.; Miller, M.W.; Crockett, M.E.; Mazzotti, F.J.; Percival, H.F.

    2011-01-01

    Florida is home to several introduced animal species, especially in the southern portion of the state. Most introduced species are restricted to the urban and suburban areas along the coasts, but some species, like the Cuban Treefrog (Osteopilus septentrionalis), are locally abundant in natural protected areas. Although Cuban Treefrogs are known predators of native treefrog species as both adults and larvae, no study has demonstrated a negative effect of Cuban Treefrogs on native treefrog survival, abundance, or occupancy rate. We monitored survival, capture probability, abundance, and proportion of sites occupied by Cuban Treefrogs and two native species, Green Treefrogs (Hyla cinerea) and Squirrel Treefrogs (Hyla squirella), at four sites in Everglades National Park in southern Florida with the use of capture–mark–recapture techniques. After at least 5 mo of monitoring all species at each site we began removing every Cuban Treefrog captured. We continued to estimate survival, abundance, and occupancy rates of native treefrogs for 1 yr after the commencement of Cuban Treefrog removal. Mark–recapture models that included the effect of Cuban Treefrog removal on native treefrog survival did not have considerable Akaike's Information Criterion (AIC) weight, although capture rates of native species were generally very low prior to Cuban Treefrog removal. Estimated abundance of native treefrogs did increase after commencement of Cuban Treefrog removal, but also varied with the season of the year. The best models of native treefrog occupancy included a Cuban Treefrog removal effect at sites with high initial densities of Cuban Treefrogs. This study demonstrates that an introduced predator can have population-level effects on similar native species.

  14. Analysis of the return period and correlation between the reservoir-induced seismic frequency and the water level based on a copula: A case study of the Three Gorges reservoir in China

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofei; Zhang, Qiuwen

    2016-11-01

    Studies have considered the many factors involved in the mechanism of reservoir seismicity. Focusing on the correlation between reservoir-induced seismicity and the water level, this study proposes to utilize copula theory to build a correlation model to analyze their relationships and perform the risk analysis. The sequences of reservoir induced seismicity events from 2003 to 2011 in the Three Gorges reservoir in China are used as a case study to test this new methodology. Next, we construct four correlation models based on the Gumbel, Clayton, Frank copula and M-copula functions and employ four methods to test the goodness of fit: Q-Q plots, the Kolmogorov-Smirnov (K-S) test, the minimum distance (MD) test and the Akaike Information Criterion (AIC) test. Through a comparison of the four models, the M-copula model fits the sample better than the other three models. Based on the M-copula model, we find that, for the case of a sudden drawdown of the water level, the possibility of seismic frequency decreasing obviously increases, whereas for the case of a sudden rising of the water level, the possibility of seismic frequency increasing obviously increases, with the former being greater than the latter. The seismic frequency is mainly distributed in the low-frequency region (Y ⩽ 20) for the low water level and in the middle-frequency region (20 < Y ≤ 80) for both the medium and high water levels; the seismic frequency in the high-frequency region (Y > 80) is the least likely. For the conditional return period, it can be seen that the period of the high-frequency seismicity is much longer than those of the normal and medium frequency seismicity, and the high water level shortens the periods.

  15. Crucial nesting habitat for gunnison sage-grouse: A spatially explicit hierarchical approach

    USGS Publications Warehouse

    Aldridge, C.L.; Saher, D.J.; Childers, T.M.; Stahlnecker, K.E.; Bowen, Z.H.

    2012-01-01

    Gunnison sage-grouse (Centrocercus minimus) is a species of special concern and is currently considered a candidate species under Endangered Species Act. Careful management is therefore required to ensure that suitable habitat is maintained, particularly because much of the species' current distribution is faced with exurban development pressures. We assessed hierarchical nest site selection patterns of Gunnison sage-grouse inhabiting the western portion of the Gunnison Basin, Colorado, USA, at multiple spatial scales, using logistic regression-based resource selection functions. Models were selected using Akaike Information Criterion corrected for small sample sizes (AIC c) and predictive surfaces were generated using model averaged relative probabilities. Landscape-scale factors that had the most influence on nest site selection included the proportion of sagebrush cover >5%, mean productivity, and density of 2 wheel-drive roads. The landscape-scale predictive surface captured 97% of known Gunnison sage-grouse nests within the top 5 of 10 prediction bins, implicating 57% of the basin as crucial nesting habitat. Crucial habitat identified by the landscape model was used to define the extent for patch-scale modeling efforts. Patch-scale variables that had the greatest influence on nest site selection were the proportion of big sagebrush cover >10%, distance to residential development, distance to high volume paved roads, and mean productivity. This model accurately predicted independent nest locations. The unique hierarchical structure of our models more accurately captures the nested nature of habitat selection, and allowed for increased discrimination within larger landscapes of suitable habitat. We extrapolated the landscape-scale model to the entire Gunnison Basin because of conservation concerns for this species. We believe this predictive surface is a valuable tool which can be incorporated into land use and conservation planning as well the assessment of

  16. The importance of retaining a phylogenetic perspective in traits-based community analyses

    SciTech Connect

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineages had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.

  17. The importance of retaining a phylogenetic perspective in traits-based community analyses

    DOE PAGES

    Poteat, Monica D.; Buchwalter, David B.; Jacobus, Luke M.

    2015-04-08

    1) Many environmental stressors manifest their effects via physiological processes (traits) that can differ significantly among species and species groups. We compiled available data for three traits related to the bioconcentration of the toxic metal cadmium (Cd) from 42 aquatic insect species representing orders Ephemeroptera (mayfly), Plecoptera (stonefly), and Trichoptera (caddisfly). These traits included the propensity to take up Cd from water (uptake rate constant, ku), the ability to excrete Cd (efflux rate constant, ke), and the net result of these two processes (bioconcentration factor, BCF). 2) Ranges in these Cd bioaccumulation traits varied in magnitude across lineages (some lineagesmore » had a greater tendency to bioaccumulate Cd than others). Overlap in the ranges of trait values among different lineages was common and highlights situations where species from different lineages can share a similar trait state, but represent the high end of possible physiological values for one lineage and the low end for another. 3) Variance around the mean trait state differed widely across clades, suggesting that some groups (e.g., Ephemerellidae) are inherently more variable than others (e.g., Perlidae). Thus, trait variability/lability is at least partially a function of lineage. 4) Akaike information criterion (AIC) comparisons of statistical models were more often driven by clade than by other potential biological or ecological explanation tested. Clade-driven models generally improved with increasing taxonomic resolution. 5) Altogether, these findings suggest that lineage provides context for the analysis of species traits, and that failure to consider lineage in community-based analysis of traits may obscure important patterns of species responses to environmental change.« less

  18. Time-dependent oral absorption models

    NASA Technical Reports Server (NTRS)

    Higaki, K.; Yamashita, S.; Amidon, G. L.

    2001-01-01

    The plasma concentration-time profiles following oral administration of drugs are often irregular and cannot be interpreted easily with conventional models based on first- or zero-order absorption kinetics and lag time. Six new models were developed using a time-dependent absorption rate coefficient, ka(t), wherein the time dependency was varied to account for the dynamic processes such as changes in fluid absorption or secretion, in absorption surface area, and in motility with time, in the gastrointestinal tract. In the present study, the plasma concentration profiles of propranolol obtained in human subjects following oral dosing were analyzed using the newly derived models based on mass balance and compared with the conventional models. Nonlinear regression analysis indicated that the conventional compartment model including lag time (CLAG model) could not predict the rapid initial increase in plasma concentration after dosing and the predicted Cmax values were much lower than that observed. On the other hand, all models with the time-dependent absorption rate coefficient, ka(t), were superior to the CLAG model in predicting plasma concentration profiles. Based on Akaike's Information Criterion (AIC), the fluid absorption model without lag time (FA model) exhibited the best overall fit to the data. The two-phase model including lag time, TPLAG model was also found to be a good model judging from the values of sum of squares. This model also described the irregular profiles of plasma concentration with time and frequently predicted Cmax values satisfactorily. A comparison of the absorption rate profiles also suggested that the TPLAG model is better at prediction of irregular absorption kinetics than the FA model. In conclusion, the incorporation of a time-dependent absorption rate coefficient ka(t) allows the prediction of nonlinear absorption characteristics in a more reliable manner.

  19. Blood lead concentrations in free-ranging Nile crocodiles (Crocodylus niloticus) from South Africa.

    PubMed

    Warner, Jonathan K; Combrink, Xander; Myburgh, Jan G; Downs, Colleen T

    2016-07-01

    Generally crocodilians have received little attention with regard to the effects of lead toxicity despite their trophic status as apex, generalist predators that utilize both aquatic and terrestrial habitats, thereby exposing them to a potentially wide range of environmental contaminants. During July-October 2010 we collected whole blood from 34 sub-adult and adult free-ranging Nile crocodiles (Crocodylus niloticus) from three separate populations in northeastern South Africa in order to analyze their blood lead concentrations (BPb). Concentrations ranged from below detectability (<3 μg/dL, n = 8) to 960 μg/dL for an adult male at the Lake St Lucia Estuary. Blood lead concentrations averaged 8.15 μg/dL (SD = 7.47) for females and 98.10 μg/dL (SD = 217.42) for males. Eighteen individuals (53 %) had elevated BPbs (≥10 μg/dL). We assessed 12 general linear models using Akaike's Information Criterion (AIC) and found no significant statistical effects among the parameters of sex, crocodile size and population sampled. On average, crocodiles had higher BPbs at Lake St Lucia than at Ndumo Game Reserve or Kosi Bay, which we attribute to lead sinker ingestion during normal gastrolith acquisition. No clinical effects of lead toxicosis were observed in these crocodiles, even though the highest concentration (960 μg/dL) we report represents the most elevated BPb recorded to date for a free-ranging vertebrate. Although we suggest adult Nile crocodiles are likely tolerant of elevated Pb body burdens, experimental studies on other crocodilian species suggest the BPb levels reported here may have harmful or fatal effects to egg development and hatchling health. In light of recent Nile crocodile nesting declines in South Africa we urge further BPb monitoring and ecotoxicology research on reproductive females and embryos. PMID:27038476

  20. A functional biological network centered on XRCC3: a new possible marker of chemoradiotherapy resistance in rectal cancer patients.

    PubMed

    Agostini, Marco; Zangrando, Andrea; Pastrello, Chiara; D'Angelo, Edoardo; Romano, Gabriele; Giovannoni, Roberto; Giordan, Marco; Maretto, Isacco; Bedin, Chiara; Zanon, Carlo; Digito, Maura; Esposito, Giovanni; Mescoli, Claudia; Lavitrano, Marialuisa; Rizzolio, Flavio; Jurisica, Igor; Giordano, Antonio; Pucciarelli, Salvatore; Nitti, Donato

    2015-01-01

    Preoperative chemoradiotherapy is widely used to improve local control of disease, sphincter preservation and to improve survival in patients with locally advanced rectal cancer. Patients enrolled in the present study underwent preoperative chemoradiotherapy, followed by surgical excision. Response to chemoradiotherapy was evaluated according to Mandard's Tumor Regression Grade (TRG). TRG 3, 4 and 5 were considered as partial or no response while TRG 1 and 2 as complete response. From pretherapeutic biopsies of 84 locally advanced rectal carcinomas available for the analysis, only 42 of them showed 70% cancer cellularity at least. By determining gene expression profiles, responders and non-responders showed significantly different expression levels for 19 genes (P < 0.001). We fitted a logistic model selected with a stepwise procedure optimizing the Akaike Information Criterion (AIC) and then validated by means of leave one out cross validation (LOOCV, accuracy = 95%). Four genes were retained in the achieved model: ZNF160, XRCC3, HFM1 and ASXL2. Real time PCR confirmed that XRCC3 is overexpressed in responders group and HFM1 and ASXL2 showed a positive trend. In vitro test on colon cancer resistant/susceptible to chemoradioterapy cells, finally prove that XRCC3 deregulation is extensively involved in the chemoresistance mechanisms. Protein-protein interactions (PPI) analysis involving the predictive classifier revealed a network of 45 interacting nodes (proteins) with TRAF6 gene playing a keystone role in the network. The present study confirmed the possibility that gene expression profiling combined with integrative computational biology is useful to predict complete responses to preoperative chemoradiotherapy in patients with advanced rectal cancer.

  1. Climatic patterns in the establishment of wintering areas by North American migratory birds.

    PubMed

    Pérez-Moreno, Heidi; Martínez-Meyer, Enrique; Soberón Mainero, Jorge; Rojas-Soto, Octavio

    2016-04-01

    Long-distance migration in birds is relatively well studied in nature; however, one aspect of this phenomenon that remains poorly understood is the pattern of distribution presented by species during arrival to and establishment of wintering areas. Some studies suggest that the selection of areas in winter is somehow determined by climate, given its influence on both the distribution of bird species and their resources. We analyzed whether different migrant passerine species of North America present climatic preferences during arrival to and departure from their wintering areas. We used ecological niche modeling to generate monthly potential climatic distributions for 13 migratory bird species during the winter season by combining the locations recorded per month with four environmental layers. We calculated monthly coefficients of climate variation and then compared two GLM (generalized linear models), evaluated with the AIC (Akaike information criterion), to describe how these coefficients varied over the course of the season, as a measure of the patterns of establishment in the wintering areas. For 11 species, the sites show nonlinear patterns of variation in climatic preferences, with low coefficients of variation at the beginning and end of the season and higher values found in the intermediate months. The remaining two species analyzed showed a different climatic pattern of selective establishment of wintering areas, probably due to taxonomic discrepancy, which would affect their modeled winter distribution. Patterns of establishment of wintering areas in the species showed a climatic preference at the macroscale, suggesting that individuals of several species actively select wintering areas that meet specific climatic conditions. This probably gives them an advantage over the winter and during the return to breeding areas. As these areas become full of migrants, alternative suboptimal sites are occupied. Nonrandom winter area selection may also have

  2. Bait stations, hard mast, and black bear population growth in Great Smoky Mountains National Park

    USGS Publications Warehouse

    Clark, Joseph D.; van Manen, Frank T.; Pelton, Michael R.

    2005-01-01

    Bait-station surveys are used by wildlife managers as an index to American black bear (Ursus americanus) population abundance, but the relationship is not well established. Hard mast surveys are similarly used to assess annual black bear food availability which may affect mortality and natality rates. We used data collected in Great Smoky Mountains National Park (GSMNP) from 1989 to 2003 to determine whether changes in the bait-station index (ΔBSI) were associated with estimated rates of bear population growth (λ) and whether hard mast production was related to bear visitation to baits. We also evaluated whether hard mast production from previous years was related to λ. Estimates of λ were based on analysis of capture-recapture data with the Pradel temporal symmetry estimator. Using the Akaike's Information Criterion (AIC), our analysis revealed no direct relationship between ΔBSI and λ. A simulation analysis indicated that our data were adequate to detect a relationship had one existed. Model fit was marginally improved when we added total oak mast production of the previous year as an interaction term suggesting that the BSI was confounded with environmental variables. Consequently the utility of the bait-station survey as a population monitoring technique is questionable at the spatial and temporal scales we studied. Mast survey data, however, were valuable covariates of λ. Population growth for a given year was negatively related to oak mast production 4 and 5 years prior. That finding supported our hypothesis that mast failures can trigger reproductive synchrony, which may not be evident from the trapped sample until years later.

  3. Estimating rates of local extinction and colonization in colonial species and an extension to the metapopulation and community levels

    USGS Publications Warehouse

    Barbraud, C.; Nichols, J.D.; Hines, J.E.; Hafner, H.

    2003-01-01

    Coloniality has mainly been studied from an evolutionary perspective, but relatively few studies have developed methods for modelling colony dynamics. Changes in number of colonies over time provide a useful tool for predicting and evaluating the responses of colonial species to management and to environmental disturbance. Probabilistic Markov process models have been recently used to estimate colony site dynamics using presence-absence data when all colonies are detected in sampling efforts. Here, we define and develop two general approaches for the modelling and analysis of colony dynamics for sampling situations in which all colonies are, and are not, detected. For both approaches, we develop a general probabilistic model for the data and then constrain model parameters based on various hypotheses about colony dynamics. We use Akaike's Information Criterion (AIC) to assess the adequacy of the constrained models. The models are parameterised with conditional probabilities of local colony site extinction and colonization. Presence-absence data arising from Pollock's robust capture-recapture design provide the basis for obtaining unbiased estimates of extinction, colonization, and detection probabilities when not all colonies are detected. This second approach should be particularly useful in situations where detection probabilities are heterogeneous among colony sites. The general methodology is illustrated using presence-absence data on two species of herons (Purple Heron, Ardea purpurea and Grey Heron, Ardea cinerea). Estimates of the extinction and colonization rates showed interspecific differences and strong temporal and spatial variations. We were also able to test specific predictions about colony dynamics based on ideas about habitat change and metapopulation dynamics. We recommend estimators based on probabilistic modelling for future work on colony dynamics. We also believe that this methodological framework has wide application to problems in animal

  4. Fine-Scale Mapping by Spatial Risk Distribution Modeling for Regional Malaria Endemicity and Its Implications under the Low-to-Moderate Transmission Setting in Western Cambodia

    PubMed Central

    Okami, Suguru; Kohtake, Naohiko

    2016-01-01

    The disease burden of malaria has decreased as malaria elimination efforts progress. The mapping approach that uses spatial risk distribution modeling needs some adjustment and reinvestigation in accordance with situational changes. Here we applied a mathematical modeling approach for standardized morbidity ratio (SMR) calculated by annual parasite incidence using routinely aggregated surveillance reports, environmental data such as remote sensing data, and non-environmental anthropogenic data to create fine-scale spatial risk distribution maps of western Cambodia. Furthermore, we incorporated a combination of containment status indicators into the model to demonstrate spatial heterogeneities of the relationship between containment status and risks. The explanatory model was fitted to estimate the SMR of each area (adjusted Pearson correlation coefficient R2 = 0.774; Akaike information criterion AIC = 149.423). A Bayesian modeling framework was applied to estimate the uncertainty of the model and cross-scale predictions. Fine-scale maps were created by the spatial interpolation of estimated SMRs at each village. Compared with geocoded case data, corresponding predicted values showed conformity [Spearman’s rank correlation r = 0.662 in the inverse distance weighed interpolation and 0.645 in ordinal kriging (95% confidence intervals of 0.414–0.827 and 0.368–0.813, respectively), Welch’s t-test; Not significant]. The proposed approach successfully explained regional malaria risks and fine-scale risk maps were created under low-to-moderate malaria transmission settings where reinvestigations of existing risk modeling approaches were needed. Moreover, different representations of simulated outcomes of containment status indicators for respective areas provided useful insights for tailored interventional planning, considering regional malaria endemicity. PMID:27415623

  5. Development of an Adaptive Multi-Method Algorithm for Automatic Picking of First Arrival Times: Application to Near Surface Seismic Data

    NASA Astrophysics Data System (ADS)

    Khalaf, A.; Camerlynck, C. M.; Schneider, A. C.; Florsch, N.

    2015-12-01

    Accurate picking of first arrival times plays an important role in many seismic studies, particularly in seismic tomography and reservoirs or aquifers monitoring. Many techniques have been developed for picking first arrivals automatically or semi-automatically, but most of them were developed for seismological purposes which does not attain the accuracy objectives due to the complexity of near surface structures, and to usual low signal-to-noise ratio. We propose a new adaptive algorithm for near surface data based on three picking methods, combining multi-nested windows (MNW), Higher Order Statistics (HOS), and Akaike Information Criterion (AIC). They exploit the benefits of integrating many properties, which reveal the presence of first arrivals, to provide an efficient and robust first arrivals picking. This strategy mimics the human first-break picking, where at the beginning the global trend is defined. Then the exact first-breaks are searched in the vicinity of the now defined trend. In a multistage algorithm, three successive phases are launched, where each of them characterize a specific signal property. Within each phase, the potential picks and their error range are automatically estimated, and then used sequentially as leader in the following phase picking. The accuracy and robustness of the implemented algorithm are successfully validated on synthetic and real data which have special challenges for automatic pickers. The comparison of resulting P-wave arrival times with those picked manually, and other algorithms of automatic picking, demonstrated the reliable performance of the new scheme under different noisy conditions. All parameters of our multi-method algorithm are auto-adaptive thanks to the integration in series of each sub-algorithm results in the flow. Hence, it is nearly a parameter-free algorithm, which is straightforward to implement and demands low computational resources.

  6. Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality

    NASA Astrophysics Data System (ADS)

    Kipnis, E. L.; Murphy, M.; Klatt, A. L.; Miller, S. N.; Williams, D. G.

    2015-12-01

    Session H103: The Hydrology-Vegetation-Climate Nexus: Identifying Process Interactions and Environmental Shifts in Mountain Catchments Influence of Terrain and Land Cover on the Isotopic Composition of Seasonal Snowpack in Rocky Mountain Headwater Catchments Affected by Bark Beetle Induced Tree Mortality Evan L Kipnis, Melanie A Murphey, Alan Klatt, Scott N Miller, David G Williams Snowpack accumulation and ablation remain difficult to estimate in forested headwater catchments. How physical terrain and forest cover separately and interactively influence spatial patterns of snow accumulation and ablation largely shapes the hydrologic response to land cover disturbances. Analysis of water isotopes in snowpack provides a powerful tool for examining integrated effects of water vapor exchange, selective redistribution, and melt. Snow water equivalence (SWE), δ2H, δ18O and deuterium excess (D-excess) of snowpack were examined throughout winter 2013-2014 across two headwater catchments impacted by bark beetle induced tree mortality. A USGS 10m DEM and a derived land cover product from 1m NAIP imagery were used to examine the effects of terrain features (e.g., elevation, slope, aspect) and canopy disturbance (e.g., live, bark-beetle killed) as predictors of D-excess, an expression of kinetic isotope effects, in snowpack. A weighting of Akaike's Information Criterion (AIC) values from multiple spatially lagged regression models describing D-excess variation for peak snowpack revealed strong effects of elevation and canopy mortality, and weaker, but significant effects of aspect and slope. Snowpack D-excess was lower in beetle-killed canopy patches compared to live green canopy patches, and at lower compared to high elevation locations, suggesting that integrated isotopic effects of vapor exchange, vertical advection of melted snow, and selective accumulation and redistribution varied systematically across the two catchments. The observed patterns illustrate the potential

  7. Is First-Order Vector Autoregressive Model Optimal for fMRI Data?

    PubMed

    Ting, Chee-Ming; Seghouane, Abd-Krim; Khalid, Muhammad Usman; Salleh, Sh-Hussain

    2015-09-01

    We consider the problem of selecting the optimal orders of vector autoregressive (VAR) models for fMRI data. Many previous studies used model order of one and ignored that it may vary considerably across data sets depending on different data dimensions, subjects, tasks, and experimental designs. In addition, the classical information criteria (IC) used (e.g., the Akaike IC (AIC)) are biased and inappropriate for the high-dimensional fMRI data typically with a small sample size. We examine the mixed results on the optimal VAR orders for fMRI, especially the validity of the order-one hypothesis, by a comprehensive evaluation using different model selection criteria over three typical data types--a resting state, an event-related design, and a block design data set--with varying time series dimensions obtained from distinct functional brain networks. We use a more balanced criterion, Kullback's IC (KIC) based on Kullback's symmetric divergence combining two directed divergences. We also consider the bias-corrected versions (AICc and KICc) to improve VAR model selection in small samples. Simulation results show better small-sample selection performance of the proposed criteria over the classical ones. Both bias-corrected ICs provide more accurate and consistent model order choices than their biased counterparts, which suffer from overfitting, with KICc performing the best. Results on real data show that orders greater than one were selected by all criteria across all data sets for the small to moderate dimensions, particularly from small, specific networks such as the resting-state default mode network and the task-related motor networks, whereas low orders close to one but not necessarily one were chosen for the large dimensions of full-brain networks.

  8. Possible Causes of a Harbour Porpoise Mass Stranding in Danish Waters in 2005

    PubMed Central

    Wright, Andrew J.; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J.; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7–15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003–2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003–2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11–28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor. PMID:23460787

  9. SU-E-T-399: Determination of the Radiobiological Parameters That Describe the Dose-Response Relations of Xerostomia and Disgeusia From Head and Neck Radiotherapy

    SciTech Connect

    Mavroidis, P; Stathakis, S; Papanikolaou, N; Peixoto Xavier, C; Costa Ferreira, B; Khouri, L; Carmo Lopes, M do

    2014-06-01

    Purpose: To estimate the radiobiological parameters that describe the doseresponse relations of xerostomia and disgeusia from head and neck cancer radiotherapy. To identify the organs that are best correlated with the manifestation of those clinical endpoints. Finally, to evaluate the goodnessof- fit by comparing the model predictions against the actual clinical results. Methods: In this study, 349 head and neck cancer patients were included. For each patient the dose volume histograms (DVH) of parotids (separate and combined), mandible, submandibular glands (separate and combined) and salivary glands were calculated. The follow-up of those patients was recorded at different times after the completion of the treatment (7 weeks, 3, 7, 12, 18 and 24 months). Acute and late xerostomia and acute disgeusia were the clinical endpoints examined. A maximum likelihood fitting was performed to calculate the best estimates of the parameters used by the relative seriality model. The statistical methods of the error distribution, the receiver operating characteristic (ROC) curve, the Pearson's test and the Akaike's information criterion were utilized to assess the goodness-of-fit and the agreement between the pattern of the radiobiological predictions with that of the clinical records. Results: The estimated values of the radiobiological parameters of salivary glands are D50 = 25.2 Gy, γ = 0.52, s = 0.001. The statistical analysis confirmed the clinical validity of those parameters (area under the ROC curve = 0.65 and AIC = 38.3). Conclusion: The analysis proved that the treatment outcome pattern of the patient material can be reproduced by the relative seriality model and the estimated radiobiological parameters. Salivary glands were found to have strong volume dependence (low relative seriality). Diminishing the biologically effective uniform dose to salivary glands below 30 Gy may significantly reduce the risk of complications to the patients irradiated for prostate cancer.

  10. Nonstationarity in the occurrence rate of floods in the Tarim River basin, China, and related impacts of climate indices

    NASA Astrophysics Data System (ADS)

    Gu, Xihui; Zhang, Qiang; Singh, Vijay P.; Chen, Xi; Liu, Lin

    2016-07-01

    Amplification of floods in the Xinjiang, China, has been observed, but reports on their changing properties and underlying mechanisms are not available. In this study, occurrence rates of floods in the Tarim River basin, the largest inland arid river basin in China, were analyzed using the Kernel density estimation technique and bootstrap resampling method. Also analyzed were the occurrence rates of precipitation extremes using the POT (Peak over Threshold)-based sampling method. Both stationary and non-stationary models were developed using GAMLSS (Generalized Additive Models for Location, Scale and Shape) to model flood frequency with time, climate index, precipitation and temperature as major predictors. Results indicated: (1) two periods with increasing occurrence of floods, i.e., the late 1960s and the late 1990s with considerable fluctuations around 2-3 flood events during time intervals between the late 1960s and the late 1990s; (2) changes in the occurrence rates of floods were subject to nonstationarity. A persistent increase of flood frequency and magnitude was observed during the 1990s and reached a peak value; (3) AMO (Atlantic Multidecadal Oscillation) and AO (Atlantic Oscillation) in winter were the key influencing climate indices impacting the occurrence rates of floods. However, NAO (North Atlantic Oscillation) and SOI (South Oscillation Index) are two principle factors that influence the occurrence rates of regional floods. The AIC (Akaike Information Criterion) values indicated that compared to the influence of climate indices, occurrence rates of floods seemed to be more sensitive to temperature and precipitation changes. Results of this study are important for flood management and development of mitigation measures.

  11. Landscape conditions predisposing grizzly bears to conflicts on private agricultural lands in the western USA

    USGS Publications Warehouse

    Wilson, S.M.; Madel, M.J.; Mattson, D.J.; Graham, J.M.; Merrill, T.

    2006-01-01

    We used multiple logistic regression to model how different landscape conditions contributed to the probability of human-grizzly bear conflicts on private agricultural ranch lands. We used locations of livestock pastures, traditional livestock carcass disposal areas (boneyards), beehives, and wetland-riparian associated vegetation to model the locations of 178 reported human-grizzly bear conflicts along the Rocky Mountain East Front, Montana, USA during 1986-2001. We surveyed 61 livestock producers in the upper Teton watershed of north-central Montana, to collect spatial and temporal data on livestock pastures, boneyards, and beehives for the same period, accounting for changes in livestock and boneyard management and beehive location and protection, for each season. We used 2032 random points to represent the null hypothesis of random location relative to potential explanatory landscape features, and used Akaike's Information Criteria (AIC/AICC) and Hosmer-Lemeshow goodness-of-fit statistics for model selection. We used a resulting "best" model to map contours of predicted probabilities of conflict, and used this map for verification with an independent dataset of conflicts to provide additional insights regarding the nature of conflicts. The presence of riparian vegetation and distances to spring, summer, and fall sheep or cattle pastures, calving and sheep lambing areas, unmanaged boneyards, and fenced and unfenced beehives were all associated with the likelihood of human-grizzly bear conflicts. Our model suggests that collections of attractants concentrated in high quality bear habitat largely explain broad patterns of human-grizzly bear conflicts on private agricultural land in our study area. ?? 2005 Elsevier Ltd. All rights reserved.

  12. Visibility Modeling and Forecasting for Abu Dhabi using Time Series Analysis Method

    NASA Astrophysics Data System (ADS)

    Eibedingil, I. G.; Abula, B.; Afshari, A.; Temimi, M.

    2015-12-01

    Land-Atmosphere interactions-their strength, directionality and evolution-are one of the main sources of uncertainty in contemporary climate modeling. A particularly crucial role in sustaining and modulating land-atmosphere interaction is the one of aerosols and dusts. Aerosols are tiny particles suspended in the air ranging from a few nanometers to a few hundred micrometers in diameter. Furthermore, the amount of dust and fog in the atmosphere is an important measure of visibility, which is another dimension of land-atmosphere interactions. Visibility affects all form of traffic, aviation, land and sailing. Being able to predict the change of visibility in the air in advance enables relevant authorities to take necessary actions before the disaster falls. Time Series Analysis (TAS) method is an emerging technique for modeling and forecasting the behavior of land-atmosphere interactions, including visibility. This research assess the dynamics and evolution of visibility around Abu Dhabi International Airport (+24.4320 latitude, +54.6510 longitude, and 27m elevation) using mean daily visibility and mean daily wind speed. TAS has been first used to model and forecast the visibility, and then the Transfer Function Model has been applied, considering the wind speed as an exogenous variable. By considering the Akaike Information Criterion (AIC) and Mean Absolute Percentage Error (MAPE) as a statistical criteria, two forecasting models namely univarite time series model and transfer function model, were developed to forecast the visibility around Abu Dhabi International Airport for three weeks. Transfer function model improved the MAPE of the forecast significantly.

  13. Fine-Scale Mapping by Spatial Risk Distribution Modeling for Regional Malaria Endemicity and Its Implications under the Low-to-Moderate Transmission Setting in Western Cambodia.

    PubMed

    Okami, Suguru; Kohtake, Naohiko

    2016-01-01

    The disease burden of malaria has decreased as malaria elimination efforts progress. The mapping approach that uses spatial risk distribution modeling needs some adjustment and reinvestigation in accordance with situational changes. Here we applied a mathematical modeling approach for standardized morbidity ratio (SMR) calculated by annual parasite incidence using routinely aggregated surveillance reports, environmental data such as remote sensing data, and non-environmental anthropogenic data to create fine-scale spatial risk distribution maps of western Cambodia. Furthermore, we incorporated a combination of containment status indicators into the model to demonstrate spatial heterogeneities of the relationship between containment status and risks. The explanatory model was fitted to estimate the SMR of each area (adjusted Pearson correlation coefficient R2 = 0.774; Akaike information criterion AIC = 149.423). A Bayesian modeling framework was applied to estimate the uncertainty of the model and cross-scale predictions. Fine-scale maps were created by the spatial interpolation of estimated SMRs at each village. Compared with geocoded case data, corresponding predicted values showed conformity [Spearman's rank correlation r = 0.662 in the inverse distance weighed interpolation and 0.645 in ordinal kriging (95% confidence intervals of 0.414-0.827 and 0.368-0.813, respectively), Welch's t-test; Not significant]. The proposed approach successfully explained regional malaria risks and fine-scale risk maps were created under low-to-moderate malaria transmission settings where reinvestigations of existing risk modeling approaches were needed. Moreover, different representations of simulated outcomes of containment status indicators for respective areas provided useful insights for tailored interventional planning, considering regional malaria endemicity. PMID:27415623

  14. Statistical assessment of bi-exponential diffusion weighted imaging signal characteristics induced by intravoxel incoherent motion in malignant breast tumors

    PubMed Central

    Wong, Oi Lei; Lo, Gladys G.; Chan, Helen H. L.; Wong, Ting Ting; Cheung, Polly S. Y.

    2016-01-01

    Background The purpose of this study is to statistically assess whether bi-exponential intravoxel incoherent motion (IVIM) model better characterizes diffusion weighted imaging (DWI) signal of malignant breast tumor than mono-exponential Gaussian diffusion model. Methods 3 T DWI data of 29 malignant breast tumors were retrospectively included. Linear least-square mono-exponential fitting and segmented least-square bi-exponential fitting were used for apparent diffusion coefficient (ADC) and IVIM parameter quantification, respectively. F-test and Akaike Information Criterion (AIC) were used to statistically assess the preference of mono-exponential and bi-exponential model using region-of-interests (ROI)-averaged and voxel-wise analysis. Results For ROI-averaged analysis, 15 tumors were significantly better fitted by bi-exponential function and 14 tumors exhibited mono-exponential behavior. The calculated ADC, D (true diffusion coefficient) and f (pseudo-diffusion fraction) showed no significant differences between mono-exponential and bi-exponential preferable tumors. Voxel-wise analysis revealed that 27 tumors contained more voxels exhibiting mono-exponential DWI decay while only 2 tumors presented more bi-exponential decay voxels. ADC was consistently and significantly larger than D for both ROI-averaged and voxel-wise analysis. Conclusions Although the presence of IVIM effect in malignant breast tumors could be suggested, statistical assessment shows that bi-exponential fitting does not necessarily better represent the DWI signal decay in breast cancer under clinically typical acquisition protocol and signal-to-noise ratio (SNR). Our study indicates the importance to statistically examine the breast cancer DWI signal characteristics in practice. PMID:27709078

  15. Improving the Accuracy of Demographic and Molecular Clock Model Comparison While Accommodating Phylogenetic Uncertainty

    PubMed Central

    Baele, Guy; Lemey, Philippe; Bedford, Trevor; Rambaut, Andrew; Suchard, Marc A.; Alekseyenko, Alexander V.

    2012-01-01

    Recent developments in marginal likelihood estimation for model selection in the field of Bayesian phylogenetics and molecular evolution have emphasized the poor performance of the harmonic mean estimator (HME). Although these studies have shown the merits of new approaches applied to standard normally distributed examples and small real-world data sets, not much is currently known concerning the performance and computational issues of these methods when fitting complex evolutionary and population genetic models to empirical real-world data sets. Further, these approaches have not yet seen widespread application in the field due to the lack of implementations of these computationally demanding techniques in commonly used phylogenetic packages. We here investigate the performance of some of these new marginal likelihood estimators, specifically, path sampling (PS) and stepping-stone (SS) sampling for comparing models of demographic change and relaxed molecular clocks, using synthetic data and real-world examples for which unexpected inferences were made using the HME. Given the drastically increased computational demands of PS and SS sampling, we also investigate a posterior simulation-based analogue of Akaike's information criterion (AIC) through Markov chain Monte Carlo (MCMC), a model comparison approach that shares with the HME the appealing feature of having a low computational overhead over the original MCMC analysis. We confirm that the HME systematically overestimates the marginal likelihood and fails to yield reliable model classification and show that the AICM performs better and may be a useful initial evaluation of model choice but that it is also, to a lesser degree, unreliable. We show that PS and SS sampling substantially outperform these estimators and adjust the conclusions made concerning previous analyses for the three real-world data sets that we reanalyzed. The methods used in this article are now available in BEAST, a powerful user

  16. Detecting the small island effect and nestedness of herpetofauna of the West Indies.

    PubMed

    Gao, De; Perry, Gad

    2016-08-01

    To detect the small island effect (SIE) and nestedness patterns of herpetofauna of the West Indies, we derived and updated data on the presence/absence of herpetofauna in this region from recently published reviews. We applied regression-based analyses, including linear regression and piecewise regressions with two and three segments, to detect the SIE and then used the Akaike's information criterion (AIC) as a criterion to select the best model. We used the NODF (a nestedness metric based on overlap and decreasing fill) to quantify nestedness and employed two null models to determine significance. Moreover, a random sampling effort was made to infer about the degree of nestedness at portions of the entire community. We found piecewise regression with three segments performed best, suggesting the species-area relationships possess three different patterns that resulted from two area thresholds: a first one, delimiting the SIE, and a second one, delimiting evolutionary processes. We also found that taxa with lower resource requirement, higher dispersal ability, and stronger adaptation to the environment generally displayed lower corresponding threshold values, indicating superior taxonomic groups could earlier end the SIE period and start in situ speciation as the increase of island size. Moreover, the traditional two-segment piecewise regression method may cause poor estimations for both slope and threshold value of the SIE. Therefore, we suggest previous SIE detection works that conducted by two-segment piecewise regression method, ignoring the possibility of three segments, need to be reanalyzed. Antinestedness occurred in the entire system, whereas high degree of nestedness could still occur in portions within the region. Nestedness may still be applicable to conservation planning at portions even if it is antinested at the regional scale. However, nestedness may not be applicable to conservation planning at the regional scale even if nestedness does exist

  17. Climate Change-Related Hydrologic Variation Affects Dissolved Organic Carbon Export to the Gulf of Maine

    NASA Astrophysics Data System (ADS)

    Huntington, T. G.; Balch, W. M.; Aiken, G.; Butler, K. D.; Billmire, M.; Roesler, C. S.; Camill, P.; Bourakovsky, A.

    2014-12-01

    Ongoing climate change is affecting the timing and amount of dissolved organic carbon (DOC) exported to the Gulf of Maine (GoM) through effects on hydrologic conditions. Climate warming in the northeast United States has resulted in decreases in snowfall amount and increases in the proportion of annual precipitation that falls as rain compared with snow. Warming has resulted in an increase in runoff during winter and earlier snowmelt and associated high spring flow. Increases in annual precipitation have resulted in increases in annual runoff. Increases in flashiness in some rivers have resulted in higher variability in daily runoff. DOC fluxes were estimated for water years 1950 through 2012 in eight rivers draining to the GoM that had long-term discharge data and data for DOC during all months of the year. These estimates used LOADEST to fit a seasonally-adjusted concentration - discharge relation. The adjusted maximum likelihood estimation (AMLE) method was used to estimate loads. One of several predefined regression models evaluated in LOADEST was selected based on the Akaike information criterion (AIC) for each river. This analysis assumed stationarity in the concentration - discharge relations. The proportion of total annual DOC exported during winter has increased. The proportion of DOC exported during March and April has also increased and the proportion exported during May has decreased in association with earlier snowmelt runoff and earlier recession to summer low flow. The total annual DOC exported by these rivers increased significantly from 1950 to 2012. The increase in flashiness has increased daily variability in DOC export in some rivers. Changes in the timing and amount of DOC exported to the near coastal ocean may influence marine biogeochemistry including the development of nuisance and harmful algal blooms, carbon sequestration, and the interpretation of satellite-derived ocean color. Terrestrially derived DOC exported to the marine environment

  18. Modeling Heteroscedasticity of Wind Speed Time Series in the United Arab Emirates

    NASA Astrophysics Data System (ADS)

    Kim, H. Y.; Marpu, P. R.; Ouarda, T.

    2014-12-01

    There has been a growing interest in wind resources in the Gulf region, not only for evaluating wind energy potential, but also for understanding and forecasting changes in wind, as a regional climate variable. In particular, time varying variance—the second order moment—or heteroscedasticity in wind time series is important to investigate since high variance causes turbulence, which affects wind power potential and may lead to structural changes in wind turbines. Nevertheless, the conditional variance of wind time series has been rarely explored, especially in the Gulf region. Therefore, the seasonal autoregressive integrated moving average-generalized autoregressive conditional heteroscedasticity (SARIMA-GARCH) model is applied to observed wind data in the United Arab Emirates (UAE). This model allows considering apparent seasonality which is present in wind time series and the heteroscedasticity in residuals indicated with the Engle test, to understand and forecast changes in the conditional variance of wind time series. In this study, the autocorrelation function of daily average wind speed time series obtained from seven stations within the UAE—Al Aradh, Al Mirfa, Al Wagan, East of Jebel Haffet, Madinat Zayed, Masdar City, Sir Bani Yas Island—is inspected to fit a SARIMA model. The best SARIMA model is selected according to the minimum Akaike Information Criteria (AIC) and based on residuals of the model. Then, the GARCH model is applied to the remaining residuals to capture the conditional variance of the SARIMA model. Results indicate that the SARIMA-GARCH model provides a good fir to wind data in the UAE.

  19. Characterizing the profile of muscle deoxygenation during ramp incremental exercise in young men.

    PubMed

    Spencer, Matthew D; Murias, Juan M; Paterson, Donald H

    2012-09-01

    This study characterized the profile of near-infrared spectroscopy (NIRS)-derived muscle deoxygenation (Δ[HHb]) and the tissue oxygenation index (TOI) as a function of absolute (PO(ABS)) and normalized power output (%PO) or oxygen consumption (%VO(2)) during incremental cycling exercise. Eight men (24 ± 5 year) each performed two fatigue-limited ramp incremental cycling tests (20 W min(-1)), during which pulmonary VO(2), Δ[HHb] and TOI were measured continuously. Responses from the two tests were averaged and the TOI (%) and normalized Δ[HHb] (%Δ[HHb]) were plotted against %VO(2), %PO and PO(ABS). The overall responses were modelled using a sigmoid regression (y = f ( 0 ) + A/(1 + e(-(-c+dx)))) and piecewise 'double-linear' function of the predominant adjustment of %Δ[HHb] or TOI observed throughout the middle portion of exercise and the 'plateau' that followed. In ~85% of cases, the corrected Akaike Information Criterion (AIC(C)) was smaller (suggesting one model favoured) for the 'double-linear' compared with the sigmoid regression for both %Δ[HHb] and TOI. Furthermore, the f ( 0 ) and A estimates from the sigmoid regressions of %Δ[HHb] yielded unrealistically large projected peak (f ( 0 ) + A) values (%VO(2p) 114.3 ± 17.5; %PO 113.3 ± 9.5; PO(ABS) 113.5 ± 9.8), suggesting that the sigmoid model does not accurately describe the underlying physiological responses in all subjects and thus may not be appropriate for comparative purposes. Alternatively, the present study proposes that the profile of %Δ[HHb] and TOI during ramp incremental exercise may be more accurately described as consisting of three distinct phases in which there is little adjustment early in the ramp, the predominant increase in %Δ[HHb] (decrease in TOI) is approximately linear and an approximately linear 'plateau' follows.

  20. Climatic patterns in the establishment of wintering areas by North American migratory birds.

    PubMed

    Pérez-Moreno, Heidi; Martínez-Meyer, Enrique; Soberón Mainero, Jorge; Rojas-Soto, Octavio

    2016-04-01

    Long-distance migration in birds is relatively well studied in nature; however, one aspect of this phenomenon that remains poorly understood is the pattern of distribution presented by species during arrival to and establishment of wintering areas. Some studies suggest that the selection of areas in winter is somehow determined by climate, given its influence on both the distribution of bird species and their resources. We analyzed whether different migrant passerine species of North America present climatic preferences during arrival to and departure from their wintering areas. We used ecological niche modeling to generate monthly potential climatic distributions for 13 migratory bird species during the winter season by combining the locations recorded per month with four environmental layers. We calculated monthly coefficients of climate variation and then compared two GLM (generalized linear models), evaluated with the AIC (Akaike information criterion), to describe how these coefficients varied over the course of the season, as a measure of the patterns of establishment in the wintering areas. For 11 species, the sites show nonlinear patterns of variation in climatic preferences, with low coefficients of variation at the beginning and end of the season and higher values found in the intermediate months. The remaining two species analyzed showed a different climatic pattern of selective establishment of wintering areas, probably due to taxonomic discrepancy, which would affect their modeled winter distribution. Patterns of establishment of wintering areas in the species showed a climatic preference at the macroscale, suggesting that individuals of several species actively select wintering areas that meet specific climatic conditions. This probably gives them an advantage over the winter and during the return to breeding areas. As these areas become full of migrants, alternative suboptimal sites are occupied. Nonrandom winter area selection may also have

  1. Temporal and spatial characteristics of extreme precipitation events in the Midwest of Jilin Province based on multifractal detrended fluctuation analysis method and copula functions

    NASA Astrophysics Data System (ADS)

    Guo, Enliang; Zhang, Jiquan; Si, Ha; Dong, Zhenhua; Cao, Tiehua; Lan, Wu

    2016-08-01

    Environmental changes have brought about significant changes and challenges to water resources and management in the world; these include increasing climate variability, land use change, intensive agriculture, and rapid urbanization and industrial development, especially much more frequency extreme precipitation events. All of which greatly affect water resource and the development of social economy. In this study, we take extreme precipitation events in the Midwest of Jilin Province as an example; daily precipitation data during 1960-2014 are used. The threshold of extreme precipitation events is defined by multifractal detrended fluctuation analysis (MF-DFA) method. Extreme precipitation (EP), extreme precipitation ratio (EPR), and intensity of extreme precipitation (EPI) are selected as the extreme precipitation indicators, and then the Kolmogorov-Smirnov (K-S) test is employed to determine the optimal probability distribution function of extreme precipitation indicators. On this basis, copulas connect nonparametric estimation method and the Akaike Information Criterion (AIC) method is adopted to determine the bivariate copula function. Finally, we analyze the characteristics of single variable extremum and bivariate joint probability distribution of the extreme precipitation events. The results show that the threshold of extreme precipitation events in semi-arid areas is far less than that in subhumid areas. The extreme precipitation frequency shows a significant decline while the extreme precipitation intensity shows a trend of growth; there are significant differences in spatiotemporal of extreme precipitation events. The spatial variation trend of the joint return period gets shorter from the west to the east. The spatial distribution of co-occurrence return period takes on contrary changes and it is longer than the joint return period.

  2. Possible causes of a harbour porpoise mass stranding in Danish waters in 2005.

    PubMed

    Wright, Andrew J; Maar, Marie; Mohn, Christian; Nabe-Nielsen, Jacob; Siebert, Ursula; Jensen, Lasse Fast; Baagøe, Hans J; Teilmann, Jonas

    2013-01-01

    An unprecedented 85 harbour porpoises stranded freshly dead along approximately 100 km of Danish coastline from 7-15 April, 2005. This total is considerably above the mean weekly stranding rate for the whole of Denmark, both for any time of year, 1.23 animals/week (ranging from 0 to 20 during 2003-2008, excluding April 2005), and specifically in April, 0.65 animals/week (0 to 4, same period). Bycatch was established as the cause of death for most of the individuals through typical indications of fisheries interactions, including net markings in the skin and around the flippers, and loss of tail flukes. Local fishermen confirmed unusually large porpoise bycatch in nets set for lumpfish (Cyclopterus lumpus) and the strandings were attributed to an early lumpfish season. However, lumpfish catches for 2005 were not unusual in terms of season onset, peak or total catch, when compared to 2003-2008. Consequently, human activity was combined with environmental factors and the variation in Danish fisheries landings (determined through a principal component analysis) in a two-part statistical model to assess the correlation of these factors with both the presence of fresh strandings and the numbers of strandings on the Danish west coast. The final statistical model (which was forward selected using Akaike information criterion; AIC) indicated that naval presence is correlated with higher rates of porpoise strandings, particularly in combination with certain fisheries, although it is not correlated with the actual presence of strandings. Military vessels from various countries were confirmed in the area from the 7th April, en route to the largest naval exercise in Danish waters to date (Loyal Mariner 2005, 11-28 April). Although sonar usage cannot be confirmed, it is likely that ships were testing various equipment prior to the main exercise. Thus naval activity cannot be ruled out as a possible contributing factor.

  3. Effects of reproductive condition, roost microclimate, and weather patterns on summer torpor use by a vespertilionid bat

    PubMed Central

    Johnson, Joseph S; Lacki, Michael J

    2014-01-01

    A growing number of mammal species are recognized as heterothermic, capable of maintaining a high-core body temperature or entering a state of metabolic suppression known as torpor. Small mammals can achieve large energetic savings when torpid, but they are also subject to ecological costs. Studying torpor use in an ecological and physiological context can help elucidate relative costs and benefits of torpor to different groups within a population. We measured skin temperatures of 46 adult Rafinesque's big-eared bats (Corynorhinus rafinesquii) to evaluate thermoregulatory strategies of a heterothermic small mammal during the reproductive season. We compared daily average and minimum skin temperatures as well as the frequency, duration, and depth of torpor bouts of sex and reproductive classes of bats inhabiting day-roosts with different thermal characteristics. We evaluated roosts with microclimates colder (caves) and warmer (buildings) than ambient air temperatures, as well as roosts with intermediate conditions (trees and rock crevices). Using Akaike's information criterion (AIC), we found that different statistical models best predicted various characteristics of torpor bouts. While the type of day-roost best predicted the average number of torpor bouts that bats used each day, current weather variables best predicted daily average and minimum skin temperatures of bats, and reproductive condition best predicted average torpor bout depth and the average amount of time spent torpid each day by bats. Finding that different models best explain varying aspects of heterothermy illustrates the importance of torpor to both reproductive and nonreproductive small mammals and emphasizes the multifaceted nature of heterothermy and the need to collect data on numerous heterothermic response variables within an ecophysiological context. PMID:24558571

  4. Assessment and Selection of Competing Models for Zero-Inflated Microbiome Data

    PubMed Central

    Xu, Lizhen; Paterson, Andrew D.; Turpin, Williams; Xu, Wei

    2015-01-01

    Typical data in a microbiome study consist of the operational taxonomic unit (OTU) counts that have the characteristic of excess zeros, which are often ignored by investigators. In this paper, we compare the performance of different competing methods to model data with zero inflated features through extensive simulations and application to a microbiome study. These methods include standard parametric and non-parametric models, hurdle models, and zero inflated models. We examine varying degrees of zero inflation, with or without dispersion in the count component, as well as different magnitude and direction of the covariate effect on structural zeros and the count components. We focus on the assessment of type I error, power to detect the overall covariate effect, measures of model fit, and bias and effectiveness of parameter estimations. We also evaluate the abilities of model selection strategies using Akaike information criterion (AIC) or Vuong test to identify the correct model. The simulation studies show that hurdle and zero inflated models have well controlled type I errors, higher power, better goodness of fit measures, and are more accurate and efficient in the parameter estimation. Besides that, the hurdle models have similar goodness of fit and parameter estimation for the count component as their corresponding zero inflated models. However, the estimation and interpretation of the parameters for the zero components differs, and hurdle models are more stable when structural zeros are absent. We then discuss the model selection strategy for zero inflated data and implement it in a gut microbiome study of > 400 independent subjects. PMID:26148172

  5. Stochastic approaches for time series forecasting of boron: a case study of Western Turkey.

    PubMed

    Durdu, Omer Faruk

    2010-10-01

    In the present study, a seasonal and non-seasonal prediction of boron concentrations time series data for the period of 1996-2004 from Büyük Menderes river in western Turkey are addressed by means of linear stochastic models. The methodology presented here is to develop adequate linear stochastic models known as autoregressive integrated moving average (ARIMA) and multiplicative seasonal autoregressive integrated moving average (SARIMA) to predict boron content in the Büyük Menderes catchment. Initially, the Box-Whisker plots and Kendall's tau test are used to identify the trends during the study period. The measurements locations do not show significant overall trend in boron concentrations, though marginal increasing and decreasing trends are observed for certain periods at some locations. ARIMA modeling approach involves the following three steps: model identification, parameter estimation, and diagnostic checking. In the model identification step, considering the autocorrelation function (ACF) and partial autocorrelation function (PACF) results of boron data series, different ARIMA models are identified. The model gives the minimum Akaike information criterion (AIC) is selected as the best-fit model. The parameter estimation step indicates that the estimated model parameters are significantly different from zero. The diagnostic check step is applied to the residuals of the selected ARIMA models and the results indicate that the residuals are independent, normally distributed, and homoscadastic. For the model validation purposes, the predicted results using the best ARIMA models are compared to the observed data. The predicted data show reasonably good agreement with the actual data. The comparison of the mean and variance of 3-year (2002-2004) observed data vs predicted data from the selected best models show that the boron model from ARIMA modeling approaches could be used in a safe manner since the predicted values from these models preserve the basic

  6. Forecast of natural aquifer discharge using a data-driven, statistical approach.

    PubMed

    Boggs, Kevin G; Van Kirk, Rob; Johnson, Gary S; Fairley, Jerry P

    2014-01-01

    In the Western United States, demand for water is often out of balance with limited water supplies. This has led to extensive water rights conflict and litigation. A tool that can reliably forecast natural aquifer discharge months ahead of peak water demand could help water practitioners and managers by providing advanced knowledge of potential water-right mitigation requirements. The timing and magnitude of natural aquifer discharge from the Eastern Snake Plain Aquifer (ESPA) in southern Idaho is accurately forecast 4 months ahead of the peak water demand, which occurs annually in July. An ARIMA time-series model with exogenous predictors (ARIMAX model) was used to develop the forecast. The ARIMAX model fit to a set of training data was assessed using Akaike's information criterion to select the optimal model that forecasts aquifer discharge, given the previous year's discharge and values of the predictor variables. Model performance was assessed by application of the model to a validation subset of data. The Nash-Sutcliffe efficiency for model predictions made on the validation set was 0.57. The predictor variables used in our forecast represent the major recharge and discharge components of the ESPA water budget, including variables that reflect overall water supply and important aspects of water administration and management. Coefficients of variation on the regression coefficients for streamflow and irrigation diversions were all much less than 0.5, indicating that these variables are strong predictors. The model with the highest AIC weight included streamflow, two irrigation diversion variables, and storage. PMID:24571388

  7. Improving the accuracy of demographic and molecular clock model comparison while accommodating phylogenetic uncertainty.

    PubMed

    Baele, Guy; Lemey, Philippe; Bedford, Trevor; Rambaut, Andrew; Suchard, Marc A; Alekseyenko, Alexander V

    2012-09-01

    Recent developments in marginal likelihood estimation for model selection in the field of Bayesian phylogenetics and molecular evolution have emphasized the poor performance of the harmonic mean estimator (HME). Although these studies have shown the merits of new approaches applied to standard normally distributed examples and small real-world data sets, not much is currently known concerning the performance and computational issues of these methods when fitting complex evolutionary and population genetic models to empirical real-world data sets. Further, these approaches have not yet seen widespread application in the field due to the lack of implementations of these computationally demanding techniques in commonly used phylogenetic packages. We here investigate the performance of some of these new marginal likelihood estimators, specifically, path sampling (PS) and stepping-stone (SS) sampling for comparing models of demographic change and relaxed molecular clocks, using synthetic data and real-world examples for which unexpected inferences were made using the HME. Given the drastically increased computational demands of PS and SS sampling, we also investigate a posterior simulation-based analogue of Akaike's information criterion (AIC) through Markov chain Monte Carlo (MCMC), a model comparison approach that shares with the HME the appealing feature of having a low computational overhead over the original MCMC analysis. We confirm that the HME systematically overestimates the marginal likelihood and fails to yield reliable model classification and show that the AICM performs better and may be a useful initial evaluation of model choice but that it is also, to a lesser degree, unreliable. We show that PS and SS sampling substantially outperform these estimators and adjust the conclusions made concerning previous analyses for the three real-world data sets that we reanalyzed. The methods used in this article are now available in BEAST, a powerful user

  8. In Vivo Evaluation of Blood Based and Reference Tissue Based PET Quantifications of [11C]DASB in the Canine Brain.

    PubMed

    Van Laeken, Nick; Taylor, Olivia; Polis, Ingeborgh; Neyt, Sara; Kersemans, Ken; Dobbeleir, Andre; Saunders, Jimmy; Goethals, Ingeborg; Peremans, Kathelijne; De Vos, Filip

    2016-01-01

    This first-in-dog study evaluates the use of the PET-radioligand [11C]DASB to image the density and availability of the serotonin transporter (SERT) in the canine brain. Imaging the serotonergic system could improve diagnosis and therapy of multiple canine behavioural disorders. Furthermore, as many similarities are reported between several human neuropsychiatric conditions and naturally occurring canine behavioural disorders, making this tracer available for use in dogs also provide researchers an interesting non-primate animal model to investigate human disorders. Five adult beagles underwent a 90 minutes dynamic PET scan and arterial whole blood was sampled throughout the scan. For each ROI, the distribution volume (VT), obtained via the one- and two- tissue compartment model (1-TC, 2-TC) and the Logan Plot, was calculated and the goodness-of-fit was evaluated by the Akaike Information Criterion (AIC). For the preferred compartmental model BPND values were estimated and compared with those derived by four reference tissue models: 4-parameter RTM, SRTM2, MRTM2 and the Logan reference tissue model. The 2-TC model indicated in 61% of the ROIs a better fit compared to the 1-TC model. The Logan plot produced almost identical VT values and can be used as an alternative. Compared with the 2-TC model, all investigated reference tissue models showed high correlations but small underestimations of the BPND-parameter. The highest correlation was achieved with the Logan reference tissue model (Y = 0.9266 x + 0.0257; R2 = 0.9722). Therefore, this model can be put forward as a non-invasive standard model for future PET-experiments with [11C]DASB in dogs. PMID:26859850

  9. Long lead-time flood forecasting using data-driven modeling approaches

    NASA Astrophysics Data System (ADS)

    Bhatia, N.; He, J.; Srivastav, R. K.

    2014-12-01

    In spite of numerous structure measures being taken for floods, accurate flood forecasting is essential to condense the damages in hazardous areas considerably. The need of producing more accurate flow forecasts motivates the researchers to develop advanced innovative methods. In this study, it is proposed to develop a hybrid neural network model to exploit the strengths of artificial neural networks (ANNs). The proposed model has two components: i.) Dual - ANN model developed using river flows; and ii.) Multiple Linear Regression (MLR) model trained on meteorological data (Rainfall and Snow on ground). Potential model inputs that best represent the process of river basin were selected in stepwise manner by identifying input-output relationship using a linear approach, Partial Correlation Input Selection (PCIS) combined with Akaike Information Criterion (AIC) technique. The presented hybrid model was compared with three conventional methods: i) Feed-forward artificial neural network (FF-ANN) using daily river flows; ii) FF-ANN applied on decomposed river flows (low flow, rising limb and falling limb of hydrograph); and iii) Recursive method for daily river flows with lead-time of 7 days. The applicability of the presented model is illustrated through daily river flow data of Bow River, Canada. Data from 1912 to 1976 were used to train the models while data from 1977 to 2006 were used to validate the models. The results of the study indicate that the proposed model is robust enough to capture the non-linear nature of hydrograph and proves to be highly promising to forecast peak flows (extreme values) well in advance (higher lead time).

  10. Modeling a habitat suitability index for the eastern fall cohort of Ommastrephes bartramii in the central North Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Chen, Xinjun; Tian, Siquan; Liu, Bilin; Chen, Yong

    2011-05-01

    The eastern fall cohort of the neon flying squid, Ommastrephes bartramii, has been commercially exploited by the Chinese squid jigging fleet in the central North Pacific Ocean since the late 1990s. To understand and identify their optimal habitat, we have developed a habitat suitability index (HSI) model using two potential important environmental variables — sea surface temperature (SST) and sea surface height anomaly (SSHA) — and fishery data from the main fishing ground (165°-180°E) during June and July of 1999-2003. A geometric mean model (GMM), minimum model (MM) and arithmetic weighted model (AWM) with different weights were compared and the best HSI model was selected using Akaike's information criterion (AIC). The performance of the developed HSI model was evaluated using fishery data for 2004. This study suggests that the highest catch per unit effort (CPUE) and fishing effort are closely related to SST and SSHA. The best SST- and SSHA-based suitability index (SI) regression models were SISST-based = 0.7SIeffort-SST + 0.3 SICPUE-SST, and SISSHA-based = 0.5SIeffort-SSHA + 0.5SICPUE-SSHA, respectively, showing that fishing effort is more important than CPUE in the estimation of SI. The best HSI model was the AWM, defined as HSI=0.3SISST-based+ 0.7SISSHA-based, indicating that SSHA is more important than SST in estimating the HSI of squid. In 2004, monthly HSI values greater than 0.6 coincided with the distribution of productive fishing ground and high CPUE in June and July, suggesting that the models perform well. The proposed model provides an important tool in our efforts to develop forecasting capacity of squid spatial dynamics.

  11. Information Practice and Malpractice.

    ERIC Educational Resources Information Center

    Mintz, Anne P.

    1985-01-01

    Discussion of extent of information malpractice highlights role of information broker, copyrights and fees, special library problems, protection against malpractice, contracts, ready reference risks, education against malpractice, continuing education, personal values, malpractice insurance, information producers, Dun and Bradstreet versus…

  12. Fireworks Information Center

    MedlinePlus

    ... Home / Safety Education / Safety Education Centers En Español Fireworks Information Center This is an information center on ... Video Put Safety First This Fourth of July Fireworks Information What are consumer fireworks and where are ...

  13. Informed consent - adults

    MedlinePlus

    ... state). What Should Occur During the Informed Consent Process? When asking for your informed consent, your doctor ... What is Your Role in the Informed Consent Process? You are an important member of your health ...

  14. Energy information sheets

    SciTech Connect

    1995-07-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the public. The Energy Information Sheets was developed to provide general information on various aspects of fuel production, prices, consumption, and capability. Additional information on related subject matter can be found in other Energy Information Administration (EIA) publications as referenced at the end of each sheet.

  15. What Are "National Information Policies?"

    ERIC Educational Resources Information Center

    Horton, Forest Woody, Jr.

    1998-01-01

    Examines basic concepts; policy instruments, language, and processes; information, basic-information, and sector-specific information policies; form, format, and content; information policies related to information resources; policy conflicts; national and government information policies; national information infrastructures; classification of…

  16. Information for disasters, information disasters, and disastrous information.

    PubMed

    McDonnell, Sharon M; Perry, Helen N; McLaughlin, Brooke; McCurdy, Bronwen; Parrish, R Gibson

    2007-01-01

    Information is needed to support humanitarian responses in every phase of a disaster. Participants of a multilateral working group convened to examine how best to meet these information needs. Although information systems based on routine reporting of diseases are desirable because they have the potential to identify trends, these systems usually do not deliver on their promise due to inadequate organization and management to support them. To identify organizational and management characteristics likely to be associated with successful information systems in disaster settings, evaluations of the Integrated Disease Surveillance and Response (IDSR) programs in 12 participating countries were reviewed. Characteristics that were mentioned repeatedly in the evaluations as associated with success were grouped into nine categories: (1) human resources management and supervision; (2) political support; (3) strengthened laboratory capacity; (4) communication and feedback (through many mechanisms); (5) infrastructure and resources; (6) system design and capacity; (7) coordination and partnerships with stakeholders; (8) community input; and (9) evaluation. Selected characteristics and issues within each category are discussed. Based on the review of the IDSR evaluations and selected articles in the published literature, recommendations are provided for improving the short- and long-term organization and management of information systems in humanitarian responses associated with disasters. It is suggested that information systems that follow these recommendations are more likely to yield quality information and be sustainable even in disaster settings.

  17. Mutual Information, Fisher Information, and Efficient Coding.

    PubMed

    Wei, Xue-Xin; Stocker, Alan A

    2016-02-01

    Fisher information is generally believed to represent a lower bound on mutual information (Brunel & Nadal, 1998), a result that is frequently used in the assessment of neural coding efficiency. However, we demonstrate that the relation between these two quantities is more nuanced than previously thought. For example, we find that in the small noise regime, Fisher information actually provides an upper bound on mutual information. Generally our results show that it is more appropriate to consider Fisher information as an approximation rather than a bound on mutual information. We analytically derive the correspondence between the two quantities and the conditions under which the approximation is good. Our results have implications for neural coding theories and the link between neural population coding and psychophysically measurable behavior. Specifically, they allow us to formulate the efficient coding problem of maximizing mutual information between a stimulus variable and the response of a neural population in terms of Fisher information. We derive a signature of efficient coding expressed as the correspondence between the population Fisher information and the distribution of the stimulus variable. The signature is more general than previously proposed solutions that rely on specific assumptions about the neural tuning characteristics. We demonstrate that it can explain measured tuning characteristics of cortical neural populations that do not agree with previous models of efficient coding.

  18. The Measurement of Information.

    ERIC Educational Resources Information Center

    Harmon, Glynn

    1984-01-01

    Views information as residual or catalytic form of energy which regulates other forms of energy in natural and artificial systems. Parallel human information processing (production systems, algorithms, heuristics) and information measurement are discussed. Suggestions for future research in area of parallel information processing include a matrix…

  19. Dialing Up Telecommunications Information.

    ERIC Educational Resources Information Center

    Bates, Mary Ellen

    1993-01-01

    Describes how to find accurate, current information about telecommunications industries, products and services, rates and tariffs, and regulatory information using electronic information resources available from the private and public sectors. A sidebar article provides contact information for producers and service providers. (KRN)

  20. Guideline 2: Informed Consent.

    ERIC Educational Resources Information Center

    American Journal on Mental Retardation, 2000

    2000-01-01

    The second in seven sets of guidelines based on the consensus of experts in the treatment of psychiatric and behavioral problems in mental retardation (MR) focuses on informed consent. Guidelines cover underlying concepts, usual components, informed consent as a process, information to include, what to provide, when to obtain informed consent, and…

  1. Information Services. Miscellaneous Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Papers on audiovisual information resources, the history of technical libraries, online legal information, and information technology for schoolchildren, which were presented at the 1983 International Federation of Library Associations (IFLA) conference, include: (1) "Continuing Issues in the Provision of Audiovisual Information Resources - An…

  2. Seymour: Maryland's Information Retriever.

    ERIC Educational Resources Information Center

    Smith, Barbara G.

    1994-01-01

    Explains the development of an electronic information network in Maryland called Seymour that offers bibliographic records; full-text databases; community information databases; the ability to request information and materials; local, state, and federal information; and access to the Internet. Policy issues are addressed, including user fees and…

  3. Information and the Economy.

    ERIC Educational Resources Information Center

    Zurkowski, Paul G.

    1979-01-01

    Advocates an American Declaration for the information age to guide development of the information industry. The information revolution is discussed in terms of technological advancement, copyright issues, international policy, and telecommunications. Information is compared to money as a universal commodity. (SW)

  4. Mission Medical Information System

    NASA Technical Reports Server (NTRS)

    Johnson-Throop, Kathy A.; Joe, John C.; Follansbee, Nicole M.

    2008-01-01

    This viewgraph presentation gives an overview of the Mission Medical Information System (MMIS). The topics include: 1) What is MMIS?; 2) MMIS Goals; 3) Terrestrial Health Information Technology Vision; 4) NASA Health Information Technology Needs; 5) Mission Medical Information System Components; 6) Electronic Medical Record; 7) Longitudinal Study of Astronaut Health (LSAH); 8) Methods; and 9) Data Submission Agreement (example).

  5. Information: Public or Private?

    ERIC Educational Resources Information Center

    Smith, Jean

    1984-01-01

    Examines policies concerning government-generated information and trend toward privatization of information, i.e., contracting out of government information functions to private sector. The impact these policies may have on public's access to government documents and reports and implications for information professionals are analyzed. A 42-item…

  6. The International Information Revolution.

    ERIC Educational Resources Information Center

    Cleveland, Harlan

    Certain characteristics of information make it a crucial resource in today's world. Unlike material resources such as coal and steel, information is expandable, easily transportable, diffusive, and shareable. Because of these properties of information, the new "information age" has already begun to challenge some of mankind's most comfortable…

  7. Intelligence, Information Technology, and Information Warfare.

    ERIC Educational Resources Information Center

    Davies, Philip H. J.

    2002-01-01

    Addresses the use of information technology for intelligence and information warfare in the context of national security and reviews the status of clandestine collection. Discusses hacking, human agent collection, signal interception, covert action, counterintelligence and security, and communications between intelligence producers and consumers…

  8. INFORM: Library Information at Your Fingertips.

    ERIC Educational Resources Information Center

    Urbanek, Val

    1982-01-01

    User-friendly computer terminals with touch sensitive screens are used in new instruction and information dissemination system operating at Providence Public Library and Rockefeller Library at Brown University. This microcomputer-based turnkey system offers databases built by librarians providing information such as use of library, book reviews,…

  9. Information Resource Management for Industrial Information Officers.

    ERIC Educational Resources Information Center

    Dosa, Marta

    This paper argues that the function of educational programs is to convey a sense of reality and an understanding of the open-endedness of information needs and situations; only such a reality orientation can instill the necessary flexibility in information professionals for effectively managing change. There is a growing consensus among…

  10. Health information seeking in the information society.

    PubMed

    Mukherjee, Abir; Bawden, David

    2012-09-01

    This article is the second student contribution to the Dissertations into Practice feature. It reports on a study that investigated the everyday health information-seeking practices of a small group of the 'general public' and the implications for information-seeking theory and health information provision. The first student article, about the implementation of radio frequency identification (RFID) in a hospital library, was very different, and the two articles illustrate the broad spectrum of possible subjects for the Dissertations into Practice feature. This study was conducted in summer 2011 by Abir Mukherjee for his MSc dissertation in the Library and Information Sciences programme at City University London. Further information and copies of the full dissertation may be obtained from Abir Mukherjee or David Bawden. AM. PMID:22925387

  11. Health information seeking in the information society.

    PubMed

    Mukherjee, Abir; Bawden, David

    2012-09-01

    This article is the second student contribution to the Dissertations into Practice feature. It reports on a study that investigated the everyday health information-seeking practices of a small group of the 'general public' and the implications for information-seeking theory and health information provision. The first student article, about the implementation of radio frequency identification (RFID) in a hospital library, was very different, and the two articles illustrate the broad spectrum of possible subjects for the Dissertations into Practice feature. This study was conducted in summer 2011 by Abir Mukherjee for his MSc dissertation in the Library and Information Sciences programme at City University London. Further information and copies of the full dissertation may be obtained from Abir Mukherjee or David Bawden. AM.

  12. Aquaculture information package

    SciTech Connect

    Boyd, T.; Rafferty, K.

    1998-08-01

    This package of information is intended to provide background information to developers of geothermal aquaculture projects. The material is divided into eight sections and includes information on market and price information for typical species, aquaculture water quality issues, typical species culture information, pond heat loss calculations, an aquaculture glossary, regional and university aquaculture offices and state aquaculture permit requirements. A bibliography containing 68 references is also included.

  13. Geographic Names Information System

    USGS Publications Warehouse

    U.S. Geological Survey

    1984-01-01

    The Geographic Names Information System (GNIS) is an automated data system developed by the U.S. Geological Survey (USGS) to standardize and disseminate information on geographic names. GNIS provides primary information for all known places, features, and areas in the United States identified by a proper name. The information in the system can be manipulated to meet varied needs. You can incorporate information from GNIS into your own data base for special applications.

  14. Quantum information does exist

    NASA Astrophysics Data System (ADS)

    Duwell, Armond

    2008-01-01

    This paper advocates a concept of quantum information whose origins can be traced to Schumacher [1995. Quantum coding. Physical Review A 51, 2738-2747]. The concept of quantum information advocated is elaborated using an analogy to Shannon's theory provided by Schumacher coding. In particular, this paper extends Timpson's [2004. Quantum information theory and the foundations of quantum mechanics. Ph.D. dissertation, University of Oxford. Preprint, quant-ph/0412063] framework for interpreting Shannon information theory to the quantum context. Entanglement fidelity is advocated as the appropriate success criterion for the reproduction of quantum information. The relationship between the Shannon theory and quantum information theory is discussed.

  15. Energy information sheets

    SciTech Connect

    Not Available

    1993-12-02

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. Written for the general public, the EIA publication Energy Information Sheets was developed to provide information on various aspects of fuel production, prices, consumption and capability. The information contained herein pertains to energy data as of December 1991. Additional information on related subject matter can be found in other EIA publications as referenced at the end of each sheet.

  16. Types of quantum information

    SciTech Connect

    Griffiths, Robert B.

    2007-12-15

    Quantum, in contrast to classical, information theory, allows for different incompatible types (or species) of information which cannot be combined with each other. Distinguishing these incompatible types is useful in understanding the role of the two classical bits in teleportation (or one bit in one-bit teleportation), for discussing decoherence in information-theoretic terms, and for giving a proper definition, in quantum terms, of 'classical information.' Various examples (some updating earlier work) are given of theorems which relate different incompatible kinds of information, and thus have no counterparts in classical information theory.

  17. Layers of Information: Geographic Information Systems (GIS).

    ERIC Educational Resources Information Center

    Lucking, Robert A.; Christmann, Edwin P.

    2003-01-01

    Describes the Geographic Information System (GIS) which is capable of storing, manipulating, and displaying data allowing students to explore complex relationships through scientific inquiry. Explains applications of GIS in middle school classrooms and includes assessment strategies. (YDS)

  18. Some statistical features of the aftershock temporal behavior after the M7.4 Izmit earthquake of august 17, 1999 in Turkey

    NASA Astrophysics Data System (ADS)

    Gospodinov, D.; Fajtin, H.; Rangelov, B.; Marekova, E.

    2009-04-01

    An earthquake of magnitude Mw=7.4 struck 8 km. southeast of Izmit, Turkey at 3:02 AM local time on August 17, 1999. The earthquake occurred on one of the world's longest and best studied strike-slip (horizontal motion) faults - the east-west trending North Anatolian fault. Seismologists are not able to predict the timing and sizes of individual aftershocks but stochastic modeling allows determinationof probabilities for aftershocks and larger mainshocks duringintervals following the mainshock. The most widely applied stochastic model to depict aftershocks temporal distribution is the non- homogenous Poisson process with a decaying intensity, which follows the Modified Omori Formula (MOF) (Utsu, 1961). A more complex model, considering the triggering potential of each aftershock was developed by Ogata (1988) and it was named Epidemic Type Aftershock Sequence (ETAS) model. Gospodinov and Rotondi (2006) elaborated a Restricted Epidemic Type Aftershock Sequence (RETAS) model. The latter follows the general idea that only aftershocks stronger than some cut-off magnitude possess the capability to induce secondary aftershock activity. In this work we shall consider the Restricted Epidemic Type Aftershock Sequence (RETAS) model, for which the conditional intensity function turns out to be ‘ K0eα(Mi-M0)- λ (t|Ht) = + (t- ti + c)p ti < t Mi ≥ Mth (1) Here the summation occurs for all aftershocks with magnitude bigger than or equal to Mth, which took place before time. Leaving Mth to take all possible values, one can examine all RETAS model versions between the MOF and the ETAS model on the basis of the Akaike Information Criterion AIC (Akaike, 1974) AIC = - 2max log L+ 2k (2) where k is the number of parameters used in the model and logL is the logarithm of the likelihood function. Then for the model providing the best fit, we choose the one with the smallest AIC value. The purpose of this paper is to verify versions of the RETAS model (including the MOF and the

  19. Keeping Public Information Public.

    ERIC Educational Resources Information Center

    Kelley, Wayne P.

    1998-01-01

    Discusses the trend toward the transfer of federal government information from the public domain to the private sector. Topics include free access, privatization, information-policy revision, accountability, copyright issues, costs, pricing, and market needs versus public needs. (LRW)

  20. General Information about Melanoma

    MedlinePlus

    ... Screening Research Melanoma Treatment (PDQ®)–Patient Version General Information About Melanoma Go to Health Professional Version Key ... the PDQ Adult Treatment Editorial Board . Clinical Trial Information A clinical trial is a study to answer ...

  1. Evaluating Health Information

    MedlinePlus

    ... information reviewed before it is posted? Be skeptical. Things that sound too good to be true often are. You want current, unbiased information based on research. NIH: National Library of Medicine

  2. Informed consent and routinisation.

    PubMed

    Ploug, Thomas; Holm, Soren

    2013-04-01

    This article introduces the notion of 'routinisation' into discussions of informed consent. It is argued that the routinisation of informed consent poses a threat to the protection of the personal autonomy of a patient through the negotiation of informed consent. On the basis of a large survey, we provide evidence of the routinisation of informed consent in various types of interaction on the internet; among these, the routinisation of consent to the exchange of health related information. We also provide evidence that the extent of the routinisation of informed consent is dependent on the character of the information exchanged, and we uncover a range of causes of routinisation. Finally, the article discusses possible ways of countering the problem of routinisation of informed consent.

  3. DSCOVR Data and Information

    Atmospheric Science Data Center

    2016-08-03

    ... DSCOVR Data and Information Deep Space Climate Observatory ( DSCOVR ) (formerly known as Triana) was originally ... and natural phenomena. This information can be used for climate science applications.   The Earth Polychromatic Imaging ...

  4. Advanced information society(2)

    NASA Astrophysics Data System (ADS)

    Masuyama, Keiichi

    Our modern life is full of information and information infiltrates into our daily life. Networking of the telecommunication is extended to society, company, and individual level. Although we have just entered the advanced information society, business world and our daily life have been steadily transformed by the advancement of information network. This advancement of information brings a big influence on economy, and will play they the main role in the expansion of domestic demands. This paper tries to view the image of coming advanced information society, focusing on the transforming businessman's life and the situation of our daily life, which became wealthy by the spread of daily life information and the visual information by satellite system, in the development of the intelligent city.

  5. National Health Information Center

    MedlinePlus

    ... About ODPHP Dietary Guidelines Physical Activity Guidelines Health Literacy and Communication Health Care Quality and Patient Safety Healthy People healthfinder health.gov About ODPHP National Health Information Center National Health Information Center The National Health ...

  6. Indiana Health Information Exchange

    Cancer.gov

    The Indiana Health Information Exchange is comprised of various Indiana health care institutions, established to help improve patient safety and is recognized as a best practice for health information exchange.

  7. Public informations guidelines

    SciTech Connect

    1986-06-01

    The purpose of these Public Information Guidelines is to provide principles for the implementation of the NWPA mandate and the Mission Plan requirements for the provision of public information. These Guidelines set forth the public information policy to be followed by all Office of Civilian Radioactive Waste Management (OCRWM) performance components. The OCRWM offices should observe these Guidelines in shaping and conducting public information activities.

  8. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    SciTech Connect

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun; Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun; Wilson, David L.

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

  9. Is Information Still Relevant?

    ERIC Educational Resources Information Center

    Ma, Lia

    2013-01-01

    Introduction: The term "information" in information science does not share the characteristics of those of a nomenclature: it does not bear a generally accepted definition and it does not serve as the bases and assumptions for research studies. As the data deluge has arrived, is the concept of information still relevant for information…

  10. Medical Information Systems.

    ERIC Educational Resources Information Center

    Smith, Kent A.

    1986-01-01

    Description of information services from the National Library of Medicine (NLM) highlights a new system for retrieving information from NLM's databases (GRATEFUL MED); a formal Regional Medical Library Network; DOCLINE; the Unified Medical Language System; and Integrated Academic Information Management Systems. Research and development and the…

  11. Europe and Information Science.

    ERIC Educational Resources Information Center

    Ingwersen, Peter

    1997-01-01

    Discusses recent European library and information science (LIS) events. Describes the development and use of regional and intra-European Union networks for science. Highlights three European conferences held in 1996: ACM-SIGIR on information retrieval held in Switzerland, Information Seeking in Context (ISIC) held in Finland, and Conceptions of…

  12. Security classification of information

    SciTech Connect

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  13. Developing an Information Strategy

    ERIC Educational Resources Information Center

    Hanson, Terry

    2011-01-01

    The purpose of an information strategy is to highlight the extent to which a modern, complex organization depends on information, in all of its guises, and to consider how this strategic asset should be managed. This dependency has always been present and nowhere more so than in universities, whose very purpose is built around information and its…

  14. Information Science Education.

    ERIC Educational Resources Information Center

    Parker, Edwin B.

    The possible objectives of information science education can be categorized into three general types: professional training intended to equip people to operate existing communication institutions; training of information technologists who will be the inventors or engineers expected to develop and test new information systems; and training of…

  15. Information Literacy. ERIC Digest.

    ERIC Educational Resources Information Center

    Plotnick, Eric

    Although alternate definitions for information literacy have been developed by educational institutions, professional organizations and individuals, they are likely to stem from the definition offered in the Final Report of the American Library Association (ALA) Presidential Committee on Information Literacy: "to be information literate, a person…

  16. Teaching Information Technology Law

    ERIC Educational Resources Information Center

    Taylor, M. J.; Jones, R. P.; Haggerty, J.; Gresty, D.

    2009-01-01

    In this paper we discuss an approach to the teaching of information technology law to higher education computing students that attempts to prepare them for professional computing practice. As information technology has become ubiquitous its interactions with the law have become more numerous. Information technology practitioners, and in particular…

  17. Recruitment and Information Program

    ERIC Educational Resources Information Center

    Liebergott, Harvey

    1976-01-01

    The Bureau of Education for the Handicapped's Recruitment and Information Program provides parents and other interested individuals with information on the educational needs of handicapped children through such activities as the National Information Center for the Handicapped ("Closer Look"), pamphlets on various subjects, and media campaigns that…

  18. Quick Information Sheets. 1988.

    ERIC Educational Resources Information Center

    Wisconsin Univ., Madison. Trace Center.

    The Trace Center gathers and organizes information on communication, control, and computer access for handicapped individuals. The information is disseminated in the form of brief sheets describing print, nonprint, and organizational resources and listing addresses and telephone numbers for ordering or for additional information. This compilation…

  19. The Campus Information Game

    ERIC Educational Resources Information Center

    Martin, Andrew

    2006-01-01

    The primary purpose of THE CAMPUS INFORMATION GAME is to help to induct new students into their unfamiliar study environment. Typically it forms an early element of an overall induction program for their course of study. THE CAMPUS INFORMATION GAME has a key secondary theme of information quality that is particularly appropriate for students of…

  20. Issues in Information Policy.

    ERIC Educational Resources Information Center

    Yurow, Jane H.; And Others

    In response to the growing importance of information issues in regional, national, and international forums, these papers on information policy were prepared by National Telecommunications and Information Administration consultants and staff members during 1979 and 1980 to provide a foundation for review, public analysis, and debate. Two types of…

  1. Kuhlthau's Information Search Process.

    ERIC Educational Resources Information Center

    Shannon, Donna

    2002-01-01

    Explains Kuhlthau's Information Search Process (ISP) model which is based on a constructivist view of learning and provides a framework for school library media specialists for the design of information services and instruction. Highlights include a shift from library skills to information skills; attitudes; process approach; and an interview with…

  2. The Information Search

    ERIC Educational Resources Information Center

    Doraiswamy, Uma

    2011-01-01

    This paper in the form of story discusses a college student's information search process. In this story we see Kuhlthau's information search process: initiation, selection, exploration, formulation, collection, and presentation. Katie is a student who goes in search of information for her class research paper. Katie's class readings, her interest…

  3. Information Law and Copyright.

    ERIC Educational Resources Information Center

    Marx, Peter A.

    1986-01-01

    Because of information law's inability to keep up with rapid changes in information technology and impreciseness of the law, copyrighting of databases poses unique problems. Interpretation of fair use doctrine, privately owned computer "downloading," impact of federal electronic filing, and questions concerning information businesses need to be…

  4. Information for Agricultural Development.

    ERIC Educational Resources Information Center

    Kaungamno, E. E.

    This paper describes the major international agricultural information services, sources, and systems; outlines the existing information situation in Tanzania as it relates to problems of agricultural development; and reviews the improvements in information provision resources required to support the process of agricultural development in Tanzania.…

  5. Pricing of Information.

    ERIC Educational Resources Information Center

    Furneaux, M. I. P.; Newton, J.

    This essay considers the cost of information retrieval by databases and information centers, and explores the need to charge users for the information supplied. The advantages and disadvantages of three means of charging users are discussed: (1) connnect hour charge, (2) print/type charge, and (3) subscription. Also addressed is the practice of…

  6. Government Information Policy.

    ERIC Educational Resources Information Center

    Dearstyne, Bruce W.; And Others

    1991-01-01

    Six articles discuss government information policy in context of technology and electronic records; policies on information resources management from OMB (Office of Management and Budget); state information resources, including Council of State Governments (CSG); state record laws and preservation of archival records; and management of electronic…

  7. What Is Information Design?

    ERIC Educational Resources Information Center

    Redish, Janice C. (Ginny)

    2000-01-01

    Defines two meanings of information design: the overall process of developing a successful document; and the way the information is presented on the screen (layout, typography, color, and so forth). Discusses the future importance of both of these meanings of information design, in terms of design for the web and single-sources (planning…

  8. Energy information directory 1995

    SciTech Connect

    1995-10-01

    The National Energy Information Center provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. This Energy Information Directory is used to assist the Center staff as well as other DOE staff in directing inquires to the proper offices.

  9. A Global Information Utility.

    ERIC Educational Resources Information Center

    Block, Robert S.

    1984-01-01

    High-powered satellites, along with other existing technologies, make possible a world information utility that could distribute virtually limitless information to every point on earth. The utility could distribute information for business, government, education, and entertainment. How the utility would work is discussed. (RM)

  10. Information Needs of Anthropologists.

    ERIC Educational Resources Information Center

    Hartmann, Jonathan

    1995-01-01

    Anthropologists at seven universities were surveyed to determine their methods of information retrieval, choice of information sources, perception of the adequacy of library service, and information needs. Results show that journals, field data, and maps are important sources; interlibrary loan is often used; and the majority of needs are met by…

  11. Connectionist Interaction Information Retrieval.

    ERIC Educational Resources Information Center

    Dominich, Sandor

    2003-01-01

    Discussion of connectionist views for adaptive clustering in information retrieval focuses on a connectionist clustering technique and activation spreading-based information retrieval model using the interaction information retrieval method. Presents theoretical as well as simulation results as regards computational complexity and includes…

  12. Shifting Boundaries in Information.

    ERIC Educational Resources Information Center

    Schiller, Anita

    1981-01-01

    Social interest in information as a shared resource is diminishing, while proprietary interest in information as a profitable resource increases. Technological advancement in information processing and services, considerations of cost recovery by public agencies, and opportunities for commercial profit are blurring distinctions between the public…

  13. The Information Experience.

    ERIC Educational Resources Information Center

    Senese, Diane

    1997-01-01

    The author argues that, in a team-based business culture where good service is a minimum and professionalism is a given, information experts need to move beyond the service era and distinguish themselves by creating an "information experience." Describes how corporate information experts can adapt to what John Naisbitt has called the "experience…

  14. Mobile Student Information System

    ERIC Educational Resources Information Center

    Asif, Muhammad; Krogstie, John

    2011-01-01

    Purpose: A mobile student information system (MSIS) based on mobile computing and context-aware application concepts can provide more user-centric information services to students. The purpose of this paper is to describe a system for providing relevant information to students on a mobile platform. Design/methodology/approach: The research…

  15. Earth Science Information Center

    USGS Publications Warehouse

    ,

    1991-01-01

    An ESIC? An Earth Science Information Center. Don't spell it. Say it. ESIC. It rhymes with seasick. You can find information in an information center, of course, and you'll find earth science information in an ESIC. That means information about the land that is the Earth, the land that is below the Earth, and in some instances, the space surrounding the Earth. The U.S. Geological Survey (USGS) operates a network of Earth Science Information Centers that sell earth science products and data. There are more than 75 ESIC's. Some are operated by the USGS, but most are in other State or Federal agencies. Each ESIC responds to requests for information received by telephone, letter, or personal visit. Your personal visit.

  16. 78 FR 7463 - Information Collection

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... SPACE ADMINISTRATION Information Collection AGENCY: National Aeronautics and Space Administration (NASA... to take this opportunity to comment on proposed and/or continuing information collections, as... INFORMATION CONTACT: Requests for additional information or copies of the information collection...

  17. Hydrologic Information Science (Invited)

    NASA Astrophysics Data System (ADS)

    Maidment, D. R.

    2009-12-01

    The CUAHSI Hydrologic Information System is intended to advance hydrologic science through better capacity to access and organize hydrologic information, as described by Tarboton et al. (2009), in this session. This development may help to create a new branch of hydrologic science, namely hydrologic information science, which is that branch of hydrologic science which deals with the organization, analysis and synthesis of hydrologic information. There are several parts of this body of information: time series data on water observations at point locations that describe the flow, level, and quality of water; GIS data that describe the watersheds, aquifers, streams, waterbodies, wells and other water features of the landscape; remote sensing data that measure distributed properties such as rainfall intensity and land surface temperature; climate grids that describe current and predict climate conditions, and information from hydrologic simulation models. Taken together, these various forms of information can be considered as a description of a set of hydrologic fields that are groups of variables distributed over a domain of time and space. The fundamental principles of hydrologic information science need to be formulated around the representation of hydrologic fields, and the interaction of one form of field with another. In particular, what is needed are insights as to how to define transformations of hydrologic fields which link information at different spatial scales, and which support interpolation of information simultaneously in space and time.

  18. Accident management information needs

    SciTech Connect

    Hanson, D.J.; Ward, L.W.; Nelson, W.R.; Meyer, O.R. )

    1990-04-01

    The tables contained in this Appendix A describe the information needs for a pressurized water reactor (PWR) with a large, dry containment. To identify these information needs, the branch points in the safety objective trees were examined to decide what information is necessary to (a) determine the status of the safety functions in the plant, i.e., whether the safety functions are being adequately maintained within predetermined limits, (b) identify plant behavior (mechanisms) or precursors to this behavior which indicate that a challenge to plant safety is occurring or is imminent, and (c) select strategies that will prevent or mitigate this plant behavior and monitor the implementation and effectiveness of these strategies. The information needs for the challenges to the safety functions are not examined since the summation of the information needs for all mechanisms associated with a challenge comprise the information needs for the challenge itself.

  19. Advanced information society(4)

    NASA Astrophysics Data System (ADS)

    Hiratsuka, Shinji

    This paper proposes that, as countermeasure against the centralization of information activities at the capital Tokyo region, the construction of information infrastructure as well as urban spaces adapted for the exchange and transmission of information be needed at the local regions. Development of information activities at the local regions requires urban spaces with high amenity to promote the characteristics of the city and allow for various kinds of human activities to take place. Making it a reality, the urban spaces should allow for (1) the presentation function of information transmission; (2) the cultural invention function related to knowledge production; and (3) construction of a "Mediacity" which is the actor of information exchange and carries out the international exchange function.

  20. Canonical information analysis

    NASA Astrophysics Data System (ADS)

    Vestergaard, Jacob Schack; Nielsen, Allan Aasbjerg

    2015-03-01

    Canonical correlation analysis is an established multivariate statistical method in which correlation between linear combinations of multivariate sets of variables is maximized. In canonical information analysis introduced here, linear correlation as a measure of association between variables is replaced by the information theoretical, entropy based measure mutual information, which is a much more general measure of association. We make canonical information analysis feasible for large sample problems, including for example multispectral images, due to the use of a fast kernel density estimator for entropy estimation. Canonical information analysis is applied successfully to (1) simple simulated data to illustrate the basic idea and evaluate performance, (2) fusion of weather radar and optical geostationary satellite data in a situation with heavy precipitation, and (3) change detection in optical airborne data. The simulation study shows that canonical information analysis is as accurate as and much faster than algorithms presented in previous work, especially for large sample sizes. URL:

  1. [INFORMATION ON DRUGS].

    PubMed

    Foucher, Jean-pierre

    2015-03-01

    Drugs not being just a product like any other, prescribers, providers and patients must have access to the information relating the characteristics of such medicinal products. This information must be complete, objective and scientifically rigorous. It must be adapted to the use of the drug and be fully understandable. It should help in prescribing, expedite dispensing, and help the patient adhere to treatment. Thus, according to the recipient, the information will be different. It is the role of the pharmacist and the physician to use it for patient education. The information given must be objective. Medication guidelines published by HAS (Haute Autorité de Santé/National Health Agency) and Inserts given with the drugs should be considered the most reliable. Information can also be found in major scientific publication journals, in independent papers produced by groups of doctors and pharmacists, or in treatment guidelines. One must be very reserved about such information found on certain "Internet" sites. PMID:26606769

  2. Freedom of Information Act

    USGS Publications Warehouse

    Newman, D.J.

    2012-01-01

    The Freedom of Information Act( FOIA), 5 U.S.C.§ 552, as amended, generally provides that any person has a right to request access to Federal agency records. The USGS proactively promotes information disclosure as inherent to its mission of providing objective science to inform decisionmakers and the general public. USGS scientists disseminate up-to-date and historical scientific data that are critical to addressing national and global priorities.

  3. National Cartographic Information Center

    USGS Publications Warehouse

    ,

    1984-01-01

    The National Cartographic Information Center (NCIC) exists to help you find maps of all kinds and much of the data and materials used to compile and to print them. NCIC collects, sorts and describes all types of cartographic information from Federal, State and local government agencies and, where possible, from private companies in the mapping business. It is the public's primary source for cartographic information. (See partial list of Federal agencies and their map and other cartographic products.)

  4. HOPE information system review

    NASA Astrophysics Data System (ADS)

    Suzuki, Yoshiaki; Nishiyama, Kenji; Ono, Shuuji; Fukuda, Kouin

    1992-08-01

    An overview of the review conducted on H-2 Orbiting Plane (HOPE) is presented. A prototype model was constructed by inputting various technical information proposed by related laboratories. Especially operation flow which enables understanding of correlation between various analysis items, judgement criteria, technical data, and interfaces with others was constructed. Technical information data base and retrieval systems were studied. A Macintosh personal computer was selected for information shaping because of its excellent function, performance, operability, and software completeness.

  5. Secure Information Sharing

    2005-09-09

    We are develoing a peer-to-peer system to support secure, location independent information sharing in the scientific community. Once complete, this system will allow seamless and secure sharing of information between multiple collaborators. The owners of information will be able to control how the information is stored, managed. ano shared. In addition, users will have faster access to information updates within a collaboration. Groups collaborating on scientific experiments have a need to share information and data.more » This information and data is often represented in the form of files and database entries. In a typical scientific collaboration, there are many different locations where data would naturally be stored. This makes It difficult for collaborators to find and access the information they need. Our goal is to create a lightweight file-sharing system that makes it’easy for collaborators to find and use the data they need. This system must be easy-to-use, easy-to-administer, and secure. Our information-sharing tool uses group communication, in particular the InterGroup protocols, to reliably deliver each query to all of the current participants in a scalable manner, without having to discover all of their identities. We will use the Secure Group Layer (SGL) and Akenti to provide security to the participants of our environment, SGL will provide confldentiality, integrity, authenticity, and authorization enforcement for the InterGroup protocols and Akenti will provide access control to other resources.« less

  6. Security classification of information

    SciTech Connect

    Quist, A.S.

    1989-09-01

    Certain governmental information must be classified for national security reasons. However, the national security benefits from classifying information are usually accompanied by significant costs -- those due to a citizenry not fully informed on governmental activities, the extra costs of operating classified programs and procuring classified materials (e.g., weapons), the losses to our nation when advances made in classified programs cannot be utilized in unclassified programs. The goal of a classification system should be to clearly identify that information which must be protected for national security reasons and to ensure that information not needing such protection is not classified. This document was prepared to help attain that goal. This document is the first of a planned four-volume work that comprehensively discusses the security classification of information. Volume 1 broadly describes the need for classification, the basis for classification, and the history of classification in the United States from colonial times until World War 2. Classification of information since World War 2, under Executive Orders and the Atomic Energy Acts of 1946 and 1954, is discussed in more detail, with particular emphasis on the classification of atomic energy information. Adverse impacts of classification are also described. Subsequent volumes will discuss classification principles, classification management, and the control of certain unclassified scientific and technical information. 340 refs., 6 tabs.

  7. Health Information Systems.

    PubMed

    Sirintrapun, S Joseph; Artz, David R

    2015-06-01

    This article provides surgical pathologists an overview of health information systems (HISs): what they are, what they do, and how such systems relate to the practice of surgical pathology. Much of this article is dedicated to the electronic medical record. Information, in how it is captured, transmitted, and conveyed, drives the effectiveness of such electronic medical record functionalities. So critical is information from pathology in integrated clinical care that surgical pathologists are becoming gatekeepers of not only tissue but also information. Better understanding of HISs can empower surgical pathologists to become stakeholders who have an impact on the future direction of quality integrated clinical care.

  8. New Information Transfer Therapies

    ERIC Educational Resources Information Center

    Lorenzi, Nancy M.; Young, K. Penny

    1974-01-01

    Specific examples of dial access systems, information centers, computer assisted instruction for students, use of media communication, resource centers, and library consortia in the biomedical field. (LS)

  9. Revitalizing executive information systems.

    PubMed

    Crockett, F

    1992-01-01

    As the saying goes, "garbage in, garbage out"--and this is as true for executive information systems as for any other computer system. Crockett presents a methodology he has used with clients to help them develop more useful systems that produce higher quality information. The key is to develop performance measures based on critical success factors and stakeholder expectations and then to link them cross functionally to show how progress is being made on strategic goals. Feedback from the executive information system then informs strategy formulation, business plan development, and operational activities.

  10. Interoperability and information discovery

    USGS Publications Warehouse

    Christian, E.

    2001-01-01

    In the context of information systems, there is interoperability when the distinctions between separate information systems are not a barrier to accomplishing a task that spans those systems. Interoperability so defined implies that there are commonalities among the systems involved and that one can exploit such commonalities to achieve interoperability. The challenge of a particular interoperability task is to identify relevant commonalities among the systems involved and to devise mechanisms that exploit those commonalities. The present paper focuses on the particular interoperability task of information discovery. The Global Information Locator Service (GILS) is described as a policy, standards, and technology framework for addressing interoperable information discovery on a global and long-term basis. While there are many mechanisms for people to discover and use all manner of data and information resources, GILS initiatives exploit certain key commonalities that seem to be sufficient to realize useful information discovery interoperability at a global, long-term scale. This paper describes ten of the specific commonalities that are key to GILS initiatives. It presents some of the practical implications for organizations in various roles: content provider, system engineer, intermediary, and searcher. The paper also provides examples of interoperable information discovery as deployed using GILS in four types of information communities: bibliographic, geographic, environmental, and government.

  11. Maintenance information: value added?

    SciTech Connect

    Tomlingson, P.D.

    2005-11-01

    A study of how well the mining industry uses information management systems for maintenance suggests there's plenty of room for improvement. The article presents results of a study of 13 mining and mineral processing operations over a four year period combined with a questionnaire completed by 91 attendees at seminars. None of the organizations had a well-defined, documented maintenance program and consequently were not able to use information effectively. Packaged information system providers did not recognise that industrial maintenance organizations had different information needs.

  12. Quantum information causality.

    PubMed

    Pitalúa-García, Damián

    2013-05-24

    How much information can a transmitted physical system fundamentally communicate? We introduce the principle of quantum information causality, which states the maximum amount of quantum information that a quantum system can communicate as a function of its dimension, independently of any previously shared quantum physical resources. We present a new quantum information task, whose success probability is upper bounded by the new principle, and show that an optimal strategy to perform it combines the quantum teleportation and superdense coding protocols with a task that has classical inputs. PMID:23745844

  13. Information management for clinicians.

    PubMed

    Mehta, Neil B; Martin, Stephen A; Maypole, Jack; Andrews, Rebecca

    2016-08-01

    Clinicians are bombarded with information daily by social media, mainstream television news, e-mail, and print and online reports. They usually do not have much control over these information streams and thus are passive recipients, which means they get more noise than signal. Accessing, absorbing, organizing, storing, and retrieving useful medical information can improve patient care. The authors outline how to create a personalized stream of relevant information that can be scanned regularly and saved so that it is readily accessible. PMID:27505880

  14. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  15. Information Security Management (ISM)

    NASA Astrophysics Data System (ADS)

    Šalgovičová, Jarmila; Prajová, Vanessa

    2012-12-01

    Currently, all organizations have to tackle the issue of information security. The paper deals with various aspects of Information Security Management (ISM), including procedures, processes, organizational structures, policies and control processes. Introduction of Information Security Management should be a strategic decision. The concept and implementation of Information Security Management in an organization are determined by the corporate needs and objectives, security requirements, the processes deployed as well as the size and structure of the organization. The implementation of ISM should be carried out to the extent consistent with the needs of the organization.

  16. Value of Information spreadsheet

    DOE Data Explorer

    Trainor-Guitton, Whitney

    2014-05-12

    This spreadsheet represents the information posteriors derived from synthetic data of magnetotellurics (MT). These were used to calculate value of information of MT for geothermal exploration. Information posteriors describe how well MT was able to locate the "throat" of clay caps, which are indicative of hidden geothermal resources. This data is full explained in the peer-reviewed publication: Trainor-Guitton, W., Hoversten, G. M., Ramirez, A., Roberts, J., Júlíusson, E., Key, K., Mellors, R. (Sept-Oct. 2014) The value of spatial information for determining well placement: a geothermal example, Geophysics.

  17. CLAMS Data and Information

    Atmospheric Science Data Center

    2016-06-14

    ... Information The Chesapeake Lighthouse and Aircraft Measurements for Satellites ( CLAMS ) field campaign was conducted ... CLAMS Home Page Chesapeake Lighthouse and Aircraft An Overview of the CLAMS Experiment Readme ...

  18. Information Technology and Personal Responsibility.

    ERIC Educational Resources Information Center

    Klempner, Irving M.

    1981-01-01

    Discusses the societal implications of advanced information technology, focusing on ethical questions and the personal responsibility of the information professional in designing information systems. (FM)

  19. Information in Health Care.

    ERIC Educational Resources Information Center

    Mayeda, Tadashi A.

    The report stresses the fact that while there is unity in the continuum of medicine, information in health care is markedly different from information in medical education and research. This difference is described as an anomaly in that it appears to deviate in excess of normal variation from needs common to research and education. In substance,…

  20. Dimensions of Drug Information

    ERIC Educational Resources Information Center

    Sharp, Mark E.

    2011-01-01

    The high number, heterogeneity, and inadequate integration of drug information resources constitute barriers to many drug information usage scenarios. In the biomedical domain there is a rich legacy of knowledge representation in ontology-like structures that allows us to connect this problem both to the very mature field of library and…

  1. Oceanography Information Sources 70.

    ERIC Educational Resources Information Center

    Vetter, Richard C.

    This booklet lists oceanography information sources in the first section under industries, laboratories and departments of oceanography, and other organizations which can provide free information and materials describing programs and activities. Publications listed in the second section include these educational materials: bibliographies, career…

  2. Constructor theory of information

    PubMed Central

    Deutsch, David; Marletto, Chiara

    2015-01-01

    We propose a theory of information expressed solely in terms of which transformations of physical systems are possible and which are impossible—i.e. in constructor-theoretic terms. It includes conjectured, exact laws of physics expressing the regularities that allow information to be physically instantiated. Although these laws are directly about information, independently of the details of particular physical instantiations, information is not regarded as an a priori mathematical or logical concept, but as something whose nature and properties are determined by the laws of physics alone. This theory solves a problem at the foundations of existing information theory, namely that information and distinguishability are each defined in terms of the other. It also explains the relationship between classical and quantum information, and reveals the single, constructor-theoretic property underlying the most distinctive phenomena associated with the latter, including the lack of in-principle distinguishability of some states, the impossibility of cloning, the existence of pairs of variables that cannot simultaneously have sharp values, the fact that measurement processes can be both deterministic and unpredictable, the irreducible perturbation caused by measurement, and locally inaccessible information (as in entangled systems). PMID:25663803

  3. Air System Information Management

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.

    2004-01-01

    I flew to Washington last week, a trip rich in distributed information management. Buying tickets, at the gate, in flight, landing and at the baggage claim, myriad messages about my reservation, the weather, our flight plans, gates, bags and so forth flew among a variety of travel agency, airline and Federal Aviation Administration (FAA) computers and personnel. By and large, each kind of information ran on a particular application, often specialized to own data formats and communications network. I went to Washington to attend an FAA meeting on System-Wide Information Management (SWIM) for the National Airspace System (NAS) (http://www.nasarchitecture.faa.gov/Tutorials/NAS101.cfm). NAS (and its information infrastructure, SWIM) is an attempt to bring greater regularity, efficiency and uniformity to the collection of stovepipe applications now used to manage air traffic. Current systems hold information about flight plans, flight trajectories, weather, air turbulence, current and forecast weather, radar summaries, hazardous condition warnings, airport and airspace capacity constraints, temporary flight restrictions, and so forth. Information moving among these stovepipe systems is usually mediated by people (for example, air traffic controllers) or single-purpose applications. People, whose intelligence is critical for difficult tasks and unusual circumstances, are not as efficient as computers for tasks that can be automated. Better information sharing can lead to higher system capacity, more efficient utilization and safer operations. Better information sharing through greater automation is possible though not necessarily easy.

  4. Reverse Coherent Information

    NASA Astrophysics Data System (ADS)

    García-Patrón, Raúl; Pirandola, Stefano; Lloyd, Seth; Shapiro, Jeffrey H.

    2009-04-01

    We define a family of entanglement distribution protocols assisted by classical feedback communication that gives an operational interpretation to reverse coherent information, i.e., the symmetric counterpart of the well-known coherent information. This protocol family leads to the definition of a new entanglement distribution capacity that exceeds the unassisted entanglement distribution capacity for some interesting channels.

  5. Reverse Coherent Information

    NASA Astrophysics Data System (ADS)

    García-Patrón, Raúl; Pirandola, Stefano; Lloyd, Seth; Shapiro, Jeffrey H.

    2009-05-01

    In this Letter we define a family of entanglement distribution protocols assisted by feedback classical communication that gives an operational interpretation to reverse coherent information, i.e., the symmetric counterpart of the well-known coherent information. This leads to the definition of a new entanglement distribution capacity that exceeds the unassisted capacity for some interesting channels.

  6. Hybrid quantum information processing

    SciTech Connect

    Furusawa, Akira

    2014-12-04

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  7. Medical Information Management System

    NASA Technical Reports Server (NTRS)

    Alterescu, S.; Hipkins, K. R.; Friedman, C. A.

    1979-01-01

    On-line interactive information processing system easily and rapidly handles all aspects of data management related to patient care. General purpose system is flexible enough to be applied to other data management situations found in areas such as occupational safety data, judicial information, or personnel records.

  8. Information Literacy Assessment

    ERIC Educational Resources Information Center

    Warmkessel, Marjorie M.

    2007-01-01

    This article presents an annotated list of seven recent articles on the topic of information literacy assessment. They include: (1) "The Three Arenas of Information Literacy Assessment" (Bonnie Gratch Lindauer); (2) "Testing the Effectiveness of Interactive Multimedia for Library-User Education" (Karen Markey et al.); (3) "Assessing Auburn…

  9. Enhanced Information Exclusion Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2016-07-01

    In Hall’s reformulation of the uncertainty principle, the entropic uncertainty relation occupies a core position and provides the first nontrivial bound for the information exclusion principle. Based upon recent developments on the uncertainty relation, we present new bounds for the information exclusion relation using majorization theory and combinatoric techniques, which reveal further characteristic properties of the overlap matrix between the measurements.

  10. EDUCATIONAL INFORMATION PROJECT.

    ERIC Educational Resources Information Center

    LINDQUIST, E.F.; AND OTHERS

    TO AID DATA COLLECTION ANALYSIS, STORAGE, AND DISSEMINATION, INSTRUMENTS AND PROCEDURES WERE DEVELOPED FOR COLLECTING INFORMATION ON ALL ASPECTS OF THE EDUCATIONAL PROGRAM FOR A LARGE POPULATION OF SCHOOLS, INCLUDING INFORMATION ON INDIVIDUAL PUPILS, SCHOOL PERSONNEL, SCHOOLS, AND SCHOOL DISTRICTS. COMPUTER PROGRAMS AND DATA-PROCESSING TECHNIQUES…

  11. Information extraction system

    DOEpatents

    Lemmond, Tracy D; Hanley, William G; Guensche, Joseph Wendell; Perry, Nathan C; Nitao, John J; Kidwell, Paul Brandon; Boakye, Kofi Agyeman; Glaser, Ron E; Prenger, Ryan James

    2014-05-13

    An information extraction system and methods of operating the system are provided. In particular, an information extraction system for performing meta-extraction of named entities of people, organizations, and locations as well as relationships and events from text documents are described herein.

  12. Craft Information Sources.

    ERIC Educational Resources Information Center

    Hujsak, Mary Dodge

    1994-01-01

    Provides a brief history of the craft movement and the American Craft Council. Information available from the American Craft Information Center is described, including bibliographic sources, reference tools, periodicals and indices, exhibition catalogs, craft registry and database, clipping file, education, business, funding, and appraising…

  13. Mandarin Visual Speech Information

    ERIC Educational Resources Information Center

    Chen, Trevor H.

    2010-01-01

    While the auditory-only aspects of Mandarin speech are heavily-researched and well-known in the field, this dissertation addresses its lesser-known aspects: The visual and audio-visual perception of Mandarin segmental information and lexical-tone information. Chapter II of this dissertation focuses on the audiovisual perception of Mandarin…

  14. Faculty and Staff Information.

    ERIC Educational Resources Information Center

    Kentucky Univ., Lexington. Community Coll. System.

    This booklet is intended to acquaint faculty and staff members with general information about the University of Kentucky community College System, and to explain some of its policies affecting them. The booklet is organized into five sections. Section I contains general information about the system, gives its history, purpose, and a map of the…

  15. Parallel Information Processing.

    ERIC Educational Resources Information Center

    Rasmussen, Edie M.

    1992-01-01

    Examines parallel computer architecture and the use of parallel processors for text. Topics discussed include parallel algorithms; performance evaluation; parallel information processing; parallel access methods for text; parallel and distributed information retrieval systems; parallel hardware for text; and network models for information…

  16. Conceptualizing an Information Commons.

    ERIC Educational Resources Information Center

    Beagle, Donald

    1999-01-01

    Concepts from Strategic Alignment, a technology-management theory, are used to discuss the Information Commons as a new service-delivery model in academic libraries. The Information Commons, as a conceptual, physical, and instructional space, involves an organizational realignment from print to the digital environment. (Author)

  17. Information about Musculoskeletal Conditions

    MedlinePlus

    ... Advocacy Ancillary Services Drugs, Devices, and FDA Health Information Technology MACRA and Delivery Reform Medical Liability Reform Medicare Payment and CMS Research Appropriations See All Issues State Advocacy State ... the PAC Contact Information Donate to the PAC FAQ Member Benefits PAC ...

  18. Ethics and Information Science.

    ERIC Educational Resources Information Center

    Kochen, Manfred

    1987-01-01

    Discussion of the debate in the information science profession over whether a code of ethics would be useful presents sample issues and places them in historical and philosophical frameworks for considering the tension between knowledge and power. Practical guidelines are offered to help information professionals act out of wisdom. (Author/EM)

  19. Management of Electronic Information.

    ERIC Educational Resources Information Center

    Breaks, Michael

    This paper discusses the management of library collections of electronic information resources within the classical theoretical framework of collection development and management. The first section provides an overview of electronic information resources, including bibliographic databases, electronic journals, journal aggregation services, and…

  20. [Outdoor Ethics Information Packet.

    ERIC Educational Resources Information Center

    Izaak Walton League of America, Arlington, VA.

    This document contains information about outdoor ethics issues. The information was compiled by the Izaak Walton League of America, established in 1922 as a national nonprofit organization whose members educate the public about emerging natural resource threats and promote citizen involvement in environmental protection efforts. The league…

  1. Information Design: A Bibliography.

    ERIC Educational Resources Information Center

    Albers, Michael J.; Lisberg, Beth Conney

    2000-01-01

    Presents a 17-item annotated list of essential books on information design chosen by members of the InfoDesign e-mail list. Includes a 113-item unannotated bibliography of additional works, on topics of creativity and critical thinking; visual thinking; graphic design; infographics; information design; instructional design; interface design;…

  2. Enhanced Information Exclusion Relations.

    PubMed

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing

    2016-01-01

    In Hall's reformulation of the uncertainty principle, the entropic uncertainty relation occupies a core position and provides the first nontrivial bound for the information exclusion principle. Based upon recent developments on the uncertainty relation, we present new bounds for the information exclusion relation using majorization theory and combinatoric techniques, which reveal further characteristic properties of the overlap matrix between the measurements. PMID:27460975

  3. Hybrid quantum information processing

    NASA Astrophysics Data System (ADS)

    Furusawa, Akira

    2014-12-01

    I will briefly explain the definition and advantage of hybrid quantum information processing, which is hybridization of qubit and continuous-variable technologies. The final goal would be realization of universal gate sets both for qubit and continuous-variable quantum information processing with the hybrid technologies. For that purpose, qubit teleportation with a continuousvariable teleporter is one of the most important ingredients.

  4. Heroin. Specialized Information Service.

    ERIC Educational Resources Information Center

    Do It Now Foundation, Phoenix, AZ.

    The document presents a collection of articles about heroin. Article 1 provides general information on heroin identification, drug dependence, effects of abuse, cost, source of supply, and penalties for illegal heroin use. Article 2 gives statistical information on heroin-related deaths in the District of Columbia between 1971 and 1982. Article 3…

  5. Physics as Information Theory

    SciTech Connect

    D'Ariano, Giacomo Mauro

    2010-10-20

    The experience from Quantum Information of the last twenty years has lead theorists to look at Quantum Theory and the whole of Physics from a different angle. A new information-theoretic paradigm is emerging, long time ago prophesied by John Archibald Wheeler with his popular coinage 'It from bit'. Theoretical groups are now addressing the problem of deriving Quantum Theory from informational principles, and similar lines are investigated in new approaches to Quantum Gravity. In my talk I will review some recent advances on these lines. The general idea synthesizing the new paradigm is that there is only Quantum Theory (without quantization rules): the whole Physics--including space-time and relativity--is emergent from quantum-information processing. And, since Quantum Theory itself is made with purely informational principles, the whole Physics must be reformulated in information-theoretical terms. The review is divided into the following parts: (a) The informational axiomatization of Quantum Theory; (b) How space-time and relativistic covariance emerge from the quantum computation; (c) What is the information-theoretical meaning of inertial mass and Planck constant, and how the quantum field emerges; (d) Observational consequences: mass-dependent refraction index of vacuum. I then conclude with some possible future research lines.

  6. Navigating the Information Society.

    ERIC Educational Resources Information Center

    Kirk, Joyce

    This paper explores the idea of an information society from different perspectives, raises issues that are relevant to university libraries, and offers a way forward to some future developments. The first section provides a sketch of the information society in Australia and presents statistics on readiness, intensity, and impacts from reports…

  7. Institutionalizing Information Literacy

    ERIC Educational Resources Information Center

    Weiner, Sharon A.

    2012-01-01

    There is increasing recognition that information literacy is essential for individual and community empowerment, workforce readiness, and global competitiveness. However, there is a history of difficulty in integrating information literacy with the postsecondary educational process. This paper posits that a greater understanding of the…

  8. Marketing Information Literacy

    ERIC Educational Resources Information Center

    Seale, Maura

    2013-01-01

    In 2012, more than a decade after the original Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education (hereafter the Standards) were institutionalized as the goal of academic library instruction, the Information Literacy Competency Standards Review Task Force convened by ACRL recommended…

  9. Information: The Ultimate Frontier.

    ERIC Educational Resources Information Center

    Branscomb, Lewis M.

    1979-01-01

    A 100-year scenario of the future of information technology. To achieve inexpensive, high-speed, and small computers, new techniques are likely to replace silicon technology. The ultimate computer might be biological and patterned on DNA. Future computers will require information rather than store it. Light wave communication will broaden…

  10. The Politics of Information.

    ERIC Educational Resources Information Center

    Gell, Marilyn K.

    1979-01-01

    Introduces the political issues to be discussed at the first White House Conference on Library and Information Services which will focus on citizen control of the information flow in view of developing demands and technology. A desired political result of the conference is the creation of a citizens' lobby. (SW)

  11. A Mine of Information.

    ERIC Educational Resources Information Center

    Williams, Lisa B.

    1986-01-01

    Business researchers and marketers find certain databases useful for finding information on investments, competitors, products, and markets. Colleges can use these same databases to get background on corporate prospects. The largest data source available, DIALOG Information Services and some other databases are described. (MLW)

  12. Information System Overview.

    ERIC Educational Resources Information Center

    Burrows, J. H.

    This paper was prepared for distribution to the California Educational Administrators participating in the "Executive Information Systems" Unit of Instruction as part of the instructional program of Operation PEP (Prepare Educational Planners). The purpose of the course was to introduce some basic concepts of information systems technology to…

  13. Addressing Information Security Risk

    ERIC Educational Resources Information Center

    Qayoumi, Mohammad H.; Woody, Carol

    2005-01-01

    Good information security does not just happen--and often does not happen at all. Resources are always in short supply, and there are always other needs that seem more pressing. Why? Because information security is hard to define, the required tasks are unclear, and the work never seems to be finished. However, the loss to the organization can be…

  14. Encouraging Global Information Literacy

    ERIC Educational Resources Information Center

    Horton, Forest Woody, Jr.; Keiser, Barbie E.

    2008-01-01

    While much has been done to address the digital divide, awareness concerning the importance of information literacy (IL) has taken a back seat to a world that focuses on technology. This article traces the genesis of a global effort to address information literacy education and training beyond discussions taking place within the library and…

  15. Environmental geographic information system.

    SciTech Connect

    Peek, Dennis W; Helfrich, Donald Alan; Gorman, Susan

    2010-08-01

    This document describes how the Environmental Geographic Information System (EGIS) was used, along with externally received data, to create maps for the Site-Wide Environmental Impact Statement (SWEIS) Source Document project. Data quality among the various classes of geographic information system (GIS) data is addressed. A complete listing of map layers used is provided.

  16. Taking Information Literacy Online.

    ERIC Educational Resources Information Center

    Levesque, Carla

    2003-01-01

    Explores the process of designing, teaching, and revising an online information literacy course at St. Petersburg College (SPC) (Florida). Shares methods for encouraging participation in online courses and ways of tracking students' progress. Reports that basic computer information and literacy is now a graduation requirement at SBC. Contains…

  17. Reverse coherent information.

    PubMed

    García-Patrón, Raúl; Pirandola, Stefano; Lloyd, Seth; Shapiro, Jeffrey H

    2009-05-29

    In this Letter we define a family of entanglement distribution protocols assisted by feedback classical communication that gives an operational interpretation to reverse coherent information, i.e., the symmetric counterpart of the well-known coherent information. This leads to the definition of a new entanglement distribution capacity that exceeds the unassisted capacity for some interesting channels.

  18. What is information?

    PubMed

    Barbieri, Marcello

    2016-03-13

    Molecular biology is based on two great discoveries: the first is that genes carry hereditary information in the form of linear sequences of nucleotides; the second is that in protein synthesis a sequence of nucleotides is translated into a sequence of amino acids, a process that amounts to a transfer of information from genes to proteins. These discoveries have shown that the information of genes and proteins is the specific linear order of their sequences. This is a clear definition of information and there is no doubt that it reflects an experimental reality. What is not clear, however, is the ontological status of information, and the result is that today we have two conflicting paradigms in biology. One is the 'chemical paradigm', the idea that 'life is chemistry', or, more precisely, that 'life is an extremely complex form of chemistry'. The other is the 'information paradigm', the view that chemistry is not enough, that 'life is chemistry plus information'. This implies that there is an ontological difference between information and chemistry, a difference which is often expressed by saying that information-based processes like heredity and natural selection simply do not exist in the world of chemistry. Against this conclusion, the supporters of the chemical paradigm have argued that the concept of information is only a linguistic metaphor, a word that summarizes the result of countless underlying chemical reactions. The supporters of the information paradigm insist that information is a real and fundamental component of the living world, but have not been able to prove this point. As a result, the chemical view has not been abandoned and the two paradigms both coexist today. Here, it is shown that a solution to the ontological problem of information does exist. It comes from the idea that life is artefact-making, that genes and proteins are molecular artefacts manufactured by molecular machines and that artefacts necessarily require sequences and coding

  19. Economics of information

    NASA Astrophysics Data System (ADS)

    Noguchi, Mitsunori

    2000-06-01

    The economics of information covers a wide range of topics such as insurance, stochastic equilibria, the theory of finance (e.g. option pricing), job search, etc. In this paper, we focus on an economic model in which traders are uncertain about the true characteristics of commodities and know only the probability distributions of those characteristics. The traders acquire information on those characteristics via the actual consumption in the past and are allowed to exchange the information among themselves prior to the forthcoming trade. Though optimal consumption at the preceding trade generally alters optimal consumption at the succeeding trade, it may happen that they both coincide. We call this particular type of optimal consumption an information stable equilibrium (ISE). At an ISE, the traders gain no additional information from consumption, which is significant enough to revise their optimal choice at the succeeding trade. .

  20. Energy information directory 1994

    SciTech Connect

    Not Available

    1994-03-28

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory is a list of most Government offices and trade associations that are involved in energy matters. It does not include those DOE offices which do not deal with the public or public information.

  1. Information Technology Resources Assessment

    SciTech Connect

    Not Available

    1993-04-01

    The Information Technology Resources Assessment (ITRA) is being published as a companion document to the Department of Energy (DOE) FY 1994--FY 1998 Information Resources Management Long-Range Plan. This document represents a collaborative effort between the Office of Information Resources Management and the Office of Energy Research that was undertaken to achieve, in part, the Technology Strategic Objective of IRM Vision 21. An integral part of this objective, technology forecasting provides an understanding of the information technology horizon and presents a perspective and focus on technologies of particular interest to DOE program activities. Specifically, this document provides site planners with an overview of the status and use of new information technology for their planning consideration.

  2. Next generation information systems

    SciTech Connect

    Limback, Nathan P; Medina, Melanie A; Silva, Michelle E

    2010-01-01

    The Information Systems Analysis and Development (ISAD) Team of the Safeguards Systems Group at Los Alamos National Laboratory (LANL) has been developing web based information and knowledge management systems for sixteen years. Our vision is to rapidly and cost effectively provide knowledge management solutions in the form of interactive information systems that help customers organize, archive, post and retrieve nonproliferation and safeguards knowledge and information vital to their success. The team has developed several comprehensive information systems that assist users in the betterment and growth of their organizations and programs. Through our information systems, users are able to streamline operations, increase productivity, and share and access information from diverse geographic locations. The ISAD team is also producing interactive visual models. Interactive visual models provide many benefits to customers beyond the scope of traditional full-scale modeling. We have the ability to simulate a vision that a customer may propose, without the time constraints of traditional engineering modeling tools. Our interactive visual models can be used to access specialized training areas, controlled areas, and highly radioactive areas, as well as review site-specific training for complex facilities, and asset management. Like the information systems that the ISAD team develops, these models can be shared and accessed from any location with access to the internet. The purpose of this paper is to elaborate on the capabilities of information systems and interactive visual models as well as consider the possibility of combining the two capabilities to provide the next generation of infonnation systems. The collection, processing, and integration of data in new ways can contribute to the security of the nation by providing indicators and information for timely action to decrease the traditional and new nuclear threats. Modeling and simulation tied to comprehensive

  3. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution

  4. Potential for Inclusion of Information Encountering within Information Literacy Models

    ERIC Educational Resources Information Center

    Erdelez, Sanda; Basic, Josipa; Levitov, Deborah D.

    2011-01-01

    Introduction: Information encountering (finding information while searching for some other information), is a type of opportunistic discovery of information that complements purposeful approaches to finding information. The motivation for this paper was to determine if the current models of information literacy instruction refer to information…

  5. Beyond informed consent.

    PubMed Central

    Bhutta, Zulfiqar A.

    2004-01-01

    Although a relatively recent phenomenon, the role of informed consent in human research is central to its ethical regulation and conduct. However, guidelines often recommend procedures for obtaining informed consent (usually written consent) that are difficult to implement in developing countries. This paper reviews the guidelines for obtaining informed consent and also discusses prevailing views on current controversies, ambiguities and problems with these guidelines and suggests potential solutions. The emphasis in most externally sponsored research projects in developing countries is on laborious documentation of several mechanical aspects of the research process rather than on assuring true comprehension and voluntary participation. The onus for the oversight of this process is often left to overworked and ill-equipped local ethics review committees. Current guidelines and processes for obtaining informed consent should be reviewed with the specific aim of developing culturally appropriate methods of sharing information about the research project and obtaining and documenting consent that is truly informed. Further research is needed to examine the validity and user friendliness of innovations in information sharing procedures for obtaining consent in different cultural settings. PMID:15643799

  6. Ignorance, information and autonomy.

    PubMed

    Harris, J; Keywood, K

    2001-09-01

    People have a powerful interest in genetic privacy and its associated claim to ignorance, and some equally powerful desires to be shielded from disturbing information are often voiced. We argue, however, that there is no such thing as a right to remain in ignorance, where a fight is understood as an entitlement that trumps competing claims. This does not of course mean that information must always be forced upon unwilling recipients, only that there is no prima facie entitlement to be protected from true or honest information about oneself. Any claims to be shielded from information about the self must compete on equal terms with claims based in the rights and interests of others. In balancing the weight and importance of rival considerations about giving or withholding information, if rights claims have any place, rights are more likely to be defensible on the side of honest communication of information rather than in defence of ignorance. The right to free speech and the right to decline to accept responsibility to take decisions for others imposed by those others seem to us more plausible candidates for fully fledged rights in this field than any purported right to ignorance. Finally, and most importantly, if the right to autonomy is invoked, a proper understanding of the distinction between claims to liberty and claims to autonomy show that the principle of autonomy, as it is understood in contemporary social ethics and English law, supports the giving rather than the withholding of information in most circumstances. PMID:11808677

  7. Acting to gain information

    NASA Technical Reports Server (NTRS)

    Rosenchein, Stanley J.; Burns, J. Brian; Chapman, David; Kaelbling, Leslie P.; Kahn, Philip; Nishihara, H. Keith; Turk, Matthew

    1993-01-01

    This report is concerned with agents that act to gain information. In previous work, we developed agent models combining qualitative modeling with real-time control. That work, however, focused primarily on actions that affect physical states of the environment. The current study extends that work by explicitly considering problems of active information-gathering and by exploring specialized aspects of information-gathering in computational perception, learning, and language. In our theoretical investigations, we analyzed agents into their perceptual and action components and identified these with elements of a state-machine model of control. The mathematical properties of each was developed in isolation and interactions were then studied. We considered the complexity dimension and the uncertainty dimension and related these to intelligent-agent design issues. We also explored active information gathering in visual processing. Working within the active vision paradigm, we developed a concept of 'minimal meaningful measurements' suitable for demand-driven vision. We then developed and tested an architecture for ongoing recognition and interpretation of visual information. In the area of information gathering through learning, we explored techniques for coping with combinatorial complexity. We also explored information gathering through explicit linguistic action by considering the nature of conversational rules, coordination, and situated communication behavior.

  8. Towards an information ecology

    NASA Astrophysics Data System (ADS)

    Truszkowski, Walt; Moore, Mike

    1993-08-01

    This paper presents a unified approach to science information systems that is explicitly designed to support change in the system, data, and user community. The approach is unified in that it supports the features found in current systems like NASA's ADS and ESA's ESIS through a single construct, the information agent. The approach is novel in that it uses the concepts and mechanism from theoretical ecology to understand and implement the information agents, their interactions, and their ability to adapt to and exploit change to better support users.

  9. Weather Information System

    NASA Technical Reports Server (NTRS)

    1995-01-01

    WxLink is an aviation weather system based on advanced airborne sensors, precise positioning available from the satellite-based Global Positioning System, cockpit graphics and a low-cost datalink. It is a two-way system that uplinks weather information to the aircraft and downlinks automatic pilot reports of weather conditions aloft. Manufactured by ARNAV Systems, Inc., the original technology came from Langley Research Center's cockpit weather information system, CWIN (Cockpit Weather INformation). The system creates radar maps of storms, lightning and reports of surface observations, offering improved safety, better weather monitoring and substantial fuel savings.

  10. Gravity from quantum information

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Weon; Kim, Hyeong-Chan; Lee, Jungjai

    2013-09-01

    We suggest that the Einstein equation can be derived from Landauer's principle applied to an information erasing process at a local Rindler horizon and Jacobson's idea linking the Einstein equation with thermodynamics. When matter crosses the horizon, information on the matter disappears, and the horizon entanglement entropy increases to compensate for the entropy reduction. The Einstein equation describes an information-energy relation during this process, which implies that entropic gravity is related to the quantum entanglement of the vacuum and has a quantuminformation theoretic origin.

  11. Advanced information society (11)

    NASA Astrophysics Data System (ADS)

    Nawa, Kotaro

    Late in the 1980's the information system of Japanese corporation has been operated strategically to strengthen its competitive position in markets rather than to make corporate management efficient. Therefore, information-oriented policy in the corporation is making remarkable progress. This policy expands the intelligence activity in the corporation and also leads to the extension of the market in an information industry. In this environment closed corporate system is transformed into open one. For this system network and database are important managerial resources.

  12. ENERGY INFORMATION CLEARINGHOUSE

    SciTech Connect

    Ron Johnson

    2003-10-01

    Alaska has spent billions of dollars on various energy-related activities over the past several decades, with projects ranging from smaller utilities used to produce heat and power in rural Alaska to huge endeavors relating to exported resources. To help provide information for end users, utilities, decision makers, and the general public, the Institute of Northern Engineering at UAF established an Energy Information Clearinghouse accessible through the worldwide web in 2002. This clearinghouse contains information on energy resources, end use technologies, policies, related environmental issues, emerging technologies, efficiency, storage, demand side management, and developments in Alaska.

  13. Can randomization be informative?

    NASA Astrophysics Data System (ADS)

    Pereira, Carlos A. B.; Campos, Thiago F.; Silva, Gustavo M.; Wechsler, Sergio

    2012-10-01

    In this paper the Pair of Siblings Paradox introduced by Pereira [1] is extended by considering more than two children and more than one child observed for gender. We follow the same lines of Wechsler et al. [2] that generalizes the three prisoners' dilemma, introduced by Gardner [3]. This paper's conjecture is that the Pair of Siblings and the Three Prisoners dilemma are dual paradoxes. Looking at possible likelihoods, the sure (randomized) selection for the former is non informative (informative), the opposite that holds for the latter. This situation is maintained for generalizations. Non informative likelihood here means that prior and posterior are equal.

  14. Health information services technologies.

    PubMed

    McCracken, S B

    1996-01-01

    Increasing demands for provider profiling have led to the growth of health information services units within payers and health plans. An important decision faced by these groups is whether to buy or build the information infrastructure necessary to support the activities of the department. The article offers an overview of a system that was collaboratively designed and built by Blue Cross and Blue Shield of Iowa and the Dartmouth Medical School. A case study illustrating the flexibility of the information system in adapting ambulatory care groups to the fee-for-service payer industry is reviewed. PMID:10154373

  15. Data Rich, Information Poor

    SciTech Connect

    Kaplan, P.G.; Rautman, C.A.

    1998-11-09

    Surviving in a data-rich environment means understanding the difference between data and information. This paper reviews an environmental case study that illustrates that understanding and shows its importance. In this study, a decision problem was stated in terms of au economic-objective fimction. The function contains a term that defines the stochastic relationship between the decision and the information obtained during field chamctetition for an environmental contaminant. Data is defied as samples drawn or experimental realizations of a mudom fimction. Information is defined as the quantitative change in the value of the objective fiction as a result of the sample.

  16. Information Literacy in the Workplace.

    ERIC Educational Resources Information Center

    Oman, Julie N.

    2001-01-01

    Discusses the need for information literacy in the workplace in the face of information overload and problems related to end user information skills. Explains how to improve information literacy by assessing the organization's infrastructure, including available information technologies and information processes; considering demographics; and…

  17. Physiological Information Database (PID)

    EPA Science Inventory

    EPA has developed a physiological information database (created using Microsoft ACCESS) intended to be used in PBPK modeling. The database contains physiological parameter values for humans from early childhood through senescence as well as similar data for laboratory animal spec...

  18. The Information Gap.

    ERIC Educational Resources Information Center

    Sharp, Bill; Appleton, Elaine

    1993-01-01

    Addresses the misconception that the ecosystems involving plants and animals in our national parks are thoroughly monitored. Discusses research and other programs designed to inventory species throughout the national parks' and to inform the national parks concerning its ecosystems. (MDH)

  19. Financial Assistance Information

    MedlinePlus

    ... Sites: Genetic and Rare Diseases Information Center Get Email Updates Advancing human health through genomics research Privacy Copyright Contact Accessibility Plug-ins Site Map Staff Directory FOIA Share Top

  20. Value of Information References

    DOE Data Explorer

    Morency, Christina

    2014-12-12

    This file contains a list of relevant references on value of information (VOI) in RIS format. VOI provides a quantitative analysis to evaluate the outcome of the combined technologies (seismology, hydrology, geodesy) used to monitor Brady's Geothermal Field.

  1. PREFACE: Quantum information processing

    NASA Astrophysics Data System (ADS)

    Briggs, Andrew; Ferry, David; Stoneham, Marshall

    2006-05-01

    Microelectronics and the classical information technologies transformed the physics of semiconductors. Photonics has given optical materials a new direction. Quantum information technologies, we believe, will have immense impact on condensed matter physics. The novel systems of quantum information processing need to be designed and made. Their behaviours must be manipulated in ways that are intrinsically quantal and generally nanoscale. Both in this special issue and in previous issues (see e.g., Spiller T P and Munro W J 2006 J. Phys.: Condens. Matter 18 V1-10) we see the emergence of new ideas that link the fundamentals of science to the pragmatism of market-led industry. We hope these papers will be followed by many others on quantum information processing in the Journal of Physics: Condensed Matter.

  2. Zika Travel Information

    MedlinePlus

    ... Citizens and Residents Living in Areas with Ongoing Zika Virus Transmission Guidelines for Travelers Visiting Friends and Family ... with Zika . For the most current information about Zika virus, please visit CDC’s Zika website . Traveling soon? Get ...

  3. Retrieving Patent Information Online

    ERIC Educational Resources Information Center

    Kaback, Stuart M.

    1978-01-01

    This paper discusses patent information retrieval from online files in terms of types of questions, file contents, coverage, timeliness, and other file variations. CLAIMS, Derwent, WPI, APIPAT and Chemical Abstracts Service are described. (KP)

  4. Information Technology for Education.

    ERIC Educational Resources Information Center

    Snyder, Cathrine E.; And Others

    1990-01-01

    Eight papers address technological, behavioral, and philosophical aspects of the application of information technology to training. Topics include instructional technology centers, intelligent training systems, distance learning, automated task analysis, training system selection, the importance of instructional methods, formative evaluation and…

  5. The atomization of information.

    PubMed Central

    Aveney, B; Conneen, S

    1986-01-01

    Electronic information may be transmitted over communications channels or distributed in electronic packages. New means of distribution lead to new forms of intellectual organization. Some implications for researchers, scholars, publishers, and libraries are discussed. PMID:3511991

  6. Energy information directory 1998

    SciTech Connect

    1998-11-01

    The National Energy Information Center (NEIC), as part of its mission, provides energy information and referral assistance to Federal, State, and local governments, the academic community, business and industrial organizations, and the general public. The two principal functions related to this task are: (1) operating a general access telephone line, and (2) responding to energy-related correspondence addressed to the Energy Information Administration (EIA). The Energy Information Directory was developed to assist the NEIC staff, as well as other Department of Energy (DOE) staff, in directing inquiries to the proper offices within DOE, other Federal agencies, or energy-related trade associations. The Directory lists most Government offices and trade associations that are involved in energy matters.

  7. TES Data and Information

    Atmospheric Science Data Center

    2016-09-07

    ... and Information The Tropospheric Emission Spectrometer ( TES ) launched into sun-synchronous orbit aboard Aura, the ... TES is a high-resolution imaging infrared Fourier-transform spectrometer that operates in both nadir and limb-sounding modes. TES global ...

  8. Quantum information and computation

    SciTech Connect

    Bennett, C.H.

    1995-10-01

    A new quantum theory of communication and computation is emerging, in which the stuff transmitted or processed is not classical information, but arbitrary superpositions of quantum states. {copyright} 1995 {ital American} {ital Institute} {ital of} {ital Physics}.

  9. Tuberculosis: General Information

    MedlinePlus

    TB Elimination Tuberculosis: General Information What is TB? Tuberculosis (TB) is a disease caused by germs that are spread from person ... Viral Hepatitis, STD, and TB Prevention Division of Tuberculosis Elimination CS227840_A What Does a Positive Test ...

  10. Information retrieval system

    NASA Technical Reports Server (NTRS)

    Berg, R. F.; Holcomb, J. E.; Kelroy, E. A.; Levine, D. A.; Mee, C., III

    1970-01-01

    Generalized information storage and retrieval system capable of generating and maintaining a file, gathering statistics, sorting output, and generating final reports for output is reviewed. File generation and file maintenance programs written for the system are general purpose routines.

  11. Alternative fuel information sources

    SciTech Connect

    Not Available

    1994-06-01

    This short document contains a list of more than 200 US sources of information (Name, address, phone number, and sometimes contact) related to the use of alternative fuels in automobiles and trucks. Electric-powered cars are also included.

  12. Carbon Monoxide Information Center

    MedlinePlus

    ... Monoxide Carbon Monoxide Information Center En Español The Invisible Killer Carbon monoxide, also known as CO, is called the "Invisible Killer" because it's a colorless, odorless, poisonous gas. ...

  13. Information and Learning Technology.

    ERIC Educational Resources Information Center

    Clarke, Alan

    1997-01-01

    Assessment of the cost effectiveness of information technologies should pay attention to such benefits as learner motivation, individualization, efficiency, self-assessment, retention, consistent quality, and distributive power. (SK)

  14. Energy Information Online

    ERIC Educational Resources Information Center

    Miller, Betty

    1978-01-01

    The need to search several files to obtain the maximum information on energy is emphasized. Energyline, APILIT, APIPAT, PIE News, TULSA, NTIS, and Chemical Abstracts Condensates files are described. (KP)

  15. NARSTO Data and Information

    Atmospheric Science Data Center

    2016-02-16

    NARSTO Data and Information NARSTO  (formerly North American Research ... effective strategies for local and regional air-pollution management. Data products from local, regional, and international monitoring ...

  16. Global Resources Information System

    NASA Technical Reports Server (NTRS)

    Estes, J. E.; Star, J. L. (Principal Investigator); Cosentino, M. J.; Mann, L. J.

    1984-01-01

    The basic design criteria and operating characteristics of a Global Resources Information System GRIS are defined. Researchers are compiling background material and aiding JPL personnel in this project definition phase of GRIS. A bibliography of past studies and current work on large scale information systems is compiled. The material in this bibliography will be continuously updated throughout the lifetime of this grant. Project management, systems architecture, and user applications are also discussed.

  17. Information Activities in Australia

    NASA Astrophysics Data System (ADS)

    Hanada, Takeyoshi

    The last few years have seen an explosive growth in database and computer networking activities in Australia. At present there are six major information networks in Australia, which carry more than 400 locally produced databases and many others from overseas. AUSINET databases are exemplified. MIDAS (Multi-mode International Data Aquisition System) provides lower cost access to overseas databases than before. The paper also gives brief outline of various bodies which relate to information and library policy in Australia and regional cooperative activities.

  18. Asymmetric information and economics

    NASA Astrophysics Data System (ADS)

    Frieden, B. Roy; Hawkins, Raymond J.

    2010-01-01

    We present an expression of the economic concept of asymmetric information with which it is possible to derive the dynamical laws of an economy. To illustrate the utility of this approach we show how the assumption of optimal information flow leads to a general class of investment strategies including the well-known Q theory of Tobin. Novel consequences of this formalism include a natural definition of market efficiency and an uncertainty principle relating capital stock and investment flow.

  19. Information systems definition architecture

    SciTech Connect

    Calapristi, A.J.

    1996-06-20

    The Tank Waste Remediation System (TWRS) Information Systems Definition architecture evaluated information Management (IM) processes in several key organizations. The intent of the study is to identify improvements in TWRS IM processes that will enable better support to the TWRS mission, and accommodate changes in TWRS business environment. The ultimate goals of the study are to reduce IM costs, Manage the configuration of TWRS IM elements, and improve IM-related process performance.

  20. Information matters: Beyond OASIS

    SciTech Connect

    Ramesh, V.C.

    1997-03-01

    Congestion on the Internet and the overwhelming volume of information are two important issues to consider as energy markets move towards a common real-time information infrastructure that permits trading of both electricity and natural gas. The rush to comply with the Phase I OASIS mandate should not cloud the vision needed to design the Energy Real-time Information System (ERIS) of the near future. Federal Energy Regulatory Commission Order 889 has mandated the establishment of Open Access Same-time Information Systems (OASIS) using the Internet infrastructure. Each Transmission Provider (TP) is required to establish an OASIS node, either alone or in conjunction with other TPs. A Responsible Party (RP), such as an Independent System Operator (ISO), can manage a node on behalf of many TPs. It is anticipated that such OASIS nodes would be regional in nature, with about 10 to 15 nodes nationwide. Transmission Customers (TCs) can access an OASIS node using a Web browser and request firm/non-firm transmission reservations. A TP is required to provide on the OASIS frequently updated information on the Available Transmission Capability (ATC) along certain {open_quotes}paths{close_quotes} in its system. This article points out that the twin problems of Internet congestion and information overload can cause problems for TCs that rely on the {open_quotes}standard{close_quotes} access mode enabled by the S&CP document. These problems will likely become more acute as the electricity industry moves towards Phase II implementation and beyond. The convergence of the information needs of the electricity and natural gas industries will likely result in a large-scale common infrastructure. The Energy Real-time Information System (ERIS) of the near future will require a sophisticated infrastructure based on emerging Internet technologies.