Sample records for covariance libraries based

  1. Covariance Applications in Criticality Safety, Light Water Reactor Analysis, and Spent Fuel Characterization

    DOE PAGES

    Williams, M. L.; Wiarda, D.; Ilas, G.; ...

    2014-06-15

    Recently, we processed a new covariance data library based on ENDF/B-VII.1 for the SCALE nuclear analysis code system. The multigroup covariance data are discussed here, along with testing and application results for critical benchmark experiments. Moreover, the cross section covariance library, along with covariances for fission product yields and decay data, is used to compute uncertainties in the decay heat produced by a burned reactor fuel assembly.

  2. AFCI-2.0 Library of Neutron Cross Section Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Herman,M.; Oblozinsky,P.

    2011-06-26

    Neutron cross section covariance library has been under development by BNL-LANL collaborative effort over the last three years. The primary purpose of the library is to provide covariances for the Advanced Fuel Cycle Initiative (AFCI) data adjustment project, which is focusing on the needs of fast advanced burner reactors. The covariances refer to central values given in the 2006 release of the U.S. neutron evaluated library ENDF/B-VII. The preliminary version (AFCI-2.0beta) has been completed in October 2010 and made available to the users for comments. In the final 2.0 release, covariances for a few materials were updated, in particular newmore » LANL evaluations for {sup 238,240}Pu and {sup 241}Am were adopted. BNL was responsible for covariances for structural materials and fission products, management of the library and coordination of the work, while LANL was in charge of covariances for light nuclei and for actinides.« less

  3. Fission yield covariances for JEFF: A Bayesian Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Leray, Olivier; Rochman, Dimitri; Fleming, Michael; Sublet, Jean-Christophe; Koning, Arjan; Vasiliev, Alexander; Ferroukhi, Hakim

    2017-09-01

    The JEFF library does not contain fission yield covariances, but simply best estimates and uncertainties. This situation is not unique as all libraries are facing this deficiency, firstly due to the lack of a defined format. An alternative approach is to provide a set of random fission yields, themselves reflecting covariance information. In this work, these random files are obtained combining the information from the JEFF library (fission yields and uncertainties) and the theoretical knowledge from the GEF code. Examples of this method are presented for the main actinides together with their impacts on simple burn-up and decay heat calculations.

  4. On the treatment of ill-conditioned cases in the Monte Carlo library least-squares approach for inverse radiation analyzers

    NASA Astrophysics Data System (ADS)

    Meric, Ilker; Johansen, Geir A.; Holstad, Marie B.; Mattingly, John; Gardner, Robin P.

    2012-05-01

    Prompt gamma-ray neutron activation analysis (PGNAA) has been and still is one of the major methods of choice for the elemental analysis of various bulk samples. This is mostly due to the fact that PGNAA offers a rapid, non-destructive and on-line means of sample interrogation. The quantitative analysis of the prompt gamma-ray data could, on the other hand, be performed either through the single peak analysis or the so-called Monte Carlo library least-squares (MCLLS) approach, of which the latter has been shown to be more sensitive and more accurate than the former. The MCLLS approach is based on the assumption that the total prompt gamma-ray spectrum of any sample is a linear combination of the contributions from the individual constituents or libraries. This assumption leads to, through the minimization of the chi-square value, a set of linear equations which has to be solved to obtain the library multipliers, a process that involves the inversion of the covariance matrix. The least-squares solution may be extremely uncertain due to the ill-conditioning of the covariance matrix. The covariance matrix will become ill-conditioned whenever, in the subsequent calculations, two or more libraries are highly correlated. The ill-conditioning will also be unavoidable whenever the sample contains trace amounts of certain elements or elements with significantly low thermal neutron capture cross-sections. In this work, a new iterative approach, which can handle the ill-conditioning of the covariance matrix, is proposed and applied to a hydrocarbon multiphase flow problem in which the parameters of interest are the separate amounts of the oil, gas, water and salt phases. The results of the proposed method are also compared with the results obtained through the implementation of a well-known regularization method, the truncated singular value decomposition. Final calculations indicate that the proposed approach would be able to treat ill-conditioned cases appropriately.

  5. Calculations of Nuclear Astrophysics and Californium Fission Neutron Spectrum Averaged Cross Section Uncertainties Using ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0 and Low-fidelity Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B., E-mail: pritychenko@bnl.gov

    Nuclear astrophysics and californium fission neutron spectrum averaged cross sections and their uncertainties for ENDF materials have been calculated. Absolute values were deduced with Maxwellian and Mannhart spectra, while uncertainties are based on ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0 and Low-Fidelity covariances. These quantities are compared with available data, independent benchmarks, EXFOR library, and analyzed for a wide range of cases. Recommendations for neutron cross section covariances are given and implications are discussed.

  6. Monte Carlo Calculation of Thermal Neutron Inelastic Scattering Cross Section Uncertainties by Sampling Perturbed Phonon Spectra

    NASA Astrophysics Data System (ADS)

    Holmes, Jesse Curtis

    Nuclear data libraries provide fundamental reaction information required by nuclear system simulation codes. The inclusion of data covariances in these libraries allows the user to assess uncertainties in system response parameters as a function of uncertainties in the nuclear data. Formats and procedures are currently established for representing covariances for various types of reaction data in ENDF libraries. This covariance data is typically generated utilizing experimental measurements and empirical models, consistent with the method of parent data production. However, ENDF File 7 thermal neutron scattering library data is, by convention, produced theoretically through fundamental scattering physics model calculations. Currently, there is no published covariance data for ENDF File 7 thermal libraries. Furthermore, no accepted methodology exists for quantifying or representing uncertainty information associated with this thermal library data. The quality of thermal neutron inelastic scattering cross section data can be of high importance in reactor analysis and criticality safety applications. These cross sections depend on the material's structure and dynamics. The double-differential scattering law, S(alpha, beta), tabulated in ENDF File 7 libraries contains this information. For crystalline solids, S(alpha, beta) is primarily a function of the material's phonon density of states (DOS). Published ENDF File 7 libraries are commonly produced by calculation and processing codes, such as the LEAPR module of NJOY, which utilize the phonon DOS as the fundamental input for inelastic scattering calculations to directly output an S(alpha, beta) matrix. To determine covariances for the S(alpha, beta) data generated by this process, information about uncertainties in the DOS is required. The phonon DOS may be viewed as a probability density function of atomic vibrational energy states that exist in a material. Probable variation in the shape of this spectrum may be established that depends on uncertainties in the physics models and methodology employed to produce the DOS. Through Monte Carlo sampling of perturbations from the reference phonon spectrum, an S(alpha, beta) covariance matrix may be generated. In this work, density functional theory and lattice dynamics in the harmonic approximation are used to calculate the phonon DOS for hexagonal crystalline graphite. This form of graphite is used as an example material for the purpose of demonstrating procedures for analyzing, calculating and processing thermal neutron inelastic scattering uncertainty information. Several sources of uncertainty in thermal neutron inelastic scattering calculations are examined, including sources which cannot be directly characterized through a description of the phonon DOS uncertainty, and their impacts are evaluated. Covariances for hexagonal crystalline graphite S(alpha, beta) data are quantified by coupling the standard methodology of LEAPR with a Monte Carlo sampling process. The mechanics of efficiently representing and processing this covariance information is also examined. Finally, with appropriate sensitivity information, it is shown that an S(alpha, beta) covariance matrix can be propagated to generate covariance data for integrated cross sections, secondary energy distributions, and coupled energy-angle distributions. This approach enables a complete description of thermal neutron inelastic scattering cross section uncertainties which may be employed to improve the simulation of nuclear systems.

  7. Development and Testing of Neutron Cross Section Covariance Data for SCALE 6.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Williams, Mark L; Wiarda, Dorothea

    2015-01-01

    Neutron cross-section covariance data are essential for many sensitivity/uncertainty and uncertainty quantification assessments performed both within the TSUNAMI suite and more broadly throughout the SCALE code system. The release of ENDF/B-VII.1 included a more complete set of neutron cross-section covariance data: these data form the basis for a new cross-section covariance library to be released in SCALE 6.2. A range of testing is conducted to investigate the properties of these covariance data and ensure that the data are reasonable. These tests include examination of the uncertainty in critical experiment benchmark model k eff values due to nuclear data uncertainties, asmore » well as similarity assessments of irradiated pressurized water reactor (PWR) and boiling water reactor (BWR) fuel with suites of critical experiments. The contents of the new covariance library, the testing performed, and the behavior of the new covariance data are described in this paper. The neutron cross-section covariances can be combined with a sensitivity data file generated using the TSUNAMI suite of codes within SCALE to determine the uncertainty in system k eff caused by nuclear data uncertainties. The Verified, Archived Library of Inputs and Data (VALID) maintained at Oak Ridge National Laboratory (ORNL) contains over 400 critical experiment benchmark models, and sensitivity data are generated for each of these models. The nuclear data uncertainty in k eff is generated for each experiment, and the resulting uncertainties are tabulated and compared to the differences in measured and calculated results. The magnitude of the uncertainty for categories of nuclides (such as actinides, fission products, and structural materials) is calculated for irradiated PWR and BWR fuel to quantify the effect of covariance library changes between the SCALE 6.1 and 6.2 libraries. One of the primary applications of sensitivity/uncertainty methods within SCALE is the assessment of similarities between benchmark experiments and safety applications. This is described by a c k value for each experiment with each application. Several studies have analyzed typical c k values for a range of critical experiments compared with hypothetical irradiated fuel applications. The c k value is sensitive to the cross-section covariance data because the contribution of each nuclide is influenced by its uncertainty; large uncertainties indicate more likely bias sources and are thus given more weight. Changes in c k values resulting from different covariance data can be used to examine and assess underlying data changes. These comparisons are performed for PWR and BWR fuel in storage and transportation systems.« less

  8. Random sampling and validation of covariance matrices of resonance parameters

    NASA Astrophysics Data System (ADS)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  9. Implementation of the NMR CHEmical Shift Covariance Analysis (CHESCA): A Chemical Biologist's Approach to Allostery.

    PubMed

    Boulton, Stephen; Selvaratnam, Rajeevan; Ahmed, Rashik; Melacini, Giuseppe

    2018-01-01

    Mapping allosteric sites is emerging as one of the central challenges in physiology, pathology, and pharmacology. Nuclear Magnetic Resonance (NMR) spectroscopy is ideally suited to map allosteric sites, given its ability to sense at atomic resolution the dynamics underlying allostery. Here, we focus specifically on the NMR CHEmical Shift Covariance Analysis (CHESCA), in which allosteric systems are interrogated through a targeted library of perturbations (e.g., mutations and/or analogs of the allosteric effector ligand). The atomic resolution readout for the response to such perturbation library is provided by NMR chemical shifts. These are then subject to statistical correlation and covariance analyses resulting in clusters of allosterically coupled residues that exhibit concerted responses to the common set of perturbations. This chapter provides a description of how each step in the CHESCA is implemented, starting from the selection of the perturbation library and ending with an overview of different clustering options.

  10. Statistical uncertainty analysis applied to the DRAGONv4 code lattice calculations and based on JENDL-4 covariance data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hernandez-Solis, A.; Demaziere, C.; Ekberg, C.

    2012-07-01

    In this paper, multi-group microscopic cross-section uncertainty is propagated through the DRAGON (Version 4) lattice code, in order to perform uncertainty analysis on k{infinity} and 2-group homogenized macroscopic cross-sections predictions. A statistical methodology is employed for such purposes, where cross-sections of certain isotopes of various elements belonging to the 172 groups DRAGLIB library format, are considered as normal random variables. This library is based on JENDL-4 data, because JENDL-4 contains the largest amount of isotopic covariance matrixes among the different major nuclear data libraries. The aim is to propagate multi-group nuclide uncertainty by running the DRAGONv4 code 500 times, andmore » to assess the output uncertainty of a test case corresponding to a 17 x 17 PWR fuel assembly segment without poison. The chosen sampling strategy for the current study is Latin Hypercube Sampling (LHS). The quasi-random LHS allows a much better coverage of the input uncertainties than simple random sampling (SRS) because it densely stratifies across the range of each input probability distribution. Output uncertainty assessment is based on the tolerance limits concept, where the sample formed by the code calculations infers to cover 95% of the output population with at least a 95% of confidence. This analysis is the first attempt to propagate parameter uncertainties of modern multi-group libraries, which are used to feed advanced lattice codes that perform state of the art resonant self-shielding calculations such as DRAGONv4. (authors)« less

  11. Improvement of Modeling HTGR Neutron Physics by Uncertainty Analysis with the Use of Cross-Section Covariance Information

    NASA Astrophysics Data System (ADS)

    Boyarinov, V. F.; Grol, A. V.; Fomichenko, P. A.; Ternovykh, M. Yu

    2017-01-01

    This work is aimed at improvement of HTGR neutron physics design calculations by application of uncertainty analysis with the use of cross-section covariance information. Methodology and codes for preparation of multigroup libraries of covariance information for individual isotopes from the basic 44-group library of SCALE-6 code system were developed. A 69-group library of covariance information in a special format for main isotopes and elements typical for high temperature gas cooled reactors (HTGR) was generated. This library can be used for estimation of uncertainties, associated with nuclear data, in analysis of HTGR neutron physics with design codes. As an example, calculations of one-group cross-section uncertainties for fission and capture reactions for main isotopes of the MHTGR-350 benchmark, as well as uncertainties of the multiplication factor (k∞) for the MHTGR-350 fuel compact cell model and fuel block model were performed. These uncertainties were estimated by the developed technology with the use of WIMS-D code and modules of SCALE-6 code system, namely, by TSUNAMI, KENO-VI and SAMS. Eight most important reactions on isotopes for MHTGR-350 benchmark were identified, namely: 10B(capt), 238U(n,γ), ν5, 235U(n,γ), 238U(el), natC(el), 235U(fiss)-235U(n,γ), 235U(fiss).

  12. Nuclear Data Uncertainties for Typical LWR Fuel Assemblies and a Simple Reactor Core

    NASA Astrophysics Data System (ADS)

    Rochman, D.; Leray, O.; Hursin, M.; Ferroukhi, H.; Vasiliev, A.; Aures, A.; Bostelmann, F.; Zwermann, W.; Cabellos, O.; Diez, C. J.; Dyrda, J.; Garcia-Herranz, N.; Castro, E.; van der Marck, S.; Sjöstrand, H.; Hernandez, A.; Fleming, M.; Sublet, J.-Ch.; Fiorito, L.

    2017-01-01

    The impact of the current nuclear data library covariances such as in ENDF/B-VII.1, JEFF-3.2, JENDL-4.0, SCALE and TENDL, for relevant current reactors is presented in this work. The uncertainties due to nuclear data are calculated for existing PWR and BWR fuel assemblies (with burn-up up to 40 GWd/tHM, followed by 10 years of cooling time) and for a simplified PWR full core model (without burn-up) for quantities such as k∞, macroscopic cross sections, pin power or isotope inventory. In this work, the method of propagation of uncertainties is based on random sampling of nuclear data, either from covariance files or directly from basic parameters. Additionally, possible biases on calculated quantities are investigated such as the self-shielding treatment. Different calculation schemes are used, based on CASMO, SCALE, DRAGON, MCNP or FISPACT-II, thus simulating real-life assignments for technical-support organizations. The outcome of such a study is a comparison of uncertainties with two consequences. One: although this study is not expected to lead to similar results between the involved calculation schemes, it provides an insight on what can happen when calculating uncertainties and allows to give some perspectives on the range of validity on these uncertainties. Two: it allows to dress a picture of the state of the knowledge as of today, using existing nuclear data library covariances and current methods.

  13. Development of High Throughput Process for Constructing 454 Titanium and Illumina Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deshpande, Shweta; Hack, Christopher; Tang, Eric

    2010-05-28

    We have developed two processes with the Biomek FX robot to construct 454 titanium and Illumina libraries in order to meet the increasing library demands. All modifications in the library construction steps were made to enable the adaptation of the entire processes to work with the 96-well plate format. The key modifications include the shearing of DNA with Covaris E210 and the enzymatic reaction cleaning and fragment size selection with SPRI beads and magnetic plate holders. The construction of 96 Titanium libraries takes about 8 hours from sheared DNA to ssDNA recovery. The processing of 96 Illumina libraries takes lessmore » time than that of the Titanium library process. Although both processes still require manual transfer of plates from robot to other work stations such as thermocyclers, these robotic processes represent about 12- to 24-folds increase of library capacity comparing to the manual processes. To enable the sequencing of many libraries in parallel, we have also developed sets of molecular barcodes for both library types. The requirements for the 454 library barcodes include 10 bases, 40-60percent GC, no consecutive same base, and no less than 3 bases difference between barcodes. We have used 96 of the resulted 270 barcodes to construct libraries and pool to test the ability of accurately assigning reads to the right samples. When allowing 1 base error occurred in the 10 base barcodes, we could assign 99.6percent of the total reads and 100percent of them were uniquely assigned. As for the Illumina barcodes, the requirements include 4 bases, balanced GC, and at least 2 bases difference between barcodes. We have begun to assess the ability to assign reads after pooling different number of libraries. We will discuss the progress and the challenges of these scale-up processes.« less

  14. Inventory Uncertainty Quantification using TENDL Covariance Data in Fispact-II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eastwood, J.W.; Morgan, J.G.; Sublet, J.-Ch., E-mail: jean-christophe.sublet@ccfe.ac.uk

    2015-01-15

    The new inventory code Fispact-II provides predictions of inventory, radiological quantities and their uncertainties using nuclear data covariance information. Central to the method is a novel fast pathways search algorithm using directed graphs. The pathways output provides (1) an aid to identifying important reactions, (2) fast estimates of uncertainties, (3) reduced models that retain important nuclides and reactions for use in the code's Monte Carlo sensitivity analysis module. Described are the methods that are being implemented for improving uncertainty predictions, quantification and propagation using the covariance data that the recent nuclear data libraries contain. In the TENDL library, above themore » upper energy of the resolved resonance range, a Monte Carlo method in which the covariance data come from uncertainties of the nuclear model calculations is used. The nuclear data files are read directly by FISPACT-II without any further intermediate processing. Variance and covariance data are processed and used by FISPACT-II to compute uncertainties in collapsed cross sections, and these are in turn used to predict uncertainties in inventories and all derived radiological data.« less

  15. CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.

    2015-10-20

    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectralmore » line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.« less

  16. Uncertainty in the delayed neutron fraction in fuel assembly depletion calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Kodeli, Ivan A.; Velkov, Kiril; Zwermann, Winfried

    2017-09-01

    This study presents uncertainty and sensitivity analyses of the delayed neutron fraction of light water reactor and sodium-cooled fast reactor fuel assemblies. For these analyses, the sampling-based XSUSA methodology is used to propagate cross section uncertainties in neutron transport and depletion calculations. Cross section data is varied according to the SCALE 6.1 covariance library. Since this library includes nu-bar uncertainties only for the total values, it has been supplemented by delayed nu-bar uncertainties from the covariance data of the JENDL-4.0 nuclear data library. The neutron transport and depletion calculations are performed with the TRITON/NEWT sequence of the SCALE 6.1 package. The evolution of the delayed neutron fraction uncertainty over burn-up is analysed without and with the consideration of delayed nu-bar uncertainties. Moreover, the main contributors to the result uncertainty are determined. In all cases, the delayed nu-bar uncertainties increase the delayed neutron fraction uncertainty. Depending on the fuel composition, the delayed nu-bar values of uranium and plutonium in fact give the main contributions to the delayed neutron fraction uncertainty for the LWR fuel assemblies. For the SFR case, the uncertainty of the scattering cross section of U-238 is the main contributor.

  17. Use and Impact of Covariance Data in the Japanese Latest Adjusted Library ADJ2010 Based on JENDL-4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yokoyama, K., E-mail: yokoyama.kenji09@jaea.go.jp; Ishikawa, M.

    2015-01-15

    The current status of covariance applications to fast reactor analysis and design in Japan is summarized. In Japan, the covariance data are mainly used for three purposes: (1) to quantify the uncertainty of nuclear core parameters, (2) to identify important nuclides, reactions and energy ranges which are dominant to the uncertainty of core parameters, and (3) to improve the accuracy of core design values by adopting the integral data such as the critical experiments and the power reactor operation data. For the last purpose, the cross section adjustment based on the Bayesian theorem is used. After the release of JENDL-4.0,more » a development project of the new adjusted group-constant set ADJ2010 was started in 2010 and completed in 2013. In the present paper, the final results of ADJ2010 are briefly summarized. In addition, the adjustment results of ADJ2010 are discussed from the viewpoint of use and impact of nuclear data covariances, focusing on {sup 239}Pu capture cross section alterations. For this purpose three kind of indices, called “degree of mobility,” “adjustment motive force,” and “adjustment potential,” are proposed.« less

  18. A Dynamic Time Warping based covariance function for Gaussian Processes signature identification

    NASA Astrophysics Data System (ADS)

    Silversides, Katherine L.; Melkumyan, Arman

    2016-11-01

    Modelling stratiform deposits requires a detailed knowledge of the stratigraphic boundaries. In Banded Iron Formation (BIF) hosted ores of the Hamersley Group in Western Australia these boundaries are often identified using marker shales. Both Gaussian Processes (GP) and Dynamic Time Warping (DTW) have been previously proposed as methods to automatically identify marker shales in natural gamma logs. However, each method has different advantages and disadvantages. We propose a DTW based covariance function for the GP that combines the flexibility of the DTW with the probabilistic framework of the GP. The three methods are tested and compared on their ability to identify two natural gamma signatures from a Marra Mamba type iron ore deposit. These tests show that while all three methods can identify boundaries, the GP with the DTW covariance function combines and balances the strengths and weaknesses of the individual methods. This method identifies more positive signatures than the GP with the standard covariance function, and has a higher accuracy for identified signatures than the DTW. The combined method can handle larger variations in the signature without requiring multiple libraries, has a probabilistic output and does not require manual cut-off selections.

  19. Sensitivity analysis of Monju using ERANOS with JENDL-4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with sensitivity analysis using JENDL-4.0 nuclear data applied to the Monju reactor. In 2010 the Japan Atomic Energy Agency - JAEA - released a new set of nuclear data: JENDL-4.0. This new evaluation is expected to contain improved data on actinides and covariance matrices. Covariance matrices are a key point in quantification of uncertainties due to basic nuclear data. For sensitivity analysis, the well-established ERANOS [1] code was chosen because of its integrated modules that allow users to perform a sensitivity analysis of complex reactor geometries. A JENDL-4.0 cross-section library is not available for ERANOS. Therefore amore » cross-section library had to be made from the original nuclear data set, available as ENDF formatted files. This is achieved by using the following codes: NJOY, CALENDF, MERGE and GECCO in order to create a library for the ECCO cell code (part of ERANOS). In order to make sure of the accuracy of the new ECCO library, two benchmark experiments have been analyzed: the MZA and MZB cores of the MOZART program measured at the ZEBRA facility in the UK. These were chosen due to their similarity to the Monju core. Using the JENDL-4.0 ECCO library we have analyzed the criticality of Monju during the restart in 2010. We have obtained good agreement with the measured criticality. Perturbation calculations have been performed between JENDL-3.3 and JENDL-4.0 based models. The isotopes {sup 239}Pu, {sup 238}U, {sup 241}Am and {sup 241}Pu account for a major part of observed differences. (authors)« less

  20. A simple method for semi-random DNA amplicon fragmentation using the methylation-dependent restriction enzyme MspJI.

    PubMed

    Shinozuka, Hiroshi; Cogan, Noel O I; Shinozuka, Maiko; Marshall, Alexis; Kay, Pippa; Lin, Yi-Han; Spangenberg, German C; Forster, John W

    2015-04-11

    Fragmentation at random nucleotide locations is an essential process for preparation of DNA libraries to be used on massively parallel short-read DNA sequencing platforms. Although instruments for physical shearing, such as the Covaris S2 focused-ultrasonicator system, and products for enzymatic shearing, such as the Nextera technology and NEBNext dsDNA Fragmentase kit, are commercially available, a simple and inexpensive method is desirable for high-throughput sequencing library preparation. MspJI is a recently characterised restriction enzyme which recognises the sequence motif CNNR (where R = G or A) when the first base is modified to 5-methylcytosine or 5-hydroxymethylcytosine. A semi-random enzymatic DNA amplicon fragmentation method was developed based on the unique cleavage properties of MspJI. In this method, random incorporation of 5-methyl-2'-deoxycytidine-5'-triphosphate is achieved through DNA amplification with DNA polymerase, followed by DNA digestion with MspJI. Due to the recognition sequence of the enzyme, DNA amplicons are fragmented in a relatively sequence-independent manner. The size range of the resulting fragments was capable of control through optimisation of 5-methyl-2'-deoxycytidine-5'-triphosphate concentration in the reaction mixture. A library suitable for sequencing using the Illumina MiSeq platform was prepared and processed using the proposed method. Alignment of generated short reads to a reference sequence demonstrated a relatively high level of random fragmentation. The proposed method may be performed with standard laboratory equipment. Although the uniformity of coverage was slightly inferior to the Covaris physical shearing procedure, due to efficiencies of cost and labour, the method may be more suitable than existing approaches for implementation in large-scale sequencing activities, such as bacterial artificial chromosome (BAC)-based genome sequence assembly, pan-genomic studies and locus-targeted genotyping-by-sequencing.

  1. Validation of tungsten cross sections in the neutron energy region up to 100 keV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej

    2017-09-01

    Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.

  2. Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less

  3. Uncertainty quantification in (α,n) neutron source calculations for an oxide matrix

    DOE PAGES

    Pigni, M. T.; Croft, S.; Gauld, I. C.

    2016-04-25

    Here we present a methodology to propagate nuclear data covariance information in neutron source calculations from (α,n) reactions. The approach is applied to estimate the uncertainty in the neutron generation rates for uranium oxide fuel types due to uncertainties on 1) 17,18O( α,n) reaction cross sections and 2) uranium and oxygen stopping power cross sections. The procedure to generate reaction cross section covariance information is based on the Bayesian fitting method implemented in the R-matrix SAMMY code. The evaluation methodology uses the Reich-Moore approximation to fit the 17,18O(α,n) reaction cross-sections in order to derive a set of resonance parameters andmore » a related covariance matrix that is then used to calculate the energydependent cross section covariance matrix. The stopping power cross sections and related covariance information for uranium and oxygen were obtained by the fit of stopping power data in the -energy range of 1 keV up to 12 MeV. Cross section perturbation factors based on the covariance information relative to the evaluated 17,18O( α,n) reaction cross sections, as well as uranium and oxygen stopping power cross sections, were used to generate a varied set of nuclear data libraries used in SOURCES4C and ORIGEN for inventory and source term calculations. The set of randomly perturbed output (α,n) source responses, provide the mean values and standard deviations of the calculated responses reflecting the uncertainties in nuclear data used in the calculations. Lastly, the results and related uncertainties are compared with experiment thick target (α,n) yields for uranium oxide.« less

  4. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  5. Covariance generation and uncertainty propagation for thermal and fast neutron induced fission yields

    NASA Astrophysics Data System (ADS)

    Terranova, Nicholas; Serot, Olivier; Archier, Pascal; De Saint Jean, Cyrille; Sumini, Marco

    2017-09-01

    Fission product yields (FY) are fundamental nuclear data for several applications, including decay heat, shielding, dosimetry, burn-up calculations. To be safe and sustainable, modern and future nuclear systems require accurate knowledge on reactor parameters, with reduced margins of uncertainty. Present nuclear data libraries for FY do not provide consistent and complete uncertainty information which are limited, in many cases, to only variances. In the present work we propose a methodology to evaluate covariance matrices for thermal and fast neutron induced fission yields. The semi-empirical models adopted to evaluate the JEFF-3.1.1 FY library have been used in the Generalized Least Square Method available in CONRAD (COde for Nuclear Reaction Analysis and Data assimilation) to generate covariance matrices for several fissioning systems such as the thermal fission of U235, Pu239 and Pu241 and the fast fission of U238, Pu239 and Pu240. The impact of such covariances on nuclear applications has been estimated using deterministic and Monte Carlo uncertainty propagation techniques. We studied the effects on decay heat and reactivity loss uncertainty estimation for simplified test case geometries, such as PWR and SFR pin-cells. The impact on existing nuclear reactors, such as the Jules Horowitz Reactor under construction at CEA-Cadarache, has also been considered.

  6. Visualizing DOM super-spectrum covariance in vanKrevelen space

    NASA Astrophysics Data System (ADS)

    Fatland, D. R.; Kalawe, J.; Stubbins, A.; Spencer, R. G.; Sleighter, R. L.; Abdulla, H. A.; Dittmar, T.

    2011-12-01

    We investigate the fate of terrigenous organic matter, DOM exported to the coastal marine environ. Many methods (fluor., FT-ICR-MS, NMR, 13C, lignin, etc) help characterize this DOM. We define a 'super spectrum' as amalgamation of analyses to a data stack and we search for physically significant patterns therein beginning with covariance across 31 samples from six circum-Arctic rivers: The Ob, Kolyma, Mackenzie, Yukon, Lena, and Yenisey sampled five times throughout the year. A vanKrevelen diagram is convenient to view distributions of molecules provided by Fourier Transform Ion Cyclotron Resonance Mass Spectometry (FT-ICR-MS). We augment this distribution space in the vertical dimension, for example to show peak height, molecular mass, principle component weighting or covariance. We use Worldwide Telescope, a virtual globe with strong data support from Microsoft Research to explore covariance results along 3+ dimensions (adding brightness, color and a parameter slide). The results show interesting covariance e.g. between molecules and PARAFAC peaks, a step towards fluorophore and cohort identification in the terrigenous DOM spectrum. Given the geoscience explosion in data volume and data complexity we feel these results should survive beyond the end point of a journal article. We are building a cloud-based Library on the Microsoft Azure platform to support this and subsequent analyses to enable data and methods to carry over and benefit other research groups and objectives.

  7. ILIAD Testing; and a Kalman Filter for 3-D Pose Estimation

    NASA Technical Reports Server (NTRS)

    Richardson, A. O.

    1996-01-01

    This report presents the results of a two-part project. The first part presents results of performance assessment tests on an Internet Library Information Assembly Data Base (ILIAD). It was found that ILLAD performed best when queries were short (one-to-three keywords), and were made up of rare, unambiguous words. In such cases as many as 64% of the typically 25 returned documents were found to be relevant. It was also found that a query format that was not so rigid with respect to spelling errors and punctuation marks would be more user-friendly. The second part of the report shows the design of a Kalman Filter for estimating motion parameters of a three dimensional object from sequences of noisy data derived from two-dimensional pictures. Given six measured deviation values represendng X, Y, Z, pitch, yaw, and roll, twelve parameters were estimated comprising the six deviations and their time rate of change. Values for the state transiton matrix, the observation matrix, the system noise covariance matrix, and the observation noise covariance matrix were determined. A simple way of initilizing the error covariance matrix was pointed out.

  8. Release of the ENDF/B-VII.1 Evaluated Nuclear Data File

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, David

    2012-06-30

    The Cross Section Evaluation Working Group (CSEWG) released the ENDF/B-VII.1 library on December 22, 2011. The ENDF/B-VII.1 library is CSEWG's latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0, including: many new evaluation in the neutron sublibrary (423 in all and over 190 of these contain covariances), new fission product yields and a greatly improved decay data sublibrary. This summary barely touches on the five years worth of advances present in the ENDF/B-VII.1 library. We expect that these changes will lead tomore » improved integral performance in reactors and other applications. Furthermore, the expansion of covariance data in this release will allow for better uncertainty quantification, reducing design margins and costs. The ENDF library is an ongoing and evolving effort. Currently, the ENDF data community embarking on several parallel efforts to improve library management: (1) The adoption of a continuous integration system to provide evaluators 'instant' feedback on the quality of their evaluations and to provide data users with working 'beta' quality libraries in between major releases. (2) The transition to new hierarchical data format - the Generalized Nuclear Data (GND) format. We expect GND to enable new kinds of evaluated data which cannot be accommodated in the legacy ENDF format. (3) The development of data assimilation and uncertainty propagation techniques to enable the consistent use of integral experimental data in the evaluation process.« less

  9. Ligation Bias in Illumina Next-Generation DNA Libraries: Implications for Sequencing Ancient Genomes

    PubMed Central

    Seguin-Orlando, Andaine; Schubert, Mikkel; Clary, Joel; Stagegaard, Julia; Alberdi, Maria T.; Prado, José Luis; Prieto, Alfredo; Willerslev, Eske; Orlando, Ludovic

    2013-01-01

    Ancient DNA extracts consist of a mixture of endogenous molecules and contaminant DNA templates, often originating from environmental microbes. These two populations of templates exhibit different chemical characteristics, with the former showing depurination and cytosine deamination by-products, resulting from post-mortem DNA damage. Such chemical modifications can interfere with the molecular tools used for building second-generation DNA libraries, and limit our ability to fully characterize the true complexity of ancient DNA extracts. In this study, we first use fresh DNA extracts to demonstrate that library preparation based on adapter ligation at AT-overhangs are biased against DNA templates starting with thymine residues, contrarily to blunt-end adapter ligation. We observe the same bias on fresh DNA extracts sheared on Bioruptor, Covaris and nebulizers. This contradicts previous reports suggesting that this bias could originate from the methods used for shearing DNA. This also suggests that AT-overhang adapter ligation efficiency is affected in a sequence-dependent manner and results in an uneven representation of different genomic contexts. We then show how this bias could affect the base composition of ancient DNA libraries prepared following AT-overhang ligation, mainly by limiting the ability to ligate DNA templates starting with thymines and therefore deaminated cytosines. This results in particular nucleotide misincorporation damage patterns, deviating from the signature generally expected for authenticating ancient sequence data. Consequently, we show that models adequate for estimating post-mortem DNA damage levels must be robust to the molecular tools used for building ancient DNA libraries. PMID:24205269

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordborg, C.

    A new improved version of the OECD Nuclear Energy Agency (NEA) co-ordinated Joint Evaluated Fission and Fusion (JEFF) data library, JEFF-3.1, was released in May 2005. It comprises a general purpose library and the following five special purpose libraries: activation; thermal scattering law; radioactive decay; fission yield; and proton library. The objective of the previous version of the library (JEFF-2.2) was to achieve improved performance for existing reactors and fuel cycles. In addition to this objective, the JEFF-3.1 library aims to provide users with data for a wider range of applications. These include innovative reactor concepts, transmutation of radioactive waste,more » fusion, and various other energy and non-energy related industrial applications. Initial benchmark testing has confirmed the expected very good performance of the JEFF-3.1 library. Additional benchmarking of the libraries is underway, both for the general purpose and for the special purpose libraries. A new three-year mandate to continue developing the JEFF library was recently granted by the NEA. For the next version of the library, JEFF-3.2, it is foreseen to put more effort into fission product and minor actinide evaluations, as well as the inclusion of more covariance data. (authors)« less

  11. ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology

    NASA Astrophysics Data System (ADS)

    Chadwick, M. B.; Obložinský, P.; Herman, M.; Greene, N. M.; McKnight, R. D.; Smith, D. L.; Young, P. G.; MacFarlane, R. E.; Hale, G. M.; Frankle, S. C.; Kahler, A. C.; Kawano, T.; Little, R. C.; Madland, D. G.; Moller, P.; Mosteller, R. D.; Page, P. R.; Talou, P.; Trellue, H.; White, M. C.; Wilson, W. B.; Arcilla, R.; Dunford, C. L.; Mughabghab, S. F.; Pritychenko, B.; Rochman, D.; Sonzogni, A. A.; Lubitz, C. R.; Trumbull, T. H.; Weinman, J. P.; Brown, D. A.; Cullen, D. E.; Heinrichs, D. P.; McNabb, D. P.; Derrien, H.; Dunn, M. E.; Larson, N. M.; Leal, L. C.; Carlson, A. D.; Block, R. C.; Briggs, J. B.; Cheng, E. T.; Huria, H. C.; Zerkle, M. L.; Kozier, K. S.; Courcelle, A.; Pronyaev, V.; van der Marck, S. C.

    2006-12-01

    We describe the next generation general purpose Evaluated Nuclear Data File, ENDF/B-VII.0, of recommended nuclear data for advanced nuclear science and technology applications. The library, released by the U.S. Cross Section Evaluation Working Group (CSEWG) in December 2006, contains data primarily for reactions with incident neutrons, protons, and photons on almost 400 isotopes, based on experimental data and theory predictions. The principal advances over the previous ENDF/B-VI library are the following: (1) New cross sections for U, Pu, Th, Np and Am actinide isotopes, with improved performance in integral validation criticality and neutron transmission benchmark tests; (2) More precise standard cross sections for neutron reactions on H, 6Li, 10B, Au and for 235,238U fission, developed by a collaboration with the IAEA and the OECD/NEA Working Party on Evaluation Cooperation (WPEC); (3) Improved thermal neutron scattering; (4) An extensive set of neutron cross sections on fission products developed through a WPEC collaboration; (5) A large suite of photonuclear reactions; (6) Extension of many neutron- and proton-induced evaluations up to 150 MeV; (7) Many new light nucleus neutron and proton reactions; (8) Post-fission beta-delayed photon decay spectra; (9) New radioactive decay data; (10) New methods for uncertainties and covariances, together with covariance evaluations for some sample cases; and (11) New actinide fission energy deposition. The paper provides an overview of this library, consisting of 14 sublibraries in the same ENDF-6 format as the earlier ENDF/B-VI library. We describe each of the 14 sublibraries, focusing on neutron reactions. Extensive validation, using radiation transport codes to simulate measured critical assemblies, show major improvements: (a) The long-standing underprediction of low enriched uranium thermal assemblies is removed; (b) The 238U and 208Pb reflector biases in fast systems are largely removed; (c) ENDF/B-VI.8 good agreement for simulations of thermal high-enriched uranium assemblies is preserved; (d) The underprediction of fast criticality of 233,235U and 239Pu assemblies is removed; and (e) The intermediate spectrum critical assemblies are predicted more accurately. We anticipate that the new library will play an important role in nuclear technology applications, including transport simulations supporting national security, nonproliferation, advanced reactor and fuel cycle concepts, criticality safety, fusion, medicine, space applications, nuclear astrophysics, and nuclear physics facility design. The ENDF/B-VII.0 library is archived at the National Nuclear Data Center, BNL, and can be retrieved from www.nndc.bnl.gov.

  12. Aerospace Toolbox--a flight vehicle design, analysis, simulation, and software development environment II: an in-depth overview

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.

    2002-07-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provided a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed included its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics that were covered in part I included flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this series will cover a more in-depth look at the analysis and simulation capability and provide an update on the toolbox enhancements. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  13. ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chadwick, M B; Oblozinsky, P; Herman, M

    2006-10-02

    We describe the next generation general purpose Evaluated Nuclear Data File, ENDF/B-VII.0, of recommended nuclear data for advanced nuclear science and technology applications. The library, released by the U.S. Cross Section Evaluation Working Group (CSEWG) in December 2006, contains data primarily for reactions with incident neutrons, protons, and photons on almost 400 isotopes. The new evaluations are based on both experimental data and nuclear reaction theory predictions. The principal advances over the previous ENDF/B-VI library are the following: (1) New cross sections for U, Pu, Th, Np and Am actinide isotopes, with improved performance in integral validation criticality and neutronmore » transmission benchmark tests; (2) More precise standard cross sections for neutron reactions on H, {sup 6}Li, {sup 10}B, Au and for {sup 235,238}U fission, developed by a collaboration with the IAEA and the OECD/NEA Working Party on Evaluation Cooperation (WPEC); (3) Improved thermal neutron scattering; (4) An extensive set of neutron cross sections on fission products developed through a WPEC collaboration; (5) A large suite of photonuclear reactions; (6) Extension of many neutron- and proton-induced reactions up to an energy of 150 MeV; (7) Many new light nucleus neutron and proton reactions; (8) Post-fission beta-delayed photon decay spectra; (9) New radioactive decay data; and (10) New methods developed to provide uncertainties and covariances, together with covariance evaluations for some sample cases. The paper provides an overview of this library, consisting of 14 sublibraries in the same, ENDF-6 format, as the earlier ENDF/B-VI library. We describe each of the 14 sublibraries, focusing on neutron reactions. Extensive validation, using radiation transport codes to simulate measured critical assemblies, show major improvements: (a) The long-standing underprediction of low enriched U thermal assemblies is removed; (b) The {sup 238}U, {sup 208}Pb, and {sup 9}Be reflector biases in fast systems are largely removed; (c) ENDF/B-VI.8 good agreement for simulations of highly enriched uranium assemblies is preserved; (d) The underprediction of fast criticality of {sup 233,235}U and {sup 239}Pu assemblies is removed; and (e) The intermediate spectrum critical assemblies are predicted more accurately. We anticipate that the new library will play an important role in nuclear technology applications, including transport simulations supporting national security, nonproliferation, advanced reactor and fuel cycle concepts, criticality safety, medicine, space applications, nuclear astrophysics, and nuclear physics facility design. The ENDF/B-VII.0 library is archived at the National Nuclear Data Center, BNL. The complete library, or any part of it, may be retrieved from www.nndc.bnl.gov.« less

  14. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order tomore » provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.« less

  15. Betty Petersen Memorial Library - NCWCP Publications - NWS

    Science.gov Websites

    Filters to Variational Statistical Analysis with Spatially Inhomogeneous Covariances (.PDF file) 432 2001 file) 456 2008 Purser, R. James Normalization Of The Diffusive Filters That Represent The Inhomogeneous file) 457 2008 Purser, R. James Normalization Of The Diffusive Filters That Represent The Inhomogeneous

  16. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pigni, M.T., E-mail: pignimt@ornl.gov; Francis, M.W.; Gauld, I.C.

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {supmore » 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.« less

  17. Three-dimensional evidence network plot system: covariate imbalances and effects in network meta-analysis explored using a new software tool.

    PubMed

    Batson, Sarah; Score, Robert; Sutton, Alex J

    2017-06-01

    The aim of the study was to develop the three-dimensional (3D) evidence network plot system-a novel web-based interactive 3D tool to facilitate the visualization and exploration of covariate distributions and imbalances across evidence networks for network meta-analysis (NMA). We developed the 3D evidence network plot system within an AngularJS environment using a third party JavaScript library (Three.js) to create the 3D element of the application. Data used to enable the creation of the 3D element for a particular topic are inputted via a Microsoft Excel template spreadsheet that has been specifically formatted to hold these data. We display and discuss the findings of applying the tool to two NMA examples considering multiple covariates. These two examples have been previously identified as having potentially important covariate effects and allow us to document the various features of the tool while illustrating how it can be used. The 3D evidence network plot system provides an immediate, intuitive, and accessible way to assess the similarity and differences between the values of covariates for individual studies within and between each treatment contrast in an evidence network. In this way, differences between the studies, which may invalidate the usual assumptions of an NMA, can be identified for further scrutiny. Hence, the tool facilitates NMA feasibility/validity assessments and aids in the interpretation of NMA results. The 3D evidence network plot system is the first tool designed specifically to visualize covariate distributions and imbalances across evidence networks in 3D. This will be of primary interest to systematic review and meta-analysis researchers and, more generally, those assessing the validity and robustness of an NMA to inform reimbursement decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Aerospace Toolbox---a flight vehicle design, analysis, simulation ,and software development environment: I. An introduction and tutorial

    NASA Astrophysics Data System (ADS)

    Christian, Paul M.; Wells, Randy

    2001-09-01

    This paper presents a demonstrated approach to significantly reduce the cost and schedule of non real-time modeling and simulation, real-time HWIL simulation, and embedded code development. The tool and the methodology presented capitalize on a paradigm that has become a standard operating procedure in the automotive industry. The tool described is known as the Aerospace Toolbox, and it is based on the MathWorks Matlab/Simulink framework, which is a COTS application. Extrapolation of automotive industry data and initial applications in the aerospace industry show that the use of the Aerospace Toolbox can make significant contributions in the quest by NASA and other government agencies to meet aggressive cost reduction goals in development programs. The part I of this paper provides a detailed description of the GUI based Aerospace Toolbox and how it is used in every step of a development program; from quick prototyping of concept developments that leverage built-in point of departure simulations through to detailed design, analysis, and testing. Some of the attributes addressed include its versatility in modeling 3 to 6 degrees of freedom, its library of flight test validated library of models (including physics, environments, hardware, and error sources), and its built-in Monte Carlo capability. Other topics to be covered in this part include flight vehicle models and algorithms, and the covariance analysis package, Navigation System Covariance Analysis Tools (NavSCAT). Part II of this paper, to be published at a later date, will conclude with a description of how the Aerospace Toolbox is an integral part of developing embedded code directly from the simulation models by using the Mathworks Real Time Workshop and optimization tools. It will also address how the Toolbox can be used as a design hub for Internet based collaborative engineering tools such as NASA's Intelligent Synthesis Environment (ISE) and Lockheed Martin's Interactive Missile Design Environment (IMD).

  19. ZASPE: A Code to Measure Stellar Atmospheric Parameters and their Covariance from Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Hartman, Joel; Bakos, Gáspár

    2017-05-01

    We describe the Zonal Atmospheric Stellar Parameters Estimator (zaspe), a new algorithm, and its associated code, for determining precise stellar atmospheric parameters and their uncertainties from high-resolution echelle spectra of FGK-type stars. zaspe estimates stellar atmospheric parameters by comparing the observed spectrum against a grid of synthetic spectra only in the most sensitive spectral zones to changes in the atmospheric parameters. Realistic uncertainties in the parameters are computed from the data itself, by taking into account the systematic mismatches between the observed spectrum and the best-fitting synthetic one. The covariances between the parameters are also estimated in the process. zaspe can in principle use any pre-calculated grid of synthetic spectra, but unbiased grids are required to obtain accurate parameters. We tested the performance of two existing libraries, and we concluded that neither is suitable for computing precise atmospheric parameters. We describe a process to synthesize a new library of synthetic spectra that was found to generate consistent results when compared with parameters obtained with different methods (interferometry, asteroseismology, equivalent widths).

  20. Benchmarking and validation activities within JEFF project

    NASA Astrophysics Data System (ADS)

    Cabellos, O.; Alvarez-Velarde, F.; Angelone, M.; Diez, C. J.; Dyrda, J.; Fiorito, L.; Fischer, U.; Fleming, M.; Haeck, W.; Hill, I.; Ichou, R.; Kim, D. H.; Klix, A.; Kodeli, I.; Leconte, P.; Michel-Sendis, F.; Nunnenmann, E.; Pecchia, M.; Peneliau, Y.; Plompen, A.; Rochman, D.; Romojaro, P.; Stankovskiy, A.; Sublet, J. Ch.; Tamagno, P.; Marck, S. van der

    2017-09-01

    The challenge for any nuclear data evaluation project is to periodically release a revised, fully consistent and complete library, with all needed data and covariances, and ensure that it is robust and reliable for a variety of applications. Within an evaluation effort, benchmarking activities play an important role in validating proposed libraries. The Joint Evaluated Fission and Fusion (JEFF) Project aims to provide such a nuclear data library, and thus, requires a coherent and efficient benchmarking process. The aim of this paper is to present the activities carried out by the new JEFF Benchmarking and Validation Working Group, and to describe the role of the NEA Data Bank in this context. The paper will also review the status of preliminary benchmarking for the next JEFF-3.3 candidate cross-section files.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sublet, J.-Ch., E-mail: jean-christophe.sublet@ukaea.uk; Eastwood, J.W.; Morgan, J.G.

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2more » and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.« less

  2. FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling

    NASA Astrophysics Data System (ADS)

    Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.; Gilbert, M. R.; Fleming, M.; Arter, W.

    2017-01-01

    Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.

  3. Least-Squares Neutron Spectral Adjustment with STAYSL PNNL

    NASA Astrophysics Data System (ADS)

    Greenwood, L. R.; Johnson, C. D.

    2016-02-01

    The STAYSL PNNL computer code, a descendant of the STAY'SL code [1], performs neutron spectral adjustment of a starting neutron spectrum, applying a least squares method to determine adjustments based on saturated activation rates, neutron cross sections from evaluated nuclear data libraries, and all associated covariances. STAYSL PNNL is provided as part of a comprehensive suite of programs [2], where additional tools in the suite are used for assembling a set of nuclear data libraries and determining all required corrections to the measured data to determine saturated activation rates. Neutron cross section and covariance data are taken from the International Reactor Dosimetry File (IRDF-2002) [3], which was sponsored by the International Atomic Energy Agency (IAEA), though work is planned to update to data from the IAEA's International Reactor Dosimetry and Fusion File (IRDFF) [4]. The nuclear data and associated covariances are extracted from IRDF-2002 using the third-party NJOY99 computer code [5]. The NJpp translation code converts the extracted data into a library data array format suitable for use as input to STAYSL PNNL. The software suite also includes three utilities to calculate corrections to measured activation rates. Neutron self-shielding corrections are calculated as a function of neutron energy with the SHIELD code and are applied to the group cross sections prior to spectral adjustment, thus making the corrections independent of the neutron spectrum. The SigPhi Calculator is a Microsoft Excel spreadsheet used for calculating saturated activation rates from raw gamma activities by applying corrections for gamma self-absorption, neutron burn-up, and the irradiation history. Gamma self-absorption and neutron burn-up corrections are calculated (iteratively in the case of the burn-up) within the SigPhi Calculator spreadsheet. The irradiation history corrections are calculated using the BCF computer code and are inserted into the SigPhi Calculator workbook for use in correcting the measured activities. Output from the SigPhi Calculator is automatically produced, and consists of a portion of the STAYSL PNNL input file data that is required to run the spectral adjustment calculations. Within STAYSL PNNL, the least-squares process is performed in one step, without iteration, and provides rapid results on PC platforms. STAYSL PNNL creates multiple output files with tabulated results, data suitable for plotting, and data formatted for use in subsequent radiation damage calculations using the SPECTER computer code (which is not included in the STAYSL PNNL suite). All components of the software suite have undergone extensive testing and validation prior to release and test cases are provided with the package.

  4. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less

  5. School Librarian Staffing Levels and Student Achievement as Represented in 2006-2009 Kansas Annual Yearly Progress Data

    ERIC Educational Resources Information Center

    Dow, Mirah J.; McMahon-Lakin, Jacqueline

    2012-01-01

    To address the presence or absence of school librarians in Kansas public schools, a study using analysis of covariance (ANCOVA) was designed to investigate staffing levels for library media specialists (LMSs), the label used for school librarians in licensed-personnel data in Kansas, and student achievement at the school level. Five subject areas…

  6. Are your covariates under control? How normalization can re-introduce covariate effects.

    PubMed

    Pain, Oliver; Dudbridge, Frank; Ronald, Angelica

    2018-04-30

    Many statistical tests rely on the assumption that the residuals of a model are normally distributed. Rank-based inverse normal transformation (INT) of the dependent variable is one of the most popular approaches to satisfy the normality assumption. When covariates are included in the analysis, a common approach is to first adjust for the covariates and then normalize the residuals. This study investigated the effect of regressing covariates against the dependent variable and then applying rank-based INT to the residuals. The correlation between the dependent variable and covariates at each stage of processing was assessed. An alternative approach was tested in which rank-based INT was applied to the dependent variable before regressing covariates. Analyses based on both simulated and real data examples demonstrated that applying rank-based INT to the dependent variable residuals after regressing out covariates re-introduces a linear correlation between the dependent variable and covariates, increasing type-I errors and reducing power. On the other hand, when rank-based INT was applied prior to controlling for covariate effects, residuals were normally distributed and linearly uncorrelated with covariates. This latter approach is therefore recommended in situations were normality of the dependent variable is required.

  7. Updating and extending the IRDF-2002 dosimetry library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Zolotarev, K.I.; Pronyaev, V.G.

    The International Reactor Dosimetry File (IRDF)-2002 released in 2004 by the IAEA (see http://www-nds.iaea.org/irdf2002/) contains cross-section data and corresponding uncertainties for 66 dosimetry reactions. New cross-section evaluations have become available recently that re-define some of these dosimetry reactions including: (1) high-fidelity evaluation work undertaken by one of the authors (KIZ); (2) evaluations from the US ENDF/B-VII.0 and candidate evaluations from the US ENDF/B-VII.1 libraries that cover reactions within the International Evaluation of Neutron Cross-Section Standards; (3) European JEFF3.1 library; and (4) Japanese JENDL-4.0 library. Additional high-threshold reactions not included in IRDF-2002 (e.g., {sup 59C}o(n,3n) and {sup 209}Bi(n,3n)) have been alsomore » evaluated to characterize higher-energy neutron fields. Overall, 37 new evaluations of dosimetry reactions have been assessed and intercomparisons made with integral measurements in reference neutron fields to determine whether they should be adopted to update and improve IRDF-2002. Benchmark calculations performed for newly evaluated reactions using the ENDF/B-VII.0 {sup 235}U thermal fission and {sup 252}Cf spontaneous fission neutron spectra show that calculated integral cross sections exhibit improved agreement with evaluated experimental data when compared with the equivalent data from the IRDF-2002 library. Data inconsistencies or deficiencies of new evaluations have been identified for {sup 63}Cu(n,2n), {sup 60}Ni(n,p) {sup 60m+g}Co, {sup 55}Mn(n,{gamma}), and {sup 232}Th(n,f) reactions. Compared with IRDF-2002, the upper neutron energy boundary was formally increased from the actual maximum energy of typically 20 MeV up to 60 MeV by using the TENDL-2010 cross sections and covariance matrices. This extension would allow the updated IRDF library to be also used in fusion dosimetry applications. Uncertainties in the cross sections for all new evaluations are given in the form of relative covariance matrices. Newly evaluated excitation functions should be considered as suitable candidates in the preparation of an improved version of the IRDF that was planned to be released for testing in December 2011. (authors)« less

  8. Storage and computationally efficient permutations of factorized covariance and square-root information matrices

    NASA Technical Reports Server (NTRS)

    Muellerschoen, R. J.

    1988-01-01

    A unified method to permute vector-stored upper-triangular diagonal factorized covariance (UD) and vector stored upper-triangular square-root information filter (SRIF) arrays is presented. The method involves cyclical permutation of the rows and columns of the arrays and retriangularization with appropriate square-root-free fast Givens rotations or elementary slow Givens reflections. A minimal amount of computation is performed and only one scratch vector of size N is required, where N is the column dimension of the arrays. To make the method efficient for large SRIF arrays on a virtual memory machine, three additional scratch vectors each of size N are used to avoid expensive paging faults. The method discussed is compared with the methods and routines of Bierman's Estimation Subroutine Library (ESL).

  9. Methods and Issues for the Combined Use of Integral Experiments and Covariance Data: Results of a NEA International Collaborative Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmiotti, Giuseppe; Salvatores, Massimo

    2014-04-01

    The Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD) established a Subgroup (called “Subgroup 33”) in 2009 on “Methods and issues for the combined use of integral experiments and covariance data.” The first stage was devoted to producing the description of different adjustment methodologies and assessing their merits. A detailed document related to this first stage has been issued. Nine leading organizations (often with a long and recognized expertise in the field) have contributed: ANL, CEA, INL, IPPE, JAEA, JSI, NRG, IRSN and ORNL. In the second stagemore » a practical benchmark exercise was defined in order to test the reliability of the nuclear data adjustment methodology. A comparison of the results obtained by the participants and major lessons learned in the exercise are discussed in the present paper that summarizes individual contributions which often include several original developments not reported separately. The paper provides the analysis of the most important results of the adjustment of the main nuclear data of 11 major isotopes in a 33-group energy structure. This benchmark exercise was based on a set of 20 well defined integral parameters from 7 fast assembly experiments. The exercise showed that using a common shared set of integral experiments but different starting evaluated libraries and/or different covariance matrices, there is a good convergence of trends for adjustments. Moreover, a significant reduction of the original uncertainties is often observed. Using the a–posteriori covariance data, there is a strong reduction of the uncertainties of integral parameters for reference reactor designs, mainly due to the new correlations in the a–posteriori covariance matrix. Furthermore, criteria have been proposed and applied to verify the consistency of differential and integral data used in the adjustment. Finally, recommendations are given for an appropriate use of sensitivity analysis methods and indications for future work are provided.« less

  10. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  11. Definition of Contravariant Velocity Components

    NASA Technical Reports Server (NTRS)

    Hung, Ching-moa; Kwak, Dochan (Technical Monitor)

    2002-01-01

    In this paper we have reviewed the basics of tensor analysis in an attempt to clarify some misconceptions regarding contravariant and covariant vector components as used in fluid dynamics. We have indicated that contravariant components are components of a given vector expressed as a unique combination of the covariant base vector system and, vice versa, that the covariant components are components of a vector expressed with the contravariant base vector system. Mathematically, expressing a vector with a combination of base vector is a decomposition process for a specific base vector system. Hence, the contravariant velocity components are decomposed components of velocity vector along the directions of coordinate lines, with respect to the covariant base vector system. However, the contravariant (and covariant) components are not physical quantities. Their magnitudes and dimensions are controlled by their corresponding covariant (and contravariant) base vectors.

  12. Quantum Efficiency and Bandgap Analysis for Combinatorial Photovoltaics: Sorting Activity of Cu–O Compounds in All-Oxide Device Libraries

    PubMed Central

    2014-01-01

    All-oxide-based photovoltaics (PVs) encompass the potential for extremely low cost solar cells, provided they can obtain an order of magnitude improvement in their power conversion efficiencies. To achieve this goal, we perform a combinatorial materials study of metal oxide based light absorbers, charge transporters, junctions between them, and PV devices. Here we report the development of a combinatorial internal quantum efficiency (IQE) method. IQE measures the efficiency associated with the charge separation and collection processes, and thus is a proxy for PV activity of materials once placed into devices, discarding optical properties that cause uncontrolled light harvesting. The IQE is supported by high-throughput techniques for bandgap fitting, composition analysis, and thickness mapping, which are also crucial parameters for the combinatorial investigation cycle of photovoltaics. As a model system we use a library of 169 solar cells with a varying thickness of sprayed titanium dioxide (TiO2) as the window layer, and covarying thickness and composition of binary compounds of copper oxides (Cu–O) as the light absorber, fabricated by Pulsed Laser Deposition (PLD). The analysis on the combinatorial devices shows the correlation between compositions and bandgap, and their effect on PV activity within several device configurations. The analysis suggests that the presence of Cu4O3 plays a significant role in the PV activity of binary Cu–O compounds. PMID:24410367

  13. Pu239 Cross-Section Variations Based on Experimental Uncertainties and Covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sigeti, David Edward; Williams, Brian J.; Parsons, D. Kent

    2016-10-18

    Algorithms and software have been developed for producing variations in plutonium-239 neutron cross sections based on experimental uncertainties and covariances. The varied cross-section sets may be produced as random samples from the multi-variate normal distribution defined by an experimental mean vector and covariance matrix, or they may be produced as Latin-Hypercube/Orthogonal-Array samples (based on the same means and covariances) for use in parametrized studies. The variations obey two classes of constraints that are obligatory for cross-section sets and which put related constraints on the mean vector and covariance matrix that detemine the sampling. Because the experimental means and covariances domore » not obey some of these constraints to sufficient precision, imposing the constraints requires modifying the experimental mean vector and covariance matrix. Modification is done with an algorithm based on linear algebra that minimizes changes to the means and covariances while insuring that the operations that impose the different constraints do not conflict with each other.« less

  14. ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data

    NASA Astrophysics Data System (ADS)

    Chadwick, M. B.; Herman, M.; Obložinský, P.; Dunn, M. E.; Danon, Y.; Kahler, A. C.; Smith, D. L.; Pritychenko, B.; Arbanas, G.; Arcilla, R.; Brewer, R.; Brown, D. A.; Capote, R.; Carlson, A. D.; Cho, Y. S.; Derrien, H.; Guber, K.; Hale, G. M.; Hoblit, S.; Holloway, S.; Johnson, T. D.; Kawano, T.; Kiedrowski, B. C.; Kim, H.; Kunieda, S.; Larson, N. M.; Leal, L.; Lestone, J. P.; Little, R. C.; McCutchan, E. A.; MacFarlane, R. E.; MacInnes, M.; Mattoon, C. M.; McKnight, R. D.; Mughabghab, S. F.; Nobre, G. P. A.; Palmiotti, G.; Palumbo, A.; Pigni, M. T.; Pronyaev, V. G.; Sayer, R. O.; Sonzogni, A. A.; Summers, N. C.; Talou, P.; Thompson, I. J.; Trkov, A.; Vogt, R. L.; van der Marck, S. C.; Wallner, A.; White, M. C.; Wiarda, D.; Young, P. G.

    2011-12-01

    The ENDF/B-VII.1 library is our latest recommended evaluated nuclear data file for use in nuclear science and technology applications, and incorporates advances made in the five years since the release of ENDF/B-VII.0. These advances focus on neutron cross sections, covariances, fission product yields and decay data, and represent work by the US Cross Section Evaluation Working Group (CSEWG) in nuclear data evaluation that utilizes developments in nuclear theory, modeling, simulation, and experiment. The principal advances in the new library are: (1) An increase in the breadth of neutron reaction cross section coverage, extending from 393 nuclides to 423 nuclides; (2) Covariance uncertainty data for 190 of the most important nuclides, as documented in companion papers in this edition; (3) R-matrix analyses of neutron reactions on light nuclei, including isotopes of He, Li, and Be; (4) Resonance parameter analyses at lower energies and statistical high energy reactions for isotopes of Cl, K, Ti, V, Mn, Cr, Ni, Zr and W; (5) Modifications to thermal neutron reactions on fission products (isotopes of Mo, Tc, Rh, Ag, Cs, Nd, Sm, Eu) and neutron absorber materials (Cd, Gd); (6) Improved minor actinide evaluations for isotopes of U, Np, Pu, and Am (we are not making changes to the major actinides 235,238U and 239Pu at this point, except for delayed neutron data and covariances, and instead we intend to update them after a further period of research in experiment and theory), and our adoption of JENDL-4.0 evaluations for isotopes of Cm, Bk, Cf, Es, Fm, and some other minor actinides; (7) Fission energy release evaluations; (8) Fission product yield advances for fission-spectrum neutrons and 14 MeV neutrons incident on 239Pu; and (9) A new decay data sublibrary. Integral validation testing of the ENDF/B-VII.1 library is provided for a variety of quantities: For nuclear criticality, the VII.1 library maintains the generally-good performance seen for VII.0 for a wide range of MCNP simulations of criticality benchmarks, with improved performance coming from new structural material evaluations, especially for Ti, Mn, Cr, Zr and W. For Be we see some improvements although the fast assembly data appear to be mutually inconsistent. Actinide cross section updates are also assessed through comparisons of fission and capture reaction rate measurements in critical assemblies and fast reactors, and improvements are evident. Maxwellian-averaged capture cross sections at 30 keV are also provided for astrophysics applications. We describe the cross section evaluations that have been updated for ENDF/B-VII.1 and the measured data and calculations that motivated the changes, and therefore this paper augments the ENDF/B-VII.0 publication [M. B. Chadwick, P. Obložinský, M. Herman, N. M. Greene, R. D. McKnight, D. L. Smith, P. G. Young, R. E. MacFarlane, G. M. Hale, S. C. Frankle, A. C. Kahler, T. Kawano, R. C. Little, D. G. Madland, P. Moller, R. D. Mosteller, P. R. Page, P. Talou, H. Trellue, M. C. White, W. B. Wilson, R. Arcilla, C. L. Dunford, S. F. Mughabghab, B. Pritychenko, D. Rochman, A. A. Sonzogni, C. R. Lubitz, T. H. Trumbull, J. P. Weinman, D. A. Br, D. E. Cullen, D. P. Heinrichs, D. P. McNabb, H. Derrien, M. E. Dunn, N. M. Larson, L. C. Leal, A. D. Carlson, R. C. Block, J. B. Briggs, E. T. Cheng, H. C. Huria, M. L. Zerkle, K. S. Kozier, A. Courcelle, V. Pronyaev, and S. C. van der Marck, "ENDF/B-VII.0: Next Generation Evaluated Nuclear Data Library for Nuclear Science and Technology," Nuclear Data Sheets 107, 2931 (2006)].

  15. Parametric Covariance Model for Horizon-Based Optical Navigation

    NASA Technical Reports Server (NTRS)

    Hikes, Jacob; Liounis, Andrew J.; Christian, John A.

    2016-01-01

    This Note presents an entirely parametric version of the covariance for horizon-based optical navigation measurements. The covariance can be written as a function of only the spacecraft position, two sensor design parameters, the illumination direction, the size of the observed planet, the size of the lit arc to be used, and the total number of observed horizon points. As a result, one may now more clearly understand the sensitivity of horizon-based optical navigation performance as a function of these key design parameters, which is insight that was obscured in previous (and nonparametric) versions of the covariance. Finally, the new parametric covariance is shown to agree with both the nonparametric analytic covariance and results from a Monte Carlo analysis.

  16. JANIS: NEA JAva-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, Nicolas; Bossant, Manuel; Cabellos, Oscar; Dupont, Emmeric; Díez, Carlos J.

    2017-09-01

    JANIS (JAva-based Nuclear Data Information System) software is developed by the OECD Nuclear Energy Agency (NEA) Data Bank to facilitate the visualization and manipulation of nuclear data, giving access to evaluated nuclear data libraries, such as ENDF, JEFF, JENDL, TENDL etc., and also to experimental nuclear data (EXFOR) and bibliographical references (CINDA). It is available as a standalone Java program, downloadable and distributed on DVD and also a web application available on the NEA website. One of the main new features in JANIS is the scripting capability via command line, which notably automatizes plots generation and permits automatically extracting data from the JANIS database. Recent NEA software developments rely on these JANIS features to access nuclear data, for example the Nuclear Data Sensitivity Tool (NDaST) makes use of covariance data in BOXER and COVERX formats, which are retrieved from the JANIS database. New features added in this version of the JANIS software are described along this paper with some examples.

  17. Evaluation of commercial DNA and RNA extraction methods for high-throughput sequencing of FFPE samples.

    PubMed

    Kresse, Stine H; Namløs, Heidi M; Lorenz, Susanne; Berner, Jeanne-Marie; Myklebost, Ola; Bjerkehagen, Bodil; Meza-Zepeda, Leonardo A

    2018-01-01

    Nucleic acid material of adequate quality is crucial for successful high-throughput sequencing (HTS) analysis. DNA and RNA isolated from archival FFPE material are frequently degraded and not readily amplifiable due to chemical damage introduced during fixation. To identify optimal nucleic acid extraction kits, DNA and RNA quantity, quality and performance in HTS applications were evaluated. DNA and RNA were isolated from five sarcoma archival FFPE blocks, using eight extraction protocols from seven kits from three different commercial vendors. For DNA extraction, the truXTRAC FFPE DNA kit from Covaris gave higher yields and better amplifiable DNA, but all protocols gave comparable HTS library yields using Agilent SureSelect XT and performed well in downstream variant calling. For RNA extraction, all protocols gave comparable yields and amplifiable RNA. However, for fusion gene detection using the Archer FusionPlex Sarcoma Assay, the truXTRAC FFPE RNA kit from Covaris and Agencourt FormaPure kit from Beckman Coulter showed the highest percentage of unique read-pairs, providing higher complexity of HTS data and more frequent detection of recurrent fusion genes. truXTRAC simultaneous DNA and RNA extraction gave similar outputs as individual protocols. These findings show that although successful HTS libraries could be generated in most cases, the different protocols gave variable quantity and quality for FFPE nucleic acid extraction. Selecting the optimal procedure is highly valuable and may generate results in borderline quality specimens.

  18. cit: hypothesis testing software for mediation analysis in genomic applications.

    PubMed

    Millstein, Joshua; Chen, Gary K; Breton, Carrie V

    2016-08-01

    The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Estimation of Covariance Matrix on Bi-Response Longitudinal Data Analysis with Penalized Spline Regression

    NASA Astrophysics Data System (ADS)

    Islamiyati, A.; Fatmawati; Chamidah, N.

    2018-03-01

    The correlation assumption of the longitudinal data with bi-response occurs on the measurement between the subjects of observation and the response. It causes the auto-correlation of error, and this can be overcome by using a covariance matrix. In this article, we estimate the covariance matrix based on the penalized spline regression model. Penalized spline involves knot points and smoothing parameters simultaneously in controlling the smoothness of the curve. Based on our simulation study, the estimated regression model of the weighted penalized spline with covariance matrix gives a smaller error value compared to the error of the model without covariance matrix.

  20. A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models

    ERIC Educational Resources Information Center

    Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka

    2015-01-01

    The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…

  1. Modeling spatiotemporal covariance for magnetoencephalography or electroencephalography source analysis.

    PubMed

    Plis, Sergey M; George, J S; Jun, S C; Paré-Blagoev, J; Ranken, D M; Wood, C C; Schmidt, D M

    2007-01-01

    We propose a new model to approximate spatiotemporal noise covariance for use in neural electromagnetic source analysis, which better captures temporal variability in background activity. As with other existing formalisms, our model employs a Kronecker product of matrices representing temporal and spatial covariance. In our model, spatial components are allowed to have differing temporal covariances. Variability is represented as a series of Kronecker products of spatial component covariances and corresponding temporal covariances. Unlike previous attempts to model covariance through a sum of Kronecker products, our model is designed to have a computationally manageable inverse. Despite increased descriptive power, inversion of the model is fast, making it useful in source analysis. We have explored two versions of the model. One is estimated based on the assumption that spatial components of background noise have uncorrelated time courses. Another version, which gives closer approximation, is based on the assumption that time courses are statistically independent. The accuracy of the structural approximation is compared to an existing model, based on a single Kronecker product, using both Frobenius norm of the difference between spatiotemporal sample covariance and a model, and scatter plots. Performance of ours and previous models is compared in source analysis of a large number of single dipole problems with simulated time courses and with background from authentic magnetoencephalography data.

  2. Space-time models based on random fields with local interactions

    NASA Astrophysics Data System (ADS)

    Hristopulos, Dionissios T.; Tsantili, Ivi C.

    2016-08-01

    The analysis of space-time data from complex, real-life phenomena requires the use of flexible and physically motivated covariance functions. In most cases, it is not possible to explicitly solve the equations of motion for the fields or the respective covariance functions. In the statistical literature, covariance functions are often based on mathematical constructions. In this paper, we propose deriving space-time covariance functions by solving “effective equations of motion”, which can be used as statistical representations of systems with diffusive behavior. In particular, we propose to formulate space-time covariance functions based on an equilibrium effective Hamiltonian using the linear response theory. The effective space-time dynamics is then generated by a stochastic perturbation around the equilibrium point of the classical field Hamiltonian leading to an associated Langevin equation. We employ a Hamiltonian which extends the classical Gaussian field theory by including a curvature term and leads to a diffusive Langevin equation. Finally, we derive new forms of space-time covariance functions.

  3. TALYS/TENDL verification and validation processes: Outcomes and recommendations

    NASA Astrophysics Data System (ADS)

    Fleming, Michael; Sublet, Jean-Christophe; Gilbert, Mark R.; Koning, Arjan; Rochman, Dimitri

    2017-09-01

    The TALYS-generated Evaluated Nuclear Data Libraries (TENDL) provide truly general-purpose nuclear data files assembled from the outputs of the T6 nuclear model codes system for direct use in both basic physics and engineering applications. The most recent TENDL-2015 version is based on both default and adjusted parameters of the most recent TALYS, TAFIS, TANES, TARES, TEFAL, TASMAN codes wrapped into a Total Monte Carlo loop for uncertainty quantification. TENDL-2015 contains complete neutron-incident evaluations for all target nuclides with Z ≤116 with half-life longer than 1 second (2809 isotopes with 544 isomeric states), up to 200 MeV, with covariances and all reaction daughter products including isomers of half-life greater than 100 milliseconds. With the added High Fidelity Resonance (HFR) approach, all resonances are unique, following statistical rules. The validation of the TENDL-2014/2015 libraries against standard, evaluated, microscopic and integral cross sections has been performed against a newly compiled UKAEA database of thermal, resonance integral, Maxwellian averages, 14 MeV and various accelerator-driven neutron source spectra. This has been assembled using the most up-to-date, internationally-recognised data sources including the Atlas of Resonances, CRC, evaluated EXFOR, activation databases, fusion, fission and MACS. Excellent agreement was found with a small set of errors within the reference databases and TENDL-2014 predictions.

  4. Comparison of bias-corrected covariance estimators for MMRM analysis in longitudinal data with dropouts.

    PubMed

    Gosho, Masahiko; Hirakawa, Akihiro; Noma, Hisashi; Maruo, Kazushi; Sato, Yasunori

    2017-10-01

    In longitudinal clinical trials, some subjects will drop out before completing the trial, so their measurements towards the end of the trial are not obtained. Mixed-effects models for repeated measures (MMRM) analysis with "unstructured" (UN) covariance structure are increasingly common as a primary analysis for group comparisons in these trials. Furthermore, model-based covariance estimators have been routinely used for testing the group difference and estimating confidence intervals of the difference in the MMRM analysis using the UN covariance. However, using the MMRM analysis with the UN covariance could lead to convergence problems for numerical optimization, especially in trials with a small-sample size. Although the so-called sandwich covariance estimator is robust to misspecification of the covariance structure, its performance deteriorates in settings with small-sample size. We investigated the performance of the sandwich covariance estimator and covariance estimators adjusted for small-sample bias proposed by Kauermann and Carroll ( J Am Stat Assoc 2001; 96: 1387-1396) and Mancl and DeRouen ( Biometrics 2001; 57: 126-134) fitting simpler covariance structures through a simulation study. In terms of the type 1 error rate and coverage probability of confidence intervals, Mancl and DeRouen's covariance estimator with compound symmetry, first-order autoregressive (AR(1)), heterogeneous AR(1), and antedependence structures performed better than the original sandwich estimator and Kauermann and Carroll's estimator with these structures in the scenarios where the variance increased across visits. The performance based on Mancl and DeRouen's estimator with these structures was nearly equivalent to that based on the Kenward-Roger method for adjusting the standard errors and degrees of freedom with the UN structure. The model-based covariance estimator with the UN structure under unadjustment of the degrees of freedom, which is frequently used in applications, resulted in substantial inflation of the type 1 error rate. We recommend the use of Mancl and DeRouen's estimator in MMRM analysis if the number of subjects completing is ( n + 5) or less, where n is the number of planned visits. Otherwise, the use of Kenward and Roger's method with UN structure should be the best way.

  5. Alzheimer's Disease Diagnosis in Individual Subjects using Structural MR Images: Validation Studies

    PubMed Central

    Vemuri, Prashanthi; Gunter, Jeffrey L.; Senjem, Matthew L.; Whitwell, Jennifer L.; Kantarci, Kejal; Knopman, David S.; Boeve, Bradley F.; Petersen, Ronald C.; Jack, Clifford R.

    2008-01-01

    OBJECTIVE To develop and validate a tool for Alzheimer's disease (AD) diagnosis in individual subjects using support vector machine (SVM) based classification of structural MR (sMR) images. BACKGROUND Libraries of sMR scans of clinically well characterized subjects can be harnessed for the purpose of diagnosing new incoming subjects. METHODS 190 patients with probable AD were age- and gender-matched with 190 cognitively normal (CN) subjects. Three different classification models were implemented: Model I uses tissue densities obtained from sMR scans to give STructural Abnormality iNDex (STAND)-score; and Models II and III use tissue densities as well as covariates (demographics and Apolipoprotein E genotype) to give adjusted-STAND (aSTAND)-score. Data from 140 AD and 140 CN were used for training. The SVM parameter optimization and training was done by four-fold cross validation. The remaining independent sample of 50 AD and 50 CN were used to obtain a minimally biased estimate of the generalization error of the algorithm. RESULTS The CV accuracy of Model II and Model III aSTAND-scores was 88.5% and 89.3% respectively and the developed models generalized well on the independent test datasets. Anatomic patterns best differentiating the groups were consistent with the known distribution of neurofibrillary AD pathology. CONCLUSIONS This paper presents preliminary evidence that application of SVM-based classification of an individual sMR scan relative to a library of scans can provide useful information in individual subjects for diagnosis of AD. Including demographic and genetic information in the classification algorithm slightly improves diagnostic accuracy. PMID:18054253

  6. HIGH DIMENSIONAL COVARIANCE MATRIX ESTIMATION IN APPROXIMATE FACTOR MODELS.

    PubMed

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2011-01-01

    The variance covariance matrix plays a central role in the inferential theories of high dimensional factor models in finance and economics. Popular regularization methods of directly exploiting sparsity are not directly applicable to many financial problems. Classical methods of estimating the covariance matrices are based on the strict factor models, assuming independent idiosyncratic components. This assumption, however, is restrictive in practical applications. By assuming sparse error covariance matrix, we allow the presence of the cross-sectional correlation even after taking out common factors, and it enables us to combine the merits of both methods. We estimate the sparse covariance using the adaptive thresholding technique as in Cai and Liu (2011), taking into account the fact that direct observations of the idiosyncratic components are unavailable. The impact of high dimensionality on the covariance matrix estimation based on the factor structure is then studied.

  7. Estimation of genetic connectedness diagnostics based on prediction errors without the prediction error variance-covariance matrix.

    PubMed

    Holmes, John B; Dodds, Ken G; Lee, Michael A

    2017-03-02

    An important issue in genetic evaluation is the comparability of random effects (breeding values), particularly between pairs of animals in different contemporary groups. This is usually referred to as genetic connectedness. While various measures of connectedness have been proposed in the literature, there is general agreement that the most appropriate measure is some function of the prediction error variance-covariance matrix. However, obtaining the prediction error variance-covariance matrix is computationally demanding for large-scale genetic evaluations. Many alternative statistics have been proposed that avoid the computational cost of obtaining the prediction error variance-covariance matrix, such as counts of genetic links between contemporary groups, gene flow matrices, and functions of the variance-covariance matrix of estimated contemporary group fixed effects. In this paper, we show that a correction to the variance-covariance matrix of estimated contemporary group fixed effects will produce the exact prediction error variance-covariance matrix averaged by contemporary group for univariate models in the presence of single or multiple fixed effects and one random effect. We demonstrate the correction for a series of models and show that approximations to the prediction error matrix based solely on the variance-covariance matrix of estimated contemporary group fixed effects are inappropriate in certain circumstances. Our method allows for the calculation of a connectedness measure based on the prediction error variance-covariance matrix by calculating only the variance-covariance matrix of estimated fixed effects. Since the number of fixed effects in genetic evaluation is usually orders of magnitudes smaller than the number of random effect levels, the computational requirements for our method should be reduced.

  8. HIGH DIMENSIONAL COVARIANCE MATRIX ESTIMATION IN APPROXIMATE FACTOR MODELS

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2012-01-01

    The variance covariance matrix plays a central role in the inferential theories of high dimensional factor models in finance and economics. Popular regularization methods of directly exploiting sparsity are not directly applicable to many financial problems. Classical methods of estimating the covariance matrices are based on the strict factor models, assuming independent idiosyncratic components. This assumption, however, is restrictive in practical applications. By assuming sparse error covariance matrix, we allow the presence of the cross-sectional correlation even after taking out common factors, and it enables us to combine the merits of both methods. We estimate the sparse covariance using the adaptive thresholding technique as in Cai and Liu (2011), taking into account the fact that direct observations of the idiosyncratic components are unavailable. The impact of high dimensionality on the covariance matrix estimation based on the factor structure is then studied. PMID:22661790

  9. A generalized spatiotemporal covariance model for stationary background in analysis of MEG data.

    PubMed

    Plis, S M; Schmidt, D M; Jun, S C; Ranken, D M

    2006-01-01

    Using a noise covariance model based on a single Kronecker product of spatial and temporal covariance in the spatiotemporal analysis of MEG data was demonstrated to provide improvement in the results over that of the commonly used diagonal noise covariance model. In this paper we present a model that is a generalization of all of the above models. It describes models based on a single Kronecker product of spatial and temporal covariance as well as more complicated multi-pair models together with any intermediate form expressed as a sum of Kronecker products of spatial component matrices of reduced rank and their corresponding temporal covariance matrices. The model provides a framework for controlling the tradeoff between the described complexity of the background and computational demand for the analysis using this model. Ways to estimate the value of the parameter controlling this tradeoff are also discussed.

  10. A Robust Adaptive Unscented Kalman Filter for Nonlinear Estimation with Uncertain Noise Covariance.

    PubMed

    Zheng, Binqi; Fu, Pengcheng; Li, Baoqing; Yuan, Xiaobing

    2018-03-07

    The Unscented Kalman filter (UKF) may suffer from performance degradation and even divergence while mismatch between the noise distribution assumed as a priori by users and the actual ones in a real nonlinear system. To resolve this problem, this paper proposes a robust adaptive UKF (RAUKF) to improve the accuracy and robustness of state estimation with uncertain noise covariance. More specifically, at each timestep, a standard UKF will be implemented first to obtain the state estimations using the new acquired measurement data. Then an online fault-detection mechanism is adopted to judge if it is necessary to update current noise covariance. If necessary, innovation-based method and residual-based method are used to calculate the estimations of current noise covariance of process and measurement, respectively. By utilizing a weighting factor, the filter will combine the last noise covariance matrices with the estimations as the new noise covariance matrices. Finally, the state estimations will be corrected according to the new noise covariance matrices and previous state estimations. Compared with the standard UKF and other adaptive UKF algorithms, RAUKF converges faster to the actual noise covariance and thus achieves a better performance in terms of robustness, accuracy, and computation for nonlinear estimation with uncertain noise covariance, which is demonstrated by the simulation results.

  11. A Robust Adaptive Unscented Kalman Filter for Nonlinear Estimation with Uncertain Noise Covariance

    PubMed Central

    Zheng, Binqi; Yuan, Xiaobing

    2018-01-01

    The Unscented Kalman filter (UKF) may suffer from performance degradation and even divergence while mismatch between the noise distribution assumed as a priori by users and the actual ones in a real nonlinear system. To resolve this problem, this paper proposes a robust adaptive UKF (RAUKF) to improve the accuracy and robustness of state estimation with uncertain noise covariance. More specifically, at each timestep, a standard UKF will be implemented first to obtain the state estimations using the new acquired measurement data. Then an online fault-detection mechanism is adopted to judge if it is necessary to update current noise covariance. If necessary, innovation-based method and residual-based method are used to calculate the estimations of current noise covariance of process and measurement, respectively. By utilizing a weighting factor, the filter will combine the last noise covariance matrices with the estimations as the new noise covariance matrices. Finally, the state estimations will be corrected according to the new noise covariance matrices and previous state estimations. Compared with the standard UKF and other adaptive UKF algorithms, RAUKF converges faster to the actual noise covariance and thus achieves a better performance in terms of robustness, accuracy, and computation for nonlinear estimation with uncertain noise covariance, which is demonstrated by the simulation results. PMID:29518960

  12. Relative-Error-Covariance Algorithms

    NASA Technical Reports Server (NTRS)

    Bierman, Gerald J.; Wolff, Peter J.

    1991-01-01

    Two algorithms compute error covariance of difference between optimal estimates, based on data acquired during overlapping or disjoint intervals, of state of discrete linear system. Provides quantitative measure of mutual consistency or inconsistency of estimates of states. Relative-error-covariance concept applied, to determine degree of correlation between trajectories calculated from two overlapping sets of measurements and construct real-time test of consistency of state estimates based upon recently acquired data.

  13. Further Evaluation of Covariate Analysis using Empirical Bayes Estimates in Population Pharmacokinetics: the Perception of Shrinkage and Likelihood Ratio Test.

    PubMed

    Xu, Xu Steven; Yuan, Min; Yang, Haitao; Feng, Yan; Xu, Jinfeng; Pinheiro, Jose

    2017-01-01

    Covariate analysis based on population pharmacokinetics (PPK) is used to identify clinically relevant factors. The likelihood ratio test (LRT) based on nonlinear mixed effect model fits is currently recommended for covariate identification, whereas individual empirical Bayesian estimates (EBEs) are considered unreliable due to the presence of shrinkage. The objectives of this research were to investigate the type I error for LRT and EBE approaches, to confirm the similarity of power between the LRT and EBE approaches from a previous report and to explore the influence of shrinkage on LRT and EBE inferences. Using an oral one-compartment PK model with a single covariate impacting on clearance, we conducted a wide range of simulations according to a two-way factorial design. The results revealed that the EBE-based regression not only provided almost identical power for detecting a covariate effect, but also controlled the false positive rate better than the LRT approach. Shrinkage of EBEs is likely not the root cause for decrease in power or inflated false positive rate although the size of the covariate effect tends to be underestimated at high shrinkage. In summary, contrary to the current recommendations, EBEs may be a better choice for statistical tests in PPK covariate analysis compared to LRT. We proposed a three-step covariate modeling approach for population PK analysis to utilize the advantages of EBEs while overcoming their shortcomings, which allows not only markedly reducing the run time for population PK analysis, but also providing more accurate covariate tests.

  14. Earth Observation System Flight Dynamics System Covariance Realism

    NASA Technical Reports Server (NTRS)

    Zaidi, Waqar H.; Tracewell, David

    2016-01-01

    This presentation applies a covariance realism technique to the National Aeronautics and Space Administration (NASA) Earth Observation System (EOS) Aqua and Aura spacecraft based on inferential statistics. The technique consists of three parts: collection calculation of definitive state estimates through orbit determination, calculation of covariance realism test statistics at each covariance propagation point, and proper assessment of those test statistics.

  15. Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from the ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0, ROSFOND-2010, CENDL-3.1 and EAF-2010 Evaluated Data Libraries

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Mughabghab, S. F.

    2012-12-01

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present paper contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.

  16. Sensitivity and uncertainty analysis for the tritium breeding ratio of a DEMO fusion reactor with a helium cooled pebble bed blanket

    NASA Astrophysics Data System (ADS)

    Nunnenmann, Elena; Fischer, Ulrich; Stieglitz, Robert

    2017-09-01

    An uncertainty analysis was performed for the tritium breeding ratio (TBR) of a fusion power plant of the European DEMO type using the MCSEN patch to the MCNP Monte Carlo code. The breeding blanket was of the type Helium Cooled Pebble Bed (HCPB), currently under development in the European Power Plant Physics and Technology (PPPT) programme for a fusion power demonstration reactor (DEMO). A suitable 3D model of the DEMO reactor with HCPB blanket modules, as routinely used for blanket design calculations, was employed. The nuclear cross-section data were taken from the JEFF-3.2 data library. For the uncertainty analysis, the isotopes H-1, Li-6, Li-7, Be-9, O-16, Si-28, Si-29, Si-30, Cr-52, Fe-54, Fe-56, Ni-58, W-182, W-183, W-184 and W-186 were considered. The covariance data were taken from JEFF-3.2 where available. Otherwise a combination of FENDL-2.1 for Li-7, EFF-3 for Be-9 and JENDL-3.2 for O-16 were compared with data from TENDL-2014. Another comparison was performed with covariance data from JEFF-3.3T1. The analyses show an overall uncertainty of ± 3.2% for the TBR when using JEFF-3.2 covariance data with the mentioned additions. When using TENDL-2014 covariance data as replacement, the uncertainty increases to ± 8.6%. For JEFF-3.3T1 the uncertainty result is ± 5.6%. The uncertainty is dominated by O-16, Li-6 and Li-7 cross-sections.

  17. Real-time probabilistic covariance tracking with efficient model update.

    PubMed

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  18. Fast and accurate estimation of the covariance between pairwise maximum likelihood distances.

    PubMed

    Gil, Manuel

    2014-01-01

    Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error.

  19. Fast and accurate estimation of the covariance between pairwise maximum likelihood distances

    PubMed Central

    2014-01-01

    Pairwise evolutionary distances are a model-based summary statistic for a set of molecular sequences. They represent the leaf-to-leaf path lengths of the underlying phylogenetic tree. Estimates of pairwise distances with overlapping paths covary because of shared mutation events. It is desirable to take these covariance structure into account to increase precision in any process that compares or combines distances. This paper introduces a fast estimator for the covariance of two pairwise maximum likelihood distances, estimated under general Markov models. The estimator is based on a conjecture (going back to Nei & Jin, 1989) which links the covariance to path lengths. It is proven here under a simple symmetric substitution model. A simulation shows that the estimator outperforms previously published ones in terms of the mean squared error. PMID:25279263

  20. Covariance specification and estimation to improve top-down Green House Gas emission estimates

    NASA Astrophysics Data System (ADS)

    Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Whetstone, J. R.

    2015-12-01

    The National Institute of Standards and Technology (NIST) operates the North-East Corridor (NEC) project and the Indianapolis Flux Experiment (INFLUX) in order to develop measurement methods to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties in urban domains using a top down inversion method. Top down inversion updates prior knowledge using observations in a Bayesian way. One primary consideration in a Bayesian inversion framework is the covariance structure of (1) the emission prior residuals and (2) the observation residuals (i.e. the difference between observations and model predicted observations). These covariance matrices are respectively referred to as the prior covariance matrix and the model-data mismatch covariance matrix. It is known that the choice of these covariances can have large effect on estimates. The main objective of this work is to determine the impact of different covariance models on inversion estimates and their associated uncertainties in urban domains. We use a pseudo-data Bayesian inversion framework using footprints (i.e. sensitivities of tower measurements of GHGs to surface emissions) and emission priors (based on Hestia project to quantify fossil-fuel emissions) to estimate posterior emissions using different covariance schemes. The posterior emission estimates and uncertainties are compared to the hypothetical truth. We find that, if we correctly specify spatial variability and spatio-temporal variability in prior and model-data mismatch covariances respectively, then we can compute more accurate posterior estimates. We discuss few covariance models to introduce space-time interacting mismatches along with estimation of the involved parameters. We then compare several candidate prior spatial covariance models from the Matern covariance class and estimate their parameters with specified mismatches. We find that best-fitted prior covariances are not always best in recovering the truth. To achieve accuracy, we perform a sensitivity study to further tune covariance parameters. Finally, we introduce a shrinkage based sample covariance estimation technique for both prior and mismatch covariances. This technique allows us to achieve similar accuracy nonparametrically in a more efficient and automated way.

  1. Controlling Surface Plasmons Through Covariant Transformation of the Spin-Dependent Geometric Phase Between Curved Metamaterials

    NASA Astrophysics Data System (ADS)

    Zhong, Fan; Li, Jensen; Liu, Hui; Zhu, Shining

    2018-06-01

    General relativity uses curved space-time to describe accelerating frames. The movement of particles in different curved space-times can be regarded as equivalent physical processes based on the covariant transformation between different frames. In this Letter, we use one-dimensional curved metamaterials to mimic accelerating particles in curved space-times. The different curved shapes of structures are used to mimic different accelerating frames. The different geometric phases along the structure are used to mimic different movements in the frame. Using the covariant principle of general relativity, we can obtain equivalent nanostructures based on space-time transformations, such as the Lorentz transformation and conformal transformation. In this way, many covariant structures can be found that produce the same surface plasmon fields when excited by spin photons. A new kind of accelerating beam, the Rindler beam, is obtained based on the Rindler metric in gravity. Very large effective indices can be obtained in such systems based on geometric-phase gradient. This general covariant design method can be extended to many other optical media.

  2. Integrated Online Software for Libraries: An Overview of Today's Best-Selling IOLS. Options from the U.S. Perspective.

    ERIC Educational Resources Information Center

    Cibbarelli, Pamela

    1996-01-01

    Profiles the top-selling IOLS (integrated online library systems) software for libraries based on sales figures reported in the 1996 "Library Journal" annual survey of the library automation marketplace. Highlights include microcomputer-based systems and minicomputer-based systems, system components, MARC formats, and market sectors.…

  3. Auto covariance computer

    NASA Technical Reports Server (NTRS)

    Hepner, T. E.; Meyers, J. F. (Inventor)

    1985-01-01

    A laser velocimeter covariance processor which calculates the auto covariance and cross covariance functions for a turbulent flow field based on Poisson sampled measurements in time from a laser velocimeter is described. The device will process a block of data that is up to 4096 data points in length and return a 512 point covariance function with 48-bit resolution along with a 512 point histogram of the interarrival times which is used to normalize the covariance function. The device is designed to interface and be controlled by a minicomputer from which the data is received and the results returned. A typical 4096 point computation takes approximately 1.5 seconds to receive the data, compute the covariance function, and return the results to the computer.

  4. Partial covariance based functional connectivity computation using Ledoit-Wolf covariance regularization.

    PubMed

    Brier, Matthew R; Mitra, Anish; McCarthy, John E; Ances, Beau M; Snyder, Abraham Z

    2015-11-01

    Functional connectivity refers to shared signals among brain regions and is typically assessed in a task free state. Functional connectivity commonly is quantified between signal pairs using Pearson correlation. However, resting-state fMRI is a multivariate process exhibiting a complicated covariance structure. Partial covariance assesses the unique variance shared between two brain regions excluding any widely shared variance, hence is appropriate for the analysis of multivariate fMRI datasets. However, calculation of partial covariance requires inversion of the covariance matrix, which, in most functional connectivity studies, is not invertible owing to rank deficiency. Here we apply Ledoit-Wolf shrinkage (L2 regularization) to invert the high dimensional BOLD covariance matrix. We investigate the network organization and brain-state dependence of partial covariance-based functional connectivity. Although RSNs are conventionally defined in terms of shared variance, removal of widely shared variance, surprisingly, improved the separation of RSNs in a spring embedded graphical model. This result suggests that pair-wise unique shared variance plays a heretofore unrecognized role in RSN covariance organization. In addition, application of partial correlation to fMRI data acquired in the eyes open vs. eyes closed states revealed focal changes in uniquely shared variance between the thalamus and visual cortices. This result suggests that partial correlation of resting state BOLD time series reflect functional processes in addition to structural connectivity. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Partial covariance based functional connectivity computation using Ledoit-Wolf covariance regularization

    PubMed Central

    Brier, Matthew R.; Mitra, Anish; McCarthy, John E.; Ances, Beau M.; Snyder, Abraham Z.

    2015-01-01

    Functional connectivity refers to shared signals among brain regions and is typically assessed in a task free state. Functional connectivity commonly is quantified between signal pairs using Pearson correlation. However, resting-state fMRI is a multivariate process exhibiting a complicated covariance structure. Partial covariance assesses the unique variance shared between two brain regions excluding any widely shared variance, hence is appropriate for the analysis of multivariate fMRI datasets. However, calculation of partial covariance requires inversion of the covariance matrix, which, in most functional connectivity studies, is not invertible owing to rank deficiency. Here we apply Ledoit-Wolf shrinkage (L2 regularization) to invert the high dimensional BOLD covariance matrix. We investigate the network organization and brain-state dependence of partial covariance-based functional connectivity. Although RSNs are conventionally defined in terms of shared variance, removal of widely shared variance, surprisingly, improved the separation of RSNs in a spring embedded graphical model. This result suggests that pair-wise unique shared variance plays a heretofore unrecognized role in RSN covariance organization. In addition, application of partial correlation to fMRI data acquired in the eyes open vs. eyes closed states revealed focal changes in uniquely shared variance between the thalamus and visual cortices. This result suggests that partial correlation of resting state BOLD time series reflect functional processes in addition to structural connectivity. PMID:26208872

  6. Getting libraries involved in industry-university-government collaboration : Libraries should support inauguration of business and lead SME into a knowledge-based society : What Toshiaki Takeuchi does as Business Library Association's President

    NASA Astrophysics Data System (ADS)

    Morita, Utako

    Getting libraries involved in industry-university-government collaboration : Libraries should support inauguration of business and lead SME into a knowledge-based society : What Toshiaki Takeuchi does as Business Library Association's President

  7. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  8. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  9. Treatment selection in a randomized clinical trial via covariate-specific treatment effect curves.

    PubMed

    Ma, Yunbei; Zhou, Xiao-Hua

    2017-02-01

    For time-to-event data in a randomized clinical trial, we proposed two new methods for selecting an optimal treatment for a patient based on the covariate-specific treatment effect curve, which is used to represent the clinical utility of a predictive biomarker. To select an optimal treatment for a patient with a specific biomarker value, we proposed pointwise confidence intervals for each covariate-specific treatment effect curve and the difference between covariate-specific treatment effect curves of two treatments. Furthermore, to select an optimal treatment for a future biomarker-defined subpopulation of patients, we proposed confidence bands for each covariate-specific treatment effect curve and the difference between each pair of covariate-specific treatment effect curve over a fixed interval of biomarker values. We constructed the confidence bands based on a resampling technique. We also conducted simulation studies to evaluate finite-sample properties of the proposed estimation methods. Finally, we illustrated the application of the proposed method in a real-world data set.

  10. Views From the Pacific--Military Base Hospital Libraries in Hawaii and Guam.

    PubMed

    Stephenson, Priscilla L; Trafford, Mabel A; Hadley, Alice E

    2016-01-01

    Hospital libraries serving military bases offer a different perspective on library services. Two libraries located on islands in the Pacific Ocean provide services to active duty service men and women, including those deployed to other regions of the world. In addition, these hospital libraries serve service members' families living on the base, and often citizens from the surrounding communities.

  11. Mixture-based combinatorial libraries from small individual peptide libraries: a case study on α1-antitrypsin deficiency.

    PubMed

    Chang, Yi-Pin; Chu, Yen-Ho

    2014-05-16

    The design, synthesis and screening of diversity-oriented peptide libraries using a "libraries from libraries" strategy for the development of inhibitors of α1-antitrypsin deficiency are described. The major buttress of the biochemical approach presented here is the use of well-established solid-phase split-and-mix method for the generation of mixture-based libraries. The combinatorial technique iterative deconvolution was employed for library screening. While molecular diversity is the general consideration of combinatorial libraries, exquisite design through systematic screening of small individual libraries is a prerequisite for effective library screening and can avoid potential problems in some cases. This review will also illustrate how large peptide libraries were designed, as well as how a conformation-sensitive assay was developed based on the mechanism of the conformational disease. Finally, the combinatorially selected peptide inhibitor capable of blocking abnormal protein aggregation will be characterized by biophysical, cellular and computational methods.

  12. The Cost of Library Services: Activity-Based Costing in an Australian Academic Library.

    ERIC Educational Resources Information Center

    Robinson, Peter; Ellis-Newman, Jennifer

    1998-01-01

    Explains activity-based costing (ABC), discusses the benefits of ABC to library managers, and describes the steps involved in implementing ABC in an Australian academic library. Discusses the budgeting process in universities, and considers benefits to the library. (Author/LRW)

  13. Continuous Covariate Imbalance and Conditional Power for Clinical Trial Interim Analyses

    PubMed Central

    Ciolino, Jody D.; Martin, Renee' H.; Zhao, Wenle; Jauch, Edward C.; Hill, Michael D.; Palesch, Yuko Y.

    2014-01-01

    Oftentimes valid statistical analyses for clinical trials involve adjustment for known influential covariates, regardless of imbalance observed in these covariates at baseline across treatment groups. Thus, it must be the case that valid interim analyses also properly adjust for these covariates. There are situations, however, in which covariate adjustment is not possible, not planned, or simply carries less merit as it makes inferences less generalizable and less intuitive. In this case, covariate imbalance between treatment groups can have a substantial effect on both interim and final primary outcome analyses. This paper illustrates the effect of influential continuous baseline covariate imbalance on unadjusted conditional power (CP), and thus, on trial decisions based on futility stopping bounds. The robustness of the relationship is illustrated for normal, skewed, and bimodal continuous baseline covariates that are related to a normally distributed primary outcome. Results suggest that unadjusted CP calculations in the presence of influential covariate imbalance require careful interpretation and evaluation. PMID:24607294

  14. Alternative Multiple Imputation Inference for Mean and Covariance Structure Modeling

    ERIC Educational Resources Information Center

    Lee, Taehun; Cai, Li

    2012-01-01

    Model-based multiple imputation has become an indispensable method in the educational and behavioral sciences. Mean and covariance structure models are often fitted to multiply imputed data sets. However, the presence of multiple random imputations complicates model fit testing, which is an important aspect of mean and covariance structure…

  15. A Virtual "Hello": A Web-Based Orientation to the Library.

    ERIC Educational Resources Information Center

    Borah, Eloisa Gomez

    1997-01-01

    Describes the development of Web-based library services and resources available at the Rosenfeld Library of the Anderson Graduate School of Management at University of California at Los Angeles. Highlights include library orientation sessions; virtual tours of the library; a database of basic business sources; and research strategies, including…

  16. Directional variance adjustment: bias reduction in covariance matrices based on factor analysis with an application to portfolio optimization.

    PubMed

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation.

  17. Directional Variance Adjustment: Bias Reduction in Covariance Matrices Based on Factor Analysis with an Application to Portfolio Optimization

    PubMed Central

    Bartz, Daniel; Hatrick, Kerr; Hesse, Christian W.; Müller, Klaus-Robert; Lemm, Steven

    2013-01-01

    Robust and reliable covariance estimates play a decisive role in financial and many other applications. An important class of estimators is based on factor models. Here, we show by extensive Monte Carlo simulations that covariance matrices derived from the statistical Factor Analysis model exhibit a systematic error, which is similar to the well-known systematic error of the spectrum of the sample covariance matrix. Moreover, we introduce the Directional Variance Adjustment (DVA) algorithm, which diminishes the systematic error. In a thorough empirical study for the US, European, and Hong Kong stock market we show that our proposed method leads to improved portfolio allocation. PMID:23844016

  18. Acoustic classification of zooplankton

    NASA Astrophysics Data System (ADS)

    Martin Traykovski, Linda V.

    1998-11-01

    Work on the forward problem in zooplankton bioacoustics has resulted in the identification of three categories of acoustic scatterers: elastic-shelled (e.g. pteropods), fluid-like (e.g. euphausiids), and gas-bearing (e.g. siphonophores). The relationship between backscattered energy and animal biomass has been shown to vary by a factor of ~19,000 across these categories, so that to make accurate estimates of zooplankton biomass from acoustic backscatter measurements of the ocean, the acoustic characteristics of the species of interest must be well-understood. This thesis describes the development of both feature based and model based classification techniques to invert broadband acoustic echoes from individual zooplankton for scatterer type, as well as for particular parameters such as animal orientation. The feature based Empirical Orthogonal Function Classifier (EOFC) discriminates scatterer types by identifying characteristic modes of variability in the echo spectra, exploiting only the inherent characteristic structure of the acoustic signatures. The model based Model Parameterisation Classifier (MPC) classifies based on correlation of observed echo spectra with simplified parameterisations of theoretical scattering models for the three classes. The Covariance Mean Variance Classifiers (CMVC) are a set of advanced model based techniques which exploit the full complexity of the theoretical models by searching the entire physical model parameter space without employing simplifying parameterisations. Three different CMVC algorithms were developed: the Integrated Score Classifier (ISC), the Pairwise Score Classifier (PSC) and the Bayesian Probability Classifier (BPC); these classifiers assign observations to a class based on similarities in covariance, mean, and variance, while accounting for model ambiguity and validity. These feature based and model based inversion techniques were successfully applied to several thousand echoes acquired from broadband (~350 kHz-750 kHz) insonifications of live zooplankton collected on Georges Bank and the Gulf of Maine to determine scatterer class. CMVC techniques were also applied to echoes from fluid-like zooplankton (Antarctic krill) to invert for angle of orientation using generic and animal-specific theoretical and empirical models. Application of these inversion techniques in situ will allow correct apportionment of backscattered energy to animal biomass, significantly improving estimates of zooplankton biomass based on acoustic surveys. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

  19. Video based object representation and classification using multiple covariance matrices.

    PubMed

    Zhang, Yurong; Liu, Quan

    2017-01-01

    Video based object recognition and classification has been widely studied in computer vision and image processing area. One main issue of this task is to develop an effective representation for video. This problem can generally be formulated as image set representation. In this paper, we present a new method called Multiple Covariance Discriminative Learning (MCDL) for image set representation and classification problem. The core idea of MCDL is to represent an image set using multiple covariance matrices with each covariance matrix representing one cluster of images. Firstly, we use the Nonnegative Matrix Factorization (NMF) method to do image clustering within each image set, and then adopt Covariance Discriminative Learning on each cluster (subset) of images. At last, we adopt KLDA and nearest neighborhood classification method for image set classification. Promising experimental results on several datasets show the effectiveness of our MCDL method.

  20. Does the covariance structure matter in longitudinal modelling for the prediction of future CD4 counts?

    PubMed

    Taylor, J M; Law, N

    1998-10-30

    We investigate the importance of the assumed covariance structure for longitudinal modelling of CD4 counts. We examine how individual predictions of future CD4 counts are affected by the covariance structure. We consider four covariance structures: one based on an integrated Ornstein-Uhlenbeck stochastic process; one based on Brownian motion, and two derived from standard linear and quadratic random-effects models. Using data from the Multicenter AIDS Cohort Study and from a simulation study, we show that there is a noticeable deterioration in the coverage rate of confidence intervals if we assume the wrong covariance. There is also a loss in efficiency. The quadratic random-effects model is found to be the best in terms of correctly calibrated prediction intervals, but is substantially less efficient than the others. Incorrectly specifying the covariance structure as linear random effects gives too narrow prediction intervals with poor coverage rates. Fitting using the model based on the integrated Ornstein-Uhlenbeck stochastic process is the preferred one of the four considered because of its efficiency and robustness properties. We also use the difference between the future predicted and observed CD4 counts to assess an appropriate transformation of CD4 counts; a fourth root, cube root and square root all appear reasonable choices.

  1. Application of seemingly unrelated regression in medical data with intermittently observed time-dependent covariates.

    PubMed

    Keshavarzi, Sareh; Ayatollahi, Seyyed Mohammad Taghi; Zare, Najaf; Pakfetrat, Maryam

    2012-01-01

    BACKGROUND. In many studies with longitudinal data, time-dependent covariates can only be measured intermittently (not at all observation times), and this presents difficulties for standard statistical analyses. This situation is common in medical studies, and methods that deal with this challenge would be useful. METHODS. In this study, we performed the seemingly unrelated regression (SUR) based models, with respect to each observation time in longitudinal data with intermittently observed time-dependent covariates and further compared these models with mixed-effect regression models (MRMs) under three classic imputation procedures. Simulation studies were performed to compare the sample size properties of the estimated coefficients for different modeling choices. RESULTS. In general, the proposed models in the presence of intermittently observed time-dependent covariates showed a good performance. However, when we considered only the observed values of the covariate without any imputations, the resulted biases were greater. The performances of the proposed SUR-based models in comparison with MRM using classic imputation methods were nearly similar with approximately equal amounts of bias and MSE. CONCLUSION. The simulation study suggests that the SUR-based models work as efficiently as MRM in the case of intermittently observed time-dependent covariates. Thus, it can be used as an alternative to MRM.

  2. The impact of covariance misspecification in group-based trajectory models for longitudinal data with non-stationary covariance structure.

    PubMed

    Davies, Christopher E; Glonek, Gary Fv; Giles, Lynne C

    2017-08-01

    One purpose of a longitudinal study is to gain a better understanding of how an outcome of interest changes among a given population over time. In what follows, a trajectory will be taken to mean the series of measurements of the outcome variable for an individual. Group-based trajectory modelling methods seek to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Group-based trajectory models generally assume a certain structure in the covariances between measurements, for example conditional independence, homogeneous variance between groups or stationary variance over time. Violations of these assumptions could be expected to result in poor model performance. We used simulation to investigate the effect of covariance misspecification on misclassification of trajectories in commonly used models under a range of scenarios. To do this we defined a measure of performance relative to the ideal Bayesian correct classification rate. We found that the more complex models generally performed better over a range of scenarios. In particular, incorrectly specified covariance matrices could significantly bias the results but using models with a correct but more complicated than necessary covariance matrix incurred little cost.

  3. A Review of DIMPACK Version 1.0: Conditional Covariance-Based Test Dimensionality Analysis Package

    ERIC Educational Resources Information Center

    Deng, Nina; Han, Kyung T.; Hambleton, Ronald K.

    2013-01-01

    DIMPACK Version 1.0 for assessing test dimensionality based on a nonparametric conditional covariance approach is reviewed. This software was originally distributed by Assessment Systems Corporation and now can be freely accessed online. The software consists of Windows-based interfaces of three components: DIMTEST, DETECT, and CCPROX/HAC, which…

  4. Web-Based Instruction: A Guide for Libraries, Third Edition

    ERIC Educational Resources Information Center

    Smith, Susan Sharpless

    2010-01-01

    Expanding on the popular, practical how-to guide for public, academic, school, and special libraries, technology expert Susan Sharpless Smith offers library instructors the confidence to take Web-based instruction into their own hands. Smith has thoroughly updated "Web-Based Instruction: A Guide for Libraries" to include new tools and trends,…

  5. Fee Based Document Delivery by a National Library: Publishing in the New Millennium.

    ERIC Educational Resources Information Center

    Smith, Malcolm D.

    1996-01-01

    An overview of the development of document supply relationships between libraries and publishers, based on the British Library's Document Supply Centre, reveals four areas leading to fee based (copyright) document delivery: libraries as markets for publishers; making users aware of what is published; making publications more accessible; and the…

  6. Structure-based design of combinatorial mutagenesis libraries

    PubMed Central

    Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris

    2015-01-01

    The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called “Structure-based Optimization of Combinatorial Mutagenesis” (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. PMID:25611189

  7. Structure-based design of combinatorial mutagenesis libraries.

    PubMed

    Verma, Deeptak; Grigoryan, Gevorg; Bailey-Kellogg, Chris

    2015-05-01

    The development of protein variants with improved properties (thermostability, binding affinity, catalytic activity, etc.) has greatly benefited from the application of high-throughput screens evaluating large, diverse combinatorial libraries. At the same time, since only a very limited portion of sequence space can be experimentally constructed and tested, an attractive possibility is to use computational protein design to focus libraries on a productive portion of the space. We present a general-purpose method, called "Structure-based Optimization of Combinatorial Mutagenesis" (SOCoM), which can optimize arbitrarily large combinatorial mutagenesis libraries directly based on structural energies of their constituents. SOCoM chooses both positions and substitutions, employing a combinatorial optimization framework based on library-averaged energy potentials in order to avoid explicitly modeling every variant in every possible library. In case study applications to green fluorescent protein, β-lactamase, and lipase A, SOCoM optimizes relatively small, focused libraries whose variants achieve energies comparable to or better than previous library design efforts, as well as larger libraries (previously not designable by structure-based methods) whose variants cover greater diversity while still maintaining substantially better energies than would be achieved by representative random library approaches. By allowing the creation of large-scale combinatorial libraries based on structural calculations, SOCoM promises to increase the scope of applicability of computational protein design and improve the hit rate of discovering beneficial variants. While designs presented here focus on variant stability (predicted by total energy), SOCoM can readily incorporate other structure-based assessments, such as the energy gap between alternative conformational or bound states. © 2015 The Protein Society.

  8. Evidence based library and information practice in Australia: defining skills and knowledge.

    PubMed

    Lewis, Suzanne

    2011-06-01

    This guest feature from Suzanne Lewis, a long-time advocate of evidence based library and information practice (EBLIP) in Australia, discusses a current trend within the movement that focuses on the skills, knowledge and competencies of health librarians. In particular, the feature describes three specific Australia-based research projects, on expert searching, indigenous health and future skills requirements for the health library workforce respectively, that exemplify this trend. These projects illustrate how the evidence base can be strengthened around the skills and knowledge required to deliver services that continue to meet the changing needs of health library and information users. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.

  9. A Comparison of Methods for Estimating the Determinant of High-Dimensional Covariance Matrix.

    PubMed

    Hu, Zongliang; Dong, Kai; Dai, Wenlin; Tong, Tiejun

    2017-09-21

    The determinant of the covariance matrix for high-dimensional data plays an important role in statistical inference and decision. It has many real applications including statistical tests and information theory. Due to the statistical and computational challenges with high dimensionality, little work has been proposed in the literature for estimating the determinant of high-dimensional covariance matrix. In this paper, we estimate the determinant of the covariance matrix using some recent proposals for estimating high-dimensional covariance matrix. Specifically, we consider a total of eight covariance matrix estimation methods for comparison. Through extensive simulation studies, we explore and summarize some interesting comparison results among all compared methods. We also provide practical guidelines based on the sample size, the dimension, and the correlation of the data set for estimating the determinant of high-dimensional covariance matrix. Finally, from a perspective of the loss function, the comparison study in this paper may also serve as a proxy to assess the performance of the covariance matrix estimation.

  10. Alterations in Anatomical Covariance in the Prematurely Born

    PubMed Central

    Scheinost, Dustin; Kwon, Soo Hyun; Lacadie, Cheryl; Vohr, Betty R.; Schneider, Karen C.; Papademetris, Xenophon; Constable, R. Todd; Ment, Laura R.

    2017-01-01

    Abstract Preterm (PT) birth results in long-term alterations in functional and structural connectivity, but the related changes in anatomical covariance are just beginning to be explored. To test the hypothesis that PT birth alters patterns of anatomical covariance, we investigated brain volumes of 25 PTs and 22 terms at young adulthood using magnetic resonance imaging. Using regional volumetrics, seed-based analyses, and whole brain graphs, we show that PT birth is associated with reduced volume in bilateral temporal and inferior frontal lobes, left caudate, left fusiform, and posterior cingulate for prematurely born subjects at young adulthood. Seed-based analyses demonstrate altered patterns of anatomical covariance for PTs compared with terms. PTs exhibit reduced covariance with R Brodmann area (BA) 47, Broca's area, and L BA 21, Wernicke's area, and white matter volume in the left prefrontal lobe, but increased covariance with R BA 47 and left cerebellum. Graph theory analyses demonstrate that measures of network complexity are significantly less robust in PTs compared with term controls. Volumes in regions showing group differences are significantly correlated with phonological awareness, the fundamental basis for reading acquisition, for the PTs. These data suggest both long-lasting and clinically significant alterations in the covariance in the PTs at young adulthood. PMID:26494796

  11. Mapping structural covariance networks of facial emotion recognition in early psychosis: A pilot study.

    PubMed

    Buchy, Lisa; Barbato, Mariapaola; Makowski, Carolina; Bray, Signe; MacMaster, Frank P; Deighton, Stephanie; Addington, Jean

    2017-11-01

    People with psychosis show deficits recognizing facial emotions and disrupted activation in the underlying neural circuitry. We evaluated associations between facial emotion recognition and cortical thickness using a correlation-based approach to map structural covariance networks across the brain. Fifteen people with an early psychosis provided magnetic resonance scans and completed the Penn Emotion Recognition and Differentiation tasks. Fifteen historical controls provided magnetic resonance scans. Cortical thickness was computed using CIVET and analyzed with linear models. Seed-based structural covariance analysis was done using the mapping anatomical correlations across the cerebral cortex methodology. To map structural covariance networks involved in facial emotion recognition, the right somatosensory cortex and bilateral fusiform face areas were selected as seeds. Statistics were run in SurfStat. Findings showed increased cortical covariance between the right fusiform face region seed and right orbitofrontal cortex in controls than early psychosis subjects. Facial emotion recognition scores were not significantly associated with thickness in any region. A negative effect of Penn Differentiation scores on cortical covariance was seen between the left fusiform face area seed and right superior parietal lobule in early psychosis subjects. Results suggest that facial emotion recognition ability is related to covariance in a temporal-parietal network in early psychosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Toward a clearer portrayal of confounding bias in instrumental variable applications.

    PubMed

    Jackson, John W; Swanson, Sonja A

    2015-07-01

    Recommendations for reporting instrumental variable analyses often include presenting the balance of covariates across levels of the proposed instrument and levels of the treatment. However, such presentation can be misleading as relatively small imbalances among covariates across levels of the instrument can result in greater bias because of bias amplification. We introduce bias plots and bias component plots as alternative tools for understanding biases in instrumental variable analyses. Using previously published data on proposed preference-based, geography-based, and distance-based instruments, we demonstrate why presenting covariate balance alone can be problematic, and how bias component plots can provide more accurate context for bias from omitting a covariate from an instrumental variable versus non-instrumental variable analysis. These plots can also provide relevant comparisons of different proposed instruments considered in the same data. Adaptable code is provided for creating the plots.

  13. Individual-based versus aggregate meta-analysis in multi-database studies of pregnancy outcomes: the Nordic example of selective serotonin reuptake inhibitors and venlafaxine in pregnancy.

    PubMed

    Selmer, Randi; Haglund, Bengt; Furu, Kari; Andersen, Morten; Nørgaard, Mette; Zoëga, Helga; Kieler, Helle

    2016-10-01

    Compare analyses of a pooled data set on the individual level with aggregate meta-analysis in a multi-database study. We reanalysed data on 2.3 million births in a Nordic register based cohort study. We compared estimated odds ratios (OR) for the effect of selective serotonin reuptake inhibitors (SSRI) and venlafaxine use in pregnancy on any cardiovascular birth defect and the rare outcome right ventricular outflow tract obstructions (RVOTO). Common covariates included maternal age, calendar year, birth order, maternal diabetes, and co-medication. Additional covariates were added in analyses with country-optimized adjustment. Country adjusted OR (95%CI) for any cardiovascular birth defect in the individual-based pooled analysis was 1.27 (1.17-1.39), 1.17 (1.07-1.27) adjusted for common covariates and 1.15 (1.05-1.26) adjusted for all covariates. In fixed effects meta-analyses pooled OR was 1.29 (1.19-1.41) based on crude country specific ORs, 1.19 (1.09-1.29) adjusted for common covariates, and 1.16 (1.06-1.27) for country-optimized adjustment. In a random effects model the adjusted OR was 1.07 (0.87-1.32). For RVOTO, OR was 1.48 (1.15-1.89) adjusted for all covariates in the pooled data set, and 1.53 (1.19-1.96) after country-optimized adjustment. Country-specific adjusted analyses at the substance level were not possible for RVOTO. Results of fixed effects meta-analysis and individual-based analyses of a pooled dataset were similar in this study on the association of SSRI/venlafaxine and cardiovascular birth defects. Country-optimized adjustment attenuated the estimates more than adjustment for common covariates only. When data are sparse pooled data on the individual level are needed for adjusted analyses. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Writing in the Library? Why Not! Using Google Slides to Reinvent the Library Checkout Period

    ERIC Educational Resources Information Center

    Doyle, Matthew

    2017-01-01

    The class visit to check out books has long been a staple of school library programs nationwide. This flexible or scheduled time allows students to take advantage of the library collection and enjoy reading based on their own interests. The library provides the perfect environment for students to self-select books based on their interests, an…

  15. Cartographic symbol library considering symbol relations based on anti-aliasing graphic library

    NASA Astrophysics Data System (ADS)

    Mei, Yang; Li, Lin

    2007-06-01

    Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.

  16. Inadequacy of internal covariance estimation for super-sample covariance

    NASA Astrophysics Data System (ADS)

    Lacasa, Fabien; Kunz, Martin

    2017-08-01

    We give an analytical interpretation of how subsample-based internal covariance estimators lead to biased estimates of the covariance, due to underestimating the super-sample covariance (SSC). This includes the jackknife and bootstrap methods as estimators for the full survey area, and subsampling as an estimator of the covariance of subsamples. The limitations of the jackknife covariance have been previously presented in the literature because it is effectively a rescaling of the covariance of the subsample area. However we point out that subsampling is also biased, but for a different reason: the subsamples are not independent, and the corresponding lack of power results in SSC underprediction. We develop the formalism in the case of cluster counts that allows the bias of each covariance estimator to be exactly predicted. We find significant effects for a small-scale area or when a low number of subsamples is used, with auto-redshift biases ranging from 0.4% to 15% for subsampling and from 5% to 75% for jackknife covariance estimates. The cross-redshift covariance is even more affected; biases range from 8% to 25% for subsampling and from 50% to 90% for jackknife. Owing to the redshift evolution of the probe, the covariances cannot be debiased by a simple rescaling factor, and an exact debiasing has the same requirements as the full SSC prediction. These results thus disfavour the use of internal covariance estimators on data itself or a single simulation, leaving analytical prediction and simulations suites as possible SSC predictors.

  17. America's Star Libraries, 2010: Top-Rated Libraries

    ERIC Educational Resources Information Center

    Lyons, Ray; Lance, Keith Curry

    2010-01-01

    The "LJ" Index of Public Library Service 2010, "Library Journal"'s national rating of public libraries, identifies 258 "star" libraries. Created by Ray Lyons and Keith Curry Lance, and based on 2008 data from the IMLS, it rates 7,407 public libraries. The top libraries in each group get five, four, or three stars. All included libraries, stars or…

  18. America's Star Libraries: Top-Rated Libraries

    ERIC Educational Resources Information Center

    Lance, Keith Curry; Lyons, Ray

    2009-01-01

    "Library Journal"'s national rating of public libraries, the "LJ" Index of Public Library Service 2009, Round 2, identifies 258 "star" libraries. Created by Keith Curry Lance and Ray Lyons and based on 2007 data from the IMLS, it rates 7,268 public libraries. The top libraries in each group get five, four, or three stars. All included libraries,…

  19. Covariance descriptor fusion for target detection

    NASA Astrophysics Data System (ADS)

    Cukur, Huseyin; Binol, Hamidullah; Bal, Abdullah; Yavuz, Fatih

    2016-05-01

    Target detection is one of the most important topics for military or civilian applications. In order to address such detection tasks, hyperspectral imaging sensors provide useful images data containing both spatial and spectral information. Target detection has various challenging scenarios for hyperspectral images. To overcome these challenges, covariance descriptor presents many advantages. Detection capability of the conventional covariance descriptor technique can be improved by fusion methods. In this paper, hyperspectral bands are clustered according to inter-bands correlation. Target detection is then realized by fusion of covariance descriptor results based on the band clusters. The proposed combination technique is denoted Covariance Descriptor Fusion (CDF). The efficiency of the CDF is evaluated by applying to hyperspectral imagery to detect man-made objects. The obtained results show that the CDF presents better performance than the conventional covariance descriptor.

  20. Schur Complement Inequalities for Covariance Matrices and Monogamy of Quantum Correlations

    NASA Astrophysics Data System (ADS)

    Lami, Ludovico; Hirche, Christoph; Adesso, Gerardo; Winter, Andreas

    2016-11-01

    We derive fundamental constraints for the Schur complement of positive matrices, which provide an operator strengthening to recently established information inequalities for quantum covariance matrices, including strong subadditivity. This allows us to prove general results on the monogamy of entanglement and steering quantifiers in continuous variable systems with an arbitrary number of modes per party. A powerful hierarchical relation for correlation measures based on the log-determinant of covariance matrices is further established for all Gaussian states, which has no counterpart among quantities based on the conventional von Neumann entropy.

  1. Schur Complement Inequalities for Covariance Matrices and Monogamy of Quantum Correlations.

    PubMed

    Lami, Ludovico; Hirche, Christoph; Adesso, Gerardo; Winter, Andreas

    2016-11-25

    We derive fundamental constraints for the Schur complement of positive matrices, which provide an operator strengthening to recently established information inequalities for quantum covariance matrices, including strong subadditivity. This allows us to prove general results on the monogamy of entanglement and steering quantifiers in continuous variable systems with an arbitrary number of modes per party. A powerful hierarchical relation for correlation measures based on the log-determinant of covariance matrices is further established for all Gaussian states, which has no counterpart among quantities based on the conventional von Neumann entropy.

  2. Structural covariance of brain region volumes is associated with both structural connectivity and transcriptomic similarity.

    PubMed

    Yee, Yohan; Fernandes, Darren J; French, Leon; Ellegood, Jacob; Cahill, Lindsay S; Vousden, Dulcie A; Spencer Noakes, Leigh; Scholz, Jan; van Eede, Matthijs C; Nieman, Brian J; Sled, John G; Lerch, Jason P

    2018-05-18

    An organizational pattern seen in the brain, termed structural covariance, is the statistical association of pairs of brain regions in their anatomical properties. These associations, measured across a population as covariances or correlations usually in cortical thickness or volume, are thought to reflect genetic and environmental underpinnings. Here, we examine the biological basis of structural volume covariance in the mouse brain. We first examined large scale associations between brain region volumes using an atlas-based approach that parcellated the entire mouse brain into 318 regions over which correlations in volume were assessed, for volumes obtained from 153 mouse brain images via high-resolution MRI. We then used a seed-based approach and determined, for 108 different seed regions across the brain and using mouse gene expression and connectivity data from the Allen Institute for Brain Science, the variation in structural covariance data that could be explained by distance to seed, transcriptomic similarity to seed, and connectivity to seed. We found that overall, correlations in structure volumes hierarchically clustered into distinct anatomical systems, similar to findings from other studies and similar to other types of networks in the brain, including structural connectivity and transcriptomic similarity networks. Across seeds, this structural covariance was significantly explained by distance (17% of the variation, up to a maximum of 49% for structural covariance to the visceral area of the cortex), transcriptomic similarity (13% of the variation, up to maximum of 28% for structural covariance to the primary visual area) and connectivity (15% of the variation, up to a maximum of 36% for structural covariance to the intermediate reticular nucleus in the medulla) of covarying structures. Together, distance, connectivity, and transcriptomic similarity explained 37% of structural covariance, up to a maximum of 63% for structural covariance to the visceral area. Additionally, this pattern of explained variation differed spatially across the brain, with transcriptomic similarity playing a larger role in the cortex than subcortex, while connectivity explains structural covariance best in parts of the cortex, midbrain, and hindbrain. These results suggest that both gene expression and connectivity underlie structural volume covariance, albeit to different extents depending on brain region, and this relationship is modulated by distance. Copyright © 2018. Published by Elsevier Inc.

  3. The NASA ADS Abstract Service and the Distributed Astronomy Digital Library [and] Project Soup: Comparing Evaluations of Digital Collection Efforts [and] Cross-Organizational Access Management: A Digital Library Authentication and Authorization Architecture [and] BibRelEx: Exploring Bibliographic Databases by Visualization of Annotated Content-based Relations [and] Semantics-Sensitive Retrieval for Digital Picture Libraries [and] Encoded Archival Description: An Introduction and Overview.

    ERIC Educational Resources Information Center

    Kurtz, Michael J.; Eichorn, Guenther; Accomazzi, Alberto; Grant, Carolyn S.; Demleitner, Markus; Murray, Stephen S.; Jones, Michael L. W.; Gay, Geri K.; Rieger, Robert H.; Millman, David; Bruggemann-Klein, Anne; Klein, Rolf; Landgraf, Britta; Wang, James Ze; Li, Jia; Chan, Desmond; Wiederhold, Gio; Pitti, Daniel V.

    1999-01-01

    Includes six articles that discuss a digital library for astronomy; comparing evaluations of digital collection efforts; cross-organizational access management of Web-based resources; searching scientific bibliographic databases based on content-based relations between documents; semantics-sensitive retrieval for digital picture libraries; and…

  4. Structural covariance in the hallucinating brain: a voxel-based morphometry study

    PubMed Central

    Modinos, Gemma; Vercammen, Ans; Mechelli, Andrea; Knegtering, Henderikus; McGuire, Philip K.; Aleman, André

    2009-01-01

    Background Neuroimaging studies have indicated that a number of cortical regions express altered patterns of structural covariance in schizophrenia. The relation between these alterations and specific psychotic symptoms is yet to be investigated. We used voxel-based morphometry to examine regional grey matter volumes and structural covariance associated with severity of auditory verbal hallucinations. Methods We applied optimized voxel-based morphometry to volumetric magnetic resonance imaging data from 26 patients with medication-resistant auditory verbal hallucinations (AVHs); statistical inferences were made at p < 0.05 after correction for multiple comparisons. Results Grey matter volume in the left inferior frontal gyrus was positively correlated with severity of AVHs. Hallucination severity influenced the pattern of structural covariance between this region and the left superior/middle temporal gyri, the right inferior frontal gyrus and hippocampus, and the insula bilaterally. Limitations The results are based on self-reported severity of auditory hallucinations. Complementing with a clinician-based instrument could have made the findings more compelling. Future studies would benefit from including a measure to control for other symptoms that may covary with AVHs and for the effects of antipsychotic medication. Conclusion The results revealed that overall severity of AVHs modulated cortical intercorrelations between frontotemporal regions involved in language production and verbal monitoring, supporting the critical role of this network in the pathophysiology of hallucinations. PMID:19949723

  5. Using machine learning to assess covariate balance in matching studies.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    In order to assess the effectiveness of matching approaches in observational studies, investigators typically present summary statistics for each observed pre-intervention covariate, with the objective of showing that matching reduces the difference in means (or proportions) between groups to as close to zero as possible. In this paper, we introduce a new approach to distinguish between study groups based on their distributions of the covariates using a machine-learning algorithm called optimal discriminant analysis (ODA). Assessing covariate balance using ODA as compared with the conventional method has several key advantages: the ability to ascertain how individuals self-select based on optimal (maximum-accuracy) cut-points on the covariates; the application to any variable metric and number of groups; its insensitivity to skewed data or outliers; and the use of accuracy measures that can be widely applied to all analyses. Moreover, ODA accepts analytic weights, thereby extending the assessment of covariate balance to any study design where weights are used for covariate adjustment. By comparing the two approaches using empirical data, we are able to demonstrate that using measures of classification accuracy as balance diagnostics produces highly consistent results to those obtained via the conventional approach (in our matched-pairs example, ODA revealed a weak statistically significant relationship not detected by the conventional approach). Thus, investigators should consider ODA as a robust complement, or perhaps alternative, to the conventional approach for assessing covariate balance in matching studies. © 2016 John Wiley & Sons, Ltd.

  6. FoodPro: A Web-Based Tool for Evaluating Covariance and Correlation NMR Spectra Associated with Food Processes

    PubMed Central

    Chikayama, Eisuke; Yamashina, Ryo; Komatsu, Keiko; Tsuboi, Yuuri; Sakata, Kenji; Kikuchi, Jun; Sekiyama, Yasuyo

    2016-01-01

    Foods from agriculture and fishery products are processed using various technologies. Molecular mixture analysis during food processing has the potential to help us understand the molecular mechanisms involved, thus enabling better cooking of the analyzed foods. To date, there has been no web-based tool focusing on accumulating Nuclear Magnetic Resonance (NMR) spectra from various types of food processing. Therefore, we have developed a novel web-based tool, FoodPro, that includes a food NMR spectrum database and computes covariance and correlation spectra to tasting and hardness. As a result, FoodPro has accumulated 236 aqueous (extracted in D2O) and 131 hydrophobic (extracted in CDCl3) experimental bench-top 60-MHz NMR spectra, 1753 tastings scored by volunteers, and 139 hardness measurements recorded by a penetrometer, all placed into a core database. The database content was roughly classified into fish and vegetable groups from the viewpoint of different spectrum patterns. FoodPro can query a user food NMR spectrum, search similar NMR spectra with a specified similarity threshold, and then compute estimated tasting and hardness, covariance, and correlation spectra to tasting and hardness. Querying fish spectra exemplified specific covariance spectra to tasting and hardness, giving positive covariance for tasting at 1.31 ppm for lactate and 3.47 ppm for glucose and a positive covariance for hardness at 3.26 ppm for trimethylamine N-oxide. PMID:27775560

  7. FoodPro: A Web-Based Tool for Evaluating Covariance and Correlation NMR Spectra Associated with Food Processes.

    PubMed

    Chikayama, Eisuke; Yamashina, Ryo; Komatsu, Keiko; Tsuboi, Yuuri; Sakata, Kenji; Kikuchi, Jun; Sekiyama, Yasuyo

    2016-10-19

    Foods from agriculture and fishery products are processed using various technologies. Molecular mixture analysis during food processing has the potential to help us understand the molecular mechanisms involved, thus enabling better cooking of the analyzed foods. To date, there has been no web-based tool focusing on accumulating Nuclear Magnetic Resonance (NMR) spectra from various types of food processing. Therefore, we have developed a novel web-based tool, FoodPro, that includes a food NMR spectrum database and computes covariance and correlation spectra to tasting and hardness. As a result, FoodPro has accumulated 236 aqueous (extracted in D₂O) and 131 hydrophobic (extracted in CDCl₃) experimental bench-top 60-MHz NMR spectra, 1753 tastings scored by volunteers, and 139 hardness measurements recorded by a penetrometer, all placed into a core database. The database content was roughly classified into fish and vegetable groups from the viewpoint of different spectrum patterns. FoodPro can query a user food NMR spectrum, search similar NMR spectra with a specified similarity threshold, and then compute estimated tasting and hardness, covariance, and correlation spectra to tasting and hardness. Querying fish spectra exemplified specific covariance spectra to tasting and hardness, giving positive covariance for tasting at 1.31 ppm for lactate and 3.47 ppm for glucose and a positive covariance for hardness at 3.26 ppm for trimethylamine N -oxide.

  8. The probability of misassociation between neighboring targets

    NASA Astrophysics Data System (ADS)

    Areta, Javier A.; Bar-Shalom, Yaakov; Rothrock, Ronald

    2008-04-01

    This paper presents procedures to calculate the probability that the measurement originating from an extraneous target will be (mis)associated with a target of interest for the cases of Nearest Neighbor and Global association. It is shown that these misassociation probabilities depend, under certain assumptions, on a particular - covariance weighted - norm of the difference between the targets' predicted measurements. For the Nearest Neighbor association, the exact solution, obtained for the case of equal innovation covariances, is based on a noncentral chi-square distribution. An approximate solution is also presented for the case of unequal innovation covariances. For the Global case an approximation is presented for the case of "similar" innovation covariances. In the general case of unequal innovation covariances where this approximation fails, an exact method based on the inversion of the characteristic function is presented. The theoretical results, confirmed by Monte Carlo simulations, quantify the benefit of Global vs. Nearest Neighbor association. These results are applied to problems of single sensor as well as centralized fusion architecture multiple sensor tracking.

  9. Library-Based Learning in an Information Society.

    ERIC Educational Resources Information Center

    Breivik, Patricia Senn

    1986-01-01

    The average academic library has great potential for quality nonclassroom learning benefiting students, faculty, alumni, and the local business community. The major detriments are the limited perceptions about libraries and librarians among campus administrators and faculty. Library-based learning should be planned to be assimilated into overall…

  10. Applications of Multidimensional Item Response Theory Models with Covariates to Longitudinal Test Data. Research Report. ETS RR-16-21

    ERIC Educational Resources Information Center

    Fu, Jianbin

    2016-01-01

    The multidimensional item response theory (MIRT) models with covariates proposed by Haberman and implemented in the "mirt" program provide a flexible way to analyze data based on item response theory. In this report, we discuss applications of the MIRT models with covariates to longitudinal test data to measure skill differences at the…

  11. Standards for hospital libraries 2002

    PubMed Central

    Gluck, Jeannine Cyr; Hassig, Robin Ackley; Balogh, Leeni; Bandy, Margaret; Doyle, Jacqueline Donaldson; Kronenfeld, Michael R.; Lindner, Katherine Lois; Murray, Kathleen; Petersen, JoAn; Rand, Debra C.

    2002-01-01

    The Medical Library Association's “Standards for Hospital Libraries 2002” have been developed as a guide for hospital administrators, librarians, and accrediting bodies to ensure that hospitals have the resources and services to effectively meet their needs for knowledge-based information. Specific requirements for knowledge-based information include that the library be a separate department with its own budget. Knowledge-based information in the library should be directed by a qualified librarian who functions as a department head and is a member of the Academy of Health Information Professionals. The standards define the role of the medical librarian and the links between knowledge-based information and other functions such as patient care, patient education, performance improvement, and education. In addition, the standards address the development and implementation of the knowledge-based information needs assessment and plans, the promotion and publicity of the knowledge-based information services, and the physical space and staffing requirements. The role, qualifications, and functions of a hospital library consultant are outlined. The health sciences library is positioned to play a key role in the hospital. The increasing use of the Internet and new information technologies by medical, nursing, and allied health staffs; patients; and the community require new strategies, strategic planning, allocation of adequate resources, and selection and evaluation of appropriate information resources and technologies. The Hospital Library Standards Committee has developed this document as a guideline to be used in facing these challenges. Editor's Note: The “Standards for Hospital Libraries 2002” were approved by the members of the Hospital Library Section during MLA '02 in Dallas, Texas. They were subsequently approved by Section Council and received final approval from the MLA Board of Directors in June 2002. They succeed the Standards for Hospital Libraries published in 1994 and the Minimum Standards for Health Sciences Libraries in Hospitals from 1983. A Frequently Asked Questions document discussing the development of the new standards can be found on the Hospital Library Section Website at http://www.hls.mlanet.org. PMID:12398254

  12. Covariate analysis of bivariate survival data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methodsmore » have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.« less

  13. External Aiding Methods for IMU-Based Navigation

    DTIC Science & Technology

    2016-11-26

    Carlo simulation and particle filtering . This approach allows for the utilization of highly complex systems in a black box configuration with minimal...alternative method, which has the advantage of being less computationally demanding, is to use a Kalman filtering -based approach. The particular...Kalman filtering -based approach used here is known as linear covariance analysis. In linear covariance analysis, the nonlinear systems describing the

  14. Standards for Medical Library Technicians, Medical Library Association.

    ERIC Educational Resources Information Center

    Medical Library Association, Chicago, IL.

    A medical library technician is a semiprofessional library employee whose duties require knowledge and skill based on a minimum of two years' general college education that includes library instruction beyond the clerical level. The medical library technician must have a practical knowledge of library functions and services, an understanding of…

  15. School Librarians' Experiences with Evidence-Based Library and Information Practice

    ERIC Educational Resources Information Center

    Richey, Jennifer; Cahill, Maria

    2014-01-01

    Evidence-based library and information practice (EBLIP) provides school librarians a systematic means of building, assessing, and revising a library program, thus demonstrating a school library program's worth to the larger school community. Through survey research collecting both qualitative and quantitative data, 111 public school librarians in…

  16. LISPA (Library and Information Center Staff Planning Advisor): A Microcomputer-Based System.

    ERIC Educational Resources Information Center

    Devadason, F. J.; Vespry, H. A.

    1996-01-01

    Describes LISPA (Library and Information Center Staff Planning Advisor), a set of programs based on Ranganathan's staff plan model. LISPA particularly aids in planning for library staff requirements, both professional and paraprofessional, in developing countries where automated systems for other library operations are not yet available.…

  17. The mathematics of a successful deconvolution: a quantitative assessment of mixture-based combinatorial libraries screened against two formylpeptide receptors.

    PubMed

    Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia

    2013-05-30

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.

  18. Robust Averaging of Covariances for EEG Recordings Classification in Motor Imagery Brain-Computer Interfaces.

    PubMed

    Uehara, Takashi; Sartori, Matteo; Tanaka, Toshihisa; Fiori, Simone

    2017-06-01

    The estimation of covariance matrices is of prime importance to analyze the distribution of multivariate signals. In motor imagery-based brain-computer interfaces (MI-BCI), covariance matrices play a central role in the extraction of features from recorded electroencephalograms (EEGs); therefore, correctly estimating covariance is crucial for EEG classification. This letter discusses algorithms to average sample covariance matrices (SCMs) for the selection of the reference matrix in tangent space mapping (TSM)-based MI-BCI. Tangent space mapping is a powerful method of feature extraction and strongly depends on the selection of a reference covariance matrix. In general, the observed signals may include outliers; therefore, taking the geometric mean of SCMs as the reference matrix may not be the best choice. In order to deal with the effects of outliers, robust estimators have to be used. In particular, we discuss and test the use of geometric medians and trimmed averages (defined on the basis of several metrics) as robust estimators. The main idea behind trimmed averages is to eliminate data that exhibit the largest distance from the average covariance calculated on the basis of all available data. The results of the experiments show that while the geometric medians show little differences from conventional methods in terms of classification accuracy in the classification of electroencephalographic recordings, the trimmed averages show significant improvement for all subjects.

  19. Alterations in Anatomical Covariance in the Prematurely Born.

    PubMed

    Scheinost, Dustin; Kwon, Soo Hyun; Lacadie, Cheryl; Vohr, Betty R; Schneider, Karen C; Papademetris, Xenophon; Constable, R Todd; Ment, Laura R

    2017-01-01

    Preterm (PT) birth results in long-term alterations in functional and structural connectivity, but the related changes in anatomical covariance are just beginning to be explored. To test the hypothesis that PT birth alters patterns of anatomical covariance, we investigated brain volumes of 25 PTs and 22 terms at young adulthood using magnetic resonance imaging. Using regional volumetrics, seed-based analyses, and whole brain graphs, we show that PT birth is associated with reduced volume in bilateral temporal and inferior frontal lobes, left caudate, left fusiform, and posterior cingulate for prematurely born subjects at young adulthood. Seed-based analyses demonstrate altered patterns of anatomical covariance for PTs compared with terms. PTs exhibit reduced covariance with R Brodmann area (BA) 47, Broca's area, and L BA 21, Wernicke's area, and white matter volume in the left prefrontal lobe, but increased covariance with R BA 47 and left cerebellum. Graph theory analyses demonstrate that measures of network complexity are significantly less robust in PTs compared with term controls. Volumes in regions showing group differences are significantly correlated with phonological awareness, the fundamental basis for reading acquisition, for the PTs. These data suggest both long-lasting and clinically significant alterations in the covariance in the PTs at young adulthood. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Copper benchmark experiment for the testing of JEFF-3.2 nuclear data for fusion applications

    NASA Astrophysics Data System (ADS)

    Angelone, M.; Flammini, D.; Loreti, S.; Moro, F.; Pillon, M.; Villar, R.; Klix, A.; Fischer, U.; Kodeli, I.; Perel, R. L.; Pohorecky, W.

    2017-09-01

    A neutronics benchmark experiment on a pure Copper block (dimensions 60 × 70 × 70 cm3) aimed at testing and validating the recent nuclear data libraries for fusion applications was performed in the frame of the European Fusion Program at the 14 MeV ENEA Frascati Neutron Generator (FNG). Reaction rates, neutron flux spectra and doses were measured using different experimental techniques (e.g. activation foils techniques, NE213 scintillator and thermoluminescent detectors). This paper first summarizes the analyses of the experiment carried-out using the MCNP5 Monte Carlo code and the European JEFF-3.2 library. Large discrepancies between calculation (C) and experiment (E) were found for the reaction rates both in the high and low neutron energy range. The analysis was complemented by sensitivity/uncertainty analyses (S/U) using the deterministic and Monte Carlo SUSD3D and MCSEN codes, respectively. The S/U analyses enabled to identify the cross sections and energy ranges which are mostly affecting the calculated responses. The largest discrepancy among the C/E values was observed for the thermal (capture) reactions indicating severe deficiencies in the 63,65Cu capture and elastic cross sections at lower rather than at high energy. Deterministic and MC codes produced similar results. The 14 MeV copper experiment and its analysis thus calls for a revision of the JEFF-3.2 copper cross section and covariance data evaluation. A new analysis of the experiment was performed with the MCNP5 code using the revised JEFF-3.3-T2 library released by NEA and a new, not yet distributed, revised JEFF-3.2 Cu evaluation produced by KIT. A noticeable improvement of the C/E results was obtained with both new libraries.

  1. Propagation of neutron-reaction uncertainties through multi-physics models of novel LWR's

    NASA Astrophysics Data System (ADS)

    Hernandez-Solis, Augusto; Sjöstrand, Henrik; Helgesson, Petter

    2017-09-01

    The novel design of the renewable boiling water reactor (RBWR) allows a breeding ratio greater than unity and thus, it aims at providing for a self-sustained fuel cycle. The neutron reactions that compose the different microscopic cross-sections and angular distributions are uncertain, so when they are employed in the determination of the spatial distribution of the neutron flux in a nuclear reactor, a methodology should be employed to account for these associated uncertainties. In this work, the Total Monte Carlo (TMC) method is used to propagate the different neutron-reactions (as well as angular distributions) covariances that are part of the TENDL-2014 nuclear data (ND) library. The main objective is to propagate them through coupled neutronic and thermal-hydraulic models in order to assess the uncertainty of important safety parameters related to multi-physics, such as peak cladding temperature along the axial direction of an RBWR fuel assembly. The objective of this study is to quantify the impact that ND covariances of important nuclides such as U-235, U-238, Pu-239 and the thermal scattering of hydrogen in H2O have in the deterministic safety analysis of novel nuclear reactors designs.

  2. Library Buildings Section. Papers.

    ERIC Educational Resources Information Center

    International Federation of Library Associations, The Hague (Netherlands).

    Papers on library architecture, which were presented at the 1982 International Federation of Library Associations (IFLA) conference focus on the effect of library networks on library design. Topics include: (1) "Some Problems in Designing of the University Library Buildings in China: A Developing Country University Librarian's View Based on…

  3. State Virtual Libraries

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2003-01-01

    Virtual library? Electronic library? Digital library? Online information network? These all apply to the growing number of Web-based resource collections managed by consortiums of state library entities. Some, like "INFOhio" and "KYVL" ("Kentucky Virtual Library"), have been available for a few years, but others are just starting. Searching for…

  4. Implementing a Knowledge-Based Library Information System with Typed Horn Logic.

    ERIC Educational Resources Information Center

    Ait-Kaci, Hassan; And Others

    1990-01-01

    Describes a prototype library expert system called BABEL which uses a new programing language, LOGIN, that combines the idea of attribute inheritance with logic programing. Use of hierarchical classification of library objects to build a knowledge base for a library information system is explained, and further research is suggested. (11…

  5. PuLSE: Quality control and quantification of peptide sequences explored by phage display libraries.

    PubMed

    Shave, Steven; Mann, Stefan; Koszela, Joanna; Kerr, Alastair; Auer, Manfred

    2018-01-01

    The design of highly diverse phage display libraries is based on assumption that DNA bases are incorporated at similar rates within the randomized sequence. As library complexity increases and expected copy numbers of unique sequences decrease, the exploration of library space becomes sparser and the presence of truly random sequences becomes critical. We present the program PuLSE (Phage Library Sequence Evaluation) as a tool for assessing randomness and therefore diversity of phage display libraries. PuLSE runs on a collection of sequence reads in the fastq file format and generates tables profiling the library in terms of unique DNA sequence counts and positions, translated peptide sequences, and normalized 'expected' occurrences from base to residue codon frequencies. The output allows at-a-glance quantitative quality control of a phage library in terms of sequence coverage both at the DNA base and translated protein residue level, which has been missing from toolsets and literature. The open source program PuLSE is available in two formats, a C++ source code package for compilation and integration into existing bioinformatics pipelines and precompiled binaries for ease of use.

  6. Structural Covariance of the Default Network in Healthy and Pathological Aging

    PubMed Central

    Turner, Gary R.

    2013-01-01

    Significant progress has been made uncovering functional brain networks, yet little is known about the corresponding structural covariance networks. The default network's functional architecture has been shown to change over the course of healthy and pathological aging. We examined cross-sectional and longitudinal datasets to reveal the structural covariance of the human default network across the adult lifespan and through the progression of Alzheimer's disease (AD). We used a novel approach to identify the structural covariance of the default network and derive individual participant scores that reflect the covariance pattern in each brain image. A seed-based multivariate analysis was conducted on structural images in the cross-sectional OASIS (N = 414) and longitudinal Alzheimer's Disease Neuroimaging Initiative (N = 434) datasets. We reproduced the distributed topology of the default network, based on a posterior cingulate cortex seed, consistent with prior reports of this intrinsic connectivity network. Structural covariance of the default network scores declined in healthy and pathological aging. Decline was greatest in the AD cohort and in those who progressed from mild cognitive impairment to AD. Structural covariance of the default network scores were positively associated with general cognitive status, reduced in APOEε4 carriers versus noncarriers, and associated with CSF biomarkers of AD. These findings identify the structural covariance of the default network and characterize changes to the network's gray matter integrity across the lifespan and through the progression of AD. The findings provide evidence for the large-scale network model of neurodegenerative disease, in which neurodegeneration spreads through intrinsically connected brain networks in a disease specific manner. PMID:24048852

  7. Dynamic Tasking of Networked Sensors Using Covariance Information

    DTIC Science & Technology

    2010-09-01

    has been created under an effort called TASMAN (Tasking Autonomous Sensors in a Multiple Application Network). One of the first studies utilizing this...environment was focused on a novel resource management approach, namely covariance-based tasking. Under this scheme, the state error covariance of...resident space objects (RSO), sensor characteristics, and sensor- target geometry were used to determine the effectiveness of future observations in

  8. Using Item-Type Performance Covariance to Improve the Skill Model of an Existing Tutor

    ERIC Educational Resources Information Center

    Pavlik, Philip I., Jr.; Cen, Hao; Wu, Lili; Koedinger, Kenneth R.

    2008-01-01

    Using data from an existing pre-algebra computer-based tutor, we analyzed the covariance of item-types with the goal of describing a more effective way to assign skill labels to item-types. Analyzing covariance is important because it allows us to place the skills in a related network in which we can identify the role each skill plays in learning…

  9. Tests for detecting overdispersion in models with measurement error in covariates.

    PubMed

    Yang, Yingsi; Wong, Man Yu

    2015-11-30

    Measurement error in covariates can affect the accuracy in count data modeling and analysis. In overdispersion identification, the true mean-variance relationship can be obscured under the influence of measurement error in covariates. In this paper, we propose three tests for detecting overdispersion when covariates are measured with error: a modified score test and two score tests based on the proposed approximate likelihood and quasi-likelihood, respectively. The proposed approximate likelihood is derived under the classical measurement error model, and the resulting approximate maximum likelihood estimator is shown to have superior efficiency. Simulation results also show that the score test based on approximate likelihood outperforms the test based on quasi-likelihood and other alternatives in terms of empirical power. By analyzing a real dataset containing the health-related quality-of-life measurements of a particular group of patients, we demonstrate the importance of the proposed methods by showing that the analyses with and without measurement error correction yield significantly different results. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Estimation of Noise Properties for TV-regularized Image Reconstruction in Computed Tomography

    PubMed Central

    Sánchez, Adrian A.

    2016-01-01

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128 × 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR. PMID:26308968

  11. Estimation of noise properties for TV-regularized image reconstruction in computed tomography.

    PubMed

    Sánchez, Adrian A

    2015-09-21

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128 × 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR.

  12. Estimation of noise properties for TV-regularized image reconstruction in computed tomography

    NASA Astrophysics Data System (ADS)

    Sánchez, Adrian A.

    2015-09-01

    A method for predicting the image covariance resulting from total-variation-penalized iterative image reconstruction (TV-penalized IIR) is presented and demonstrated in a variety of contexts. The method is validated against the sample covariance from statistical noise realizations for a small image using a variety of comparison metrics. Potential applications for the covariance approximation include investigation of image properties such as object- and signal-dependence of noise, and noise stationarity. These applications are demonstrated, along with the construction of image pixel variance maps for two-dimensional 128× 128 pixel images. Methods for extending the proposed covariance approximation to larger images and improving computational efficiency are discussed. Future work will apply the developed methodology to the construction of task-based image quality metrics such as the Hotelling observer detectability for TV-based IIR.

  13. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  14. The Mathematics of a Successful Deconvolution: A Quantitative Assessment of Mixture-Based Combinatorial Libraries Screened Against Two Formylpeptide Receptors

    PubMed Central

    Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia

    2014-01-01

    In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730

  15. Use of Computer-Based Reference Services in Texas Information Exchange Libraries.

    ERIC Educational Resources Information Center

    Menges, Gary L.

    The Texas Information Exchange (TIE) is a state-wide library network organized in 1967 for the purpose of sharing resources among Texas libraries. Its membership includes 37 college and university libraries, the Texas State Library, and ten public libraries that serve as Major Resource Centers in the Texas State Library Communications Network. In…

  16. Empirical State Error Covariance Matrix for Batch Estimation

    NASA Technical Reports Server (NTRS)

    Frisbee, Joe

    2015-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the uncertainty in the estimated states. By a reinterpretation of the equations involved in the weighted batch least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. The proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. This empirical error covariance matrix may be calculated as a side computation for each unique batch solution. Results based on the proposed technique will be presented for a simple, two observer and measurement error only problem.

  17. Model-based reconstruction of synthetic promoter library in Corynebacterium glutamicum.

    PubMed

    Zhang, Shuanghong; Liu, Dingyu; Mao, Zhitao; Mao, Yufeng; Ma, Hongwu; Chen, Tao; Zhao, Xueming; Wang, Zhiwen

    2018-05-01

    To develop an efficient synthetic promoter library for fine-tuned expression of target genes in Corynebacterium glutamicum. A synthetic promoter library for C. glutamicum was developed based on conserved sequences of the - 10 and - 35 regions. The synthetic promoter library covered a wide range of strengths, ranging from 1 to 193% of the tac promoter. 68 promoters were selected and sequenced for correlation analysis between promoter sequence and strength with a statistical model. A new promoter library was further reconstructed with improved promoter strength and coverage based on the results of correlation analysis. Tandem promoter P70 was finally constructed with increased strength by 121% over the tac promoter. The promoter library developed in this study showed a great potential for applications in metabolic engineering and synthetic biology for the optimization of metabolic networks. To the best of our knowledge, this is the first reconstruction of synthetic promoter library based on statistical analysis of C. glutamicum.

  18. Automatic Assembly of Combined Checkingfixture for Auto-Body Components Based Onfixture Elements Libraries

    NASA Astrophysics Data System (ADS)

    Jiang, Jingtao; Sui, Rendong; Shi, Yan; Li, Furong; Hu, Caiqi

    In this paper 3-D models of combined fixture elements are designed, classified by their functions, and saved in computer as supporting elements library, jointing elements library, basic elements library, localization elements library, clamping elements library, and adjusting elements library etc. Then automatic assembly of 3-D combined checking fixture for auto-body part is presented based on modularization theory. And in virtual auto-body assembly space, Locating constraint mapping technique and assembly rule-based reasoning technique are used to calculate the position of modular elements according to localization points and clamp points of auto-body part. Auto-body part model is transformed from itself coordinate system space to virtual assembly space by homogeneous transformation matrix. Automatic assembly of different functional fixture elements and auto-body part is implemented with API function based on the second development of UG. It is proven in practice that the method in this paper is feasible and high efficiency.

  19. An evidence-based approach to globally assess the covariate-dependent effect of the MTHFR single nucleotide polymorphism rs1801133 on blood homocysteine: a systematic review and meta-analysis.

    PubMed

    Jin, Huifeng; Cheng, Haojie; Chen, Wei; Sheng, Xiaoming; Levy, Mark A; Brown, Mark J; Tian, Junqiang

    2018-05-01

    The single nucleotide polymorphism of the gene 5,10-methylenetetrahydrofolate reductase (MTHFR) C677T (or rs1801133) is the most established genetic factor that increases plasma total homocysteine (tHcy) and consequently results in hyperhomocysteinemia. Yet, given the limited penetrance of this genetic variant, it is necessary to individually predict the risk of hyperhomocysteinemia for an rs1801133 carrier. We hypothesized that variability in this genetic risk is largely due to the presence of factors (covariates) that serve as effect modifiers, confounders, or both, such as folic acid (FA) intake, and aimed to assess this risk in the complex context of these covariates. We systematically extracted from published studies the data on tHcy, rs1801133, and any previously reported rs1801133 covariates. The resulting metadata set was first used to analyze the covariates' modifying effect by meta-regression and other statistical means. Subsequently, we controlled for this modifying effect by genotype-stratifying tHcy data and analyzed the variability in the risk resulting from the confounding of covariates. The data set contains data on 36 rs1801133 covariates that were collected from 114,799 participants and 256 qualified studies, among which 6 covariates (sex, age, race, FA intake, smoking, and alcohol consumption) are the most frequently informed and therefore included for statistical analysis. The effect of rs1801133 on tHcy exhibits significant variability that can be attributed to effect modification as well as confounding by these covariates. Via statistical modeling, we predicted the covariate-dependent risk of tHcy elevation and hyperhomocysteinemia in a systematic manner. We showed an evidence-based approach that globally assesses the covariate-dependent effect of rs1801133 on tHcy. The results should assist clinicians in interpreting the rs1801133 data from genetic testing for their patients. Such information is also important for the public, who increasingly receive genetic data from commercial services without interpretation of its clinical relevance. This study was registered at Research Registry with the registration number reviewregistry328.

  20. Multiple feature fusion via covariance matrix for visual tracking

    NASA Astrophysics Data System (ADS)

    Jin, Zefenfen; Hou, Zhiqiang; Yu, Wangsheng; Wang, Xin; Sun, Hui

    2018-04-01

    Aiming at the problem of complicated dynamic scenes in visual target tracking, a multi-feature fusion tracking algorithm based on covariance matrix is proposed to improve the robustness of the tracking algorithm. In the frame-work of quantum genetic algorithm, this paper uses the region covariance descriptor to fuse the color, edge and texture features. It also uses a fast covariance intersection algorithm to update the model. The low dimension of region covariance descriptor, the fast convergence speed and strong global optimization ability of quantum genetic algorithm, and the fast computation of fast covariance intersection algorithm are used to improve the computational efficiency of fusion, matching, and updating process, so that the algorithm achieves a fast and effective multi-feature fusion tracking. The experiments prove that the proposed algorithm can not only achieve fast and robust tracking but also effectively handle interference of occlusion, rotation, deformation, motion blur and so on.

  1. Plan for the Development of Library Service in Montana.

    ERIC Educational Resources Information Center

    Warncke, Ruth

    This plan for the development of Montana library service is based on the experiences of other states, opinions of library experts, written information on Montana libraries, visits to several libraries, and attendance at meetings of the Montana Library Association and its committees. Specific recommendations include: moving the State Library…

  2. Library Service in Delaware.

    ERIC Educational Resources Information Center

    Humphry, John A.; Humphry, James, III

    This study which gives detailed recommendations for the implementation of a state-wide library improvement program for Delaware is based on visits to all types of libraries and library agencies in the state and conference with members of the State Library Commission, library trustees, state and local officials, librarians and interested laymen.…

  3. E-Global Library: The Academic Campus Library Meets the Internet.

    ERIC Educational Resources Information Center

    Heilig, Jean M.

    2001-01-01

    Describes e-global library, the first Internet-based virtual library designed for online students at Jones International University and that has grown into a separately licensable product. Highlights include marketing to other academic libraries, both online and traditional; fees; the e-global library model; collection development policies;…

  4. Douglass Rationalization: An Evaluation of a Team Environment and a Computer-Based Task in Academic Libraries

    ERIC Educational Resources Information Center

    Denda, Kayo; Smulewitz, Gracemary

    2004-01-01

    In the contemporary library environment, the presence of the Internet and the infrastructure of the integrated library system suggest an integrated internal organization. The article describes the example of Douglass Rationalization, a team-based collaborative project to refocus the collection of Rutgers' Douglass Library, taking advantage of the…

  5. Digital Library Archaeology: A Conceptual Framework for Understanding Library Use through Artifact-Based Evaluation

    ERIC Educational Resources Information Center

    Nicholson, Scott

    2005-01-01

    Archaeologists have used material artifacts found in a physical space to gain an understanding about the people who occupied that space. Likewise, as users wander through a digital library, they leave behind data-based artifacts of their activity in the virtual space. Digital library archaeologists can gather these artifacts and employ inductive…

  6. What's New in the Library Automation Arena?

    ERIC Educational Resources Information Center

    Breeding, Marshall

    1998-01-01

    Reviews trends in library automation based on vendors at the 1998 American Library Association Annual Conference. Discusses the major industry trend, a move from host-based computer systems to the new generation of client/server, object-oriented, open systems-based automation. Includes a summary of developments for 26 vendors. (LRW)

  7. Library Statistical Data Base Formats and Definitions.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    Represented are the detailed set of data structures relevant to the categorization of information, terminology, and definitions employed in the design of the library statistical data base. The data base, or management information system, provides administrators with a framework of information and standardized data for library management, planning,…

  8. Commentary to Library Statistical Data Base.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has developed a library statistical data base which concentrates on the management information needs of administrators of public and academic libraries. This document provides an overview of the framework and conceptual approach employed in the design of the data base. The data…

  9. Operation Reliability Assessment for Cutting Tools by Applying a Proportional Covariate Model to Condition Monitoring Information

    PubMed Central

    Cai, Gaigai; Chen, Xuefeng; Li, Bing; Chen, Baojia; He, Zhengjia

    2012-01-01

    The reliability of cutting tools is critical to machining precision and production efficiency. The conventional statistic-based reliability assessment method aims at providing a general and overall estimation of reliability for a large population of identical units under given and fixed conditions. However, it has limited effectiveness in depicting the operational characteristics of a cutting tool. To overcome this limitation, this paper proposes an approach to assess the operation reliability of cutting tools. A proportional covariate model is introduced to construct the relationship between operation reliability and condition monitoring information. The wavelet packet transform and an improved distance evaluation technique are used to extract sensitive features from vibration signals, and a covariate function is constructed based on the proportional covariate model. Ultimately, the failure rate function of the cutting tool being assessed is calculated using the baseline covariate function obtained from a small sample of historical data. Experimental results and a comparative study show that the proposed method is effective for assessing the operation reliability of cutting tools. PMID:23201980

  10. Significance of clustering and classification applications in digital and physical libraries

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Ioannis; Koulouris, Alexandros; Zervos, Spiros; Dendrinos, Markos; Giannakopoulos, Georgios

    2015-02-01

    Applications of clustering and classification techniques can be proved very significant in both digital and physical (paper-based) libraries. The most essential application, document classification and clustering, is crucial for the content that is produced and maintained in digital libraries, repositories, databases, social media, blogs etc., based on various tags and ontology elements, transcending the traditional library-oriented classification schemes. Other applications with very useful and beneficial role in the new digital library environment involve document routing, summarization and query expansion. Paper-based libraries can benefit as well since classification combined with advanced material characterization techniques such as FTIR (Fourier Transform InfraRed spectroscopy) can be vital for the study and prevention of material deterioration. An improved two-level self-organizing clustering architecture is proposed in order to enhance the discrimination capacity of the learning space, prior to classification, yielding promising results when applied to the above mentioned library tasks.

  11. The State of Planning of Automation Projects in the Libraries of Canada.

    ERIC Educational Resources Information Center

    Clement, Hope E. A.

    Library automation in Canada is complicated by the large size, dispersed population, and cultural diversity of the country. The National Library of Canada is actively planning a Canadian library network based on national bibliographic services for which the library is now developing automated systems. Canadian libraries are involved in the…

  12. Differently Able: A Review of Academic Library Websites for People with Disabilities

    ERIC Educational Resources Information Center

    Cassner, Mary; Maxey-Harris, Charlene; Anaya, Toni

    2011-01-01

    This research is based on the Library Services for People with Disabilities Policy, which was approved by the American Library Association in 2001. The policy identified focus areas for libraries including library services, facilities, collections, and assistive technology. Library websites frequently convey this information to users. Our study…

  13. Nigerian Library Staff and Their Perceptions of Health Risks Posed by Using Computer-Based Systems in University Libraries

    ERIC Educational Resources Information Center

    Uwaifo, Stephen Osahon

    2008-01-01

    Purpose: The paper seeks to examine the health risks faced when using computer-based systems by library staff in Nigerian libraries. Design/methodology/approach: The paper uses a survey research approach to carry out this investigation. Findings: The investigation reveals that the perceived health risk does not predict perceived ease of use of…

  14. Collection development at the NOAA Central Library

    NASA Technical Reports Server (NTRS)

    Quillen, Steve R.

    1994-01-01

    The National Oceanic and Atmospheric Administration (NOAA) Central Library collection, approximately one million volumes, incorporates the holdings of its predecessor agencies. Within the library, the collections are filed separately, based on their source and/or classification schemes. The NOAA Central Library provides a variety of services to users, ranging from quick reference and interlibrary loan to in-depth research and online data bases.

  15. How to deal with the high condition number of the noise covariance matrix of gravity field functionals synthesised from a satellite-only global gravity field model?

    NASA Astrophysics Data System (ADS)

    Klees, R.; Slobbe, D. C.; Farahani, H. H.

    2018-03-01

    The posed question arises for instance in regional gravity field modelling using weighted least-squares techniques if the gravity field functionals are synthesised from the spherical harmonic coefficients of a satellite-only global gravity model (GGM), and are used as one of the noisy datasets. The associated noise covariance matrix, appeared to be extremely ill-conditioned with a singular value spectrum that decayed gradually to zero without any noticeable gap. We analysed three methods to deal with the ill-conditioned noise covariance matrix: Tihonov regularisation of the noise covariance matrix in combination with the standard formula for the weighted least-squares estimator, a formula of the weighted least-squares estimator, which does not involve the inverse noise covariance matrix, and an estimator based on Rao's unified theory of least-squares. Our analysis was based on a numerical experiment involving a set of height anomalies synthesised from the GGM GOCO05s, which is provided with a full noise covariance matrix. We showed that the three estimators perform similar, provided that the two regularisation parameters each method knows were chosen properly. As standard regularisation parameter choice rules do not apply here, we suggested a new parameter choice rule, and demonstrated its performance. Using this rule, we found that the differences between the three least-squares estimates were within noise. For the standard formulation of the weighted least-squares estimator with regularised noise covariance matrix, this required an exceptionally strong regularisation, much larger than one expected from the condition number of the noise covariance matrix. The preferred method is the inversion-free formulation of the weighted least-squares estimator, because of its simplicity with respect to the choice of the two regularisation parameters.

  16. Automated model selection in covariance estimation and spatial whitening of MEG and EEG signals.

    PubMed

    Engemann, Denis A; Gramfort, Alexandre

    2015-03-01

    Magnetoencephalography and electroencephalography (M/EEG) measure non-invasively the weak electromagnetic fields induced by post-synaptic neural currents. The estimation of the spatial covariance of the signals recorded on M/EEG sensors is a building block of modern data analysis pipelines. Such covariance estimates are used in brain-computer interfaces (BCI) systems, in nearly all source localization methods for spatial whitening as well as for data covariance estimation in beamformers. The rationale for such models is that the signals can be modeled by a zero mean Gaussian distribution. While maximizing the Gaussian likelihood seems natural, it leads to a covariance estimate known as empirical covariance (EC). It turns out that the EC is a poor estimate of the true covariance when the number of samples is small. To address this issue the estimation needs to be regularized. The most common approach downweights off-diagonal coefficients, while more advanced regularization methods are based on shrinkage techniques or generative models with low rank assumptions: probabilistic PCA (PPCA) and factor analysis (FA). Using cross-validation all of these models can be tuned and compared based on Gaussian likelihood computed on unseen data. We investigated these models on simulations, one electroencephalography (EEG) dataset as well as magnetoencephalography (MEG) datasets from the most common MEG systems. First, our results demonstrate that different models can be the best, depending on the number of samples, heterogeneity of sensor types and noise properties. Second, we show that the models tuned by cross-validation are superior to models with hand-selected regularization. Hence, we propose an automated solution to the often overlooked problem of covariance estimation of M/EEG signals. The relevance of the procedure is demonstrated here for spatial whitening and source localization of MEG signals. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Growing Competition for Libraries.

    ERIC Educational Resources Information Center

    Gibbons, Susan

    2001-01-01

    Describes the Questia subscription-based online academic digital books library. Highlights include weaknesses of the collection; what college students want from a library; importance of marketing; competition for traditional academic libraries that may help improve library services; and the ability of Questia to overcome barriers and…

  18. Working covariance model selection for generalized estimating equations.

    PubMed

    Carey, Vincent J; Wang, You-Gan

    2011-11-20

    We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.

  19. A Nakanishi-based model illustrating the covariant extension of the pion GPD overlap representation and its ambiguities

    NASA Astrophysics Data System (ADS)

    Chouika, N.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.

    2018-05-01

    A systematic approach for the model building of Generalized Parton Distributions (GPDs), based on their overlap representation within the DGLAP kinematic region and a further covariant extension to the ERBL one, is applied to the valence-quark pion's case, using light-front wave functions inspired by the Nakanishi representation of the pion Bethe-Salpeter amplitudes (BSA). This simple but fruitful pion GPD model illustrates the general model building technique and, in addition, allows for the ambiguities related to the covariant extension, grounded on the Double Distribution (DD) representation, to be constrained by requiring a soft-pion theorem to be properly observed.

  20. Enriching peptide libraries for binding affinity and specificity through computationally directed library design

    PubMed Central

    Foight, Glenna Wink; Chen, T. Scott; Richman, Daniel; Keating, Amy E.

    2017-01-01

    Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model. PMID:28236241

  1. Enriching Peptide Libraries for Binding Affinity and Specificity Through Computationally Directed Library Design.

    PubMed

    Foight, Glenna Wink; Chen, T Scott; Richman, Daniel; Keating, Amy E

    2017-01-01

    Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model.

  2. [Progress in the spectral library based protein identification strategy].

    PubMed

    Yu, Derui; Ma, Jie; Xie, Zengyan; Bai, Mingze; Zhu, Yunping; Shu, Kunxian

    2018-04-25

    Exponential growth of the mass spectrometry (MS) data is exhibited when the mass spectrometry-based proteomics has been developing rapidly. It is a great challenge to develop some quick, accurate and repeatable methods to identify peptides and proteins. Nowadays, the spectral library searching has become a mature strategy for tandem mass spectra based proteins identification in proteomics, which searches the experiment spectra against a collection of confidently identified MS/MS spectra that have been observed previously, and fully utilizes the abundance in the spectrum, peaks from non-canonical fragment ions, and other features. This review provides an overview of the implement of spectral library search strategy, and two key steps, spectral library construction and spectral library searching comprehensively, and discusses the progress and challenge of the library search strategy.

  3. Public Libraries and Community-Based Education: Making the Connection for Lifelong Learning. Volume 2: Commissioned Papers. A Conference Sponsored by the National Institute on Postsecondary Education, Libraries, and Lifelong Learning, Office of Educational Research and Improvement (Washington, D.C., April 12-13, 1995).

    ERIC Educational Resources Information Center

    National Inst. on Postsecondary Education, Libraries, and Lifelong Learning (ED/OERI), Washington, DC.

    This conference explored the relationship between the public library, community-based adult education, and lifelong learning. The eight commissioned papers presented include: "Community Based Adult Jewish Learning Program: Issues and Concerns" (Paul A. Flexner); "Rural and Small Libraries: Provisions for Lifelong Learning" (Bernard Vavrek);…

  4. Dynamic Pathfinders: Leveraging Your OPAC to Create Resource Guides

    ERIC Educational Resources Information Center

    Hunter, Ben

    2008-01-01

    Library pathfinders are a time-tested method of leading library users to important resources. However, paper-based pathfinders suffer from space limitations, and both paper-based and Web-based pathfinders require frequent updates to keep up with new library acquisitions. This article details a step-by-step method to create an online dynamic…

  5. High impact technologies for natural products screening.

    PubMed

    Koehn, Frank E

    2008-01-01

    Natural products have historically been a rich source of lead molecules in drug discovery. However, natural products have been de-emphasized as high throughput screening resources in the recent past, in part because of difficulties in obtaining high quality natural products screening libraries, or in applying modern screening assays to these libraries. In addition, natural products programs based on screening of extract libraries, bioassay-guided isolation, structure elucidation and subsequent production scale-up are challenged to meet the rapid cycle times that are characteristic of the modern HTS approach. Fortunately, new technologies in mass spectrometry, NMR and other spectroscopic techniques can greatly facilitate the first components of the process - namely the efficient creation of high-quality natural products libraries, bimolecular target or cell-based screening, and early hit characterization. The success of any high throughput screening campaign is dependent on the quality of the chemical library. The construction and maintenance of a high quality natural products library, whether based on microbial, plant, marine or other sources is a costly endeavor. The library itself may be composed of samples that are themselves mixtures - such as crude extracts, semi-pure mixtures or single purified natural products. Each of these library designs carries with it distinctive advantages and disadvantages. Crude extract libraries have lower resource requirements for sample preparation, but high requirements for identification of the bioactive constituents. Pre-fractionated libraries can be an effective strategy to alleviate interferences encountered with crude libraries, and may shorten the time needed to identify the active principle. Purified natural product libraries require substantial resources for preparation, but offer the advantage that the hit detection process is reduced to that of synthetic single component libraries. Whether the natural products library consists of crude or partially fractionated mixtures, the library contents should be profiled to identify the known components present - a process known as dereplication. The use of mass spectrometry and HPLC-mass spectrometry together with spectral databases is a powerful tool in the chemometric profiling of bio-sources for natural product production. High throughput, high sensitivity flow NMR is an emerging tool in this area as well. Whether by cell based or biomolecular target based assays, screening of natural product extract libraries continues to furnish novel lead molecules for further drug development, despite challenges in the analysis and prioritization of natural products hits. Spectroscopic techniques are now being used to directly screen natural product and synthetic libraries. Mass spectrometry in the form of methods such as ESI-ICRFTMS, and FACS-MS as well as NMR methods such as SAR by NMR and STD-NMR have been utilized to effectively screen molecular libraries. Overall, emerging advances in mass spectrometry, NMR and other technologies are making it possible to overcome the challenges encountered in screening natural products libraries in today's drug discovery environment. As we apply these technologies and develop them even further, we can look forward to increased impact of natural products in the HTS based drug discovery.

  6. Propensity score to detect baseline imbalance in cluster randomized trials: the role of the c-statistic.

    PubMed

    Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno

    2016-01-22

    Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.

  7. Interspecific analysis of covariance structure in the masticatory apparatus of galagos.

    PubMed

    Vinyard, Christopher J

    2007-01-01

    The primate masticatory apparatus (MA) is a functionally integrated set of features, each of which performs important functions in biting, ingestive, and chewing behaviors. A comparison of morphological covariance structure among species for these MA features will help us to further understand the evolutionary history of this region. In this exploratory analysis, the covariance structure of the MA is compared across seven galago species to investigate 1) whether there are differences in covariance structure in this region, and 2) if so, how has this covariation changed with respect to size, MA form, diet, and/or phylogeny? Ten measurements of the MA functionally related to bite force production and load resistance were obtained from 218 adults of seven galago species. Correlation matrices were generated for these 10 dimensions and compared among species via matrix correlations and Mantel tests. Subsequently, pairwise covariance disparity in the MA was estimated as a measure of difference in covariance structure between species. Covariance disparity estimates were correlated with pairwise distances related to differences in body size, MA size and shape, genetic distance (based on cytochrome-b sequences) and percentage of dietary foods to determine whether one or more of these factors is linked to differences in covariance structure. Galagos differ in MA covariance structure. Body size appears to be a major factor correlated with differences in covariance structure among galagos. The largest galago species, Otolemur crassicaudatus, exhibits large differences in body mass and covariance structure relative to other galagos, and thus plays a primary role in creating this association. MA size and shape do not correlate with covariance structure when body mass is held constant. Diet also shows no association. Genetic distance is significantly negatively correlated with covariance disparity when body mass is held constant, but this correlation appears to be a function of the small body size and large genetic distance for Galagoides demidoff. These exploratory results indicate that changing body size may have been a key factor in the evolution of the galago MA.

  8. The radiated noise from isotropic turbulence revisited

    NASA Technical Reports Server (NTRS)

    Lilley, Geoffrey M.

    1993-01-01

    The noise radiated from isotropic turbulence at low Mach numbers and high Reynolds numbers, as derived by Proudman (1952), was the first application of Lighthill's Theory of Aerodynamic Noise to a complete flow field. The theory presented by Proudman involves the assumption of the neglect of retarded time differences and so replaces the second-order retarded-time and space covariance of Lighthill's stress tensor, Tij, and in particular its second time derivative, by the equivalent simultaneous covariance. This assumption is a valid approximation in the derivation of the second partial derivative of Tij/derivative of t exp 2 covariance at low Mach numbers, but is not justified when that covariance is reduced to the sum of products of the time derivatives of equivalent second-order velocity covariances as required when Gaussian statistics are assumed. The present paper removes these assumptions and finds that although the changes in the analysis are substantial, the change in the numerical result for the total acoustic power is small. The present paper also considers an alternative analysis which does not neglect retarded times. It makes use of the Lighthill relationship, whereby the fourth-order Tij retarded-time covariance is evaluated from the square of similar second order covariance, which is assumed known. In this derivation, no statistical assumptions are involved. This result, using distributions for the second-order space-time velocity squared covariance based on the Direct Numerical Simulation (DNS) results of both Sarkar and Hussaini(1993) and Dubois(1993), is compared with the re-evaluation of Proudman's original model. These results are then compared with the sound power derived from a phenomenological model based on simple approximations to the retarded-time/space covariance of Txx. Finally, the recent numerical solutions of Sarkar and Hussaini(1993) for the acoustic power are compared with the results obtained from the analytic solutions.

  9. Tonic and phasic co-variation of peripheral arousal indices in infants

    PubMed Central

    Wass, S.V.; de Barbaro, K.; Clackson, K.

    2015-01-01

    Tonic and phasic differences in peripheral autonomic nervous system (ANS) indicators strongly predict differences in attention and emotion regulation in developmental populations. However, virtually all previous research has been based on individual ANS measures, which poses a variety of conceptual and methodlogical challenges to comparing results across studies. Here we recorded heart rate, electrodermal activity (EDA), pupil size, head movement velocity and peripheral accelerometry concurrently while a cohort of 37 typical 12-month-old infants completed a mixed assessment battery lasting approximately 20 min per participant. We analysed covariation of these autonomic indices in three ways: first, tonic (baseline) arousal; second, co-variation in spontaneous (phasic) changes during testing; third, phasic co-variation relative to an external stimulus event. We found that heart rate, head velocity and peripheral accelerometry showed strong positive co-variation across all three analyses. EDA showed no co-variation in tonic activity levels but did show phasic positive co-variation with other measures, that appeared limited to sections of high but not low general arousal. Tonic pupil size showed significant positive covariation, but phasic pupil changes were inconsistent. We conclude that: (i) there is high covariation between autonomic indices in infants, but that EDA may only be sensitive at extreme arousal levels, (ii) that tonic pupil size covaries with other indices, but does not show predicted patterns of phasic change and (iii) that motor activity appears to be a good proxy measure of ANS activity. The strongest patterns of covariation were observed using epoch durations of 40 s per epoch, although significant covariation between indices was also observed using shorter epochs (1 and 5 s). PMID:26316360

  10. Parcellation of the human orbitofrontal cortex based on gray matter volume covariance.

    PubMed

    Liu, Huaigui; Qin, Wen; Qi, Haotian; Jiang, Tianzi; Yu, Chunshui

    2015-02-01

    The human orbitofrontal cortex (OFC) is an enigmatic brain region that cannot be parcellated reliably using diffusional and functional magnetic resonance imaging (fMRI) because there is signal dropout that results from an inherent defect in imaging techniques. We hypothesise that the OFC can be reliably parcellated into subregions based on gray matter volume (GMV) covariance patterns that are derived from artefact-free structural images. A total of 321 healthy young subjects were examined by high-resolution structural MRI. The OFC was parcellated into subregions-based GMV covariance patterns; and then sex and laterality differences in GMV covariance pattern of each OFC subregion were compared. The human OFC was parcellated into the anterior (OFCa), medial (OFCm), posterior (OFCp), intermediate (OFCi), and lateral (OFCl) subregions. This parcellation scheme was validated by the same analyses of the left OFC and the bilateral OFCs in male and female subjects. Both visual observation and quantitative comparisons indicated a unique GMV covariance pattern for each OFC subregion. These OFC subregions mainly covaried with the prefrontal and temporal cortices, cingulate cortex and amygdala. In addition, GMV correlations of most OFC subregions were similar across sex and laterality except for significant laterality difference in the OFCl. The right OFCl had stronger GMV correlation with the right inferior frontal cortex. Using high-resolution structural images, we established a reliable parcellation scheme for the human OFC, which may provide an in vivo guide for subregion-level studies of this region and improve our understanding of the human OFC at subregional levels. © 2014 Wiley Periodicals, Inc.

  11. Integrated Library Systems in Canadian Public, Academic and Special Libraries: Fourth Annual Survey.

    ERIC Educational Resources Information Center

    Merilees, Bobbie

    1990-01-01

    Reports the results of a survey of integrated library system vendors that examined installations in Canadian academic, public and special libraries during 1989. Findings discussed include large library system versus PC-based system market shares, an analysis of system selection by type of library, and other factors that affect system selection. A…

  12. Public Library Site Evaluation and Location: Past and Present Market-Based Modelling Tools for the Future.

    ERIC Educational Resources Information Center

    Koontz, Christine M.

    1992-01-01

    Presents a methodology for construction of location modeling for public library facilities in diverse urban environments. Historical and current research in library location is reviewed; and data collected from a survey of six library systems are analyzed according to population, spatial, library use, and library attractiveness variables. (48…

  13. Adoption of Library 2.0 Functionalities by Academic Libraries and Users: A Knowledge Management Perspective

    ERIC Educational Resources Information Center

    Kim, Yong-Mi; Abbas, June

    2010-01-01

    This study investigates the adoption of Library 2.0 functionalities by academic libraries and users through a knowledge management perspective. Based on randomly selected 230 academic library Web sites and 184 users, the authors found RSS and blogs are widely adopted by academic libraries while users widely utilized the bookmark function.…

  14. Development of a Terpenoid Alkaloid-like Compound Library Based on the Humulene Skeleton.

    PubMed

    Kikuchi, Haruhisa; Nishimura, Takehiro; Kwon, Eunsang; Kawai, Junya; Oshima, Yoshiteru

    2016-10-24

    Many natural terpenoid alkaloid conjugates show biological activity because their structures contain both sp 3 -rich terpenoid scaffolds and nitrogen-containing alkaloid scaffolds. However, their biosynthesis utilizes a limited set of compounds as sources of the terpenoid moiety. The production of terpenoid alkaloids containing various types of terpenoid moiety may provide useful, chemically diverse compound libraries for drug discovery. Herein, we report the construction of a library of terpenoid alkaloid-like compounds based on Lewis-acid-catalyzed transannulation of humulene diepoxide and subsequent sequential olefin metathesis. Cheminformatic analysis quantitatively showed that the synthesized terpenoid alkaloid-like compound library has a high level of three-dimensional-shape diversity. Extensive pharmacological screening of the library has led to the identification of promising compounds for the development of antihypolipidemic drugs. Therefore, the synthesis of terpenoid alkaloid-like compound libraries based on humulene is well suited to drug discovery. Synthesis of terpenoid alkaloid-like compounds based on several natural terpenoids is an effective strategy for producing chemically diverse libraries. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Using ePortfolio-Based Learning Approach to Facilitate Knowledge Sharing and Creation among College Students

    ERIC Educational Resources Information Center

    Chang, Chi-Cheng; Chou, Pao-Nan; Liang, Chaoyan

    2018-01-01

    The purpose of the present study was to examine the effects of the ePortfolio-based learning approach (ePBLA) on knowledge sharing and creation with 92 college students majoring in electrical engineering as the participants. Multivariate analysis of covariance (MANCOVA) with a covariance of pretest on knowledge sharing and creation was conducted…

  16. Investigation of the Public Library as a Linking Agent to Major Scientific, Educational, Social and Environmental Data Bases. Two-Year Interim Report.

    ERIC Educational Resources Information Center

    Summit, Roger K.; Firschein, Oscar

    Eight public libraries participated in a two-year experiment to investigate the potential of the public library as a "linking agent" between the public and the many machine-readable data bases currently accessible using on line computer terminals. The investigation covered users of the service, impact on the library, conditions for…

  17. Protocols for the Design of Kinase-focused Compound Libraries.

    PubMed

    Jacoby, Edgar; Wroblowski, Berthold; Buyck, Christophe; Neefs, Jean-Marc; Meyer, Christophe; Cummings, Maxwell D; van Vlijmen, Herman

    2018-05-01

    Protocols for the design of kinase-focused compound libraries are presented. Kinase-focused compound libraries can be differentiated based on the design goal. Depending on whether the library should be a discovery library specific for one particular kinase, a general discovery library for multiple distinct kinase projects, or even phenotypic screening, there exists today a variety of in silico methods to design candidate compound libraries. We address the following scenarios: 1) Datamining of SAR databases and kinase focused vendor catalogues; 2) Predictions and virtual screening; 3) Structure-based design of combinatorial kinase inhibitors; 4) Design of covalent kinase inhibitors; 5) Design of macrocyclic kinase inhibitors; and 6) Design of allosteric kinase inhibitors and activators. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Trends in hospital librarianship and hospital library services: 1989 to 2006.

    PubMed

    Thibodeau, Patricia L; Funk, Carla J

    2009-10-01

    The research studied the status of hospital librarians and library services to better inform the Medical Library Association's advocacy activities. The Vital Pathways Survey Subcommittee of the Task Force on Vital Pathways for Hospital Librarians distributed a web-based survey to hospital librarians and academic health sciences library directors. The survey results were compared to data collected in a 1989 survey of hospital libraries by the American Hospital Association in order to identify any trends in hospital libraries, roles of librarians, and library services. A web-based hospital library report form based on the survey questions was also developed to more quickly identify changes in the status of hospital libraries on an ongoing basis. The greatest change in library services between 1989 and 2005/06 was in the area of access to information, with 40% more of the respondents providing access to commercial online services, 100% more providing access to Internet resources, and 28% more providing training in database searching and use of information resources. Twenty-nine percent (n = 587) of the 2005/06 respondents reported a decrease in staff over the last 5 years. Survey data support reported trends of consolidation of hospitals and hospital libraries and additions of new services. These services have likely required librarians to acquire new skills. It is hoped that future surveys will be undertaken to continue to study these trends.

  19. Trends in hospital librarianship and hospital library services: 1989 to 2006

    PubMed Central

    Thibodeau, Patricia L.; Funk, Carla J.

    2009-01-01

    Objective: The research studied the status of hospital librarians and library services to better inform the Medical Library Association's advocacy activities. Methods: The Vital Pathways Survey Subcommittee of the Task Force on Vital Pathways for Hospital Librarians distributed a web-based survey to hospital librarians and academic health sciences library directors. The survey results were compared to data collected in a 1989 survey of hospital libraries by the American Hospital Association in order to identify any trends in hospital libraries, roles of librarians, and library services. A web-based hospital library report form based on the survey questions was also developed to more quickly identify changes in the status of hospital libraries on an ongoing basis. Results: The greatest change in library services between 1989 and 2005/06 was in the area of access to information, with 40% more of the respondents providing access to commercial online services, 100% more providing access to Internet resources, and 28% more providing training in database searching and use of information resources. Twenty-nine percent (n = 587) of the 2005/06 respondents reported a decrease in staff over the last 5 years. Conclusions: Survey data support reported trends of consolidation of hospitals and hospital libraries and additions of new services. These services have likely required librarians to acquire new skills. It is hoped that future surveys will be undertaken to continue to study these trends. PMID:19851491

  20. Covariance Bell inequalities

    NASA Astrophysics Data System (ADS)

    Pozsgay, Victor; Hirsch, Flavien; Branciard, Cyril; Brunner, Nicolas

    2017-12-01

    We introduce Bell inequalities based on covariance, one of the most common measures of correlation. Explicit examples are discussed, and violations in quantum theory are demonstrated. A crucial feature of these covariance Bell inequalities is their nonlinearity; this has nontrivial consequences for the derivation of their local bound, which is not reached by deterministic local correlations. For our simplest inequality, we derive analytically tight bounds for both local and quantum correlations. An interesting application of covariance Bell inequalities is that they can act as "shared randomness witnesses": specifically, the value of the Bell expression gives device-independent lower bounds on both the dimension and the entropy of the shared random variable in a local model.

  1. An Empirical State Error Covariance Matrix for the Weighted Least Squares Estimation Method

    NASA Technical Reports Server (NTRS)

    Frisbee, Joseph H., Jr.

    2011-01-01

    State estimation techniques effectively provide mean state estimates. However, the theoretical state error covariance matrices provided as part of these techniques often suffer from a lack of confidence in their ability to describe the un-certainty in the estimated states. By a reinterpretation of the equations involved in the weighted least squares algorithm, it is possible to directly arrive at an empirical state error covariance matrix. This proposed empirical state error covariance matrix will contain the effect of all error sources, known or not. Results based on the proposed technique will be presented for a simple, two observer, measurement error only problem.

  2. Libraries as a venue for exciting education technology, both high tech and low

    NASA Astrophysics Data System (ADS)

    Harold, J. B.; Dusenbery, P.; Holland, A.

    2016-12-01

    Public libraries provide a broad range of possibilities for reaching diverse audiences with NASA and STEM related content and activities, from hands-on activities, to interactive kiosks incorporating science based games, simulations, and real-time data. NCIL/SSI has been developing STEM-based exhibits and program activities for public libraries since 2007, and is currently managing 7 national tours in partnership with the American Library Association and other organizations. Past and current exhibitions will reach over 100 libraries and an estimated 1.5 million patrons. In this paper we will discuss a range of findings from almost a decade of deploying both high and low tech STEM learning strategies into libraries, including usage and engagement by library patrons, and challenges (and solutions) for deploying technologically sophisticated components into libraries which may or may not have dedicated technical staff.

  3. Public Libraries: Responding to Demand.

    ERIC Educational Resources Information Center

    Annichiarico, Mark; And Others

    1993-01-01

    Discussion of problems library wholesalers/distributors face trying to fulfill public libraries' needs while adjusting to a changing industry is based on responses by librarians to a survey on library jobbers. Increased services to libraries, electronic ordering, timeliness, stock management, and quality control are addressed; and a chart of…

  4. Library of the Future: Croydon's New Central Library Complex.

    ERIC Educational Resources Information Center

    Batt, Chris

    1993-01-01

    A new library and cultural center in Croyden (England) is described. Function-based areas include library, administration, technical services, museum and galleries, museum offices and store, cinema, tourist information center, and local government offices. Information technology systems include the library management system, office automation, and…

  5. The Changing Role of Librarian.

    ERIC Educational Resources Information Center

    Cummins, Thompson R.

    This paper explores the political responsiblity of the library administrator, particularly in regard to government funding of public libraries and state library agencies. The history of library development in the United States is outlined, revealing the original political base of the library movement and indicating the necessity for political…

  6. Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis

    NASA Astrophysics Data System (ADS)

    George, E., Chan, K.

    2012-09-01

    Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for pairs of large objects may be inadequate. These relationships may also form the basis of an important metric for catalog maintenance by defining the maximum allowable covariance size for effective conjunction analysis. The application of these techniques promises to greatly improve the efficiency and completeness of conjunction analysis.

  7. Modelling Nitrogen Oxides in Los Angeles Using a Hybrid Dispersion/Land Use Regression Model

    NASA Astrophysics Data System (ADS)

    Wilton, Darren C.

    The goal of this dissertation is to develop models capable of predicting long term annual average NOx concentrations in urban areas. Predictions from simple meteorological dispersion models and seasonal proxies for NO2 oxidation were included as covariates in a land use regression (LUR) model for NOx in Los Angeles, CA. The NO x measurements were obtained from a comprehensive measurement campaign that is part of the Multi-Ethnic Study of Atherosclerosis Air Pollution Study (MESA Air). Simple land use regression models were initially developed using a suite of GIS-derived land use variables developed from various buffer sizes (R²=0.15). Caline3, a simple steady-state Gaussian line source model, was initially incorporated into the land-use regression framework. The addition of this spatio-temporally varying Caline3 covariate improved the simple LUR model predictions. The extent of improvement was much more pronounced for models based solely on the summer measurements (simple LUR: R²=0.45; Caline3/LUR: R²=0.70), than it was for models based on all seasons (R²=0.20). We then used a Lagrangian dispersion model to convert static land use covariates for population density, commercial/industrial area into spatially and temporally varying covariates. The inclusion of these covariates resulted in significant improvement in model prediction (R²=0.57). In addition to the dispersion model covariates described above, a two-week average value of daily peak-hour ozone was included as a surrogate of the oxidation of NO2 during the different sampling periods. This additional covariate further improved overall model performance for all models. The best model by 10-fold cross validation (R²=0.73) contained the Caline3 prediction, a static covariate for length of A3 roads within 50 meters, the Calpuff-adjusted covariates derived from both population density and industrial/commercial land area, and the ozone covariate. This model was tested against annual average NOx concentrations from an independent data set from the EPA's Air Quality System (AQS) and MESA Air fixed site monitors, and performed very well (R²=0.82).

  8. Species richness in soil bacterial communities: a proposed approach to overcome sample size bias.

    PubMed

    Youssef, Noha H; Elshahed, Mostafa S

    2008-09-01

    Estimates of species richness based on 16S rRNA gene clone libraries are increasingly utilized to gauge the level of bacterial diversity within various ecosystems. However, previous studies have indicated that regardless of the utilized approach, species richness estimates obtained are dependent on the size of the analyzed clone libraries. We here propose an approach to overcome sample size bias in species richness estimates in complex microbial communities. Parametric (Maximum likelihood-based and rarefaction curve-based) and non-parametric approaches were used to estimate species richness in a library of 13,001 near full-length 16S rRNA clones derived from soil, as well as in multiple subsets of the original library. Species richness estimates obtained increased with the increase in library size. To obtain a sample size-unbiased estimate of species richness, we calculated the theoretical clone library sizes required to encounter the estimated species richness at various clone library sizes, used curve fitting to determine the theoretical clone library size required to encounter the "true" species richness, and subsequently determined the corresponding sample size-unbiased species richness value. Using this approach, sample size-unbiased estimates of 17,230, 15,571, and 33,912 were obtained for the ML-based, rarefaction curve-based, and ACE-1 estimators, respectively, compared to bias-uncorrected values of 15,009, 11,913, and 20,909.

  9. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  10. Comparing different methods for determining forest evapotranspiration and its components at multiple temporal scales.

    PubMed

    Tie, Qiang; Hu, Hongchang; Tian, Fuqiang; Holbrook, N Michele

    2018-08-15

    Accurately estimating forest evapotranspiration and its components is of great importance for hydrology, ecology, and meteorology. In this study, a comparison of methods for determining forest evapotranspiration and its components at annual, monthly, daily, and diurnal scales was conducted based on in situ measurements in the subhumid mountainous forest of North China. The goal of the study was to evaluate the accuracies and reliabilities of the different methods. The results indicate the following: (1) The sap flow upscaling procedure, taking into account diversities in forest types and tree species, produced component-based forest evapotranspiration estimate that agreed with eddy covariance-based estimate at the temporal scales of year, month, and day, while soil water budget-based forest evapotranspiration estimate was also qualitatively consistent with eddy covariance-based estimate at the daily scale; (2) At the annual scale, catchment water balance-based forest evapotranspiration estimate was significantly higher than eddy covariance-based estimate, which might probably result from non-negligible subsurface runoff caused by the widely distributed regolith and fractured bedrock under the ground; (3) At the sub-daily scale, the diurnal course of sap flow based-canopy transpiration estimate lagged significantly behind eddy covariance-based forest evapotranspiration estimate, which might physiologically be due to stem water storage and stem hydraulic conductivity. The results in this region may have much referential significance for forest evapotranspiration estimation and method evaluation in regions with similar environmental conditions. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Continental-scale temperature covariance in proxy reconstructions and climate models

    NASA Astrophysics Data System (ADS)

    Hartl-Meier, Claudia; Büntgen, Ulf; Smerdon, Jason; Zorita, Eduardo; Krusic, Paul; Ljungqvist, Fredrik; Schneider, Lea; Esper, Jan

    2017-04-01

    Inter-continental temperature variability over the past millennium has been reported to be more coherent in climate model simulations than in multi-proxy-based reconstructions, a finding that undermines the representation of spatial variability in either of these approaches. We assess the covariance of summer temperatures among Northern Hemisphere continents by comparing tree-ring based temperature reconstructions with state-of-the-art climate model simulations over the past millennium. We find inter-continental temperature covariance to be larger in tree-ring-only reconstructions compared to those derived from multi-proxy networks, thus enhancing the agreement between proxy- and model-based spatial representations. A detailed comparison of simulated temperatures, however, reveals substantial spread among the models. Over the past millennium, inter-continental temperature correlations are driven by the cooling after major volcanic eruptions in 1257, 1452, 1601, and 1815. The coherence of these synchronizing events appears to be elevated in several climate simulations relative to their own covariance baselines and the proxy reconstructions, suggesting these models overestimate the amplitude of cooling in response to volcanic forcing at large spatial scales.

  12. Collection-based analysis of selected medical libraries in the Philippines using Doody's Core Titles.

    PubMed

    Torres, Efren

    2017-01-01

    This study assessed the book collection of five selected medical libraries in the Philippines, based on Doodys' Essential Purchase List for basic sciences and clinical medicine, to compare the match and non-match titles among libraries, to determine the strong and weak disciplines of each library, and to explore the factors that contributed to the percentage of match and non-match titles. List checking was employed as the method of research. Among the medical libraries, De La Salle Health Sciences Institute and University of Santo Tomas had the highest percentage of match titles, whereas Ateneo School of Medicine and Public Health had the lowest percentage of match titles. University of the Philippines Manila had the highest percentage of near-match titles. De La Salle Health Sciences Institute and University of Santo Tomas had sound medical collections based on Doody's Core Titles. Collectively, the medical libraries shared common collection development priorities, as evidenced by similarities in strong areas. Library budget and the role of the library director in book selection were among the factors that could contribute to a high percentage of match titles.

  13. A determinant-based criterion for working correlation structure selection in generalized estimating equations.

    PubMed

    Jaman, Ajmery; Latif, Mahbub A H M; Bari, Wasimul; Wahed, Abdus S

    2016-05-20

    In generalized estimating equations (GEE), the correlation between the repeated observations on a subject is specified with a working correlation matrix. Correct specification of the working correlation structure ensures efficient estimators of the regression coefficients. Among the criteria used, in practice, for selecting working correlation structure, Rotnitzky-Jewell, Quasi Information Criterion (QIC) and Correlation Information Criterion (CIC) are based on the fact that if the assumed working correlation structure is correct then the model-based (naive) and the sandwich (robust) covariance estimators of the regression coefficient estimators should be close to each other. The sandwich covariance estimator, used in defining the Rotnitzky-Jewell, QIC and CIC criteria, is biased downward and has a larger variability than the corresponding model-based covariance estimator. Motivated by this fact, a new criterion is proposed in this paper based on the bias-corrected sandwich covariance estimator for selecting an appropriate working correlation structure in GEE. A comparison of the proposed and the competing criteria is shown using simulation studies with correlated binary responses. The results revealed that the proposed criterion generally performs better than the competing criteria. An example of selecting the appropriate working correlation structure has also been shown using the data from Madras Schizophrenia Study. Copyright © 2015 John Wiley & Sons, Ltd.

  14. Evaluation of Approaches to Deal with Low-Frequency Nuisance Covariates in Population Pharmacokinetic Analyses.

    PubMed

    Lagishetty, Chakradhar V; Duffull, Stephen B

    2015-11-01

    Clinical studies include occurrences of rare variables, like genotypes, which due to their frequency and strength render their effects difficult to estimate from a dataset. Variables that influence the estimated value of a model-based parameter are termed covariates. It is often difficult to determine if such an effect is significant, since type I error can be inflated when the covariate is rare. Their presence may have either an insubstantial effect on the parameters of interest, hence are ignorable, or conversely they may be influential and therefore non-ignorable. In the case that these covariate effects cannot be estimated due to power and are non-ignorable, then these are considered nuisance, in that they have to be considered but due to type 1 error are of limited interest. This study assesses methods of handling nuisance covariate effects. The specific objectives include (1) calibrating the frequency of a covariate that is associated with type 1 error inflation, (2) calibrating its strength that renders it non-ignorable and (3) evaluating methods for handling these non-ignorable covariates in a nonlinear mixed effects model setting. Type 1 error was determined for the Wald test. Methods considered for handling the nuisance covariate effects were case deletion, Box-Cox transformation and inclusion of a specific fixed effects parameter. Non-ignorable nuisance covariates were found to be effectively handled through addition of a fixed effect parameter.

  15. Development and Demonstration of a Statistical Data Base System for Library and Network Planning and Evaluation. Fourth Quarterly Report.

    ERIC Educational Resources Information Center

    Jones, Dennis; And Others

    The National Center for Higher Education Management Systems (NCHEMS) has completed the development and demonstration of a library statistical data base. The data base, or management information system, was developed for administrators of public and academic libraries. The system provides administrators with a framework of information and…

  16. Filter Tuning Using the Chi-Squared Statistic

    NASA Technical Reports Server (NTRS)

    Lilly-Salkowski, Tyler

    2017-01-01

    The Goddard Space Flight Center (GSFC) Flight Dynamics Facility (FDF) performs orbit determination (OD) for the Aqua and Aura satellites. Both satellites are located in low Earth orbit (LEO), and are part of what is considered the A-Train satellite constellation. Both spacecraft are currently in the science phase of their respective missions. The FDF has recently been tasked with delivering definitive covariance for each satellite.The main source of orbit determination used for these missions is the Orbit Determination Toolkit developed by Analytical Graphics Inc. (AGI). This software uses an Extended Kalman Filter (EKF) to estimate the states of both spacecraft. The filter incorporates force modelling, ground station and space network measurements to determine spacecraft states. It also generates a covariance at each measurement. This covariance can be useful for evaluating the overall performance of the tracking data measurements and the filter itself. An accurate covariance is also useful for covariance propagation which is utilized in collision avoidance operations. It is also valuable when attempting to determine if the current orbital solution will meet mission requirements in the future.This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The Chi-square statistic is calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance.For the EKF to correctly calculate the covariance, error models associated with tracking data measurements must be accurately tuned. Over estimating or under estimating these error values can have detrimental effects on the overall filter performance. The filter incorporates ground station measurements, which can be tuned based on the accuracy of the individual ground stations. It also includes measurements from the NASA space network (SN), which can be affected by the assumed accuracy of the TDRS satellite state at the time of the measurement.The force modelling in the EKF is also an important factor that affects the propagation accuracy and covariance sizing. The dominant force in the LEO orbit regime is the drag force caused by atmospheric drag. Accurate accounting of the drag force is especially important for the accuracy of the propagated state. The implementation of a box and wing model to improve drag estimation accuracy, and its overall effect on the covariance state is explored.The process of tuning the EKF for Aqua and Aura support is described, including examination of the measurement errors of available observation types (Doppler and range), and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-square statistic, calculated based of the ODTK EKF solutions, are assessed versus accepted norms for the orbit regime.

  17. PNA-encoded chemical libraries.

    PubMed

    Zambaldo, Claudio; Barluenga, Sofia; Winssinger, Nicolas

    2015-06-01

    Peptide nucleic acid (PNA)-encoded chemical libraries along with DNA-encoded libraries have provided a powerful new paradigm for library synthesis and ligand discovery. PNA-encoding stands out for its compatibility with standard solid phase synthesis and the technology has been used to prepare libraries of peptides, heterocycles and glycoconjugates. Different screening formats have now been reported including selection-based and microarray-based methods that have yielded specific ligands against diverse target classes including membrane receptors, lectins and challenging targets such as Hsp70. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Making the Most of Youth: Rice University Library.

    ERIC Educational Resources Information Center

    Thompson, James

    1984-01-01

    Discusses the development of a research library at Rice University, whose first and only library building is less than 40 years old, highlighting the founding of the University in 1912, the Fondren Library building, library collections and cooperation, fee-based services, automation and the future, and problems and prospects. (EJS)

  19. Standards for Libraries Within Regional Library Systems in Saskatchewan.

    ERIC Educational Resources Information Center

    Saskatchewan Library Association, Regina.

    These quantitative standards for the delivery of library services to a dispersed population, which were developed by the Saskatchewan Library Association, are based on the decentralized delivery of library services backed up by the centralized provision of technical services, resource people, and special collections in Saskatchewan. The roles of…

  20. Federal Legislative Policy of the American Library Association.

    ERIC Educational Resources Information Center

    American Library Association, Chicago, IL. Legislation Committee.

    The American Library Association's policy on federal legislation is based on its objectives of promoting and improving library service and librarianship. Representing those who use libraries as well as those who operate them, the Association is a source of information on libraries and information services for those concerned with formulating and…

  1. Collaborative Portfolio's Effect on Library Usage

    ERIC Educational Resources Information Center

    Bryan, Valerie

    2011-01-01

    Library resources are expensive and it is the library media specialist's responsibility to ensure that use of the library's resources is maximized to support the School Strategic Plan (SSP). This library usage study examined data on the scheduling of high school classes for research-based assignments, related to content area curriculum standards,…

  2. Library Research Support in Queensland: A Survey

    ERIC Educational Resources Information Center

    Richardson, Joanna; Nolan-Brown, Therese; Loria, Pat; Bradbury, Stephanie

    2012-01-01

    University libraries worldwide are reconceptualising the ways in which they support the research agenda in their respective institutions. This paper is based on a survey completed by member libraries of the Queensland University Libraries Office of Cooperation (QULOC), the findings of which may be informative for other university libraries. After…

  3. Sharing the Riches--Cooperation and the Library of Congress.

    ERIC Educational Resources Information Center

    Welsh, William J.

    These remarks are based on the Association's cooperative library study finding that two-thirds of the southeastern libraries are already participating in cooperative programs. Libraries must now exploit these avenues of cooperation, especially with the Library of Congress. Cooperative projects in automation such as CONSER, COMARC, and RLG are…

  4. Cultural Dimensions of Digital Library Development, Part II: The Cultures of Innovation in Five European National Libraries (Narratives of Development)

    ERIC Educational Resources Information Center

    Dalbello, Marija

    2009-01-01

    This article presents the narrative accounts of the beginnings of digital library programs in five European national libraries: Biblioteca nacional de Portugal, Bibliotheque nationale de France, Die Deutsche Bibliothek, the National Library of Scotland, and the British Library. Based on interviews with policy makers and developers of digital…

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B.; Mughabghab, S.F.

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present papermore » contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.« less

  6. Bayes Factor Covariance Testing in Item Response Models.

    PubMed

    Fox, Jean-Paul; Mulder, Joris; Sinharay, Sandip

    2017-12-01

    Two marginal one-parameter item response theory models are introduced, by integrating out the latent variable or random item parameter. It is shown that both marginal response models are multivariate (probit) models with a compound symmetry covariance structure. Several common hypotheses concerning the underlying covariance structure are evaluated using (fractional) Bayes factor tests. The support for a unidimensional factor (i.e., assumption of local independence) and differential item functioning are evaluated by testing the covariance components. The posterior distribution of common covariance components is obtained in closed form by transforming latent responses with an orthogonal (Helmert) matrix. This posterior distribution is defined as a shifted-inverse-gamma, thereby introducing a default prior and a balanced prior distribution. Based on that, an MCMC algorithm is described to estimate all model parameters and to compute (fractional) Bayes factor tests. Simulation studies are used to show that the (fractional) Bayes factor tests have good properties for testing the underlying covariance structure of binary response data. The method is illustrated with two real data studies.

  7. Disruption of structural covariance networks for language in autism is modulated by verbal ability.

    PubMed

    Sharda, Megha; Khundrakpam, Budhachandra S; Evans, Alan C; Singh, Nandini C

    2016-03-01

    The presence of widespread speech and language deficits is a core feature of autism spectrum disorders (ASD). These impairments have often been attributed to altered connections between brain regions. Recent developments in anatomical correlation-based approaches to map structural covariance offer an effective way of studying such connections in vivo. In this study, we employed such a structural covariance network (SCN)-based approach to investigate the integrity of anatomical networks in fronto-temporal brain regions of twenty children with ASD compared to an age and gender-matched control group of twenty-two children. Our findings reflected large-scale disruption of inter and intrahemispheric covariance in left frontal SCNs in the ASD group compared to controls, but no differences in right fronto-temporal SCNs. Interhemispheric covariance in left-seeded networks was further found to be modulated by verbal ability of the participants irrespective of autism diagnosis, suggesting that language function might be related to the strength of interhemispheric structural covariance between frontal regions. Additionally, regional cortical thickening was observed in right frontal and left posterior regions, which was predicted by decreasing symptom severity and increasing verbal ability in ASD. These findings unify reports of regional differences in cortical morphology in ASD. They also suggest that reduced left hemisphere asymmetry and increased frontal growth may not only reflect neurodevelopmental aberrations but also compensatory mechanisms.

  8. Covariance Matrix Evaluations for Independent Mass Fission Yields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, N., E-mail: nicholas.terranova@unibo.it; Serot, O.; Archier, P.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yieldsmore » variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.« less

  9. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    PubMed Central

    Qin, Lei; Snoussi, Hichem; Abdallah, Fahed

    2014-01-01

    We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences. PMID:24865883

  10. Comparison of Sensible Heat Flux from Eddy Covariance and Scintillometer over different land surface conditions

    NASA Astrophysics Data System (ADS)

    Zeweldi, D. A.; Gebremichael, M.; Summis, T.; Wang, J.; Miller, D.

    2008-12-01

    The large source of uncertainty in satellite-based evapotranspiration algorithm results from the estimation of sensible heat flux H. Traditionally eddy covariance sensors, and recently large-aperture scintillometers, have been used as ground truth to evaluate satellite-based H estimates. The two methods rely on different physical measurement principles, and represent different foot print sizes. In New Mexico, we conducted a field campaign during summer 2008 to compare H estimates obtained from the eddy covariance and scintillometer methods. During this field campaign, we installed sonic anemometers; one propeller eddy covariance (OPEC) equipped with net radiometer and soil heat flux sensors; large aperture scintillometer (LAS); and weather station consisting of wind speed, direction and radiation sensors over three different experimental areas consisting of different roughness conditions (desert, irrigated area and lake). Our results show the similarities and differences in H estimates obtained from these various methods over the different land surface conditions. Further, our results show that the H estimates obtained from the LAS agree with those obtained from the eddy covariance method when high frequency thermocouple temperature, instead of the typical weather station temperature measurements, is used in the LAS analysis.

  11. Censored quantile regression with recursive partitioning-based weights

    PubMed Central

    Wey, Andrew; Wang, Lan; Rudser, Kyle

    2014-01-01

    Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800

  12. Identification of Biomarkers of Impaired Sensory Profiles among Autistic Patients

    PubMed Central

    El-Ansary, Afaf; Hassan, Wail M.; Qasem, Hanan; Das, Undurti N.

    2016-01-01

    Background Autism is a neurodevelopmental disorder that displays significant heterogeneity. Comparison of subgroups within autism, and analyses of selected biomarkers as measure of the variation of the severity of autistic features such as cognitive dysfunction, social interaction impairment, and sensory abnormalities might help in understanding the pathophysiology of autism. Methods and Participants In this study, two sets of biomarkers were selected. The first included 7, while the second included 6 biomarkers. For set 1, data were collected from 35 autistic and 38 healthy control participants, while for set 2, data were collected from 29 out of the same 35 autistic and 16 additional healthy subjects. These markers were subjected to a principal components analysis using either covariance or correlation matrices. Moreover, libraries composed of participants categorized into units were constructed. The biomarkers used include, PE (phosphatidyl ethanolamine), PS (phosphatidyl serine), PC (phosphatidyl choline), MAP2K1 (Dual specificity mitogen-activated protein kinase kinase 1), IL-10 (interleukin-10), IL-12, NFκB (nuclear factor-κappa B); PGE2 (prostaglandin E2), PGE2-EP2, mPGES-1 (microsomal prostaglandin synthase E-1), cPLA2 (cytosolic phospholipase A2), 8-isoprostane, and COX-2 (cyclo-oxygenase-2). Results While none of the studied markers correlated with CARS and SRS as measure of cognitive and social impairments, six markers significantly correlated with sensory profiles of autistic patients. Multiple regression analysis identifies a combination of PGES, mPGES-1, and PE as best predictors of the degree of sensory profile impairment. Library identification resulted in 100% correct assignments of both autistic and control participants based on either set 1 or 2 biomarkers together with a satisfactory rate of assignments in case of sensory profile impairment using different sets of biomarkers. Conclusion The two selected sets of biomarkers were effective to separate autistic from healthy control subjects, demonstarting the possibility to accurately predict the severity of autism using the selected biomarkers. The effectiveness of the identified libraries lied in the fact that they were helpful in correctly assigning the study population as control or autistic patients and in classifying autistic patients with different degree of sensory profile impairment. PMID:27824861

  13. Systems Analysis, Machineable Circulation Data and Library Users and Non-Users.

    ERIC Educational Resources Information Center

    Lubans, John, Jr.

    A study to be made with computer-based circulation data of the non-use and use of a large academic library is discussed. A search of the literature reveals that computer-based circulation systems can be, but have not been, utilized to provide data bases for systematic analyses of library users and resources. The data gathered in the circulation…

  14. Web-Based Online Public Access Catalogues of IIT Libraries in India: An Evaluative Study

    ERIC Educational Resources Information Center

    Madhusudhan, Margam; Aggarwal, Shalini

    2011-01-01

    Purpose: The purpose of the paper is to examine the various features and components of web-based online public access catalogues (OPACs) of IIT libraries in India with the help of a specially designed evaluation checklist. Design/methodology/approach: The various features of the web-based OPACs in six IIT libraries (IIT Delhi, IIT Bombay, IIT…

  15. Library Research: A Ten Year Analysis of the Library Automation Marketplace: 1981-1990.

    ERIC Educational Resources Information Center

    Fivecoat, Martha H.

    This study focuses on the growth of the library automation market from 1981 to 1990. It draws on library automation data published annually in the Library Journal between 1981 and 1990. The data are used to examine: (1) the overall library system market trends based on the total and cumulative number of systems installed and revenue generated; (2)…

  16. Attitudes about OCLC in Small and Medium-Sized Libraries. Illinois Valley Library System OCLC Experimental Project. Report No. 4.

    ERIC Educational Resources Information Center

    Bills, Linda G.; Wilford, Valerie

    A project was conducted from 1980 to 1982 to determine the costs and benefits of OCLC use in 29 small and medium-sized member libraries of the Illinois Valley Library System (IVLS). Academic, school, public, and special libraries participated in the project. Based on written attitude surveys of and interviews with library directors, staff,…

  17. Role of Computers in Sci-Tech Libraries.

    ERIC Educational Resources Information Center

    Bichteler, Julie; And Others

    1986-01-01

    Articles in this theme issue discuss applications of microcomputers in science/technology libraries, a UNIX-based online catalog, online versus print sources, computer-based statistics, and the applicability and implications of the Matheson-Cooper Report on health science centers for science/technology libraries. A bibliography of new reference…

  18. Conditional Covariance-Based Nonparametric Multidimensionality Assessment.

    ERIC Educational Resources Information Center

    Stout, William; And Others

    1996-01-01

    Three nonparametric procedures that use estimates of covariances of item-pair responses conditioned on examinee trait level for assessing dimensionality of a test are described. The HCA/CCPROX, DIMTEST, and DETECT are applied to a dimensionality study of the Law School Admission Test. (SLD)

  19. Statistics of Public Libraries, 1977-1978.

    ERIC Educational Resources Information Center

    Eckard, Helen

    Based on a study of U.S. public libraries which was part of the 1977-78 Library General Information Survey (LIBGIS), this statistical report updates a similar 1974 publication. A definition of public libraries as used in the survey is provided, as well as information on public library service outlets, equipment, staffing, receipts, expenditures,…

  20. Introducing ORACLE: Library Processing in a Multi-User Environment.

    ERIC Educational Resources Information Center

    Queensland Library Board, Brisbane (Australia).

    Currently being developed by the State Library of Queensland, Australia, ORACLE (On-Line Retrieval of Acquisitions, Cataloguing, and Circulation Details for Library Enquiries) is a computerized library system designed to provide rapid processing of library materials in a multi-user environment. It is based on the Australian MARC format and fully…

  1. Using a Gravity Model to Predict Circulation in a Public Library System.

    ERIC Educational Resources Information Center

    Ottensmann, John R.

    1995-01-01

    Describes the development of a gravity model based upon principles of spatial interaction to predict the circulation of libraries in the Indianapolis-Marion County Public Library (Indiana). The model effectively predicted past circulation figures and was tested by predicting future library circulation, particularly for a new branch library.…

  2. Marketing Activities and Usage in Historically Black College and University Libraries 2000-2008

    ERIC Educational Resources Information Center

    Walsh, Janet

    2012-01-01

    The academic library has experienced overall growth and decline based on demographics, technology, and convenience, yet several problems face academic libraries today including: (a) perceived relevance, (b) market share, and (c) competition. The purpose of this study was to explore marketing activities and library usage in HBCU libraries. The…

  3. Making Wise Buys: Five Values to Consider when Evaluating a Library Purchase

    ERIC Educational Resources Information Center

    Durr, Chris

    2011-01-01

    Library staff members should ultimately base their purchasing choices on the mission statements of their employing institutions. Fortunately, library mission statements have much in common. Undoubtedly, for example, all libraries have a goal that includes "serve the information needs of the community," because on some level, all libraries are…

  4. Customer Satisfaction with Public Libraries.

    ERIC Educational Resources Information Center

    D'Elia, George; Rodger, Eleanor Jo

    1996-01-01

    Surveys conducted in 142 urban public libraries examined customer satisfaction, comparisons with other libraries, and factors affecting satisfaction. Overall, customers were satisfied with their libraries but experienced different levels of satisfaction based on convenience, availability of materials and information, and services facilitating…

  5. Library Standards: Evidence of Library Effectiveness and Accreditation.

    ERIC Educational Resources Information Center

    Ebbinghouse, Carol

    1999-01-01

    Discusses accreditation standards for libraries based on experiences in an academic law library. Highlights include the accreditation process; the impact of distance education and remote technologies on accreditation; and a list of Internet sources of standards and information. (LRW)

  6. Expanding roles in a library-based bioinformatics service program: a case study

    PubMed Central

    Li, Meng; Chen, Yi-Bu; Clintworth, William A

    2013-01-01

    Question: How can a library-based bioinformatics support program be implemented and expanded to continuously support the growing and changing needs of the research community? Setting: A program at a health sciences library serving a large academic medical center with a strong research focus is described. Methods: The bioinformatics service program was established at the Norris Medical Library in 2005. As part of program development, the library assessed users' bioinformatics needs, acquired additional funds, established and expanded service offerings, and explored additional roles in promoting on-campus collaboration. Results: Personnel and software have increased along with the number of registered software users and use of the provided services. Conclusion: With strategic efforts and persistent advocacy within the broader university environment, library-based bioinformatics service programs can become a key part of an institution's comprehensive solution to researchers' ever-increasing bioinformatics needs. PMID:24163602

  7. EditorialEvidence based library and information practice.

    PubMed

    Grant, Maria J

    2011-06-01

    Whilst many of us engage in supporting clinicians in identifying, appraising and using evidence, how many of us adopt the same approach to our own work? A recent survey by the UK LIS Research Coalition indicated that 60% of respondents use research reports as a source of information whilst a similar proportion of health library respondents use professional events such as conferences as a source of information. This summer sees the 6(th) International Evidence Based Library and Information Practice (EBLIP6) being held at the University of Salford, UK between 27(th) -30(th) June which will go some way to satisfying this latter information need whilst the Health Information and Libraries Journal can help satisfy the need for research reports. Whatever you're doing this summer, let's make it evidence based. © 2011 The authors. Health Information and Libraries Journal © 2011 Health Libraries Group.

  8. Next-generation libraries for robust RNA interference-based genome-wide screens

    PubMed Central

    Kampmann, Martin; Horlbeck, Max A.; Chen, Yuwen; Tsai, Jordan C.; Bassik, Michael C.; Gilbert, Luke A.; Villalta, Jacqueline E.; Kwon, S. Chul; Chang, Hyeshik; Kim, V. Narry; Weissman, Jonathan S.

    2015-01-01

    Genetic screening based on loss-of-function phenotypes is a powerful discovery tool in biology. Although the recent development of clustered regularly interspaced short palindromic repeats (CRISPR)-based screening approaches in mammalian cell culture has enormous potential, RNA interference (RNAi)-based screening remains the method of choice in several biological contexts. We previously demonstrated that ultracomplex pooled short-hairpin RNA (shRNA) libraries can largely overcome the problem of RNAi off-target effects in genome-wide screens. Here, we systematically optimize several aspects of our shRNA library, including the promoter and microRNA context for shRNA expression, selection of guide strands, and features relevant for postscreen sample preparation for deep sequencing. We present next-generation high-complexity libraries targeting human and mouse protein-coding genes, which we grouped into 12 sublibraries based on biological function. A pilot screen suggests that our next-generation RNAi library performs comparably to current CRISPR interference (CRISPRi)-based approaches and can yield complementary results with high sensitivity and high specificity. PMID:26080438

  9. Adaptive pre-specification in randomized trials with and without pair-matching.

    PubMed

    Balzer, Laura B; van der Laan, Mark J; Petersen, Maya L

    2016-11-10

    In randomized trials, adjustment for measured covariates during the analysis can reduce variance and increase power. To avoid misleading inference, the analysis plan must be pre-specified. However, it is often unclear a priori which baseline covariates (if any) should be adjusted for in the analysis. Consider, for example, the Sustainable East Africa Research in Community Health (SEARCH) trial for HIV prevention and treatment. There are 16 matched pairs of communities and many potential adjustment variables, including region, HIV prevalence, male circumcision coverage, and measures of community-level viral load. In this paper, we propose a rigorous procedure to data-adaptively select the adjustment set, which maximizes the efficiency of the analysis. Specifically, we use cross-validation to select from a pre-specified library the candidate targeted maximum likelihood estimator (TMLE) that minimizes the estimated variance. For further gains in precision, we also propose a collaborative procedure for estimating the known exposure mechanism. Our small sample simulations demonstrate the promise of the methodology to maximize study power, while maintaining nominal confidence interval coverage. We show how our procedure can be tailored to the scientific question (intervention effect for the study sample vs. for the target population) and study design (pair-matched or not). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Determination of a Screening Metric for High Diversity DNA Libraries.

    PubMed

    Guido, Nicholas J; Handerson, Steven; Joseph, Elaine M; Leake, Devin; Kung, Li A

    2016-01-01

    The fields of antibody engineering, enzyme optimization and pathway construction rely increasingly on screening complex variant DNA libraries. These highly diverse libraries allow researchers to sample a maximized sequence space; and therefore, more rapidly identify proteins with significantly improved activity. The current state of the art in synthetic biology allows for libraries with billions of variants, pushing the limits of researchers' ability to qualify libraries for screening by measuring the traditional quality metrics of fidelity and diversity of variants. Instead, when screening variant libraries, researchers typically use a generic, and often insufficient, oversampling rate based on a common rule-of-thumb. We have developed methods to calculate a library-specific oversampling metric, based on fidelity, diversity, and representation of variants, which informs researchers, prior to screening the library, of the amount of oversampling required to ensure that the desired fraction of variant molecules will be sampled. To derive this oversampling metric, we developed a novel alignment tool to efficiently measure frequency counts of individual nucleotide variant positions using next-generation sequencing data. Next, we apply a method based on the "coupon collector" probability theory to construct a curve of upper bound estimates of the sampling size required for any desired variant coverage. The calculated oversampling metric will guide researchers to maximize their efficiency in using highly variant libraries.

  11. Population Pharmacokinetics of Intranasal Scopolamine

    NASA Technical Reports Server (NTRS)

    Wu, L.; Chow, D. S. L.; Putcha, L.

    2013-01-01

    Introduction: An intranasal gel dosage formulation of scopolamine (INSCOP) was developed for the treatment of Space Motion Sickness (SMS).The bioavailability and pharmacokinetics (PK) was evaluated using data collected in Phase II IND protocols. We reported earlier statistically significant gender differences in PK parameters of INSCOP at a dose level of 0.4 mg. To identify covariates that influence PK parameters of INSCOP, we examined population covariates of INSCOP PK model for 0.4 mg dose. Methods: Plasma scopolamine concentrations versus time data were collected from 20 normal healthy human subjects (11 male/9 female) after a 0.4 mg dose. Phoenix NLME was employed for PK analysis of these data using gender, body weight and age as covariates for model selection. Model selection was based on a likelihood ratio test on the difference of criteria (-2LL). Statistical significance for base model building and individual covariate analysis was set at P less than 0.05{delta(-2LL)=3.84}. Results: A one-compartment pharmacokinetic model with first-order elimination best described INSCOP concentration ]time profiles. Inclusion of gender, body weight and age as covariates individually significantly reduced -2LL by the cut-off value of 3.84(P less than 0.05) when tested against the base model. After the forward stepwise selection and backward elimination steps, gender was selected to add to the final model which had significant influence on absorption rate constant (ka) and the volume of distribution (V) of INSCOP. Conclusion: A population pharmacokinetic model for INSCOP has been identified and gender was a significant contributing covariate for the final model. The volume of distribution and Ka were significantly higher in males than in females which confirm gender-dependent pharmacokinetics of scopolamine after administration of a 0.4 mg dose.

  12. Simplification of the Kalman filter for meteorological data assimilation

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.

    1991-01-01

    The paper proposes a new statistical method of data assimilation that is based on a simplification of the Kalman filter equations. The forecast error covariance evolution is approximated simply by advecting the mass-error covariance field, deriving the remaining covariances geostrophically, and accounting for external model-error forcing only at the end of each forecast cycle. This greatly reduces the cost of computation of the forecast error covariance. In simulations with a linear, one-dimensional shallow-water model and data generated artificially, the performance of the simplified filter is compared with that of the Kalman filter and the optimal interpolation (OI) method. The simplified filter produces analyses that are nearly optimal, and represents a significant improvement over OI.

  13. Non-Linear Optimization Applied to Angle-of-Arrival Satellite-Based Geolocation with Correlated Measurements

    DTIC Science & Technology

    2015-03-01

    General covariance intersection covariance matrix Σ1 Measurement 1’s covariance matrix I(X) Fisher information matrix g Confidence region L Lower... information in this chapter will discuss the motivation and background of the geolocation algorithm with the scope of the applications for this research. The...algorithm is able to produce the best description of an object given the information from a set of measurements. Determining a position requires the use of a

  14. Large Covariance Estimation by Thresholding Principal Orthogonal Complements

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2012-01-01

    This paper deals with the estimation of a high-dimensional covariance with a conditional sparsity structure and fast-diverging eigenvalues. By assuming sparse error covariance matrix in an approximate factor model, we allow for the presence of some cross-sectional correlation even after taking out common but unobservable factors. We introduce the Principal Orthogonal complEment Thresholding (POET) method to explore such an approximate factor structure with sparsity. The POET estimator includes the sample covariance matrix, the factor-based covariance matrix (Fan, Fan, and Lv, 2008), the thresholding estimator (Bickel and Levina, 2008) and the adaptive thresholding estimator (Cai and Liu, 2011) as specific examples. We provide mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. The rates of convergence of the sparse residual covariance matrix and the conditional sparse covariance matrix are studied under various norms. It is shown that the impact of estimating the unknown factors vanishes as the dimensionality increases. The uniform rates of convergence for the unobserved factors and their factor loadings are derived. The asymptotic results are also verified by extensive simulation studies. Finally, a real data application on portfolio allocation is presented. PMID:24348088

  15. Large Covariance Estimation by Thresholding Principal Orthogonal Complements.

    PubMed

    Fan, Jianqing; Liao, Yuan; Mincheva, Martina

    2013-09-01

    This paper deals with the estimation of a high-dimensional covariance with a conditional sparsity structure and fast-diverging eigenvalues. By assuming sparse error covariance matrix in an approximate factor model, we allow for the presence of some cross-sectional correlation even after taking out common but unobservable factors. We introduce the Principal Orthogonal complEment Thresholding (POET) method to explore such an approximate factor structure with sparsity. The POET estimator includes the sample covariance matrix, the factor-based covariance matrix (Fan, Fan, and Lv, 2008), the thresholding estimator (Bickel and Levina, 2008) and the adaptive thresholding estimator (Cai and Liu, 2011) as specific examples. We provide mathematical insights when the factor analysis is approximately the same as the principal component analysis for high-dimensional data. The rates of convergence of the sparse residual covariance matrix and the conditional sparse covariance matrix are studied under various norms. It is shown that the impact of estimating the unknown factors vanishes as the dimensionality increases. The uniform rates of convergence for the unobserved factors and their factor loadings are derived. The asymptotic results are also verified by extensive simulation studies. Finally, a real data application on portfolio allocation is presented.

  16. Complementary nonparametric analysis of covariance for logistic regression in a randomized clinical trial setting.

    PubMed

    Tangen, C M; Koch, G G

    1999-03-01

    In the randomized clinical trial setting, controlling for covariates is expected to produce variance reduction for the treatment parameter estimate and to adjust for random imbalances of covariates between the treatment groups. However, for the logistic regression model, variance reduction is not obviously obtained. This can lead to concerns about the assumptions of the logistic model. We introduce a complementary nonparametric method for covariate adjustment. It provides results that are usually compatible with expectations for analysis of covariance. The only assumptions required are based on randomization and sampling arguments. The resulting treatment parameter is a (unconditional) population average log-odds ratio that has been adjusted for random imbalance of covariates. Data from a randomized clinical trial are used to compare results from the traditional maximum likelihood logistic method with those from the nonparametric logistic method. We examine treatment parameter estimates, corresponding standard errors, and significance levels in models with and without covariate adjustment. In addition, we discuss differences between unconditional population average treatment parameters and conditional subpopulation average treatment parameters. Additional features of the nonparametric method, including stratified (multicenter) and multivariate (multivisit) analyses, are illustrated. Extensions of this methodology to the proportional odds model are also made.

  17. A stochastic multiple imputation algorithm for missing covariate data in tree-structured survival analysis.

    PubMed

    Wallace, Meredith L; Anderson, Stewart J; Mazumdar, Sati

    2010-12-20

    Missing covariate data present a challenge to tree-structured methodology due to the fact that a single tree model, as opposed to an estimated parameter value, may be desired for use in a clinical setting. To address this problem, we suggest a multiple imputation algorithm that adds draws of stochastic error to a tree-based single imputation method presented by Conversano and Siciliano (Technical Report, University of Naples, 2003). Unlike previously proposed techniques for accommodating missing covariate data in tree-structured analyses, our methodology allows the modeling of complex and nonlinear covariate structures while still resulting in a single tree model. We perform a simulation study to evaluate our stochastic multiple imputation algorithm when covariate data are missing at random and compare it to other currently used methods. Our algorithm is advantageous for identifying the true underlying covariate structure when complex data and larger percentages of missing covariate observations are present. It is competitive with other current methods with respect to prediction accuracy. To illustrate our algorithm, we create a tree-structured survival model for predicting time to treatment response in older, depressed adults. Copyright © 2010 John Wiley & Sons, Ltd.

  18. Usage Analysis for the Identification of Research Trends in Digital Libraries; Keepers of the Crumbling Culture: What Digital Preservation Can Learn from Library History; Patterns of Journal Use by Scientists through Three Evolutionary Phases; Developing a Content Management System-Based Web Site; Exploring Charging Models for Digital Cultural Heritage in Europe; Visions: The Academic Library in 2012.

    ERIC Educational Resources Information Center

    Bollen, Johan; Vemulapalli, Soma Sekara; Xu, Weining; Luce, Rick; Marcum, Deanna; Friedlander, Amy; Tenopir, Carol; Grayson, Matt; Zhang, Yan; Ebuen, Mercy; King, Donald W.; Boyce, Peter; Rogers, Clare; Kirriemuir, John; Tanner, Simon; Deegan, Marilyn; Marcum, James W.

    2003-01-01

    Includes six articles that discuss use analysis and research trends in digital libraries; library history and digital preservation; journal use by scientists; a content management system-based Web site for higher education in the United Kingdom; cost studies for transitioning to digitized collections in European cultural institutions; and the…

  19. Competency-Based Education Programs: A Library Perspective

    ERIC Educational Resources Information Center

    Sanders, Colleen

    2015-01-01

    Competency-based education (CBE) is an emerging model for higher education designed to reduce certain barriers to educational attainment. This essay describes CBE and the challenges and opportunities for academic librarians desiring to serve students and faculty in Library and Information Management Master of Library Science (MLS) programs. Every…

  20. Computer-Based Training for Library Staff: From Demonstration to Continuing Program.

    ERIC Educational Resources Information Center

    Bayne, Pauline S.

    1993-01-01

    Describes a demonstration project developed at the University of Tennessee (Knoxville) libraries to train nonprofessional library staff with computer-based training using HyperCard that was created by librarians rather than by computer programmers. Evaluation methods are discussed, including formative and summative evaluation; and modifications…

  1. The Experience of Evidence-Based Practice in an Australian Public Library: An Ethnography

    ERIC Educational Resources Information Center

    Gillespie, Ann; Partridge, Helen; Bruce, Christine; Howlett, Alisa

    2016-01-01

    Introduction: This paper presents the findings from a project that investigated the lived experiences of library and information professionals in relation to evidence-based practice within an Australian public library. Method: The project employed ethnography, which allows holistic description of people's experiences within a particular community…

  2. A Framework for Concept-Based Digital Course Libraries

    ERIC Educational Resources Information Center

    Dicheva, Darina; Dichev, Christo

    2004-01-01

    This article presents a general framework for building conceptbased digital course libraries. The framework is based on the idea of using a conceptual structure that represents a subject domain ontology for classification of the course library content. Two aspects, domain conceptualization, which supports findability and ontologies, which support…

  3. A Library Based Apprenticeship in Psychology Research.

    ERIC Educational Resources Information Center

    Ross, Linda; Carter, Elizabeth W.

    A collaboration to develop materials that will foster critical thinking and communication skills through library-based research is described and presented. Library activities were designed to promote the use of cognitive skills, such as analysis, synthesis, and evaluation. This was a pilot study designed to expose undergraduate students to one…

  4. Building Bridges: A Research Library Model for Technology-Based Partnerships

    ERIC Educational Resources Information Center

    Snyder, Carolyn A.; Carter, Howard; Soltys, Mickey

    2005-01-01

    The nature of technology-based collaboration is affected by the changing goals and priorities, budgetary considerations, staff expertise, and leadership of each of the organizations involved in the partnership. In the context of a national research library, this article will describe Southern Illinois University Carbondale Library Affairs'…

  5. Management Data for Selection Decisions in Building Library Collections.

    ERIC Educational Resources Information Center

    Hamaker, Charles A.

    1992-01-01

    Discusses the use of library management data, particularly circulation data, in making selection decisions for library collection development based on experiences at Louisiana State University. Development of a collection based on actual use rather than perceived research needs is considered, and the decision-making process for serials…

  6. Ecological covariates based predictive model of malaria risk in the state of Chhattisgarh, India.

    PubMed

    Kumar, Rajesh; Dash, Chinmaya; Rani, Khushbu

    2017-09-01

    Malaria being an endemic disease in the state of Chhattisgarh and ecologically dependent mosquito-borne disease, the study is intended to identify the ecological covariates of malaria risk in districts of the state and to build a suitable predictive model based on those predictors which could assist developing a weather based early warning system. This secondary data based analysis used one month lagged district level malaria positive cases as response variable and ecological covariates as independent variables which were tested with fixed effect panelled negative binomial regression models. Interactions among the covariates were explored using two way factorial interaction in the model. Although malaria risk in the state possesses perennial characteristics, higher parasitic incidence was observed during the rainy and winter seasons. The univariate analysis indicated that the malaria incidence risk was statistically significant associated with rainfall, maximum humidity, minimum temperature, wind speed, and forest cover ( p  < 0.05). The efficient predictive model include the forest cover [IRR-1.033 (1.024-1.042)], maximum humidity [IRR-1.016 (1.013-1.018)], and two-way factorial interactions between district specific averaged monthly minimum temperature and monthly minimum temperature, monthly minimum temperature was statistically significant [IRR-1.44 (1.231-1.695)] whereas the interaction term has a protective effect [IRR-0.982 (0.974-0.990)] against malaria infections. Forest cover, maximum humidity, minimum temperature and wind speed emerged as potential covariates to be used in predictive models for modelling the malaria risk in the state which could be efficiently used for early warning systems in the state.

  7. Definition of Contravariant Velocity Components

    NASA Technical Reports Server (NTRS)

    Hung, Ching-Mao; Kwak, Dochan (Technical Monitor)

    2002-01-01

    This is an old issue in computational fluid dynamics (CFD). What is the so-called contravariant velocity or contravariant velocity component? In the article, we review the basics of tensor analysis and give the contravariant velocity component a rigorous explanation. For a given coordinate system, there exist two uniquely determined sets of base vector systems - one is the covariant and another is the contravariant base vector system. The two base vector systems are reciprocal. The so-called contravariant velocity component is really the contravariant component of a velocity vector for a time-independent coordinate system, or the contravariant component of a relative velocity between fluid and coordinates, for a time-dependent coordinate system. The contravariant velocity components are not physical quantities of the velocity vector. Their magnitudes, dimensions, and associated directions are controlled by their corresponding covariant base vectors. Several 2-D (two-dimensional) linear examples and 2-D mass-conservation equation are used to illustrate the details of expressing a vector with respect to the covariant and contravariant base vector systems, respectively.

  8. The Michigan Electronic Library.

    ERIC Educational Resources Information Center

    Davidsen, Susanna L.

    1997-01-01

    Describes the Michigan Electronic Library (MEL), the largest evaluated and organized Web-based library of Internet resources, that was designed to provide a library of electronic information resources selected by librarians. MEL's partnership is explained, the collection is described, and future developments are considered. (LRW)

  9. Collection-based analysis of selected medical libraries in the Philippines using Doody’s Core Titles

    PubMed Central

    Torres, Efren

    2017-01-01

    Objectives This study assessed the book collection of five selected medical libraries in the Philippines, based on Doodys’ Essential Purchase List for basic sciences and clinical medicine, to compare the match and non-match titles among libraries, to determine the strong and weak disciplines of each library, and to explore the factors that contributed to the percentage of match and non-match titles. Method List checking was employed as the method of research. Results Among the medical libraries, De La Salle Health Sciences Institute and University of Santo Tomas had the highest percentage of match titles, whereas Ateneo School of Medicine and Public Health had the lowest percentage of match titles. University of the Philippines Manila had the highest percentage of near-match titles. Conclusion De La Salle Health Sciences Institute and University of Santo Tomas had sound medical collections based on Doody’s Core Titles. Collectively, the medical libraries shared common collection development priorities, as evidenced by similarities in strong areas. Library budget and the role of the library director in book selection were among the factors that could contribute to a high percentage of match titles. PMID:28096742

  10. A Wavelet based Suboptimal Kalman Filter for Assimilation of Stratospheric Chemical Tracer Observations

    NASA Technical Reports Server (NTRS)

    Tangborn, Andrew; Auger, Ludovic

    2003-01-01

    A suboptimal Kalman filter system which evolves error covariances in terms of a truncated set of wavelet coefficients has been developed for the assimilation of chemical tracer observations of CH4. This scheme projects the discretized covariance propagation equations and covariance matrix onto an orthogonal set of compactly supported wavelets. Wavelet representation is localized in both location and scale, which allows for efficient representation of the inherently anisotropic structure of the error covariances. The truncation is carried out in such a way that the resolution of the error covariance is reduced only in the zonal direction, where gradients are smaller. Assimilation experiments which last 24 days, and used different degrees of truncation were carried out. These reduced the covariance size by 90, 97 and 99 % and the computational cost of covariance propagation by 80, 93 and 96 % respectively. The difference in both error covariance and the tracer field between the truncated and full systems over this period were found to be not growing in the first case, and growing relatively slowly in the later two cases. The largest errors in the tracer fields were found to occur in regions of largest zonal gradients in the constituent field. This results indicate that propagation of error covariances for a global two-dimensional data assimilation system are currently feasible. Recommendations for further reduction in computational cost are made with the goal of extending this technique to three-dimensional global assimilation systems.

  11. A Global Covariance Descriptor for Nuclear Atypia Scoring in Breast Histopathology Images.

    PubMed

    Khan, Adnan Mujahid; Sirinukunwattana, Korsuk; Rajpoot, Nasir

    2015-09-01

    Nuclear atypia scoring is a diagnostic measure commonly used to assess tumor grade of various cancers, including breast cancer. It provides a quantitative measure of deviation in visual appearance of cell nuclei from those in normal epithelial cells. In this paper, we present a novel image-level descriptor for nuclear atypia scoring in breast cancer histopathology images. The method is based on the region covariance descriptor that has recently become a popular method in various computer vision applications. The descriptor in its original form is not suitable for classification of histopathology images as cancerous histopathology images tend to possess diversely heterogeneous regions in a single field of view. Our proposed image-level descriptor, which we term as the geodesic mean of region covariance descriptors, possesses all the attractive properties of covariance descriptors lending itself to tractable geodesic-distance-based k-nearest neighbor classification using efficient kernels. The experimental results suggest that the proposed image descriptor yields high classification accuracy compared to a variety of widely used image-level descriptors.

  12. Academic Job Placements in Library and Information Science Field: A Case Study Performed on ALISE Web-Based Postings

    ERIC Educational Resources Information Center

    Abouserie, Hossam Eldin Mohamed Refaat

    2010-01-01

    The study investigated and analyzed the state of academic web-based job announcements in Library and Information Science Field. The purpose of study was to get in depth understanding about main characteristics and trends of academic job market in Library and Information science field. The study focused on web-based version announcement as it was…

  13. Evidence-based medicine and the development of medical libraries in China.

    PubMed

    Huang, Michael Bailou; Cheng, Aijun; Ma, Lu

    2009-07-01

    This article elaborates on the opportunities and challenges that evidence-based medicine (EBM) has posed to the development of medical libraries and summarizes the research in the field of evidence-based medicine and achievements of EBM practice in Chinese medical libraries. Issues such as building collections of information resources, transformation of information services models, human resources management, and training of medical librarians, clinicians, and EBM users are addressed. In view of problems encountered in EBM research and practice, several suggestions are made about important roles medical libraries can play in the future development of EBM in China.

  14. Geek the Library: A Community Awareness Campaign. A Report to the OCLC Membership

    ERIC Educational Resources Information Center

    Edvardsen, Linn Haugestad, Ed.

    2011-01-01

    Geek the Library, a community awareness campaign designed to highlight the vital role of public libraries and raise awareness about the critical funding issues many libraries face, was developed based on the research findings included in "From Awareness to Funding: A study of library support in America." This study, published by OCLC in…

  15. Enhancing Learning while Creating a Library Presence in Course Management Systems

    ERIC Educational Resources Information Center

    Mairn, Chad

    2010-01-01

    Web 2.0 has made information more accessible and offers opportunities to make library resources more visible. This article presents several strategies for incorporating libraries and library resources into Web sites and course management systems. The tools presented are appropriate for many types of libraries and work with most Web-based systems.…

  16. Library Spaces for 21st-Century Learners: A Planning Guide for Creating New School Library Concepts

    ERIC Educational Resources Information Center

    Sullivan, Margaret

    2013-01-01

    "Library Spaces for 21st-Century Learners: A Planning Guide for Creating New School Library Concepts" focuses on planning contemporary school library spaces with user-based design strategies. The book walks school librarians and administrators through the process of gathering information from students and other stakeholders involved in…

  17. Studying the Value of Library and Information Services: A Taxonomy of Users Assessments.

    ERIC Educational Resources Information Center

    Kantor, Paul B.; Saracevic, Tefko

    1995-01-01

    Describes the development of a taxonomy of the value of library services based on users' assessments from five large research libraries. Highlights include empirical and derived taxonomy, replicability of the study, reasons for using the library, how library services are related to time and money, and a theory of value. (LRW)

  18. Bringing Up Gopher: Access to Local & Remote Electronic Resources for University Library Users.

    ERIC Educational Resources Information Center

    Brown, Melvin Marlo; And Others

    Some of the administrative and organizational issues in creating a gopher, specifically a library gopher for university libraries, are discussed. In 1993 the Electronic Collections Task Force of the New Mexico State University library administration began to develop a library-based gopher system that would enable users to have unlimited access to…

  19. Tapping Teen Talent in Queens: A Library-Based, LSCA-Funded Youth Development Success Story from New York.

    ERIC Educational Resources Information Center

    Williams, Barbara Osborne

    1996-01-01

    Describes a program developed by the Youth Services Division at the Queens Borough Public Library's Central Library to help teenagers maximize growth opportunities, build self-esteem, and see the library as a life resource. Highlights include securing funding through LSCA (Library Services and Construction Act), recruiting participants, and…

  20. Organization Charts in ARL Libraries. SPEC Kit #170.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    This kit is based on a review of the organization charts of 71 member libraries of the Association of Research Libraries (ARL) gathered in January 1991, compared with an earlier SPEC Kit published in 1986, and contains charts of 29 of the libraries. A summary of the chart analyses presents information about the titles of library directors,…

  1. The Impacts of Free Public Internet Access on Public Library Patrons and Communities

    ERIC Educational Resources Information Center

    Bertot, John Carlo; McClure, Charles R.; Jaeger, Paul T.

    2008-01-01

    Public libraries have evolved into a primary source of Internet access in many communities, generating wide-ranging impacts in the communities that public libraries serve. Based on the findings of the 2007 Public Libraries and the Internet study, this article examines the ways in which the Internet access delivered by public libraries affects…

  2. Challenges to Library Materials from Principals in United States Secondary Schools--A "Victory" of Sorts.

    ERIC Educational Resources Information Center

    Hopkins, Dianne McAfee

    1995-01-01

    Examines challenges to school library materials initiated by principals in public middle, junior, and senior high school libraries based on a 1990 survey. A review of literature emphasizing the leadership of principals, their role in school library program development, and the principal and school library censorship is included. (Author/LRW)

  3. Predicting the required number of training samples. [for remotely sensed image data based on covariance matrix estimate quality criterion of normal distribution

    NASA Technical Reports Server (NTRS)

    Kalayeh, H. M.; Landgrebe, D. A.

    1983-01-01

    A criterion which measures the quality of the estimate of the covariance matrix of a multivariate normal distribution is developed. Based on this criterion, the necessary number of training samples is predicted. Experimental results which are used as a guide for determining the number of training samples are included. Previously announced in STAR as N82-28109

  4. School Libraries Empowering Learning: The Australian Landscape.

    ERIC Educational Resources Information Center

    Todd, Ross J.

    2003-01-01

    Describes school libraries in Australia. Highlights include the title of teacher librarian and their education; the history of the role of school libraries in Australian education; empowerment; information skills and benchmarks; national standards for school libraries; information literacy; learning outcomes; evidence-based practice; digital…

  5. Prospects for Public Library Evaluation.

    ERIC Educational Resources Information Center

    Van House, Nancy A.; Childers, Thomas

    1991-01-01

    Discusses methods of evaluation that can be used to measure public library effectiveness, based on a conference sponsored by the Council on Library Resources. Topics discussed include the Public Library Effectiveness Study (PLES), quantitative and qualitative evaluation, using evaluative information for resource acquisition and resource…

  6. Plausible combinations: An improved method to evaluate the covariate structure of Cormack-Jolly-Seber mark-recapture models

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.; McDonald, Trent L.; Amstrup, Steven C.

    2013-01-01

    Mark-recapture models are extensively used in quantitative population ecology, providing estimates of population vital rates, such as survival, that are difficult to obtain using other methods. Vital rates are commonly modeled as functions of explanatory covariates, adding considerable flexibility to mark-recapture models, but also increasing the subjectivity and complexity of the modeling process. Consequently, model selection and the evaluation of covariate structure remain critical aspects of mark-recapture modeling. The difficulties involved in model selection are compounded in Cormack-Jolly- Seber models because they are composed of separate sub-models for survival and recapture probabilities, which are conceptualized independently even though their parameters are not statistically independent. The construction of models as combinations of sub-models, together with multiple potential covariates, can lead to a large model set. Although desirable, estimation of the parameters of all models may not be feasible. Strategies to search a model space and base inference on a subset of all models exist and enjoy widespread use. However, even though the methods used to search a model space can be expected to influence parameter estimation, the assessment of covariate importance, and therefore the ecological interpretation of the modeling results, the performance of these strategies has received limited investigation. We present a new strategy for searching the space of a candidate set of Cormack-Jolly-Seber models and explore its performance relative to existing strategies using computer simulation. The new strategy provides an improved assessment of the importance of covariates and covariate combinations used to model survival and recapture probabilities, while requiring only a modest increase in the number of models on which inference is based in comparison to existing techniques.

  7. The effect of using genealogy-based haplotypes for genomic prediction

    PubMed Central

    2013-01-01

    Background Genomic prediction uses two sources of information: linkage disequilibrium between markers and quantitative trait loci, and additive genetic relationships between individuals. One way to increase the accuracy of genomic prediction is to capture more linkage disequilibrium by regression on haplotypes instead of regression on individual markers. The aim of this study was to investigate the accuracy of genomic prediction using haplotypes based on local genealogy information. Methods A total of 4429 Danish Holstein bulls were genotyped with the 50K SNP chip. Haplotypes were constructed using local genealogical trees. Effects of haplotype covariates were estimated with two types of prediction models: (1) assuming that effects had the same distribution for all haplotype covariates, i.e. the GBLUP method and (2) assuming that a large proportion (π) of the haplotype covariates had zero effect, i.e. a Bayesian mixture method. Results About 7.5 times more covariate effects were estimated when fitting haplotypes based on local genealogical trees compared to fitting individuals markers. Genealogy-based haplotype clustering slightly increased the accuracy of genomic prediction and, in some cases, decreased the bias of prediction. With the Bayesian method, accuracy of prediction was less sensitive to parameter π when fitting haplotypes compared to fitting markers. Conclusions Use of haplotypes based on genealogy can slightly increase the accuracy of genomic prediction. Improved methods to cluster the haplotypes constructed from local genealogy could lead to additional gains in accuracy. PMID:23496971

  8. The effect of using genealogy-based haplotypes for genomic prediction.

    PubMed

    Edriss, Vahid; Fernando, Rohan L; Su, Guosheng; Lund, Mogens S; Guldbrandtsen, Bernt

    2013-03-06

    Genomic prediction uses two sources of information: linkage disequilibrium between markers and quantitative trait loci, and additive genetic relationships between individuals. One way to increase the accuracy of genomic prediction is to capture more linkage disequilibrium by regression on haplotypes instead of regression on individual markers. The aim of this study was to investigate the accuracy of genomic prediction using haplotypes based on local genealogy information. A total of 4429 Danish Holstein bulls were genotyped with the 50K SNP chip. Haplotypes were constructed using local genealogical trees. Effects of haplotype covariates were estimated with two types of prediction models: (1) assuming that effects had the same distribution for all haplotype covariates, i.e. the GBLUP method and (2) assuming that a large proportion (π) of the haplotype covariates had zero effect, i.e. a Bayesian mixture method. About 7.5 times more covariate effects were estimated when fitting haplotypes based on local genealogical trees compared to fitting individuals markers. Genealogy-based haplotype clustering slightly increased the accuracy of genomic prediction and, in some cases, decreased the bias of prediction. With the Bayesian method, accuracy of prediction was less sensitive to parameter π when fitting haplotypes compared to fitting markers. Use of haplotypes based on genealogy can slightly increase the accuracy of genomic prediction. Improved methods to cluster the haplotypes constructed from local genealogy could lead to additional gains in accuracy.

  9. A causal examination of the effects of confounding factors on multimetric indices

    USGS Publications Warehouse

    Schoolmaster, Donald R.; Grace, James B.; Schweiger, E. William; Mitchell, Brian R.; Guntenspergen, Glenn R.

    2013-01-01

    The development of multimetric indices (MMIs) as a means of providing integrative measures of ecosystem condition is becoming widespread. An increasingly recognized problem for the interpretability of MMIs is controlling for the potentially confounding influences of environmental covariates. Most common approaches to handling covariates are based on simple notions of statistical control, leaving the causal implications of covariates and their adjustment unstated. In this paper, we use graphical models to examine some of the potential impacts of environmental covariates on the observed signals between human disturbance and potential response metrics. Using simulations based on various causal networks, we show how environmental covariates can both obscure and exaggerate the effects of human disturbance on individual metrics. We then examine from a causal interpretation standpoint the common practice of adjusting ecological metrics for environmental influences using only the set of sites deemed to be in reference condition. We present and examine the performance of an alternative approach to metric adjustment that uses the whole set of sites and models both environmental and human disturbance effects simultaneously. The findings from our analyses indicate that failing to model and adjust metrics can result in a systematic bias towards those metrics in which environmental covariates function to artificially strengthen the metric–disturbance relationship resulting in MMIs that do not accurately measure impacts of human disturbance. We also find that a “whole-set modeling approach” requires fewer assumptions and is more efficient with the given information than the more commonly applied “reference-set” approach.

  10. Robust Mean and Covariance Structure Analysis through Iteratively Reweighted Least Squares.

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Bentler, Peter M.

    2000-01-01

    Adapts robust schemes to mean and covariance structures, providing an iteratively reweighted least squares approach to robust structural equation modeling. Each case is weighted according to its distance, based on first and second order moments. Test statistics and standard error estimators are given. (SLD)

  11. Determining the oxygen isotope composition of evapotranspiration with eddy covariance

    USDA-ARS?s Scientific Manuscript database

    The oxygen isotope componsition of evapotranspiration (dF) represents an important tracer in the study of biosphere-atmosphere interactions, hydrology, paleoclimate, and carbon cycling. Here we demonstrate direct measurement of dF based on eddy covariance (EC) and tunable diode laser (EC-TDL) techni...

  12. Structural whole-brain covariance of the anterior and posterior hippocampus: Associations with age and memory.

    PubMed

    Nordin, Kristin; Persson, Jonas; Stening, Eva; Herlitz, Agneta; Larsson, Elna-Marie; Söderlund, Hedvig

    2018-02-01

    The hippocampus (HC) interacts with distributed brain regions to support memory and shows significant volume reductions in aging, but little is known about age effects on hippocampal whole-brain structural covariance. It is also unclear whether the anterior and posterior HC show similar or distinct patterns of whole-brain covariance and to what extent these are related to memory functions organized along the hippocampal longitudinal axis. Using the multivariate approach partial least squares, we assessed structural whole-brain covariance of the HC in addition to regional volume, in young, middle-aged and older adults (n = 221), and assessed associations with episodic and spatial memory. Based on findings of sex differences in both memory and brain aging, we further considered sex as a potential modulating factor of age effects. There were two main covariance patterns: one capturing common anterior and posterior covariance, and one differentiating the two regions by capturing anterior-specific covariance only. These patterns were differentially related to associative memory while unrelated to measures of single-item memory and spatial memory. Although patterns were qualitatively comparable across age groups, participants' expression of both patterns decreased with age, independently of sex. The results suggest that the organization of hippocampal structural whole-brain covariance remains stable across age, but that the integrity of these networks decreases as the brain undergoes age-related alterations. © 2017 Wiley Periodicals, Inc.

  13. Measuring Customer Satisfaction and Quality of Service in Special Libraries.

    ERIC Educational Resources Information Center

    White, Marilyn Domas; Abels, Eileen G.; Nitecki, Danuta

    This project tested the appropriateness of SERVQUAL (i.e., an instrument widely used in the service industry for assessing service quality based on repeated service encounters rather than a particular service encounter) to measure service quality in special libraries and developed a modified version for special libraries. SERVQUAL is based on an…

  14. Public Library Directors: A Career and Managerial Profile.

    ERIC Educational Resources Information Center

    Mech, Terrence

    1989-01-01

    Develops a career and managerial profile of directors of Northeastern medium-sized public libraries based on the responses of 217 directors. The results confirm the existence of many of the gender-based career patterns found among directors of academic and larger public libraries together with an emphasis on internal administrator roles. (16…

  15. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  16. Building to Scale: An Analysis of Web-Based Services in CIC (Big Ten) Libraries.

    ERIC Educational Resources Information Center

    Dewey, Barbara I.

    Advancing library services in large universities requires creative approaches for "building to scale." This is the case for CIC, Committee on Institutional Cooperation (Big Ten), libraries whose home institutions serve thousands of students, faculty, staff, and others. Developing virtual Web-based services is an increasingly viable…

  17. Evidence-Based Library Management: The Leadership Challenge

    ERIC Educational Resources Information Center

    Lakos, Amos

    2007-01-01

    This paper is an extension of the author's earlier work on developing management information services and creating a culture of assessment in libraries. The author will focus observations on the use of data in decision-making in libraries, specifically on the role of leadership in making evidence-based decision a reality, and will review new…

  18. Advancing Your Library's Web-Based Services. ERIC Digest.

    ERIC Educational Resources Information Center

    Feldman, Sari; Strobel, Tracy

    This digest discusses the development of World Wide Web-based services for libraries and provides examples from the Cleveland Public Library (CPL). The first section highlights the importance of developing such services, steps to be followed for a successful project, and the importance of having the goal of replicating and enhancing traditional…

  19. Virtual Reference Services--Down-Under: A Cautionary Tale.

    ERIC Educational Resources Information Center

    Wagner, Gulten S.

    Digital reference services at university libraries in Australia and New Zealand are a recent phenomena dating back to the late 1990s--following the developments in Web-based online library services. This paper examines the move towards the provision of e-mail reference services based on the study of 16 randomly chosen university libraries in…

  20. InfoQUEST: An Online Catalog for Small Libraries.

    ERIC Educational Resources Information Center

    Campbell, Bonnie

    1984-01-01

    InfoQUEST is a microcomputer-based online public access catalog, designed for the small library handling file sizes up to 25,000 records. Based on the IBM-PC, or compatible machines, the system will accept downloading, in batch mode, of records from the library's file on the UTLAS Catalog Support System. (Author/EJS)

  1. Evidence-Based Practice and School Libraries: Interconnections of Evidence, Advocacy, and Actions

    ERIC Educational Resources Information Center

    Todd, Ross J.

    2015-01-01

    This author states that a professional focus on evidence based practice (EBP) for school libraries emerged from the International Association of School Librarianship conference when he presented the concept. He challenged the school library profession to actively engage in professional and reflective practices that chart, measure, document, and…

  2. ORACLE INEQUALITIES FOR THE LASSO IN THE COX MODEL

    PubMed Central

    Huang, Jian; Sun, Tingni; Ying, Zhiliang; Yu, Yi; Zhang, Cun-Hui

    2013-01-01

    We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Similar results based on an extension of the restricted eigenvalue can be also proved by our method. However, the presented oracle inequalities are sharper since the compatibility and cone invertibility factors are always greater than the corresponding restricted eigenvalue. In the Cox regression model, the Hessian matrix is based on time-dependent covariates in censored risk sets, so that the compatibility and cone invertibility factors, and the restricted eigenvalue as well, are random variables even when they are evaluated for the Hessian at the true regression coefficients. Under mild conditions, we prove that these quantities are bounded from below by positive constants for time-dependent covariates, including cases where the number of covariates is of greater order than the sample size. Consequently, the compatibility and cone invertibility factors can be treated as positive constants in our oracle inequalities. PMID:24086091

  3. ORACLE INEQUALITIES FOR THE LASSO IN THE COX MODEL.

    PubMed

    Huang, Jian; Sun, Tingni; Ying, Zhiliang; Yu, Yi; Zhang, Cun-Hui

    2013-06-01

    We study the absolute penalized maximum partial likelihood estimator in sparse, high-dimensional Cox proportional hazards regression models where the number of time-dependent covariates can be larger than the sample size. We establish oracle inequalities based on natural extensions of the compatibility and cone invertibility factors of the Hessian matrix at the true regression coefficients. Similar results based on an extension of the restricted eigenvalue can be also proved by our method. However, the presented oracle inequalities are sharper since the compatibility and cone invertibility factors are always greater than the corresponding restricted eigenvalue. In the Cox regression model, the Hessian matrix is based on time-dependent covariates in censored risk sets, so that the compatibility and cone invertibility factors, and the restricted eigenvalue as well, are random variables even when they are evaluated for the Hessian at the true regression coefficients. Under mild conditions, we prove that these quantities are bounded from below by positive constants for time-dependent covariates, including cases where the number of covariates is of greater order than the sample size. Consequently, the compatibility and cone invertibility factors can be treated as positive constants in our oracle inequalities.

  4. The Marine Corps University Library.

    ERIC Educational Resources Information Center

    Ramkey, Carol E.

    2001-01-01

    Describes the Marine Corps University (Virginia) Library's collections and reserves. States that the library's resources focus on military doctrine, history, arts, and sciences, and that they include Web- and CD-ROM-based specialized military databases. Describes the library's mission to serve the university community and Marine Corps patrons…

  5. Managerial Accounting in Library and Information Science Education.

    ERIC Educational Resources Information Center

    Hayes, Robert M.

    1983-01-01

    Explores meaning of managerial accounting in libraries and discusses instructional program for students of library and information science based on experience in School of Library and Information Science at University of California, Los Angeles. Management decision making (budgeting, performance evaluation, overhead, resource allocation,…

  6. Virtual Library: Providing Accessible Online Resources.

    ERIC Educational Resources Information Center

    Kelly, Rob

    2001-01-01

    Describes e-global library, a virtual library based on the Jones International University's library that organizes Internet resources to make them more accessible to students at all skill levels. Highlights include online tutorials; research guides; financial aid and career development information; and possible partnerships with other digital…

  7. Spatio-Temporal Regression Based Clustering of Precipitation Extremes in a Presence of Systematically Missing Covariates

    NASA Astrophysics Data System (ADS)

    Kaiser, Olga; Martius, Olivia; Horenko, Illia

    2017-04-01

    Regression based Generalized Pareto Distribution (GPD) models are often used to describe the dynamics of hydrological threshold excesses relying on the explicit availability of all of the relevant covariates. But, in real application the complete set of relevant covariates might be not available. In this context, it was shown that under weak assumptions the influence coming from systematically missing covariates can be reflected by a nonstationary and nonhomogenous dynamics. We present a data-driven, semiparametric and an adaptive approach for spatio-temporal regression based clustering of threshold excesses in a presence of systematically missing covariates. The nonstationary and nonhomogenous behavior of threshold excesses is describes by a set of local stationary GPD models, where the parameters are expressed as regression models, and a non-parametric spatio-temporal hidden switching process. Exploiting nonparametric Finite Element time-series analysis Methodology (FEM) with Bounded Variation of the model parameters (BV) for resolving the spatio-temporal switching process, the approach goes beyond strong a priori assumptions made is standard latent class models like Mixture Models and Hidden Markov Models. Additionally, the presented FEM-BV-GPD provides a pragmatic description of the corresponding spatial dependence structure by grouping together all locations that exhibit similar behavior of the switching process. The performance of the framework is demonstrated on daily accumulated precipitation series over 17 different locations in Switzerland from 1981 till 2013 - showing that the introduced approach allows for a better description of the historical data.

  8. On the Lighthill relationship and sound generation from isotropic turbulence

    NASA Technical Reports Server (NTRS)

    Zhou, YE; Praskovsky, Alexander; Oncley, Steven

    1994-01-01

    In 1952, Lighthill developed a theory for determining the sound generated by a turbulent motion of a fluid. With some statistical assumptions, Proudman applied this theory to estimate the acoustic power of isotropic turbulence. Recently, Lighthill established a simple relationship that relates the fourth-order retarded time and space covariance of his stress tensor to the corresponding second-order covariance and the turbulent flatness factor, without making statistical assumptions for a homogeneous turbulence. Lilley revisited Proudman's work and applied the Lighthill relationship to evaluate directly the radiated acoustic power from isotropic turbulence. After choosing the time separation dependence in the two-point velocity time and space covariance based on the insights gained from direct numerical simulations, Lilley concluded that the Proudman constant is determined by the turbulent flatness factor and the second-order spatial velocity covariance. In order to estimate the Proudman constant at high Reynolds numbers, we analyzed a unique data set of measurements in a large wind tunnel and atmospheric surface layer that covers a range of the Taylor microscale based on Reynolds numbers 2.0 x 10(exp 3) less than or equal to R(sub lambda) less than or equal to 12.7 x 10(exp 3). Our measurements demonstrate that the Lighthill relationship is a good approximation, providing additional support to Lilley's approach. The flatness factor is found between 2.7 - 3.3 and the second order spatial velocity covariance is obtained. Based on these experimental data, the Proudman constant is estimated to be 0.68 - 3.68.

  9. A Study with Computer-Based Circulation Data of the Non-Use and Use of a Large Academic Library. Final Report.

    ERIC Educational Resources Information Center

    Lubans, John, Jr.; And Others

    Computer-based circulation systems, it is widely believed, can be utilized to provide data for library use studies. The study described in this report involves using such a data base to analyze aspects of library use and non-use and types of users. Another major objective of this research was the testing of machine-readable circulation data…

  10. The hospital library online--a point of service for consumers and hospital staff: a case study.

    PubMed Central

    Cain, N J; Fuller, H J

    1999-01-01

    The Health Library at Stanford University is described in the context of electronic information services provided to Stanford University Medical Center, the local community, and Internet users in general. The evolution from CD-ROM-based services to Web-based services and in-library services to networked resources are described. Electronic services have expanded the mission of The Health Library to include national and international users and the provision of unique services and collections. PMID:10427424

  11. School Library Media Specialists Inform Technology Preparation of Library Science Students: An Evidence-Based Discussion

    ERIC Educational Resources Information Center

    Snyder, Donna L.; Miller, Andrea L.

    2009-01-01

    What is the relative importance of current and emerging technologies in school library media programs? In order to answer this question, in Fall 2007 the authors administered a survey to 1,053 school library media specialists (SLMSs) throughout the state of Pennsylvania. As a part of the MSLS degree with Library Science K-12 certification, Clarion…

  12. Library and Information Science Journal Prestige as Assessed by Library and Information Science Faculty

    ERIC Educational Resources Information Center

    Manzari, Laura

    2013-01-01

    This prestige study surveyed full-time faculty of American Library Association (ALA)-accredited programs in library and information studies regarding library and information science (LIS) journals. Faculty were asked to rate a list of eighty-nine LIS journals on a scale from 1 to 5 based on each journal's importance to their research and teaching.…

  13. Academic Libraries in Ohio. Progress through Collaboration, Storage, and Technology. Report of the Library Study Committee.

    ERIC Educational Resources Information Center

    Ohio Board of Regents, Columbus.

    Based on a study of the need for, and alternatives to, significant expansion of space for state college and university libraries, this report discusses the resultant recommendations, which address both the long term and the immediate space needs of the state's academic libraries. Following a description of the role of academic libraries and a…

  14. Developing a general practice library: a collaborative project between a GP and librarian.

    PubMed

    Pearson, D; Rossall, H

    2001-12-01

    The authors report on a self-completed questionnaire study from a North Yorkshire based general practice regarding the information needs of its clinicians. The work was carried out with a particular focus on the practice library, and the findings identified that a new approach to maintaining and developing the library was needed. The literature regarding the information needs of primary care clinicians and the role of practice libraries is considered, and compared to those of the clinicians at the practice. Discussion follows on how a collaborative project was set up between the practice and a librarian based at the local NHS Trust library in order to improve the existing practice library. Difficulties encountered and issues unique to the project are explored, including training implications presented by the implementation of electronic resources. Marketing activities implemented are discussed, how the library will operate in its new capacity, and how ongoing support and maintenance of the library will be carried out. It is concluded that although scepticism still exists regarding librarian involvement in practice libraries, collaboration between clinicians and librarians is an effective approach to the successful development and maintenance of a practice library, and recommendations are therefore made for similar collaborative work.

  15. Health sciences libraries building survey, 1999-2009.

    PubMed

    Ludwig, Logan

    2010-04-01

    A survey was conducted of health sciences libraries to obtain information about newer buildings, additions, remodeling, and renovations. An online survey was developed, and announcements of survey availability posted to three major email discussion lists: Medical Library Association (MLA), Association of Academic Health Sciences Libraries (AAHSL), and MEDLIB-L. Previous discussions of library building projects on email discussion lists, a literature review, personal communications, and the author's consulting experiences identified additional projects. Seventy-eight health sciences library building projects at seventy-three institutions are reported. Twenty-two are newer facilities built within the last ten years; two are space expansions; forty-five are renovation projects; and nine are combinations of new and renovated space. Six institutions report multiple or ongoing renovation projects during the last ten years. The survey results confirm a continuing migration from print-based to digitally based collections and reveal trends in library space design. Some health sciences libraries report loss of space as they move toward creating space for "community" building. Libraries are becoming more proactive in using or retooling space for concentration, collaboration, contemplation, communication, and socialization. All are moving toward a clearer operational vision of the library as the institution's information nexus and not merely as a physical location with print collections.

  16. MT71x: Multi-Temperature Library Based on ENDF/B-VII.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conlin, Jeremy Lloyd; Parsons, Donald Kent; Gray, Mark Girard

    The Nuclear Data Team has released a multitemperature transport library, MT71x, based upon ENDF/B-VII.1 with a few modifications as well as additional evaluations for a total of 427 isotope tables. The library was processed using NJOY2012.39 into 23 temperatures. MT71x consists of two sub-libraries; MT71xMG for multigroup energy representation data and MT71xCE for continuous energy representation data. These sub-libraries are suitable for deterministic transport and Monte Carlo transport applications, respectively. The SZAs used are the same for the two sub-libraries; that is, the same SZA can be used for both libraries. This makes comparisons between the two libraries and betweenmore » deterministic and Monte Carlo codes straightforward. Both the multigroup energy and continuous energy libraries were verified and validated with our checking codes checkmg and checkace (multigroup and continuous energy, respectively) Then an expanded suite of tests was used for additional verification and, finally, verified using an extensive suite of critical benchmark models. We feel that this library is suitable for all calculations and is particularly useful for calculations sensitive to temperature effects.« less

  17. Hospital library foreign language labs: the experiences of two hospital libraries.

    PubMed

    Whelan, Julia S; Schneider, Elizabeth; Woodworth, Karl; Markwell, Linda Garr

    2006-01-01

    Increasingly, hospital-based physicians, residents, and medical students are welcoming into their care foreign-born patients, who do not speak English. Most hospitals today have an Interpretive Services Department, but many of the physicians, residents, and medical students want to become more proficient in the most frequently spoken foreign languages in their respective locales. To help recruit and retain a diverse workforce, some hospitals sponsor English programs for staff. The Treadwell Library at Massachusetts General Hospital in Boston, Massachusetts, and the Grady Branch Library at Grady Memorial Hospital in Atlanta, Georgia, have developed a special collection and hospital library-based language laboratories in order to meet this need.

  18. Generation of an arrayed CRISPR-Cas9 library targeting epigenetic regulators: from high-content screens to in vivo assays

    PubMed Central

    2017-01-01

    ABSTRACT The CRISPR-Cas9 system has revolutionized genome engineering, allowing precise modification of DNA in various organisms. The most popular method for conducting CRISPR-based functional screens involves the use of pooled lentiviral libraries in selection screens coupled with next-generation sequencing. Screens employing genome-scale pooled small guide RNA (sgRNA) libraries are demanding, particularly when complex assays are used. Furthermore, pooled libraries are not suitable for microscopy-based high-content screens or for systematic interrogation of protein function. To overcome these limitations and exploit CRISPR-based technologies to comprehensively investigate epigenetic mechanisms, we have generated a focused sgRNA library targeting 450 epigenetic regulators with multiple sgRNAs in human cells. The lentiviral library is available both in an arrayed and pooled format and allows temporally-controlled induction of gene knock-out. Characterization of the library showed high editing activity of most sgRNAs and efficient knock-out at the protein level in polyclonal populations. The sgRNA library can be used for both selection and high-content screens, as well as for targeted investigation of selected proteins without requiring isolation of knock-out clones. Using a variety of functional assays we show that the library is suitable for both in vitro and in vivo applications, representing a unique resource to study epigenetic mechanisms in physiological and pathological conditions. PMID:29327641

  19. A broad-group cross-section library based on ENDF/B-VII.0 for fast neutron dosimetry Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpan, F.A.

    2011-07-01

    A new ENDF/B-VII.0-based coupled 44-neutron, 20-gamma-ray-group cross-section library was developed to investigate the latest evaluated nuclear data file (ENDF) ,in comparison to ENDF/B-VI.3 used in BUGLE-96, as well as to generate an objective-specific library. The objectives selected for this work consisted of dosimetry calculations for in-vessel and ex-vessel reactor locations, iron atom displacement calculations for reactor internals and pressure vessel, and {sup 58}Ni(n,{gamma}) calculation that is important for gas generation in the baffle plate. The new library was generated based on the contribution and point-wise cross-section-driven (CPXSD) methodology and was applied to one of the most widely used benchmarks, themore » Oak Ridge National Laboratory Pool Critical Assembly benchmark problem. In addition to the new library, BUGLE-96 and an ENDF/B-VII.0-based coupled 47-neutron, 20-gamma-ray-group cross-section library was generated and used with both SNLRML and IRDF dosimetry cross sections to compute reaction rates. All reaction rates computed by the multigroup libraries are within {+-} 20 % of measurement data and meet the U. S. Nuclear Regulatory Commission acceptance criterion for reactor vessel neutron exposure evaluations specified in Regulatory Guide 1.190. (authors)« less

  20. Homonuclear long-range correlation spectra from HMBC experiments by covariance processing.

    PubMed

    Schoefberger, Wolfgang; Smrecki, Vilko; Vikić-Topić, Drazen; Müller, Norbert

    2007-07-01

    We present a new application of covariance nuclear magnetic resonance processing based on 1H--13C-HMBC experiments which provides an effective way for establishing indirect 1H--1H and 13C--13C nuclear spin connectivity at natural isotope abundance. The method, which identifies correlated spin networks in terms of covariance between one-dimensional traces from a single decoupled HMBC experiment, derives 13C--13C as well as 1H--1H spin connectivity maps from the two-dimensional frequency domain heteronuclear long-range correlation data matrix. The potential and limitations of this novel covariance NMR application are demonstrated on two compounds: eugenyl-beta-D-glucopyranoside and an emodin-derivative. Copyright (c) 2007 John Wiley & Sons, Ltd.

  1. Optical Implementation of the Optimal Universal and Phase-Covariant Quantum Cloning Machines

    NASA Astrophysics Data System (ADS)

    Ye, Liu; Song, Xue-Ke; Yang, Jie; Yang, Qun; Ma, Yang-Cheng

    Quantum cloning relates to the security of quantum computation and quantum communication. In this paper, firstly we propose a feasible unified scheme to implement optimal 1 → 2 universal, 1 → 2 asymmetric and symmetric phase-covariant cloning, and 1 → 2 economical phase-covariant quantum cloning machines only via a beam splitter. Then 1 → 3 economical phase-covariant quantum cloning machines also can be realized by adding another beam splitter in context of linear optics. The scheme is based on the interference of two photons on a beam splitter with different splitting ratios for vertical and horizontal polarization components. It is shown that under certain condition, the scheme is feasible by current experimental technology.

  2. Marginalized zero-inflated Poisson models with missing covariates.

    PubMed

    Benecha, Habtamu K; Preisser, John S; Divaris, Kimon; Herring, Amy H; Das, Kalyan

    2018-05-11

    Unlike zero-inflated Poisson regression, marginalized zero-inflated Poisson (MZIP) models for counts with excess zeros provide estimates with direct interpretations for the overall effects of covariates on the marginal mean. In the presence of missing covariates, MZIP and many other count data models are ordinarily fitted using complete case analysis methods due to lack of appropriate statistical methods and software. This article presents an estimation method for MZIP models with missing covariates. The method, which is applicable to other missing data problems, is illustrated and compared with complete case analysis by using simulations and dental data on the caries preventive effects of a school-based fluoride mouthrinse program. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Addressing missing covariates for the regression analysis of competing risks: Prognostic modelling for triaging patients diagnosed with prostate cancer.

    PubMed

    Escarela, Gabriel; Ruiz-de-Chavez, Juan; Castillo-Morales, Alberto

    2016-08-01

    Competing risks arise in medical research when subjects are exposed to various types or causes of death. Data from large cohort studies usually exhibit subsets of regressors that are missing for some study subjects. Furthermore, such studies often give rise to censored data. In this article, a carefully formulated likelihood-based technique for the regression analysis of right-censored competing risks data when two of the covariates are discrete and partially missing is developed. The approach envisaged here comprises two models: one describes the covariate effects on both long-term incidence and conditional latencies for each cause of death, whilst the other deals with the observation process by which the covariates are missing. The former is formulated with a well-established mixture model and the latter is characterised by copula-based bivariate probability functions for both the missing covariates and the missing data mechanism. The resulting formulation lends itself to the empirical assessment of non-ignorability by performing sensitivity analyses using models with and without a non-ignorable component. The methods are illustrated on a 20-year follow-up involving a prostate cancer cohort from the National Cancer Institutes Surveillance, Epidemiology, and End Results program. © The Author(s) 2013.

  4. Independent component analysis of DTI data reveals white matter covariances in Alzheimer's disease

    NASA Astrophysics Data System (ADS)

    Ouyang, Xin; Sun, Xiaoyu; Guo, Ting; Sun, Qiaoyue; Chen, Kewei; Yao, Li; Wu, Xia; Guo, Xiaojuan

    2014-03-01

    Alzheimer's disease (AD) is a progressive neurodegenerative disease with the clinical symptom of the continuous deterioration of cognitive and memory functions. Multiple diffusion tensor imaging (DTI) indices such as fractional anisotropy (FA) and mean diffusivity (MD) can successfully explain the white matter damages in AD patients. However, most studies focused on the univariate measures (voxel-based analysis) to examine the differences between AD patients and normal controls (NCs). In this investigation, we applied a multivariate independent component analysis (ICA) to investigate the white matter covariances based on FA measurement from DTI data in 35 AD patients and 45 NCs from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. We found that six independent components (ICs) showed significant FA reductions in white matter covariances in AD compared with NC, including the genu and splenium of corpus callosum (IC-1 and IC-2), middle temporal gyral of temporal lobe (IC-3), sub-gyral of frontal lobe (IC-4 and IC-5) and sub-gyral of parietal lobe (IC-6). Our findings revealed covariant white matter loss in AD patients and suggest that the unsupervised data-driven ICA method is effective to explore the changes of FA in AD. This study assists us in understanding the mechanism of white matter covariant reductions in the development of AD.

  5. Automation at the University of Georgia Libraries.

    ERIC Educational Resources Information Center

    Christoffersson, John G.

    1979-01-01

    Presents the design procedures, bibliographic system, file structures, acquisitions and circulation systems, functional implementation, and future development of the Managing Resources for University Libraries (MARVEL) data base at the University of Georgia Libraries, which accepts MARC input from OCLC and Library of Congress (LC) MARC tapes. (CWM)

  6. C3 Domain Analysis, Lessons Learned

    DTIC Science & Technology

    1993-09-30

    organize the domain. This approach is heavily based on the principles of library science and is geared toward a reuse effort with a large library-like...method adapts many principles from library science to the organization and implementation of a reuse library. C-1 DEFENSE INFORMATION SYSTEMS AGENCY

  7. Humanities Programming in Public Libraries: The Connecticut Perspective.

    ERIC Educational Resources Information Center

    Rader, Barbara A.

    1990-01-01

    Describes how public libraries can plan, fund, and implement scholar-led, library-based, humanities book discussion programs using the example of the Southern Connecticut Library Council. Key steps in planning, funding, targeting the audience, selecting topics and books, obtaining community support, recruiting scholars, marketing, administration,…

  8. Ignoring the Evidence: Another Decade of Decline for School Libraries

    ERIC Educational Resources Information Center

    Oberg, Dianne

    2012-01-01

    Four decades of research indicates that well-staffed, well-stocked, and well-used school libraries are correlated with increases in student achievement. Well-staffed school libraries have qualified teacher-librarians with qualifications in librarianship, digital technologies, and inquiry-based pedagogies. Well-stocked school libraries include…

  9. Partners for Success: A School Library Advocacy Training Program for Principals.

    ERIC Educational Resources Information Center

    Kachel, Debra E.

    2003-01-01

    Describes a program developed to help school principals understand the importance of school library media specialists based on "Information Power." Explains modules on academic achievement and school libraries, information literacy and academic standards, library collections and flexible access, and revitalization and evaluation of…

  10. Detection of fungal damaged popcorn using image property covariance features

    USDA-ARS?s Scientific Manuscript database

    Covariance-matrix-based features were applied to the detection of popcorn infected by a fungus that cause a symptom called “blue-eye.” This infection of popcorn kernels causes economic losses because of their poor appearance and the frequently disagreeable flavor of the popped kernels. Images of ker...

  11. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed

    Halub, L P

    1999-07-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services.

  12. The value of Web-based library services at Cedars-Sinai Health System.

    PubMed Central

    Halub, L P

    1999-01-01

    Cedars-Sinai Medical Library/Information Center has maintained Web-based services since 1995 on the Cedars-Sinai Health System network. In that time, the librarians have found the provision of Web-based services to be a very worthwhile endeavor. Library users value the services that they access from their desktops because the services save time. They also appreciate being able to access services at their convenience, without restriction by the library's hours of operation. The library values its Web site because it brings increased visibility within the health system, and it enables library staff to expand services when budget restrictions have forced reduced hours of operation. In creating and maintaining the information center Web site, the librarians have learned the following lessons: consider the design carefully; offer what services you can, but weigh the advantages of providing the services against the time required to maintain them; make the content as accessible as possible; promote your Web site; and make friends in other departments, especially information services. PMID:10427423

  13. Experimental OAI-Based Digital Library Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Michael L. (Editor); Maly, Kurt (Editor); Zubair, Mohammad (Editor); Rusch-Feja, Diann (Editor)

    2002-01-01

    The objective of Open Archives Initiative (OAI) is to develop a simple, lightweight framework to facilitate the discovery of content in distributed archives (http://www.openarchives.org). The focus of the workshop held at the 5th European Conference on Research and Advanced Technology for Digital Libraries (ECDL 2001) was to bring researchers in the area of digital libraries who are building OAI based systems so as to share their experiences, problems they are facing, and approaches they are taking to address them. The workshop consisted of invited talks from well-established researchers working in building OAI based digital library system along with short paper presentations.

  14. Sparse Covariance Matrix Estimation With Eigenvalue Constraints

    PubMed Central

    LIU, Han; WANG, Lie; ZHAO, Tuo

    2014-01-01

    We propose a new approach for estimating high-dimensional, positive-definite covariance matrices. Our method extends the generalized thresholding operator by adding an explicit eigenvalue constraint. The estimated covariance matrix simultaneously achieves sparsity and positive definiteness. The estimator is rate optimal in the minimax sense and we develop an efficient iterative soft-thresholding and projection algorithm based on the alternating direction method of multipliers. Empirically, we conduct thorough numerical experiments on simulated datasets as well as real data examples to illustrate the usefulness of our method. Supplementary materials for the article are available online. PMID:25620866

  15. Quasilocal conserved charges in a covariant theory of gravity.

    PubMed

    Kim, Wontae; Kulkarni, Shailesh; Yi, Sang-Heon

    2013-08-23

    In any generally covariant theory of gravity, we show the relationship between the linearized asymptotically conserved current and its nonlinear completion through the identically conserved current. Our formulation for conserved charges is based on the Lagrangian description, and so completely covariant. By using this result, we give a prescription to define quasilocal conserved charges in any higher derivative gravity. As applications of our approach, we demonstrate the angular momentum invariance along the radial direction of black holes and reproduce more efficiently the linearized potential on the asymptotic anti-de Sitter space.

  16. Competencies for Librarians. Proceedings from the 1985 Spring Meeting of the Nebraska Library Association: College and University Section (Omaha, Nebraska, April 26, 1985).

    ERIC Educational Resources Information Center

    Krzywkowski, Valerie I., Ed.

    Based on the conference theme, "Competencies for Librarians," papers presented at the 1985 meeting of the association include: (1) "Planning a Library-Based Public Access Microcomputer Facility" (Suzanne Kehm); (2) "Processing and Circulating Microcomputer Software in the Academic Library: A Sharing Session" (Jan…

  17. IDENTIFICATION OF ACTIVE BACTERIAL COMMUNITIES IN A MODEL DRINKING WATER BIOFILM SYSTEM USING 16S RRNA-BASED CLONE LIBRARIES

    EPA Science Inventory

    Recent phylogenetic studies have used DNA as the target molecule for the development of environmental 16S rDNA clone libraries. As DNA may persist in the environment, DNA-based libraries cannot be used to identify metabolically active bacteria in water systems. In this study, a...

  18. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries.

    PubMed

    Wu, Jemma X; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P

    2016-07-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  19. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    PubMed Central

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  20. The Weapons Laboratory Technical Library: Automating with ’Stilas’

    DTIC Science & Technology

    1990-03-01

    version of the system to LC in October 1988. -X- A small business specializing in library automation, SIRSI was founded in 1979 by library and...computer specialists, and has a strong reputation based upon the success of their UNIX-based Unicorn Collection Management System. SIRSI offers a complete...system based on the Unicorn and BRS/ Search systems. The contracted STILAS package includes UNISYS hardware, software written in the C language

  1. Specifications of a Mechanized Center for Information Services for a Public Library Reference Center. Final Report. Part 1, Preliminary Specification: Mechanized Information Services in Public Library Reference Centers.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Inst. of Library Research.

    This document presents preliminary specifications for a library-based Center for Information Services (CIS). Four sets of issues are covered: (1) data base inventory, providing a listing of magnetic tape data bases now available from national sources or soon to be so; (2) administrative issues, including the organization of the CIS within the…

  2. CIELO Collaboration Summary Results: International Evaluations of Neutron Reactions on Uranium, Plutonium, Iron, Oxygen and Hydrogen

    DOE PAGES

    Chadwick, M. B.; Capote, R.; Trkov, A.; ...

    2018-03-07

    The CIELO collaboration has studied neutron cross sections on nuclides that significantly impact criticality in nuclear technologies - 235,238U, 239Pu, 56Fe, 16O and 1H - with the aim of improving the accuracy of the data and resolving previous discrepancies in our understanding. This multi-laboratory pilot project, coordinated via the OECD/NEA Working Party on Evaluation Cooperation (WPEC) Subgroup 40 with support also from the IAEA, has motivated experimental and theoretical work and led to suites of new evaluated libraries that accurately reflect measured data and also perform well in integral simulations of criticality. This report summarizes our results on cross sectionsmore » and preliminary work on covariances, and outlines plans for the next phase of this collaboration.« less

  3. CIELO Collaboration Summary Results: International Evaluations of Neutron Reactions on Uranium, Plutonium, Iron, Oxygen and Hydrogen

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chadwick, M. B.; Capote, R.; Trkov, A.

    The CIELO collaboration has studied neutron cross sections on nuclides that significantly impact criticality in nuclear technologies - 235,238U, 239Pu, 56Fe, 16O and 1H - with the aim of improving the accuracy of the data and resolving previous discrepancies in our understanding. This multi-laboratory pilot project, coordinated via the OECD/NEA Working Party on Evaluation Cooperation (WPEC) Subgroup 40 with support also from the IAEA, has motivated experimental and theoretical work and led to suites of new evaluated libraries that accurately reflect measured data and also perform well in integral simulations of criticality. This report summarizes our results on cross sectionsmore » and preliminary work on covariances, and outlines plans for the next phase of this collaboration.« less

  4. FORTRAN multitasking library for use on the ELXSI 6400 and the CRAY XMP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montry, G.R.

    1985-07-16

    A library of FORTRAN-based multitasking routines has been written for the ELXSI 6400 and the CRAY XMP. This library is designed to make multitasking codes easily transportable between machines with different hardware configurations. The library provides enhanced error checking and diagnostics over vendor-supplied multitasking intrinsics. The library also contains multitasking control structures not normally supplied by the vendor.

  5. So Different yet so Similar: A Tale of Two Academic Libraries in the United States and Poland

    ERIC Educational Resources Information Center

    Cyran, Katarzyna

    2017-01-01

    The aim of this article is to present a comparison between two academic libraries: Indiana University Libraries in the United States and the Library of Pope John Paul II State School of Higher Education in Biala Podlaska, Poland. This comparison is based on data that each of these libraries published in statistical reports. Much of the information…

  6. Ethnographic Methods in Academic Libraries: A Review

    ERIC Educational Resources Information Center

    Ramsden, Bryony

    2016-01-01

    Research in academic libraries has recently seen an increase in the use of ethnographic-based methods to collect data. Primarily used to learn about library users and their interaction with spaces and resources, the methods are proving particularly useful to academic libraries. The data ethnographic methods retrieve is rich, context specific, and…

  7. Library Automation with Workstations: Using Apple Macintoshes in a Special Library.

    ERIC Educational Resources Information Center

    Valauskas, Edward J.

    1988-01-01

    Describes an automation project at the Merriam Center Library in which Apple Macintoshes were introduced as library workstations. The implementation process, staff involvement and reactions, and current configurations and applications of the workstations are discussed. An appendix provides a comparison of current microcomputer based workstations…

  8. Diversifying Fiscal Support by Pricing Public Library Services: A Policy Impact Analysis.

    ERIC Educational Resources Information Center

    Hicks, Donald A.

    1980-01-01

    Addresses the possibility of diversifying the resource base of public libraries dependent on property taxes for funding through the setting of fees for library services, and reports on a pricing policy adopted by the Dallas Public Library System. Twenty-seven references are cited. (FM)

  9. What's Wrong with Library Organization? Factors Leading to Restructuring in Research Libraries.

    ERIC Educational Resources Information Center

    Hewitt, Joe A.

    1997-01-01

    Discusses the need for organizational change in academic research libraries, based on a study of a small group of libraries that had experienced varying degrees of restructuring and had analyzed factors that energized change. Highlights include organizational flexibility, external or client-centered orientation, staff empowerment, and improving…

  10. Business as Usual: Amazon.com and the Academic Library

    ERIC Educational Resources Information Center

    Van Ullen, Mary K.; Germain, Carol Anne

    2002-01-01

    In 1999, Steve Coffman proposed that libraries form a single interlibrary loan based entity patterned after Amazon.com. This study examined the suitability of Amazon.com's Web interface and record enhancements for academic libraries. Amazon.com could not deliver circulating monographs in the University at Albany Libraries' collection quickly…

  11. Changing State Digital Libraries

    ERIC Educational Resources Information Center

    Pappas, Marjorie L.

    2006-01-01

    Research has shown that state virtual or digital libraries are evolving into websites that are loaded with free resources, subscription databases, and instructional tools. In this article, the author explores these evolving libraries based on the following questions: (1) How user-friendly are the state digital libraries?; (2) How do state digital…

  12. How large are the consequences of covariate imbalance in cluster randomized trials: a simulation study with a continuous outcome and a binary covariate at the cluster level.

    PubMed

    Moerbeek, Mirjam; van Schie, Sander

    2016-07-11

    The number of clusters in a cluster randomized trial is often low. It is therefore likely random assignment of clusters to treatment conditions results in covariate imbalance. There are no studies that quantify the consequences of covariate imbalance in cluster randomized trials on parameter and standard error bias and on power to detect treatment effects. The consequences of covariance imbalance in unadjusted and adjusted linear mixed models are investigated by means of a simulation study. The factors in this study are the degree of imbalance, the covariate effect size, the cluster size and the intraclass correlation coefficient. The covariate is binary and measured at the cluster level; the outcome is continuous and measured at the individual level. The results show covariate imbalance results in negligible parameter bias and small standard error bias in adjusted linear mixed models. Ignoring the possibility of covariate imbalance while calculating the sample size at the cluster level may result in a loss in power of at most 25 % in the adjusted linear mixed model. The results are more severe for the unadjusted linear mixed model: parameter biases up to 100 % and standard error biases up to 200 % may be observed. Power levels based on the unadjusted linear mixed model are often too low. The consequences are most severe for large clusters and/or small intraclass correlation coefficients since then the required number of clusters to achieve a desired power level is smallest. The possibility of covariate imbalance should be taken into account while calculating the sample size of a cluster randomized trial. Otherwise more sophisticated methods to randomize clusters to treatments should be used, such as stratification or balance algorithms. All relevant covariates should be carefully identified, be actually measured and included in the statistical model to avoid severe levels of parameter and standard error bias and insufficient power levels.

  13. a New ENDF/B-VII.0 Based Multigroup Cross-Section Library for Reactor Dosimetry

    NASA Astrophysics Data System (ADS)

    Alpan, F. A.; Anderson, S. L.

    2009-08-01

    The latest of the ENDF/B libraries, ENDF/B-VII.0 was released in December 2006. In this paper, the ENDF/B-VII.O evaluations were used in generating a new coupled neutron/gamma multigroup library having the same group structure of VITAMIN-B6, i.e., the 199-neutron, 42-gamma group library. The new library was generated utilizing NJOY99.259 for pre-processing and the AMPX modules for post-processing of cross sections. An ENDF/B-VI.3 based VITAMIN-B6-like library was also generated. The fine-group libraries and the ENDF/B-VI.3 based 47-neutron, 20-gamma group BUGLE-96 library were used with the discrete ordinates code DORT to obtain a three-dimensional synthesized flux distribution from r, r-θ, and r-z models for a standard Westinghouse 3-loop design reactor. Reaction rates were calculated for ex-vessel neutron dosimetry containing 63Cu(n,α)60Co, 46Ti(n,p)46Sc, 54Fe(n,P)54Mn, 58Ni(n,P)58Co, 238U(n,f)137Cs, 237Np(n,f)137Cs, and 59Co(n,γ)60Co (bare and cadmium covered) reactions. Results were compared to measurements. In comparing the 199-neutron, 42-gamma group ENDF/B-VI.3 and ENDF/B-VII.O libraries, it was observed that the ENDF/B-VI.3 based library results were in better agreement with measurements. There is a maximum difference of 7% (for the 63Cu(n,α)60Co reaction rate calculation) between ENDF/B-VI.3 and ENDF/B-VII.O. Differences between ENDF/B-VI.3 and ENDF/B-VII.O libraries are due to 16O, 1H, 90Zr, 91Zr, 92Zr, 238U, and 239Pu evaluations. Both ENDF/B-VI.3 and ENDF/B-VII.O library calculated reaction rates are within 20% of measurement and meet the criterion specified in the U. S. Nuclear Regulatory Commission Regulatory Guide 1.190, "Calculational and Dosimetry Methods for Determining Pressure Vessel Neutron Fluence."

  14. A study of library use in problem-based and traditional medical curricula.

    PubMed

    Marshall, J G; Fitzgerald, D; Busby, L; Heaton, G

    1993-07-01

    A key question for librarians and medical educators who are planning for curriculum change is whether students and faculty in problem-based learning (PBL) programs use the library and its resources differently than do participants in traditional programs. During 1991, this research question was explored at three medical schools in the province of Ontario, Canada. At the time of the study, McMaster University medical school was totally problem-based, the University of Western Ontario had one PBL day each week for first-year medical students, and the University of Toronto, although planning for medical curriculum change, had not yet initiated PBL. Data collected in the study suggest that more medical students in the problem-based curriculum than in the more traditional programs use the library and that, when the PBL students use the library, they do so more frequently, for longer periods of time, and as a source of a greater proportion of their study materials. PBL students also use the library more than their counterparts as a place to study and meet other students. Students in the problem-based curriculum use the following resources more extensively: end-user MEDLINE searching, library journals, reserve or short-term loan materials, photocopy services, and audiovisual materials. PBL students also report purchasing more textbooks. In contrast to the differences found among medical students, however, patterns of library and resource use by medical faculty at the three schools were quite similar.

  15. Academic health sciences library Website navigation: an analysis of forty-one Websites and their navigation tools.

    PubMed

    Brower, Stewart M

    2004-10-01

    The analysis included forty-one academic health sciences library (HSL) Websites as captured in the first two weeks of January 2001. Home pages and persistent navigational tools (PNTs) were analyzed for layout, technology, and links, and other general site metrics were taken. Websites were selected based on rank in the National Network of Libraries of Medicine, with regional and resource libraries given preference on the basis that these libraries are recognized as leaders in their regions and would be the most reasonable source of standards for best practice. A three-page evaluation tool was developed based on previous similar studies. All forty-one sites were evaluated in four specific areas: library general information, Website aids and tools, library services, and electronic resources. Metrics taken for electronic resources included orientation of bibliographic databases alphabetically by title or by subject area and with links to specifically named databases. Based on the results, a formula for determining obligatory links was developed, listing items that should appear on all academic HSL Web home pages and PNTs. These obligatory links demonstrate a series of best practices that may be followed in the design and construction of academic HSL Websites.

  16. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach

    NASA Astrophysics Data System (ADS)

    Tarai, Madhumita; Kumar, Keshav; Divya, O.; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-01

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix.

  17. Generating Nonnormal Multivariate Data Using Copulas: Applications to SEM.

    PubMed

    Mair, Patrick; Satorra, Albert; Bentler, Peter M

    2012-07-01

    This article develops a procedure based on copulas to simulate multivariate nonnormal data that satisfy a prespecified variance-covariance matrix. The covariance matrix used can comply with a specific moment structure form (e.g., a factor analysis or a general structural equation model). Thus, the method is particularly useful for Monte Carlo evaluation of structural equation models within the context of nonnormal data. The new procedure for nonnormal data simulation is theoretically described and also implemented in the widely used R environment. The quality of the method is assessed by Monte Carlo simulations. A 1-sample test on the observed covariance matrix based on the copula methodology is proposed. This new test for evaluating the quality of a simulation is defined through a particular structural model specification and is robust against normality violations.

  18. Context and learning: the value and limits of library-based information literacy teaching.

    PubMed

    Eyre, Jason

    2012-12-01

    This month's regular feature will discuss some of the implications for library-based information literacy teaching that have emerged from a HEA-funded research project conducted at De Montfort University. It is argued that information literacy teaching as it has evolved in a university setting, while having a greater degree of relevance and value than ever before, nevertheless has inherent limits when it comes to its transferability beyond the academy and into a workplace setting. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group.

  19. 3D Riesz-wavelet based Covariance descriptors for texture classification of lung nodule tissue in CT.

    PubMed

    Cirujeda, Pol; Muller, Henning; Rubin, Daniel; Aguilera, Todd A; Loo, Billy W; Diehn, Maximilian; Binefa, Xavier; Depeursinge, Adrien

    2015-01-01

    In this paper we present a novel technique for characterizing and classifying 3D textured volumes belonging to different lung tissue types in 3D CT images. We build a volume-based 3D descriptor, robust to changes of size, rigid spatial transformations and texture variability, thanks to the integration of Riesz-wavelet features within a Covariance-based descriptor formulation. 3D Riesz features characterize the morphology of tissue density due to their response to changes in intensity in CT images. These features are encoded in a Covariance-based descriptor formulation: this provides a compact and flexible representation thanks to the use of feature variations rather than dense features themselves and adds robustness to spatial changes. Furthermore, the particular symmetric definite positive matrix form of these descriptors causes them to lay in a Riemannian manifold. Thus, descriptors can be compared with analytical measures, and accurate techniques from machine learning and clustering can be adapted to their spatial domain. Additionally we present a classification model following a "Bag of Covariance Descriptors" paradigm in order to distinguish three different nodule tissue types in CT: solid, ground-glass opacity, and healthy lung. The method is evaluated on top of an acquired dataset of 95 patients with manually delineated ground truth by radiation oncology specialists in 3D, and quantitative sensitivity and specificity values are presented.

  20. Quiet in the Library: An Evidence-Based Approach to Improving the Student Experience

    ERIC Educational Resources Information Center

    McCaffrey, Ciara; Breen, Michelle

    2016-01-01

    This article deals with the management of noise in an academic library by outlining an evidence-based approach taken over seven years by the University of Limerick in the Republic of Ireland. The objective of this study was to measure the impact on library users of noise management interventions implemented from 2007 to 2014 through retrospective…

  1. Building an Online Library for Interpretation Training: Explorations into an Effective Blended-Learning Mode

    ERIC Educational Resources Information Center

    Chan, Clara Ho-yan

    2014-01-01

    This paper reports on a blended-learning project that aims to develop a web-based library of interpreting practice resources built on the course management system Blackboard for Hong Kong interpretation students to practise outside the classroom. It also evaluates the library's effectiveness for learning, based on a case study that uses it to…

  2. "For Poor Nations a Library Service Is Vital": Establishing a National Public Library Service in Tanzania in the 1960s

    ERIC Educational Resources Information Center

    Olden, Anthony

    2005-01-01

    The Tanganyika Library Service (TLS) was the national public library service set up in Tanzania, East Africa, in the 1960s. By the end of the decade, it was generally regarded as a model of Western-style public library development in Africa. This is an account of its establishment and early years based on accessible documentary sources in Tanzania…

  3. Health sciences libraries building survey, 1999–2009

    PubMed Central

    Ludwig, Logan

    2010-01-01

    Objective: A survey was conducted of health sciences libraries to obtain information about newer buildings, additions, remodeling, and renovations. Method: An online survey was developed, and announcements of survey availability posted to three major email discussion lists: Medical Library Association (MLA), Association of Academic Health Sciences Libraries (AAHSL), and MEDLIB-L. Previous discussions of library building projects on email discussion lists, a literature review, personal communications, and the author's consulting experiences identified additional projects. Results: Seventy-eight health sciences library building projects at seventy-three institutions are reported. Twenty-two are newer facilities built within the last ten years; two are space expansions; forty-five are renovation projects; and nine are combinations of new and renovated space. Six institutions report multiple or ongoing renovation projects during the last ten years. Conclusions: The survey results confirm a continuing migration from print-based to digitally based collections and reveal trends in library space design. Some health sciences libraries report loss of space as they move toward creating space for “community” building. Libraries are becoming more proactive in using or retooling space for concentration, collaboration, contemplation, communication, and socialization. All are moving toward a clearer operational vision of the library as the institution's information nexus and not merely as a physical location with print collections. PMID:20428277

  4. Library services for persons with disabilities: twentieth anniversary update.

    PubMed

    Willis, Christine A

    2012-01-01

    In recognition of the twentieth anniversary of the Americans with Disabilities Act (ADA), this survey updates the progress and reflects on the status of academic health sciences library services for people with disabilities since the Nelson study in 1996. The results of this survey extend beyond academic libraries to hospital libraries and include areas where all libraries can improve disability access. Based on a 24% response rate, libraries have addressed accessibility of technology in cost-effective and relatively easy ways. Libraries are reactively rather than proactively making changes to services for persons with disabilities. Copyright © Taylor & Francis Group, LLC

  5. A simple method for estimating frequency response corrections for eddy covariance systems

    Treesearch

    W. J. Massman

    2000-01-01

    A simple analytical formula is developed for estimating the frequency attenuation of eddy covariance fluxes due to sensor response, path-length averaging, sensor separation, signal processing, and flux averaging periods. Although it is an approximation based on flat terrain cospectra, this analytical formula should have broader applicability than just flat-terrain...

  6. The Impact of Conditional Scores on the Performance of DETECT.

    ERIC Educational Resources Information Center

    Zhang, Yanwei Oliver; Yu, Feng; Nandakumar, Ratna

    DETECT is a nonparametric, conditional covariance-based procedure to identify dimensional structure and the degree of multidimensionality of test data. The ability composite or conditional score used to estimate conditional covariance plays a significant role in the performance of DETECT. The number correct score of all items in the test (T) and…

  7. Generating Nonnormal Multivariate Data Using Copulas: Applications to SEM

    ERIC Educational Resources Information Center

    Mair, Patrick; Satorra, Albert; Bentler, Peter M.

    2012-01-01

    This article develops a procedure based on copulas to simulate multivariate nonnormal data that satisfy a prespecified variance-covariance matrix. The covariance matrix used can comply with a specific moment structure form (e.g., a factor analysis or a general structural equation model). Thus, the method is particularly useful for Monte Carlo…

  8. Performance Appraisal in Research Libraries. SPEC Kit 140.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    This kit and flyer produced by the Systems and Procedures Exchange Center of the Association of Research Libraries provides documents submitted by 14 universities that are used in the performance evaluation of professional library staff. Commentary based on a thorough review of documents submitted by 60 libraries includes an overview of the…

  9. Incorporating Library School Interns on Academic Library Subject Teams

    ERIC Educational Resources Information Center

    Sargent, Aloha R.; Becker, Bernd W.; Klingberg, Susan

    2011-01-01

    This case study analyzes the use of library school interns on subject-based teams for the social sciences, humanities, and sciences in the San Jose State University Library. Interns worked closely with team librarians on reference, collection development/management, and instruction activities. In a structured focus group, interns reported that the…

  10. Automating Media Centers and Small Libraries: A Microcomputer-Based Approach.

    ERIC Educational Resources Information Center

    Meghabghab, Dania Bilal

    Although the general automation process can be applied to most libraries, small libraries and media centers require a customized approach. Using a systematic approach, this guide covers each step and aspect of automation in a small library setting, and combines the principles of automation with field- tested activities. After discussing needs…

  11. An Inventory of Library Services and Resources of the State of Washington, 1965.

    ERIC Educational Resources Information Center

    Bevis, L. Dorothy

    This survey of current library resources and services in Washington is based on questionnaires; visits to public, university, college and community college libraries in the state; and statistics from state and national governmental sources. The inventories of public and academic libraries include discussions of standards applicable to the…

  12. Integrated Library Systems in Canadian Public, Academic and Special Libraries: The Sixth Annual Survey.

    ERIC Educational Resources Information Center

    Merilees, Bobbie

    1992-01-01

    Reports results of a survey of vendors of large and microcomputer-based integrated library systems. Data presented on Canadian installations include total systems installed, comparisons with earlier years, market segments, and installations by type of library (excluding school). International sales and automation requirements for music are…

  13. Total Library Computerization, Version 2: A DOS-Based Program from On Point, Inc., for Managing Small to Midsized Libraries.

    ERIC Educational Resources Information Center

    Combs, Joseph, Jr.

    1995-01-01

    Reviews the Total Library Computerization program, which can be used to manage small to midsized libraries. Discusses costs; operating system requirements; security features; user-interface styles; and system modules including online cataloging, circulation, serials control, acquisitions, authorities control, and interlibrary loan. (Author/JMV)

  14. Prospecting for New Collaborations: Mining Syllabi for Library Service Opportunities

    ERIC Educational Resources Information Center

    Williams, Lisa M.; Cody, Sue Ann; Parnell, Jerry

    2004-01-01

    Online course syllabi provide a convenient source of information about library use. This article discusses the strategies used to retrieve syllabi, analyze library use, and develop new opportunities to collaborate with faculty. A new Web-based service was developed to pull course- and library-related materials into a convenient package.

  15. Library Performance Measurement in the UK and Ireland

    ERIC Educational Resources Information Center

    Stanley, Tracey; Killick, Selena

    2009-01-01

    This survey was a joint initiative between Society of College, National, and University Libraries (SCONUL) and the Association of Research Libraries (ARL), that since 2004 has sponsored a program to assist libraries with the assessment of services that they offer their users and the processes that support those services. It was based on an ARL…

  16. Telecommuting for Original Cataloging at the Michigan State University Libraries.

    ERIC Educational Resources Information Center

    Black, Leah; Hyslop, Colleen

    1995-01-01

    Working conditions in library technical services departments can be a problem for catalogers in need of a quiet work environment. Based on a successful program for indexers at the National Agriculture Library, a proposal for an experimental telecommuting program for original cataloging at the Michigan State University Libraries was developed and…

  17. The Management of the Scientific Information Environment: The Role of the Research Library Web Site.

    ERIC Educational Resources Information Center

    Arte, Assunta

    2001-01-01

    Describes the experiences of the Italian National Research Council Library staff in the successful development and implementation of its Web site. Discusses electronic information sources that interface with the Web site; library services; technical infrastructure; and the choice of a Web-based library management system. (Author/LRW)

  18. The University Library: A Study of Services Offered the Blind.

    ERIC Educational Resources Information Center

    Parkin, Derral

    A survey based on the American Library Association's "Standards for Library Services for the Blind and Visually Handicapped" (1966) was sent to 65 four year universities in Utah, Idaho, Wyoming, Colorado, New Mexico, Nevada, Montana, and Arizona. Libraries were asked how many blind patrons they had and what services and facilities were…

  19. Digital Ethnography: Library Web Page Redesign among Digital Natives

    ERIC Educational Resources Information Center

    Klare, Diane; Hobbs, Kendall

    2011-01-01

    Presented with an opportunity to improve Wesleyan University's dated library home page, a team of librarians employed ethnographic techniques to explore how its users interacted with Wesleyan's current library home page and web pages in general. Based on the data that emerged, a group of library staff and members of the campus' information…

  20. Analyses of the 1981-82 Illinois Public Library Statistics.

    ERIC Educational Resources Information Center

    Wallace, Danny P.

    Using data provided by the annual reports of Illinois public libraries and by the Illinois state library, this publication is a companion to the November 1982 issue of "Illinois Libraries," which enumerated the 16 data elements upon which the analyses are based. Three additional types of information are provided for each of six…

  1. Small Libraries Online: Automating Circulation and Public Access Catalogs. Revised and Updated.

    ERIC Educational Resources Information Center

    Peterson, Christine

    This manual provides information to help libraries in Texas considering an automation project, with special emphasis on smaller libraries. The solutions discussed are microcomputer-based. The manual begins with a discussion of how to prepare for the automation of a library, including planning, approval, collection decisions, policy, and staffing.…

  2. Colorado Academic Library Master Plan, Spring 1982.

    ERIC Educational Resources Information Center

    Breivik, Patricia Senn; And Others

    Based on a need to assess current library strengths and weaknesses and to project potential library roles in supporting higher education, this master plan makes a series of recommendations to Colorado's academic libraries. It is noted that the plan was endorsed by both the Colorado Commission on Higher Education and the Colorado State Department…

  3. Current and future resources for functional metagenomics.

    PubMed

    Lam, Kathy N; Cheng, Jiujun; Engel, Katja; Neufeld, Josh D; Charles, Trevor C

    2015-01-01

    Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries-physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research.

  4. Filter Tuning Using the Chi-Squared Statistic

    NASA Technical Reports Server (NTRS)

    Lilly-Salkowski, Tyler B.

    2017-01-01

    This paper examines the use of the Chi-square statistic as a means of evaluating filter performance. The goal of the process is to characterize the filter performance in the metric of covariance realism. The Chi-squared statistic is the value calculated to determine the realism of a covariance based on the prediction accuracy and the covariance values at a given point in time. Once calculated, it is the distribution of this statistic that provides insight on the accuracy of the covariance. The process of tuning an Extended Kalman Filter (EKF) for Aqua and Aura support is described, including examination of the measurement errors of available observation types, and methods of dealing with potentially volatile atmospheric drag modeling. Predictive accuracy and the distribution of the Chi-squared statistic, calculated from EKF solutions, are assessed.

  5. A Wavelet Based Suboptimal Kalman Filter for Assimilation of Stratospheric Chemical Tracer Observations

    NASA Technical Reports Server (NTRS)

    Auger, Ludovic; Tangborn, Andrew; Atlas, Robert (Technical Monitor)

    2002-01-01

    A suboptimal Kalman filter system which evolves error covariances in terms of a truncated set of wavelet coefficients has been developed for the assimilation of chemical tracer observations of CH4. The truncation is carried out in such a way that the resolution of the error covariance, is reduced only in the zonal direction, where gradients are smaller. Assimilation experiments which last 24 days, and used different degrees of truncation were carried out. These reduced the covariance, by 90, 97 and 99 % and the computational cost of covariance propagation by 80, 93 and 96 % respectively. The difference in both error covariance and the tracer field between the truncated and full systems over this period were found to be not growing in the first case, and a growing relatively slowly in the later two cases. The largest errors in the tracer fields were found to occur in regions of largest zonal gradients in the tracer field.

  6. HIGHLIGHTING DIFFERENCES BETWEEN CONDITIONAL AND UNCONDITIONAL QUANTILE REGRESSION APPROACHES THROUGH AN APPLICATION TO ASSESS MEDICATION ADHERENCE

    PubMed Central

    BORAH, BIJAN J.; BASU, ANIRBAN

    2014-01-01

    The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer’s disease. PMID:23616446

  7. Mixed model approaches for diallel analysis based on a bio-model.

    PubMed

    Zhu, J; Weir, B S

    1996-12-01

    A MINQUE(1) procedure, which is minimum norm quadratic unbiased estimation (MINQUE) method with 1 for all the prior values, is suggested for estimating variance and covariance components in a bio-model for diallel crosses. Unbiasedness and efficiency of estimation were compared for MINQUE(1), restricted maximum likelihood (REML) and MINQUE theta which has parameter values for the prior values. MINQUE(1) is almost as efficient as MINQUE theta for unbiased estimation of genetic variance and covariance components. The bio-model is efficient and robust for estimating variance and covariance components for maternal and paternal effects as well as for nuclear effects. A procedure of adjusted unbiased prediction (AUP) is proposed for predicting random genetic effects in the bio-model. The jack-knife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects. Worked examples are given for estimation of variance and covariance components and for prediction of genetic merits.

  8. A Robust Statistics Approach to Minimum Variance Portfolio Optimization

    NASA Astrophysics Data System (ADS)

    Yang, Liusha; Couillet, Romain; McKay, Matthew R.

    2015-12-01

    We study the design of portfolios under a minimum risk criterion. The performance of the optimized portfolio relies on the accuracy of the estimated covariance matrix of the portfolio asset returns. For large portfolios, the number of available market returns is often of similar order to the number of assets, so that the sample covariance matrix performs poorly as a covariance estimator. Additionally, financial market data often contain outliers which, if not correctly handled, may further corrupt the covariance estimation. We address these shortcomings by studying the performance of a hybrid covariance matrix estimator based on Tyler's robust M-estimator and on Ledoit-Wolf's shrinkage estimator while assuming samples with heavy-tailed distribution. Employing recent results from random matrix theory, we develop a consistent estimator of (a scaled version of) the realized portfolio risk, which is minimized by optimizing online the shrinkage intensity. Our portfolio optimization method is shown via simulations to outperform existing methods both for synthetic and real market data.

  9. Using Bayesian regression to test hypotheses about relationships between parameters and covariates in cognitive models.

    PubMed

    Boehm, Udo; Steingroever, Helen; Wagenmakers, Eric-Jan

    2018-06-01

    An important tool in the advancement of cognitive science are quantitative models that represent different cognitive variables in terms of model parameters. To evaluate such models, their parameters are typically tested for relationships with behavioral and physiological variables that are thought to reflect specific cognitive processes. However, many models do not come equipped with the statistical framework needed to relate model parameters to covariates. Instead, researchers often revert to classifying participants into groups depending on their values on the covariates, and subsequently comparing the estimated model parameters between these groups. Here we develop a comprehensive solution to the covariate problem in the form of a Bayesian regression framework. Our framework can be easily added to existing cognitive models and allows researchers to quantify the evidential support for relationships between covariates and model parameters using Bayes factors. Moreover, we present a simulation study that demonstrates the superiority of the Bayesian regression framework to the conventional classification-based approach.

  10. Use of Information and Communication Technology (ICT) in Collection Development in Scientific and Research Institute Libraries in Iran: A study

    NASA Astrophysics Data System (ADS)

    Khademizadeh, Shahnaz

    2012-08-01

    The explosion of information communication technology (ICT) since the beginning of the 20th century has been rendering manual-based library system in academic, research, special and public libraries less relevant. This is because using and implementing information communication technology in the library depend largely on the librarian attitude toward the current digital age. This study examined the attitudinal correlates of some selected scientific and research institutes libraries in Irantowards the use and application of ICT in their various libraries. A total of ten libraries from all the forty nine libraries in Iran formed the studyís population. It is observed that 'Internet/intranet etc' (1046; 67.5%) is the most important source through which the users become aware of modern information technologies used in their libraries. The vast majority of the respondents who answered electronic sources make it 'Easier' to gather and use information are (1313; 84.7%). The results indicate that there is a significant relationship between e-environment and collection development (?262.86, p=0.000). Findings further show that all of librarians (9; 100%) opined they feel that ICT application affects the collection development of library. Based on these findings, it is recommended that libraries in the developing countries should consider training those librarians who do not have knowledge of ICT in order to remove the fear and anxiety hindering them from developing good attitude towards the use of ICT in their libraries.

  11. Poisson Statistics of Combinatorial Library Sampling Predict False Discovery Rates of Screening

    PubMed Central

    2017-01-01

    Microfluidic droplet-based screening of DNA-encoded one-bead-one-compound combinatorial libraries is a miniaturized, potentially widely distributable approach to small molecule discovery. In these screens, a microfluidic circuit distributes library beads into droplets of activity assay reagent, photochemically cleaves the compound from the bead, then incubates and sorts the droplets based on assay result for subsequent DNA sequencing-based hit compound structure elucidation. Pilot experimental studies revealed that Poisson statistics describe nearly all aspects of such screens, prompting the development of simulations to understand system behavior. Monte Carlo screening simulation data showed that increasing mean library sampling (ε), mean droplet occupancy, or library hit rate all increase the false discovery rate (FDR). Compounds identified as hits on k > 1 beads (the replicate k class) were much more likely to be authentic hits than singletons (k = 1), in agreement with previous findings. Here, we explain this observation by deriving an equation for authenticity, which reduces to the product of a library sampling bias term (exponential in k) and a sampling saturation term (exponential in ε) setting a threshold that the k-dependent bias must overcome. The equation thus quantitatively describes why each hit structure’s FDR is based on its k class, and further predicts the feasibility of intentionally populating droplets with multiple library beads, assaying the micromixtures for function, and identifying the active members by statistical deconvolution. PMID:28682059

  12. Effect of condensed tannins on bovine rumen protist diversity based on 18S rRNA gene sequences.

    PubMed

    Tan, Hui Yin; Sieo, Chin Chin; Abdullah, Norhani; Liang, Juan Boo; Huang, Xiao Dan; Ho, Yin Wan

    2013-01-01

    Molecular diversity of protists from bovine rumen fluid incubated with condensed tannins of Leucaena leucocephala hybrid-Rendang at 20 mg/500 mg dry matter (treatment) or without condensed tannins (control) was investigated using 18S rRNA gene library. Clones from the control library were distributed within nine genera, but clones from the condensed tannin treatment clone library were related to only six genera. Diversity estimators such as abundance-based coverage estimation and Chao1 showed significant differences between the two libraries, although no differences were found based on Shannon-Weaver index and Libshuff. © 2012 The Author(s) Journal of Eukaryotic Microbiology © 2012 International Society of Protistologists.

  13. Cortical Thinning and Altered Cortico-Cortical Structural Covariance of the Default Mode Network in Patients with Persistent Insomnia Symptoms.

    PubMed

    Suh, Sooyeon; Kim, Hosung; Dang-Vu, Thien Thanh; Joo, Eunyeon; Shin, Chol

    2016-01-01

    Recent studies have suggested that structural abnormalities in insomnia may be linked with alterations in the default-mode network (DMN). This study compared cortical thickness and structural connectivity linked to the DMN in patients with persistent insomnia (PI) and good sleepers (GS). The current study used a clinical subsample from the longitudinal community-based Korean Genome and Epidemiology Study (KoGES). Cortical thickness and structural connectivity linked to the DMN in patients with persistent insomnia symptoms (PIS; n = 57) were compared to good sleepers (GS; n = 40). All participants underwent MRI acquisition. Based on literature review, we selected cortical regions corresponding to the DMN. A seed-based structural covariance analysis measured cortical thickness correlation between each seed region of the DMN and other cortical areas. Association of cortical thickness and covariance with sleep quality and neuropsychological assessments were further assessed. Compared to GS, cortical thinning was found in PIS in the anterior cingulate cortex, precentral cortex, and lateral prefrontal cortex. Decreased structural connectivity between anterior and posterior regions of the DMN was observed in the PIS group. Decreased structural covariance within the DMN was associated with higher PSQI scores. Cortical thinning in the lateral frontal lobe was related to poor performance in executive function in PIS. Disrupted structural covariance network in PIS might reflect malfunctioning of antero-posterior disconnection of the DMN during the wake to sleep transition that is commonly found during normal sleep. The observed structural network alteration may further implicate commonly observed sustained sleep difficulties and cognitive impairment in insomnia. © 2016 Associated Professional Sleep Societies, LLC.

  14. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  15. Discovery of potent inhibitors of soluble epoxide hydrolase by combinatorial library design and structure-based virtual screening.

    PubMed

    Xing, Li; McDonald, Joseph J; Kolodziej, Steve A; Kurumbail, Ravi G; Williams, Jennifer M; Warren, Chad J; O'Neal, Janet M; Skepner, Jill E; Roberds, Steven L

    2011-03-10

    Structure-based virtual screening was applied to design combinatorial libraries to discover novel and potent soluble epoxide hydrolase (sEH) inhibitors. X-ray crystal structures revealed unique interactions for a benzoxazole template in addition to the conserved hydrogen bonds with the catalytic machinery of sEH. By exploitation of the favorable binding elements, two iterations of library design based on amide coupling were employed, guided principally by the docking results of the enumerated virtual products. Biological screening of the libraries demonstrated as high as 90% hit rate, of which over two dozen compounds were single digit nanomolar sEH inhibitors by IC(50) determination. In total the library design and synthesis produced more than 300 submicromolar sEH inhibitors. In cellular systems consistent activities were demonstrated with biochemical measurements. The SAR understanding of the benzoxazole template provides valuable insights into discovery of novel sEH inhibitors as therapeutic agents.

  16. Robust Covariate-Adjusted Log-Rank Statistics and Corresponding Sample Size Formula for Recurrent Events Data

    PubMed Central

    Song, Rui; Kosorok, Michael R.; Cai, Jianwen

    2009-01-01

    Summary Recurrent events data are frequently encountered in clinical trials. This article develops robust covariate-adjusted log-rank statistics applied to recurrent events data with arbitrary numbers of events under independent censoring and the corresponding sample size formula. The proposed log-rank tests are robust with respect to different data-generating processes and are adjusted for predictive covariates. It reduces to the Kong and Slud (1997, Biometrika 84, 847–862) setting in the case of a single event. The sample size formula is derived based on the asymptotic normality of the covariate-adjusted log-rank statistics under certain local alternatives and a working model for baseline covariates in the recurrent event data context. When the effect size is small and the baseline covariates do not contain significant information about event times, it reduces to the same form as that of Schoenfeld (1983, Biometrics 39, 499–503) for cases of a single event or independent event times within a subject. We carry out simulations to study the control of type I error and the comparison of powers between several methods in finite samples. The proposed sample size formula is illustrated using data from an rhDNase study. PMID:18162107

  17. Shrinkage Estimation of Varying Covariate Effects Based On Quantile Regression

    PubMed Central

    Peng, Limin; Xu, Jinfeng; Kutner, Nancy

    2013-01-01

    Varying covariate effects often manifest meaningful heterogeneity in covariate-response associations. In this paper, we adopt a quantile regression model that assumes linearity at a continuous range of quantile levels as a tool to explore such data dynamics. The consideration of potential non-constancy of covariate effects necessitates a new perspective for variable selection, which, under the assumed quantile regression model, is to retain variables that have effects on all quantiles of interest as well as those that influence only part of quantiles considered. Current work on l1-penalized quantile regression either does not concern varying covariate effects or may not produce consistent variable selection in the presence of covariates with partial effects, a practical scenario of interest. In this work, we propose a shrinkage approach by adopting a novel uniform adaptive LASSO penalty. The new approach enjoys easy implementation without requiring smoothing. Moreover, it can consistently identify the true model (uniformly across quantiles) and achieve the oracle estimation efficiency. We further extend the proposed shrinkage method to the case where responses are subject to random right censoring. Numerical studies confirm the theoretical results and support the utility of our proposals. PMID:25332515

  18. Demographic and Psychological Predictors of Panel Attrition: Evidence from the New Zealand Attitudes and Values Study

    PubMed Central

    Satherley, Nicole; Milojev, Petar; Greaves, Lara M.; Huang, Yanshu; Osborne, Danny; Bulbulia, Joseph; Sibley, Chris G.

    2015-01-01

    This study examines attrition rates over the first four years of the New Zealand Attitudes and Values Study, a longitudinal national panel sample of New Zealand adults. We report the base rate and covariates for the following four distinct classes of respondents: explicit withdrawals, lost respondents, intermittent respondents and constant respondents. A multinomial logistic regression examined an extensive range of demographic and socio-psychological covariates (among them the Big-Six personality traits) associated with membership in these classes (N = 5,814). Results indicated that men, Māori and Asian peoples were less likely to be constant respondents. Conscientiousness and Honesty-Humility were also positively associated with membership in the constant respondent class. Notably, the effect sizes for the socio-psychological covariates of panel attrition tended to match or exceed those of standard demographic covariates. This investigation broadens the focus of research on panel attrition beyond demographics by including a comprehensive set of socio-psychological covariates. Our findings show that core psychological covariates convey important information about panel attrition, and are practically important to the management of longitudinal panel samples like the New Zealand Attitudes and Values Study. PMID:25793746

  19. The Catalog Takes to the Highway.

    ERIC Educational Resources Information Center

    Chesbro, Melinda

    1999-01-01

    Discusses new developments in online library catalogs, including Web-based catalogs; interconnectivity within the library; interconnectivity between libraries; graphical user interfaces; pricing models; and a checklist of questions to ask when purchasing a new online catalog. (LRW)

  20. Analysis of stock investment selection based on CAPM using covariance and genetic algorithm approach

    NASA Astrophysics Data System (ADS)

    Sukono; Susanti, D.; Najmia, M.; Lesmana, E.; Napitupulu, H.; Supian, S.; Putra, A. S.

    2018-03-01

    Investment is one of the economic growth factors of countries, especially in Indonesia. Stocks is a form of investment, which is liquid. In determining the stock investment decisions which need to be considered by investors is to choose stocks that can generate maximum returns with a minimum risk level. Therefore, we need to know how to allocate the capital which may give the optimal benefit. This study discusses the issue of stock investment based on CAPM which is estimated using covariance and Genetic Algorithm approach. It is assumed that the stocks analyzed follow the CAPM model. To do the estimation of beta parameter on CAPM equation is done by two approach, first is to be represented by covariance approach, and second with genetic algorithm optimization. As a numerical illustration, in this paper analyzed ten stocks traded on the capital market in Indonesia. The results of the analysis show that estimation of beta parameters using covariance and genetic algorithm approach, give the same decision, that is, six underpriced stocks with buying decision, and four overpriced stocks with a sales decision. Based on the analysis, it can be concluded that the results can be used as a consideration for investors buying six under-priced stocks, and selling four overpriced stocks.

  1. Efficient Storage Scheme of Covariance Matrix during Inverse Modeling

    NASA Astrophysics Data System (ADS)

    Mao, D.; Yeh, T. J.

    2013-12-01

    During stochastic inverse modeling, the covariance matrix of geostatistical based methods carries the information about the geologic structure. Its update during iterations reflects the decrease of uncertainty with the incorporation of observed data. For large scale problem, its storage and update cost too much memory and computational resources. In this study, we propose a new efficient storage scheme for storage and update. Compressed Sparse Column (CSC) format is utilized to storage the covariance matrix, and users can assign how many data they prefer to store based on correlation scales since the data beyond several correlation scales are usually not very informative for inverse modeling. After every iteration, only the diagonal terms of the covariance matrix are updated. The off diagonal terms are calculated and updated based on shortened correlation scales with a pre-assigned exponential model. The correlation scales are shortened by a coefficient, i.e. 0.95, every iteration to show the decrease of uncertainty. There is no universal coefficient for all the problems and users are encouraged to try several times. This new scheme is tested with 1D examples first. The estimated results and uncertainty are compared with the traditional full storage method. In the end, a large scale numerical model is utilized to validate this new scheme.

  2. Data Fusion of Gridded Snow Products Enhanced with Terrain Covariates and a Simple Snow Model

    NASA Astrophysics Data System (ADS)

    Snauffer, A. M.; Hsieh, W. W.; Cannon, A. J.

    2017-12-01

    Hydrologic planning requires accurate estimates of regional snow water equivalent (SWE), particularly areas with hydrologic regimes dominated by spring melt. While numerous gridded data products provide such estimates, accurate representations are particularly challenging under conditions of mountainous terrain, heavy forest cover and large snow accumulations, contexts which in many ways define the province of British Columbia (BC), Canada. One promising avenue of improving SWE estimates is a data fusion approach which combines field observations with gridded SWE products and relevant covariates. A base artificial neural network (ANN) was constructed using three of the best performing gridded SWE products over BC (ERA-Interim/Land, MERRA and GLDAS-2) and simple location and time covariates. This base ANN was then enhanced to include terrain covariates (slope, aspect and Terrain Roughness Index, TRI) as well as a simple 1-layer energy balance snow model driven by gridded bias-corrected ANUSPLIN temperature and precipitation values. The ANN enhanced with all aforementioned covariates performed better than the base ANN, but most of the skill improvement was attributable to the snow model with very little contribution from the terrain covariates. The enhanced ANN improved station mean absolute error (MAE) by an average of 53% relative to the composing gridded products over the province. Interannual peak SWE correlation coefficient was found to be 0.78, an improvement of 0.05 to 0.18 over the composing products. This nonlinear approach outperformed a comparable multiple linear regression (MLR) model by 22% in MAE and 0.04 in interannual correlation. The enhanced ANN has also been shown to estimate better than the Variable Infiltration Capacity (VIC) hydrologic model calibrated and run for four BC watersheds, improving MAE by 22% and correlation by 0.05. The performance improvements of the enhanced ANN are statistically significant at the 5% level across the province and in four out of five physiographic regions.

  3. Cortical Thinning and Altered Cortico-Cortical Structural Covariance of the Default Mode Network in Patients with Persistent Insomnia Symptoms

    PubMed Central

    Suh, Sooyeon; Kim, Hosung; Dang-Vu, Thien Thanh; Joo, Eunyeon; Shin, Chol

    2016-01-01

    Study Objectives: Recent studies have suggested that structural abnormalities in insomnia may be linked with alterations in the default-mode network (DMN). This study compared cortical thickness and structural connectivity linked to the DMN in patients with persistent insomnia (PI) and good sleepers (GS). Methods: The current study used a clinical subsample from the longitudinal community-based Korean Genome and Epidemiology Study (KoGES). Cortical thickness and structural connectivity linked to the DMN in patients with persistent insomnia symptoms (PIS; n = 57) were compared to good sleepers (GS; n = 40). All participants underwent MRI acquisition. Based on literature review, we selected cortical regions corresponding to the DMN. A seed-based structural covariance analysis measured cortical thickness correlation between each seed region of the DMN and other cortical areas. Association of cortical thickness and covariance with sleep quality and neuropsychological assessments were further assessed. Results: Compared to GS, cortical thinning was found in PIS in the anterior cingulate cortex, precentral cortex, and lateral prefrontal cortex. Decreased structural connectivity between anterior and posterior regions of the DMN was observed in the PIS group. Decreased structural covariance within the DMN was associated with higher PSQI scores. Cortical thinning in the lateral frontal lobe was related to poor performance in executive function in PIS. Conclusion: Disrupted structural covariance network in PIS might reflect malfunctioning of antero-posterior disconnection of the DMN during the wake to sleep transition that is commonly found during normal sleep. The observed structural network alteration may further implicate commonly observed sustained sleep difficulties and cognitive impairment in insomnia. Citation: Suh S, Kim H, Dang-Vu TT, Joo E, Shin C. Cortical thinning and altered cortico-cortical structural covariance of the default mode network in patients with persistent insomnia symptoms. SLEEP 2016;39(1):161–171. PMID:26414892

  4. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    PubMed

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  5. Spectrum-to-Spectrum Searching Using a Proteome-wide Spectral Library*

    PubMed Central

    Yen, Chia-Yu; Houel, Stephane; Ahn, Natalie G.; Old, William M.

    2011-01-01

    The unambiguous assignment of tandem mass spectra (MS/MS) to peptide sequences remains a key unsolved problem in proteomics. Spectral library search strategies have emerged as a promising alternative for peptide identification, in which MS/MS spectra are directly compared against a reference library of confidently assigned spectra. Two problems relate to library size. First, reference spectral libraries are limited to rediscovery of previously identified peptides and are not applicable to new peptides, because of their incomplete coverage of the human proteome. Second, problems arise when searching a spectral library the size of the entire human proteome. We observed that traditional dot product scoring methods do not scale well with spectral library size, showing reduction in sensitivity when library size is increased. We show that this problem can be addressed by optimizing scoring metrics for spectrum-to-spectrum searches with large spectral libraries. MS/MS spectra for the 1.3 million predicted tryptic peptides in the human proteome are simulated using a kinetic fragmentation model (MassAnalyzer version2.1) to create a proteome-wide simulated spectral library. Searches of the simulated library increase MS/MS assignments by 24% compared with Mascot, when using probabilistic and rank based scoring methods. The proteome-wide coverage of the simulated library leads to 11% increase in unique peptide assignments, compared with parallel searches of a reference spectral library. Further improvement is attained when reference spectra and simulated spectra are combined into a hybrid spectral library, yielding 52% increased MS/MS assignments compared with Mascot searches. Our study demonstrates the advantages of using probabilistic and rank based scores to improve performance of spectrum-to-spectrum search strategies. PMID:21532008

  6. The Library as Leader: Computer Assisted Information Services at Northwestern University. A Report of the NULCAIS Committee on the Present Status, and Proposals for the Future, of Computer Assisted Information Services at Northwestern University Library.

    ERIC Educational Resources Information Center

    Northwestern Univ., Evanston, IL. Univ. Libraries.

    In March 1974, a study was undertaken at Northwestern University to examine the role of the library in providing information services based on computerized data bases. After taking an inventory of existing data bases at Northwestern and in the greater Chicago area, a committee suggested ways to continue and expand the scope of information…

  7. Card Sorting in an Online Environment: Key to Involving Online-Only Student Population in Usability Testing of an Academic Library Web Site?

    ERIC Educational Resources Information Center

    Paladino, Emily B.; Klentzin, Jacqueline C.; Mills, Chloe P.

    2017-01-01

    Based on in-person, task-based usability testing and interviews, the authors' library Web site was recently overhauled in order to improve user experience. This led to the authors' interest in additional usability testing methods and test environments that would most closely fit their library's goals and situation. The appeal of card sorting…

  8. Modeling of frequency agile devices: development of PKI neuromodeling library based on hierarchical network structure

    NASA Astrophysics Data System (ADS)

    Sanchez, P.; Hinojosa, J.; Ruiz, R.

    2005-06-01

    Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.

  9. Development and application of a recombination-based library versus library high- throughput yeast two-hybrid (RLL-Y2H) screening system.

    PubMed

    Yang, Fang; Lei, Yingying; Zhou, Meiling; Yao, Qili; Han, Yichao; Wu, Xiang; Zhong, Wanshun; Zhu, Chenghang; Xu, Weize; Tao, Ran; Chen, Xi; Lin, Da; Rahman, Khaista; Tyagi, Rohit; Habib, Zeshan; Xiao, Shaobo; Wang, Dang; Yu, Yang; Chen, Huanchun; Fu, Zhenfang; Cao, Gang

    2018-02-16

    Protein-protein interaction (PPI) network maintains proper function of all organisms. Simple high-throughput technologies are desperately needed to delineate the landscape of PPI networks. While recent state-of-the-art yeast two-hybrid (Y2H) systems improved screening efficiency, either individual colony isolation, library preparation arrays, gene barcoding or massive sequencing are still required. Here, we developed a recombination-based 'library vs library' Y2H system (RLL-Y2H), by which multi-library screening can be accomplished in a single pool without any individual treatment. This system is based on the phiC31 integrase-mediated integration between bait and prey plasmids. The integrated fragments were digested by MmeI and subjected to deep sequencing to decode the interaction matrix. We applied this system to decipher the trans-kingdom interactome between Mycobacterium tuberculosis and host cells and further identified Rv2427c interfering with the phagosome-lysosome fusion. This concept can also be applied to other systems to screen protein-RNA and protein-DNA interactions and delineate signaling landscape in cells.

  10. Development of a multi-data assimilation scheme to integrate Bio-Argo floats data with ocean colour satellite data into the CMEMS MFC-Biogeochemistry

    NASA Astrophysics Data System (ADS)

    Cossarini, Gianpiero; D'Ortenzio, Fabrizio; Mariotti, Laura; Mignot, Alexandre; Salon, Stefano

    2017-04-01

    The Mediterranean Sea is a very promising site to develop and test the assimilation of Bio-Argo data since 1) the Bio-Argo network is one of the densest of the global ocean, and 2) a consolidate data assimilation framework of biogeochemical variables (3DVAR-BIO, presently based on assimilation of satellite-estimated surface chlorophyll data) already exists within the CMEMS biogeochemical model system for Mediterranean Sea. The MASSIMILI project, granted by the CMEMS Service Evolution initiative, is aimed to develop the assimilation of Bio-Argo Floats data into the CMEMS biogeochemical model system of the Mediterranean Sea, by means of an upgrade of the 3DVAR-BIO scheme. Specific developments of the 3DVAR-BIO scheme focus on the estimate of new operators of the variational decomposition of the background error covariance matrix and on the implementation of the new observation operator specifically for the Bio-Argo float vertical profile data. In particular, a new horizontal covariance operator for chlorophyll, nitrate and oxygen is based on 3D fields of horizontal correlation radius calculated from a long-term reanalysis simulation. A new vertical covariance operator is built on monthly and spatial varying EOF decomposition to account for the spatiotemporal variability of vertical structure of the three variables error covariance. Further, the observation error covariance is a key factor for an effective assimilation of the Bio-Argo data into the model dynamics. The sensitivities of assimilation to the different factors are estimated. First results of the implementation of the new 3DVAR-BIO scheme show the impact of Bio-Argo data on the 3D fields of chlorophyll, nitrate and oxygen. Tuning the length scale factors of horizontal covariance, analysing the sensitivity of the observation error covariance, introducing non-diagonal biogeochemical covariance operator and non-diagonal multi-platform operator (i.e. Bio-Argo and satellite) are crucial future steps for the success of the MASSIMILI project. In our contribute, we will discuss the recent and promising advancements this strategic project has been having in the past year and its potential for the whole operational biogeochemical modelling community.

  11. Gray matter volume covariance patterns associated with gait speed in older adults: a multi-cohort MRI study.

    PubMed

    Blumen, Helena M; Brown, Lucy L; Habeck, Christian; Allali, Gilles; Ayers, Emmeline; Beauchet, Olivier; Callisaya, Michele; Lipton, Richard B; Mathuranath, P S; Phan, Thanh G; Pradeep Kumar, V G; Srikanth, Velandai; Verghese, Joe

    2018-04-09

    Accelerated gait decline in aging is associated with many adverse outcomes, including an increased risk for falls, cognitive decline, and dementia. Yet, the brain structures associated with gait speed, and how they relate to specific cognitive domains, are not well-understood. We examined structural brain correlates of gait speed, and how they relate to processing speed, executive function, and episodic memory in three non-demented and community-dwelling older adult cohorts (Overall N = 352), using voxel-based morphometry and multivariate covariance-based statistics. In all three cohorts, we identified gray matter volume covariance patterns associated with gait speed that included brain stem, precuneus, fusiform, motor, supplementary motor, and prefrontal (particularly ventrolateral prefrontal) cortex regions. Greater expression of these gray matter volume covariance patterns linked to gait speed were associated with better processing speed in all three cohorts, and with better executive function in one cohort. These gray matter covariance patterns linked to gait speed were not associated with episodic memory in any of the cohorts. These findings suggest that gait speed, processing speed (and to some extent executive functions) rely on shared neural systems that are subject to age-related and dementia-related change. The implications of these findings are discussed within the context of the development of interventions to compensate for age-related gait and cognitive decline.

  12. Structural covariance and cortical reorganisation in schizophrenia: a MRI-based morphometric study.

    PubMed

    Palaniyappan, Lena; Hodgson, Olha; Balain, Vijender; Iwabuchi, Sarina; Gowland, Penny; Liddle, Peter

    2018-05-06

    In patients with schizophrenia, distributed abnormalities are observed in grey matter volume. A recent hypothesis posits that these distributed changes are indicative of a plastic reorganisation process occurring in response to a functional defect in neuronal information transmission. We investigated the structural covariance across various brain regions in early-stage schizophrenia to determine if indeed the observed patterns of volumetric loss conform to a coordinated pattern of structural reorganisation. Structural magnetic resonance imaging scans were obtained from 40 healthy adults and 41 age, gender and parental socioeconomic status matched patients with schizophrenia. Volumes of grey matter tissue were estimated at the regional level across 90 atlas-based parcellations. Group-level structural covariance was studied using a graph theoretical framework. Patients had distributed reduction in grey matter volume, with high degree of localised covariance (clustering) compared with controls. Patients with schizophrenia had reduced centrality of anterior cingulate and insula but increased centrality of the fusiform cortex, compared with controls. Simulating targeted removal of highly central nodes resulted in significant loss of the overall covariance patterns in patients compared with controls. Regional volumetric deficits in schizophrenia are not a result of random, mutually independent processes. Our observations support the occurrence of a spatially interconnected reorganisation with the systematic de-escalation of conventional 'hub' regions. This raises the question of whether the morphological architecture in schizophrenia is primed for compensatory functions, albeit with a high risk of inefficiency.

  13. Non-stationary pre-envelope covariances of non-classically damped systems

    NASA Astrophysics Data System (ADS)

    Muscolino, G.

    1991-08-01

    A new formulation is given to evaluate the stationary and non-stationary response of linear non-classically damped systems subjected to multi-correlated non-separable Gaussian input processes. This formulation is based on a new and more suitable definition of the impulse response function matrix for such systems. It is shown that, when using this definition, the stochastic response of non-classically damped systems involves the evaluation of quantities similar to those of classically damped ones. Furthermore, considerations about non-stationary cross-covariances, spectral moments and pre-envelope cross-covariances are presented for a monocorrelated input process.

  14. The interdependence between screening methods and screening libraries.

    PubMed

    Shelat, Anang A; Guy, R Kiplin

    2007-06-01

    The most common methods for discovery of chemical compounds capable of manipulating biological function involves some form of screening. The success of such screens is highly dependent on the chemical materials - commonly referred to as libraries - that are assayed. Classic methods for the design of screening libraries have depended on knowledge of target structure and relevant pharmacophores for target focus, and on simple count-based measures to assess other properties. The recent proliferation of two novel screening paradigms, structure-based screening and high-content screening, prompts a profound rethink about the ideal composition of small-molecule screening libraries. We suggest that currently utilized libraries are not optimal for addressing new targets by high-throughput screening, or complex phenotypes by high-content screening.

  15. Construction and characterization of a bacterial artificial chromosome library for hexaploid wheat line 92R137

    USDA-ARS?s Scientific Manuscript database

    For map-based cloning of genes conferring important traits in the hexaploid wheat line 92R137, a bacterial artificial chromosome (BAC) library, including two sub libraries, was constructed using the genomic DNA of 92R137 digested with restriction enzymes HindIII and BamHI. The BAC library was compos...

  16. Academic Librarians' Practices and Perceptions on Web-Based Instruction for Academic Library Patrons as Adult Learners

    ERIC Educational Resources Information Center

    Taylor, Deborah Michelle

    2016-01-01

    Academic librarians are encouraged to provide library services, resources, and instruction to all patrons, including the adult learner. Statistics reported that worldwide, adults are a growing student population in colleges and universities; however, the adult learner as an academic library patron is often neglected. Academic libraries can…

  17. From LAMP to Koha: Case Study of the Pakistan Legislative Assembly Libraries

    ERIC Educational Resources Information Center

    Shafi-Ullah, Farasat; Qutab, Saima

    2012-01-01

    Purpose: This paper aims to elaborate the library data migration process from LAMP (Library Automation Management Program) to the open source software Koha's (2.2.8 Windows based) Pakistani flavour PakLAG-Koha in six legislative assembly libraries of Pakistan. Design/methodology/approach: The paper explains different steps of the data migration…

  18. National Library Service for the Blind and Physically Handicapped. Working Paper.

    ERIC Educational Resources Information Center

    Battelle Memorial Inst., Columbus, OH. Columbus Labs.

    Based on observations made at the National Library Service for the Blind and Physically Handicapped (NLS) on September 8, 9, and 10, 1982 and on documents supplied by NLS, this report compares standards published in the 1979 document entitled "Standards of Service for the Library of Congress Network of Libraries for the Blind and Physically…

  19. Public Library Service to Children in Oklahoma.

    ERIC Educational Resources Information Center

    Wentroth, Mary Ann

    Because of the low density of its population and subsequent low property tax support, library service in Oklahoma is based on the multicounty library operating as a single unit. With the help of federal funds, such units now cover one-third of the state and 60 percent of its population utilizing branch libraries and bookmobile service. Service to…

  20. 36 CFR Appendix A to Subpart A of... - Fees and Charges for Services Provided to Requesters of Records

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... LIBRARY OF CONGRESS DISCLOSURE OR PRODUCTION OF RECORDS OR INFORMATION Availability of Library of Congress... based on the direct cost to the Library, including labor, material, and computer time. (b) Duplication... established by the Library's Photoduplication Service, or in the case of machine media duplication, by the...

  1. 36 CFR Appendix A to Subpart A of... - Fees and Charges for Services Provided to Requesters of Records

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... LIBRARY OF CONGRESS DISCLOSURE OR PRODUCTION OF RECORDS OR INFORMATION Availability of Library of Congress... based on the direct cost to the Library, including labor, material, and computer time. (b) Duplication... established by the Library's Photoduplication Service, or in the case of machine media duplication, by the...

  2. 36 CFR Appendix A to Subpart A of... - Fees and Charges for Services Provided to Requesters of Records

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... LIBRARY OF CONGRESS DISCLOSURE OR PRODUCTION OF RECORDS OR INFORMATION Availability of Library of Congress... based on the direct cost to the Library, including labor, material, and computer time. (b) Duplication... established by the Library's Photoduplication Service, or in the case of machine media duplication, by the...

  3. 36 CFR Appendix A to Subpart A of... - Fees and Charges for Services Provided to Requesters of Records

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... LIBRARY OF CONGRESS DISCLOSURE OR PRODUCTION OF RECORDS OR INFORMATION Availability of Library of Congress... based on the direct cost to the Library, including labor, material, and computer time. (b) Duplication... established by the Library's Photoduplication Service, or in the case of machine media duplication, by the...

  4. Impressions of an old master: hospital libraries and librarians, 1970-2014.

    PubMed

    Taylor, Mary Virginia

    2015-01-01

    This article is a retrospective look at the changes in hospital libraries from 1970 to 2014 based on the author's experience and a survey of the literature related to hospital libraries indexed in PubMed from 1970 to the present. New roles for librarians and methods for conveying the value of libraries to administrators are described.

  5. 36 CFR Appendix A to Subpart A of... - Fees and Charges for Services Provided to Requesters of Records

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... LIBRARY OF CONGRESS DISCLOSURE OR PRODUCTION OF RECORDS OR INFORMATION Availability of Library of Congress... based on the direct cost to the Library, including labor, material, and computer time. (b) Duplication... established by the Library's Photoduplication Service, or in the case of machine media duplication, by the...

  6. Proposals for a Dynamic Library. Technical Report.

    ERIC Educational Resources Information Center

    Salton, Gerard

    The current library environment is first examined, and an attempt is made to explain why the standard approaches to the library problem have been less productive than had been anticipated. A new design is then introduced for modern library operations based on a two-fold strategy: on the input side, the widest possible utilization should be made of…

  7. Learning from Distance Faculty: A Faculty Needs Assessment at the University of Wyoming

    ERIC Educational Resources Information Center

    Kvenild, Cassandra; Bowles-Terry, Melissa

    2011-01-01

    Distance educators have special library needs. This article discusses the results of a library needs assessment of distance instructors at the University of Wyoming. Access to resources, use of library instructional services, barriers to distance library use, and perceived gaps in service are all addressed. Follow-up actions, based on survey…

  8. A Study of Organization and Governance of Alabama State Library Systems.

    ERIC Educational Resources Information Center

    Public Administration Service, Washington, DC.

    In order to provide the citizens of Alabama with the best possible library service for a given level of funding, this study recommends a model for the organization and funding of multi-type library cooperation in the state, based on a review of past developments and current conditions, together with proposed changes in state library legislation…

  9. Academic Libraries as High-Tech Gateways: A Guide to Design & Space Decisions. Second Edition.

    ERIC Educational Resources Information Center

    Bazillion, Richard J.; Braun, Connie L.

    This book, based on research about libraries around the country, provides tools that can be used for planning and building an academic library space that streamlines access to information. It explains how to incorporate the latest innovations in academic library facility design; how to make the facility flexible for changing information technology…

  10. The Performance Analysis Based on SAR Sample Covariance Matrix

    PubMed Central

    Erten, Esra

    2012-01-01

    Multi-channel systems appear in several fields of application in science. In the Synthetic Aperture Radar (SAR) context, multi-channel systems may refer to different domains, as multi-polarization, multi-interferometric or multi-temporal data, or even a combination of them. Due to the inherent speckle phenomenon present in SAR images, the statistical description of the data is almost mandatory for its utilization. The complex images acquired over natural media present in general zero-mean circular Gaussian characteristics. In this case, second order statistics as the multi-channel covariance matrix fully describe the data. For practical situations however, the covariance matrix has to be estimated using a limited number of samples, and this sample covariance matrix follow the complex Wishart distribution. In this context, the eigendecomposition of the multi-channel covariance matrix has been shown in different areas of high relevance regarding the physical properties of the imaged scene. Specifically, the maximum eigenvalue of the covariance matrix has been frequently used in different applications as target or change detection, estimation of the dominant scattering mechanism in polarimetric data, moving target indication, etc. In this paper, the statistical behavior of the maximum eigenvalue derived from the eigendecomposition of the sample multi-channel covariance matrix in terms of multi-channel SAR images is simplified for SAR community. Validation is performed against simulated data and examples of estimation and detection problems using the analytical expressions are as well given. PMID:22736976

  11. A trade-off solution between model resolution and covariance in surface-wave inversion

    USGS Publications Warehouse

    Xia, J.; Xu, Y.; Miller, R.D.; Zeng, C.

    2010-01-01

    Regularization is necessary for inversion of ill-posed geophysical problems. Appraisal of inverse models is essential for meaningful interpretation of these models. Because uncertainties are associated with regularization parameters, extra conditions are usually required to determine proper parameters for assessing inverse models. Commonly used techniques for assessment of a geophysical inverse model derived (generally iteratively) from a linear system are based on calculating the model resolution and the model covariance matrices. Because the model resolution and the model covariance matrices of the regularized solutions are controlled by the regularization parameter, direct assessment of inverse models using only the covariance matrix may provide incorrect results. To assess an inverted model, we use the concept of a trade-off between model resolution and covariance to find a proper regularization parameter with singular values calculated in the last iteration. We plot the singular values from large to small to form a singular value plot. A proper regularization parameter is normally the first singular value that approaches zero in the plot. With this regularization parameter, we obtain a trade-off solution between model resolution and model covariance in the vicinity of a regularized solution. The unit covariance matrix can then be used to calculate error bars of the inverse model at a resolution level determined by the regularization parameter. We demonstrate this approach with both synthetic and real surface-wave data. ?? 2010 Birkh??user / Springer Basel AG.

  12. Machine learning for predicting soil classes in three semi-arid landscapes

    USGS Publications Warehouse

    Brungard, Colby W.; Boettinger, Janis L.; Duniway, Michael C.; Wills, Skye A.; Edwards, Thomas C.

    2015-01-01

    Mapping the spatial distribution of soil taxonomic classes is important for informing soil use and management decisions. Digital soil mapping (DSM) can quantitatively predict the spatial distribution of soil taxonomic classes. Key components of DSM are the method and the set of environmental covariates used to predict soil classes. Machine learning is a general term for a broad set of statistical modeling techniques. Many different machine learning models have been applied in the literature and there are different approaches for selecting covariates for DSM. However, there is little guidance as to which, if any, machine learning model and covariate set might be optimal for predicting soil classes across different landscapes. Our objective was to compare multiple machine learning models and covariate sets for predicting soil taxonomic classes at three geographically distinct areas in the semi-arid western United States of America (southern New Mexico, southwestern Utah, and northeastern Wyoming). All three areas were the focus of digital soil mapping studies. Sampling sites at each study area were selected using conditioned Latin hypercube sampling (cLHS). We compared models that had been used in other DSM studies, including clustering algorithms, discriminant analysis, multinomial logistic regression, neural networks, tree based methods, and support vector machine classifiers. Tested machine learning models were divided into three groups based on model complexity: simple, moderate, and complex. We also compared environmental covariates derived from digital elevation models and Landsat imagery that were divided into three different sets: 1) covariates selected a priori by soil scientists familiar with each area and used as input into cLHS, 2) the covariates in set 1 plus 113 additional covariates, and 3) covariates selected using recursive feature elimination. Overall, complex models were consistently more accurate than simple or moderately complex models. Random forests (RF) using covariates selected via recursive feature elimination was consistently the most accurate, or was among the most accurate, classifiers between study areas and between covariate sets within each study area. We recommend that for soil taxonomic class prediction, complex models and covariates selected by recursive feature elimination be used. Overall classification accuracy in each study area was largely dependent upon the number of soil taxonomic classes and the frequency distribution of pedon observations between taxonomic classes. Individual subgroup class accuracy was generally dependent upon the number of soil pedon observations in each taxonomic class. The number of soil classes is related to the inherent variability of a given area. The imbalance of soil pedon observations between classes is likely related to cLHS. Imbalanced frequency distributions of soil pedon observations between classes must be addressed to improve model accuracy. Solutions include increasing the number of soil pedon observations in classes with few observations or decreasing the number of classes. Spatial predictions using the most accurate models generally agree with expected soil–landscape relationships. Spatial prediction uncertainty was lowest in areas of relatively low relief for each study area.

  13. Major Decision Points in Library Automation

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    1970-01-01

    Based on a longer, more detailed paper prepared for the 1970 Midwinter Meeting of the Association of Research Libraries, this article discussion auutomation in the context of the management, facilities and system requirements for large research libraries. (Author/NH)

  14. A lanthipeptide library used to identify a protein-protein interaction inhibitor.

    PubMed

    Yang, Xiao; Lennard, Katherine R; He, Chang; Walker, Mark C; Ball, Andrew T; Doigneaux, Cyrielle; Tavassoli, Ali; van der Donk, Wilfred A

    2018-04-01

    In this article we describe the production and screening of a genetically encoded library of 10 6 lanthipeptides in Escherichia coli using the substrate-tolerant lanthipeptide synthetase ProcM. This plasmid-encoded library was combined with a bacterial reverse two-hybrid system for the interaction of the HIV p6 protein with the UEV domain of the human TSG101 protein, which is a critical protein-protein interaction for HIV budding from infected cells. Using this approach, we identified an inhibitor of this interaction from the lanthipeptide library, whose activity was verified in vitro and in cell-based virus-like particle-budding assays. Given the variety of lanthipeptide backbone scaffolds that may be produced with ProcM, this method may be used for the generation of genetically encoded libraries of natural product-like lanthipeptides containing substantial structural diversity. Such libraries may be combined with any cell-based assay to identify lanthipeptides with new biological activities.

  15. The Value of Library and Information Services in Nursing and Patient Care.

    PubMed

    Gard Marshall, Joanne; Morgan, Jennifer; Klem, Mary Lou; Thompson, Cheryl; Wells, Amber

    2014-08-18

    Libraries are a primary resource for evidence-based practice. This study, using a critical incident survey administered to 6,788 nurses at 118 hospitals, sought to explore the influence of nurses' use of library resources on both nursing and patient outcomes. In this article, the authors describe the background events motivating this study, the survey methods used, and the study results. They also discuss their findings, noting that use of library resources showed consistently positive relationships with changing advice given to patients, handling patient care differently, avoiding adverse events, and saving time. The authors discuss the study limitations and conclude that the availability and use of library and information resources and services had a positive impact on nursing and patient outcomes, and that nurse managers play an important role both by encouraging nurses to use evidence-based library resources and services and by supporting the availability of these resources in healthcare settings.

  16. Real-time implementation of optimized maximum noise fraction transform for feature extraction of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Zhao, Haina; Li, Jun

    2014-01-01

    We present a parallel implementation of the optimized maximum noise fraction (G-OMNF) transform algorithm for feature extraction of hyperspectral images on commodity graphics processing units (GPUs). The proposed approach explored the algorithm data-level concurrency and optimized the computing flow. We first defined a three-dimensional grid, in which each thread calculates a sub-block data to easily facilitate the spatial and spectral neighborhood data searches in noise estimation, which is one of the most important steps involved in OMNF. Then, we optimized the processing flow and computed the noise covariance matrix before computing the image covariance matrix to reduce the original hyperspectral image data transmission. These optimization strategies can greatly improve the computing efficiency and can be applied to other feature extraction algorithms. The proposed parallel feature extraction algorithm was implemented on an Nvidia Tesla GPU using the compute unified device architecture and basic linear algebra subroutines library. Through the experiments on several real hyperspectral images, our GPU parallel implementation provides a significant speedup of the algorithm compared with the CPU implementation, especially for highly data parallelizable and arithmetically intensive algorithm parts, such as noise estimation. In order to further evaluate the effectiveness of G-OMNF, we used two different applications: spectral unmixing and classification for evaluation. Considering the sensor scanning rate and the data acquisition time, the proposed parallel implementation met the on-board real-time feature extraction.

  17. Survival analysis with functional covariates for partial follow-up studies.

    PubMed

    Fang, Hong-Bin; Wu, Tong Tong; Rapoport, Aaron P; Tan, Ming

    2016-12-01

    Predictive or prognostic analysis plays an increasingly important role in the era of personalized medicine to identify subsets of patients whom the treatment may benefit the most. Although various time-dependent covariate models are available, such models require that covariates be followed in the whole follow-up period. This article studies a new class of functional survival models where the covariates are only monitored in a time interval that is shorter than the whole follow-up period. This paper is motivated by the analysis of a longitudinal study on advanced myeloma patients who received stem cell transplants and T cell infusions after the transplants. The absolute lymphocyte cell counts were collected serially during hospitalization. Those patients are still followed up if they are alive after hospitalization, while their absolute lymphocyte cell counts cannot be measured after that. Another complication is that absolute lymphocyte cell counts are sparsely and irregularly measured. The conventional method using Cox model with time-varying covariates is not applicable because of the different lengths of observation periods. Analysis based on each single observation obviously underutilizes available information and, more seriously, may yield misleading results. This so-called partial follow-up study design represents increasingly common predictive modeling problem where we have serial multiple biomarkers up to a certain time point, which is shorter than the total length of follow-up. We therefore propose a solution to the partial follow-up design. The new method combines functional principal components analysis and survival analysis with selection of those functional covariates. It also has the advantage of handling sparse and irregularly measured longitudinal observations of covariates and measurement errors. Our analysis based on functional principal components reveals that it is the patterns of the trajectories of absolute lymphocyte cell counts, instead of the actual counts, that affect patient's disease-free survival time. © The Author(s) 2014.

  18. The Role of the Library Media Specialist in Standards-Based Learning.

    ERIC Educational Resources Information Center

    Corey, Linda

    2002-01-01

    Discusses the role of the school library media specialist in standards-based learning. Topics include standards-based assessment; information literacy standards; collaboration with classroom teachers; benchmarks and indicators for student performance; leadership in a standards-based climate; and the use of technology to support curriculum and…

  19. Factors Affecting the Mental Development of Very Low Birthweight Infants: An Evaluation Based Primarily on Covariance Structure Analysis.

    ERIC Educational Resources Information Center

    Honjo, Shuji; And Others

    1998-01-01

    Evaluated statistically the effect of intranatal and early postnatal period factors on mental development of very low-birth-weight infants. Covariance structure analysis revealed direct influence of birth weight and gestational age in weeks on mental development at age 1, and of opthalmological aberrations and respirator disorder on mental…

  20. Technical Report of: Assessing Teacher Preparation Program Effectiveness--A Pilot Examination of Value Added Approaches

    ERIC Educational Resources Information Center

    Noell, George H.

    2004-01-01

    A preliminary set of analyses was conducted linking students to courses and courses to teachers based upon data collected by the Louisiana Department of Education's Divisions of Planning, Analysis, and Information Resources and Student Standards and Assessments. An analysis of covariance, a weighted analysis of covariance, and a hierarchical…

  1. Eigenvalue-eigenvector decomposition (EED) analysis of dissimilarity and covariance matrix obtained from total synchronous fluorescence spectral (TSFS) data sets of herbal preparations: Optimizing the classification approach.

    PubMed

    Tarai, Madhumita; Kumar, Keshav; Divya, O; Bairi, Partha; Mishra, Kishor Kumar; Mishra, Ashok Kumar

    2017-09-05

    The present work compares the dissimilarity and covariance based unsupervised chemometric classification approaches by taking the total synchronous fluorescence spectroscopy data sets acquired for the cumin and non-cumin based herbal preparations. The conventional decomposition method involves eigenvalue-eigenvector analysis of the covariance of the data set and finds the factors that can explain the overall major sources of variation present in the data set. The conventional approach does this irrespective of the fact that the samples belong to intrinsically different groups and hence leads to poor class separation. The present work shows that classification of such samples can be optimized by performing the eigenvalue-eigenvector decomposition on the pair-wise dissimilarity matrix. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. The use of quality benchmarking in assessing web resources for the dermatology virtual branch library of the National electronic Library for Health (NeLH).

    PubMed

    Kamel Boulos, M N; Roudsari, A V; Gordon, C; Muir Gray, J A

    2001-01-01

    In 1998, the U.K. National Health Service Information for Health Strategy proposed the implementation of a National electronic Library for Health to provide clinicians, healthcare managers and planners, patients and the public with easy, round the clock access to high quality, up-to-date electronic information on health and healthcare. The Virtual Branch Libraries are among the most important components of the National electronic Library for Health. They aim at creating online knowledge based communities, each concerned with some specific clinical and other health-related topics. This study is about the envisaged Dermatology Virtual Branch Libraries of the National electronic Library for Health. It aims at selecting suitable dermatology Web resources for inclusion in the forthcoming Virtual Branch Libraries after establishing preliminary quality benchmarking rules for this task. Psoriasis, being a common dermatological condition, has been chosen as a starting point. Because quality is a principal concern of the National electronic Library for Health, the study includes a review of the major quality benchmarking systems available today for assessing health-related Web sites. The methodology of developing a quality benchmarking system has been also reviewed. Aided by metasearch Web tools, candidate resources were hand-selected in light of the reviewed benchmarking systems and specific criteria set by the authors. Over 90 professional and patient-oriented Web resources on psoriasis and dermatology in general are suggested for inclusion in the forthcoming Dermatology Virtual Branch Libraries. The idea of an all-in knowledge-hallmarking instrument for the National electronic Library for Health is also proposed based on the reviewed quality benchmarking systems. Skilled, methodical, organized human reviewing, selection and filtering based on well-defined quality appraisal criteria seems likely to be the key ingredient in the envisaged National electronic Library for Health service. Furthermore, by promoting the application of agreed quality guidelines and codes of ethics by all health information providers and not just within the National electronic Library for Health, the overall quality of the Web will improve with time and the Web will ultimately become a reliable and integral part of the care space.

  3. Academic health sciences library Website navigation: an analysis of forty-one Websites and their navigation tools

    PubMed Central

    Brower, Stewart M.

    2004-01-01

    Background: The analysis included forty-one academic health sciences library (HSL) Websites as captured in the first two weeks of January 2001. Home pages and persistent navigational tools (PNTs) were analyzed for layout, technology, and links, and other general site metrics were taken. Methods: Websites were selected based on rank in the National Network of Libraries of Medicine, with regional and resource libraries given preference on the basis that these libraries are recognized as leaders in their regions and would be the most reasonable source of standards for best practice. A three-page evaluation tool was developed based on previous similar studies. All forty-one sites were evaluated in four specific areas: library general information, Website aids and tools, library services, and electronic resources. Metrics taken for electronic resources included orientation of bibliographic databases alphabetically by title or by subject area and with links to specifically named databases. Results: Based on the results, a formula for determining obligatory links was developed, listing items that should appear on all academic HSL Web home pages and PNTs. Conclusions: These obligatory links demonstrate a series of best practices that may be followed in the design and construction of academic HSL Websites. PMID:15494756

  4. Nursing staff connect libraries with improving patient care but not with achieving organisational objectives: a grounded theory approach.

    PubMed

    Chamberlain, David; Brook, Richard

    2014-03-01

    Health organisations are often driven by specific targets defined by mission statements, aims and objectives to improve patient care. Health libraries need to demonstrate that they contribute to organisational objectives, but it is not clear how nurses view that contribution. To investigate ward nursing staff motivations, their awareness of ward and organisational objectives; and their attitudes towards the contribution of health library services to improving patient care. Qualitative research using focus group data was combined with content analysis of literature evidence and library statistics (quantitative data). Data were analysed using thematic coding, divided into five group themes: understanding of Trust, Ward and Personal objectives, use of Library, use of other information sources, quality and Issues. Four basic social-psychological processes were then developed. Behaviour indicates low awareness of organisational objectives despite patient-centric motivation. High awareness of library services is shown with some connection made by ward staff between improved knowledge and improved patient care. There was a two-tiered understanding of ward objectives and library services, based on level of seniority. However, evidence-based culture needs to be intrinsic in the organisation before all staff benefit. Libraries can actively engage in this at ward and board level and improve patient care by supporting organisational objectives. © 2014 The author. Health Information and Libraries Journal © 2014 Health Libraries Group.

  5. Structure-Based Virtual Screening of Commercially Available Compound Libraries.

    PubMed

    Kireev, Dmitri

    2016-01-01

    Virtual screening (VS) is an efficient hit-finding tool. Its distinctive strength is that it allows one to screen compound libraries that are not available in the lab. Moreover, structure-based (SB) VS also enables an understanding of how the hit compounds bind the protein target, thus laying ground work for the rational hit-to-lead progression. SBVS requires a very limited experimental effort and is particularly well suited for academic labs and small biotech companies that, unlike pharmaceutical companies, do not have physical access to quality small-molecule libraries. Here, we describe SBVS of commercial compound libraries for Mer kinase inhibitors. The screening protocol relies on the docking algorithm Glide complemented by a post-docking filter based on structural protein-ligand interaction fingerprints (SPLIF).

  6. Differential Age-Related Changes in Structural Covariance Networks of Human Anterior and Posterior Hippocampus.

    PubMed

    Li, Xinwei; Li, Qiongling; Wang, Xuetong; Li, Deyu; Li, Shuyu

    2018-01-01

    The hippocampus plays an important role in memory function relying on information interaction between distributed brain areas. The hippocampus can be divided into the anterior and posterior sections with different structure and function along its long axis. The aim of this study is to investigate the effects of normal aging on the structural covariance of the anterior hippocampus (aHPC) and the posterior hippocampus (pHPC). In this study, 240 healthy subjects aged 18-89 years were selected and subdivided into young (18-23 years), middle-aged (30-58 years), and older (61-89 years) groups. The aHPC and pHPC was divided based on the location of uncal apex in the MNI space. Then, the structural covariance networks were constructed by examining their covariance in gray matter volumes with other brain regions. Finally, the influence of age on the structural covariance of these hippocampal sections was explored. We found that the aHPC and pHPC had different structural covariance patterns, but both of them were associated with the medial temporal lobe and insula. Moreover, both increased and decreased covariances were found with the aHPC but only increased covariance was found with the pHPC with age ( p < 0.05, family-wise error corrected). These decreased connections occurred within the default mode network, while the increased connectivity mainly occurred in other memory systems that differ from the hippocampus. This study reveals different age-related influence on the structural networks of the aHPC and pHPC, providing an essential insight into the mechanisms of the hippocampus in normal aging.

  7. ENDF/B-VII.1 Neutron Cross Section Data Testing with Critical Assembly Benchmarks and Reactor Experiments

    NASA Astrophysics Data System (ADS)

    Kahler, A. C.; MacFarlane, R. E.; Mosteller, R. D.; Kiedrowski, B. C.; Frankle, S. C.; Chadwick, M. B.; McKnight, R. D.; Lell, R. M.; Palmiotti, G.; Hiruta, H.; Herman, M.; Arcilla, R.; Mughabghab, S. F.; Sublet, J. C.; Trkov, A.; Trumbull, T. H.; Dunn, M.

    2011-12-01

    The ENDF/B-VII.1 library is the latest revision to the United States' Evaluated Nuclear Data File (ENDF). The ENDF library is currently in its seventh generation, with ENDF/B-VII.0 being released in 2006. This revision expands upon that library, including the addition of new evaluated files (was 393 neutron files previously, now 423 including replacement of elemental vanadium and zinc evaluations with isotopic evaluations) and extension or updating of many existing neutron data files. Complete details are provided in the companion paper [M. B. Chadwick et al., "ENDF/B-VII.1 Nuclear Data for Science and Technology: Cross Sections, Covariances, Fission Product Yields and Decay Data," Nuclear Data Sheets, 112, 2887 (2011)]. This paper focuses on how accurately application libraries may be expected to perform in criticality calculations with these data. Continuous energy cross section libraries, suitable for use with the MCNP Monte Carlo transport code, have been generated and applied to a suite of nearly one thousand critical benchmark assemblies defined in the International Criticality Safety Benchmark Evaluation Project's International Handbook of Evaluated Criticality Safety Benchmark Experiments. This suite covers uranium and plutonium fuel systems in a variety of forms such as metallic, oxide or solution, and under a variety of spectral conditions, including unmoderated (i.e., bare), metal reflected and water or other light element reflected. Assembly eigenvalues that were accurately predicted with ENDF/B-VII.0 cross sections such as unmoderated and uranium reflected 235U and 239Pu assemblies, HEU solution systems and LEU oxide lattice systems that mimic commercial PWR configurations continue to be accurately calculated with ENDF/B-VII.1 cross sections, and deficiencies in predicted eigenvalues for assemblies containing selected materials, including titanium, manganese, cadmium and tungsten are greatly reduced. Improvements are also confirmed for selected actinide reaction rates such as 236U, 238,242Pu and 241,243Am capture in fast systems. Other deficiencies, such as the overprediction of Pu solution system critical eigenvalues and a decreasing trend in calculated eigenvalue for 233U fueled systems as a function of Above-Thermal Fission Fraction remain. The comprehensive nature of this critical benchmark suite and the generally accurate calculated eigenvalues obtained with ENDF/B-VII.1 neutron cross sections support the conclusion that this is the most accurate general purpose ENDF/B cross section library yet released to the technical community.

  8. Availability and accessibility of evidence-based information resources provided by medical libraries in Australia.

    PubMed

    Ritchie, A; Sowter, B

    2000-01-01

    This article reports on the results of an exploratory survey of the availability and accessibility of evidence-based information resources provided by medical libraries in Australia. Although barriers impede access to evidence-based information for hospital clinicians, the survey revealed that Medline and Cinahl are available in over 90% of facilities. In most cases they are widely accessible via internal networks and the Internet. The Cochrane Library is available in 69% of cases. The Internet is widely accessible and most libraries provide access to some full-text, electronic journals. Strategies for overcoming restrictions and integrating information resources with clinical workflow are being pursued. State, regional and national public and private consortia are developing agreements utilising on-line technology. These could produce cost savings and more equitable access to a greater range of evidence-based resources.

  9. The Weakest Link: Library Catalogs.

    ERIC Educational Resources Information Center

    Young, Terrence E., Jr.

    2002-01-01

    Describes methods of correcting MARC records in online public access catalogs in school libraries. Highlights include in-house methods; professional resources; conforming to library cataloging standards; vendor services, including Web-based services; software specifically developed for record cleanup; and outsourcing. (LRW)

  10. Evaluating the Effect of Web-Based Iranian Diabetic Personal Health Record App on Self-Care Status and Clinical Indicators: Randomized Controlled Trial.

    PubMed

    Azizi, Amirabbas; Aboutorabi, Robab; Mazloum-Khorasani, Zahra; Afzal-Aghaea, Monavar; Tabesh, Hamed; Tara, Mahmood

    2016-10-21

    There are 4 main types of chronic or noncommunicable diseases. Of these, diabetes is one of the major therapeutic concerns globally. Moreover, Iran is among the countries with the highest incidence of diabetic patients. Furthermore, library-based studies by researchers have shown that thus far no study has been carried out to evaluate the relationship between Web-based diabetic personal health records (DPHR) and self-care indicators in Iran. The objective of this study is to examine the effect of Web-based DPHR on self-care status of diabetic patients in an intervention group as compared with a control group. The effect of DPHR on self-care was assessed by using a randomized controlled trial (RCT) protocol for a 2-arm parallel group with a 1:1 allocation ratio. During a 4-month trial period, the control group benefited from the routine care; the intervention group additionally had access to the Web-based DPHR app besides routine care. During the trial, 2 time points at baseline and postintervention were used to evaluate the impact of the DPHR app. A sample size of 72 people was randomly and equally assigned to both the control and intervention groups. The primary outcome measure was the self-care status of the participants. Test results showed that the self-care status in the intervention group in comparison with the control group had a significant difference. In addition, the dimensions of self-care, including normal values, changes trend, the last measured value, and the last time measured values had a significant difference while other dimensions had no significant difference. Furthermore, we found no correlation between Web-based DPHR system and covariates, including scores of weight, glycated hemoglobin (HbA1c), serum creatinine, high-density lipoprotein (HDL), low-density lipoprotein (LDL), total cholesterol, and planned visit adherence, as well as the change trend of mean for blood glucose and blood pressure. We found that as a result of the Web-based DPHR app, the self-care scores in the intervention group were significantly higher than those of the control group. In total, we found no correlation between the Web-based DPHR app and covariates, including planned visit adherence, HbA1c, serum creatinine, HDL, LDL, total cholesterol, weight, and the change trend of mean for blood glucose and blood pressure. Iranian Registry of Clinical Trials (IRCT): 2013082914522N1; http://www.irct.ir/searchresult.php?id= 14522&number=1 (Archived by WebCite at http://www.webcitation.org/6cC4PCcau).

  11. Dominant genetics using a yeast genomic library under the control of a strong inducible promoter.

    PubMed

    Ramer, S W; Elledge, S J; Davis, R W

    1992-12-01

    In Saccharomyces cerevisiae, numerous genes have been identified by selection from high-copy-number libraries based on "multicopy suppression" or other phenotypic consequences of overexpression. Although fruitful, this approach suffers from two major drawbacks. First, high copy number alone may not permit high-level expression of tightly regulated genes. Conversely, other genes expressed in proportion to dosage cannot be identified if their products are toxic at elevated levels. This work reports construction of a genomic DNA expression library for S. cerevisiae that circumvents both limitations by fusing randomly sheared genomic DNA to the strong, inducible yeast GAL1 promoter, which can be regulated by carbon source. The library obtained contains 5 x 10(7) independent recombinants, representing a breakpoint at every base in the yeast genome. This library was used to examine aberrant gene expression in S. cerevisiae. A screen for dominant activators of yeast mating response identified eight genes that activate the pathway in the absence of exogenous mating pheromone, including one previously unidentified gene. One activator was a truncated STE11 gene lacking approximately 1000 base pairs of amino-terminal coding sequence. In two different clones, the same GAL1 promoter-proximal ATG is in-frame with the coding sequence of STE11, suggesting that internal initiation of translation there results in production of a biologically active, truncated STE11 protein. Thus this library allows isolation based on dominant phenotypes of genes that might have been difficult or impossible to isolate from high-copy-number libraries.

  12. McGill Library Makes E-Books Portable: E-Reader Loan Service in a Canadian Academic Library

    ERIC Educational Resources Information Center

    Savova, Maria; Garsia, Matthew

    2012-01-01

    E-readers are increasingly popular personal devices, but can they be effectively used for the needs of academic libraries' clients? This paper employs an evidence-based approach that examines the role and efficacy of implementing an E-reader Loan Service at McGill University Library. Suggestions are offered as to what lending model and device…

  13. A Prototype System for a Computer-Based Statewide Film Library Network: A Model for Operation. Final Report.

    ERIC Educational Resources Information Center

    Bidwell, Charles M.; Auricchio, Dominick

    The project set out to establish an operational film scheduling network to improve service to New York State teachers using 16mm educational films. The Network is designed to serve local libraries located in Boards of Cooperative Educational Services (BOCES), regional libraries, and a statewide Syracuse University Film Rental Library (SUFRL). The…

  14. What Students Want: Generation Y and the Changing Function of the Academic Library

    ERIC Educational Resources Information Center

    Gardner, Susan; Eng, Susanna

    2005-01-01

    This article presents the results of a 2003 undergraduate library user survey as a case study of Generation Y. Survey data support four main traits attributed to Generation Y, which are discussed within the context of library use and satisfaction. Implications for future directions in academic library services based on the new ways Generation Y…

  15. Evaluation of Library Utilization by Students Enrolled in External Degree Programme in University of Nairobi, Kenya

    ERIC Educational Resources Information Center

    Gor, Peter Ochieng

    2012-01-01

    With the increasing popularity of distance education, focus has turned to the role of libraries in distance learning process. It is widely agreed that like their campus-based counterparts, distance education learners need adequate library services if they are to gain quality education. This study sought to examine library utilization by students…

  16. The Impact of the Economic Downturn on Libraries: With Special Reference to University Libraries

    ERIC Educational Resources Information Center

    Nicholas, David; Rowlands, Ian; Jubb, Michael; Jamali, Hamid R.

    2010-01-01

    Evidence is presented of the extent to which libraries from around the world are experiencing financial hardship as a result of the world-wide economic downturn. Comparative analyses are provides on the grounds of country, sector and size of institution. The article concentrates on the situation of UK and US university libraries and is based on…

  17. Assignment Report on Library Banks in Health Institution in Indonesia (Draft).

    ERIC Educational Resources Information Center

    Urata, Takeo

    The medical library needs of Indonesia were surveyed and recommendations for improving the existing situation are made based on the results of the survey. The survey indicates that: (1) the library collections are out of date and inadequate, (2) there is a need for more and better trained medical librarians and library assistants; (3) there is no…

  18. A Library Service Center for Suburban Maryland County Library Systems, Anne Arundel, Baltimore, Montgomery, Prince George's; An Establishment Proposal.

    ERIC Educational Resources Information Center

    Duchac, Kenneth F.

    Based on a year of inquiry and consultation, this report of the Suburban Maryland Project confirms the feasibility of cooperative technical service functions for the four public library systems of suburban Maryland. It is recommended that the proposed Library Service Center be assigned the ordering, acquisition, cataloging, preparation for book…

  19. Suboptimal schemes for atmospheric data assimilation based on the Kalman filter

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Cohn, Stephen E.

    1994-01-01

    This work is directed toward approximating the evolution of forecast error covariances for data assimilation. The performance of different algorithms based on simplification of the standard Kalman filter (KF) is studied. These are suboptimal schemes (SOSs) when compared to the KF, which is optimal for linear problems with known statistics. The SOSs considered here are several versions of optimal interpolation (OI), a scheme for height error variance advection, and a simplified KF in which the full height error covariance is advected. To employ a methodology for exact comparison among these schemes, a linear environment is maintained, in which a beta-plane shallow-water model linearized about a constant zonal flow is chosen for the test-bed dynamics. The results show that constructing dynamically balanced forecast error covariances rather than using conventional geostrophically balanced ones is essential for successful performance of any SOS. A posteriori initialization of SOSs to compensate for model - data imbalance sometimes results in poor performance. Instead, properly constructed dynamically balanced forecast error covariances eliminate the need for initialization. When the SOSs studied here make use of dynamically balanced forecast error covariances, the difference among their performances progresses naturally from conventional OI to the KF. In fact, the results suggest that even modest enhancements of OI, such as including an approximate dynamical equation for height error variances while leaving height error correlation structure homogeneous, go a long way toward achieving the performance of the KF, provided that dynamically balanced cross-covariances are constructed and that model errors are accounted for properly. The results indicate that such enhancements are necessary if unconventional data are to have a positive impact.

  20. Open resource metagenomics: a model for sharing metagenomic libraries.

    PubMed

    Neufeld, J D; Engel, K; Cheng, J; Moreno-Hagelsieb, G; Rose, D R; Charles, T C

    2011-11-30

    Both sequence-based and activity-based exploitation of environmental DNA have provided unprecedented access to the genomic content of cultivated and uncultivated microorganisms. Although researchers deposit microbial strains in culture collections and DNA sequences in databases, activity-based metagenomic studies typically only publish sequences from the hits retrieved from specific screens. Physical metagenomic libraries, conceptually similar to entire sequence datasets, are usually not straightforward to obtain by interested parties subsequent to publication. In order to facilitate unrestricted distribution of metagenomic libraries, we propose the adoption of open resource metagenomics, in line with the trend towards open access publishing, and similar to culture- and mutant-strain collections that have been the backbone of traditional microbiology and microbial genetics. The concept of open resource metagenomics includes preparation of physical DNA libraries, preferably in versatile vectors that facilitate screening in a diversity of host organisms, and pooling of clones so that single aliquots containing complete libraries can be easily distributed upon request. Database deposition of associated metadata and sequence data for each library provides researchers with information to select the most appropriate libraries for further research projects. As a starting point, we have established the Canadian MetaMicroBiome Library (CM(2)BL [1]). The CM(2)BL is a publicly accessible collection of cosmid libraries containing environmental DNA from soils collected from across Canada, spanning multiple biomes. The libraries were constructed such that the cloned DNA can be easily transferred to Gateway® compliant vectors, facilitating functional screening in virtually any surrogate microbial host for which there are available plasmid vectors. The libraries, which we are placing in the public domain, will be distributed upon request without restriction to members of both the academic research community and industry. This article invites the scientific community to adopt this philosophy of open resource metagenomics to extend the utility of functional metagenomics beyond initial publication, circumventing the need to start from scratch with each new research project.

  1. Open resource metagenomics: a model for sharing metagenomic libraries

    PubMed Central

    Neufeld, J.D.; Engel, K.; Cheng, J.; Moreno-Hagelsieb, G.; Rose, D.R.; Charles, T.C.

    2011-01-01

    Both sequence-based and activity-based exploitation of environmental DNA have provided unprecedented access to the genomic content of cultivated and uncultivated microorganisms. Although researchers deposit microbial strains in culture collections and DNA sequences in databases, activity-based metagenomic studies typically only publish sequences from the hits retrieved from specific screens. Physical metagenomic libraries, conceptually similar to entire sequence datasets, are usually not straightforward to obtain by interested parties subsequent to publication. In order to facilitate unrestricted distribution of metagenomic libraries, we propose the adoption of open resource metagenomics, in line with the trend towards open access publishing, and similar to culture- and mutant-strain collections that have been the backbone of traditional microbiology and microbial genetics. The concept of open resource metagenomics includes preparation of physical DNA libraries, preferably in versatile vectors that facilitate screening in a diversity of host organisms, and pooling of clones so that single aliquots containing complete libraries can be easily distributed upon request. Database deposition of associated metadata and sequence data for each library provides researchers with information to select the most appropriate libraries for further research projects. As a starting point, we have established the Canadian MetaMicroBiome Library (CM2BL [1]). The CM2BL is a publicly accessible collection of cosmid libraries containing environmental DNA from soils collected from across Canada, spanning multiple biomes. The libraries were constructed such that the cloned DNA can be easily transferred to Gateway® compliant vectors, facilitating functional screening in virtually any surrogate microbial host for which there are available plasmid vectors. The libraries, which we are placing in the public domain, will be distributed upon request without restriction to members of both the academic research community and industry. This article invites the scientific community to adopt this philosophy of open resource metagenomics to extend the utility of functional metagenomics beyond initial publication, circumventing the need to start from scratch with each new research project. PMID:22180823

  2. Spectrum-based method to generate good decoy libraries for spectral library searching in peptide identifications.

    PubMed

    Cheng, Chia-Ying; Tsai, Chia-Feng; Chen, Yu-Ju; Sung, Ting-Yi; Hsu, Wen-Lian

    2013-05-03

    As spectral library searching has received increasing attention for peptide identification, constructing good decoy spectra from the target spectra is the key to correctly estimating the false discovery rate in searching against the concatenated target-decoy spectral library. Several methods have been proposed to construct decoy spectral libraries. Most of them construct decoy peptide sequences and then generate theoretical spectra accordingly. In this paper, we propose a method, called precursor-swap, which directly constructs decoy spectral libraries directly at the "spectrum level" without generating decoy peptide sequences by swapping the precursors of two spectra selected according to a very simple rule. Our spectrum-based method does not require additional efforts to deal with ion types (e.g., a, b or c ions), fragment mechanism (e.g., CID, or ETD), or unannotated peaks, but preserves many spectral properties. The precursor-swap method is evaluated on different spectral libraries and the results of obtained decoy ratios show that it is comparable to other methods. Notably, it is efficient in time and memory usage for constructing decoy libraries. A software tool called Precursor-Swap-Decoy-Generation (PSDG) is publicly available for download at http://ms.iis.sinica.edu.tw/PSDG/.

  3. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size.

  4. Network-Based Detection and Classification of Seismovolcanic Tremors: Example From the Klyuchevskoy Volcanic Group in Kamchatka

    NASA Astrophysics Data System (ADS)

    Soubestre, Jean; Shapiro, Nikolai M.; Seydoux, Léonard; de Rosny, Julien; Droznin, Dmitry V.; Droznina, Svetlana Ya.; Senyukov, Sergey L.; Gordeev, Evgeniy I.

    2018-01-01

    We develop a network-based method for detecting and classifying seismovolcanic tremors. The proposed approach exploits the coherence of tremor signals across the network that is estimated from the array covariance matrix. The method is applied to four and a half years of continuous seismic data recorded by 19 permanent seismic stations in the vicinity of the Klyuchevskoy volcanic group in Kamchatka (Russia), where five volcanoes were erupting during the considered time period. We compute and analyze daily covariance matrices together with their eigenvalues and eigenvectors. As a first step, most coherent signals corresponding to dominating tremor sources are detected based on the width of the covariance matrix eigenvalues distribution. Thus, volcanic tremors of the two volcanoes known as most active during the considered period, Klyuchevskoy and Tolbachik, are efficiently detected. As a next step, we consider the daily array covariance matrix's first eigenvector. Our main hypothesis is that these eigenvectors represent the principal components of the daily seismic wavefield and, for days with tremor activity, characterize dominant tremor sources. Those daily first eigenvectors, which can be used as network-based fingerprints of tremor sources, are then grouped into clusters using correlation coefficient as a measure of the vector similarity. As a result, we identify seven clusters associated with different periods of activity of four volcanoes: Tolbachik, Klyuchevskoy, Shiveluch, and Kizimen. The developed method does not require a priori knowledge and is fully automatic; and the database of the network-based tremor fingerprints can be continuously enriched with newly available data.

  5. An Investigation of the Educational Needs of Health Sciences Library Manpower II. Health-Related Institutions and Their Library Resources *

    PubMed Central

    Rothenberg, Lesliebeth; Rees, Alan M.; Kronick, David A.

    1970-01-01

    As part of an investigation of health sciences library manpower, the universe of health-related institutions and programs (excluding hospitals) was surveyed by postcard questionnaire to produce an inventory and description of libraries providing services to these institutions and programs. Seventysix percent (5,215) of the institutions reported access to library resources, indicating usage of some 2,207 non-hospital libraries. Eighty percent (2,431) of the institutions reported that the library used was “within” their own institution; 20 percent (608) noted that the library was “outside” of their institution. The distribution of health-related institutions and libraries is shown by RML districts, together with relevant census data. A classification of libraries, based on the degree of involvement of the libraries' facilities, resources and personnel in supplying services to health-related institutions, was developed. It is concluded that projections of manpower needs should take into account institutions and programs not at present possessing health sciences libraries as well as documented demand in existing health sciences libraries. PMID:5496236

  6. A Balancing Act.

    ERIC Educational Resources Information Center

    Bilal, Dania; Barry, Jeff; Penniman, W. David

    1999-01-01

    Reviews automated-systems activities in libraries during the past year and profiles major vendors. Topics include new partnership arrangements driven by competition; library-systems revenues; Y2K issues; Windows-based interfaces; consulting; outsourcing; development trends; global system sales; and sales by type of library. (LRW)

  7. Measuring patrons' technology habits: an evidence-based approach to tailoring library services

    PubMed Central

    Wu, Jin; Chatfield, Amy J.; Hughes, Annie M.; Kysh, Lynn; Rosenbloom, Megan Curran

    2014-01-01

    Librarians continually integrate new technologies into library services for health sciences students. Recently published data are lacking about student ownership of technological devices, awareness of new technologies, and interest in using devices and technologies to interact with the library. A survey was implemented at seven health sciences libraries to help answer these questions. Results show that librarian assumptions about awareness of technologies are not supported, and student interest in using new technologies to interact with the library varies widely. Collecting this evidence provides useful information for successfully integrating technologies into library services. PMID:24860272

  8. Measuring patrons' technology habits: an evidence-based approach to tailoring library services.

    PubMed

    Wu, Jin; Chatfield, Amy J; Hughes, Annie M; Kysh, Lynn; Rosenbloom, Megan Curran

    2014-04-01

    Librarians continually integrate new technologies into library services for health sciences students. Recently published data are lacking about student ownership of technological devices, awareness of new technologies, and interest in using devices and technologies to interact with the library. A survey was implemented at seven health sciences libraries to help answer these questions. Results show that librarian assumptions about awareness of technologies are not supported, and student interest in using new technologies to interact with the library varies widely. Collecting this evidence provides useful information for successfully integrating technologies into library services.

  9. The AHEC library program and consortia development in California.

    PubMed

    Jensen, M A; Maddalena, B

    1986-07-01

    A brief history of the first Area Health Education Center (AHEC) Library Program in California is presented, with a description of methodology and results. The goals of this program were to develop and improve hospital library resources and services, to train hospital library personnel, and to promote resource sharing in a medically underserved area. The health sciences library consortium that evolved became a model for the ten other library consortia in the state. Based on AHEC's twelve years' experience with consortia, from 1973 to 1985, recommendations are made as to size, composition, leadership, outside funding, group participation, publicity, and linkages.

  10. The NEOUCOM Cooperative Cataloging Service: development and review of the first four years.

    PubMed Central

    Miller, D R

    1983-01-01

    The Basic Medical Sciences Library of the Northeastern Ohio Universities College of Medicine (NEOUCOM) provided a Cooperative Cataloging Service to fourteen of its affiliated hospitals' libraries since March 1978, using the OCLC system. Analysis of the first four years of service showed that the hospital libraries spent almost $30,000 to catalog more than 18,000 titles. Personnel expenses and other costs eclipsed the savings from a 31.3% duplication rate. Centralized bibliographic control control and the principal by-product of the service, a uniform, machine-related data base, provided the foundation for an on-line integrated library system to serve the consortium. The hospital libraries contributed 44% of the unique titles in this data base, which emphasis the need to share resources and continue cooperation. PMID:6860826

  11. The NEOUCOM Cooperative Cataloging Service: development and review of the first four years.

    PubMed

    Miller, D R

    1983-04-01

    The Basic Medical Sciences Library of the Northeastern Ohio Universities College of Medicine (NEOUCOM) provided a Cooperative Cataloging Service to fourteen of its affiliated hospitals' libraries since March 1978, using the OCLC system. Analysis of the first four years of service showed that the hospital libraries spent almost $30,000 to catalog more than 18,000 titles. Personnel expenses and other costs eclipsed the savings from a 31.3% duplication rate. Centralized bibliographic control control and the principal by-product of the service, a uniform, machine-related data base, provided the foundation for an on-line integrated library system to serve the consortium. The hospital libraries contributed 44% of the unique titles in this data base, which emphasis the need to share resources and continue cooperation.

  12. Design and Synthesis of Biaryl DNA-Encoded Libraries.

    PubMed

    Ding, Yun; Franklin, G Joseph; DeLorey, Jennifer L; Centrella, Paolo A; Mataruse, Sibongile; Clark, Matthew A; Skinner, Steven R; Belyanskaya, Svetlana

    2016-10-10

    DNA-encoded library technology (ELT) is a powerful tool for the discovery of new small-molecule ligands to various protein targets. Here we report the design and synthesis of biaryl DNA-encoded libraries based on the scaffold of 5-formyl 3-iodobenzoic acid. Three reactions on DNA template, acylation, Suzuki-Miyaura coupling and reductive amination, were applied in the library synthesis. The three cycle library of 3.5 million diversity has delivered potent hits for phosphoinositide 3-kinase α (PI3Kα).

  13. Constructing statistically unbiased cortical surface templates using feature-space covariance

    NASA Astrophysics Data System (ADS)

    Parvathaneni, Prasanna; Lyu, Ilwoo; Huo, Yuankai; Blaber, Justin; Hainline, Allison E.; Kang, Hakmook; Woodward, Neil D.; Landman, Bennett A.

    2018-03-01

    The choice of surface template plays an important role in cross-sectional subject analyses involving cortical brain surfaces because there is a tendency toward registration bias given variations in inter-individual and inter-group sulcal and gyral patterns. In order to account for the bias and spatial smoothing, we propose a feature-based unbiased average template surface. In contrast to prior approaches, we factor in the sample population covariance and assign weights based on feature information to minimize the influence of covariance in the sampled population. The mean surface is computed by applying the weights obtained from an inverse covariance matrix, which guarantees that multiple representations from similar groups (e.g., involving imaging, demographic, diagnosis information) are down-weighted to yield an unbiased mean in feature space. Results are validated by applying this approach in two different applications. For evaluation, the proposed unbiased weighted surface mean is compared with un-weighted means both qualitatively and quantitatively (mean squared error and absolute relative distance of both the means with baseline). In first application, we validated the stability of the proposed optimal mean on a scan-rescan reproducibility dataset by incrementally adding duplicate subjects. In the second application, we used clinical research data to evaluate the difference between the weighted and unweighted mean when different number of subjects were included in control versus schizophrenia groups. In both cases, the proposed method achieved greater stability that indicated reduced impacts of sampling bias. The weighted mean is built based on covariance information in feature space as opposed to spatial location, thus making this a generic approach to be applicable to any feature of interest.

  14. Early grey matter changes in structural covariance networks in Huntington's disease.

    PubMed

    Coppen, Emma M; van der Grond, Jeroen; Hafkemeijer, Anne; Rombouts, Serge A R B; Roos, Raymund A C

    2016-01-01

    Progressive subcortical changes are known to occur in Huntington's disease (HD), a hereditary neurodegenerative disorder. Less is known about the occurrence and cohesion of whole brain grey matter changes in HD. We aimed to detect network integrity changes in grey matter structural covariance networks and examined relationships with clinical assessments. Structural magnetic resonance imaging data of premanifest HD ( n  = 30), HD patients (n = 30) and controls (n = 30) was used to identify ten structural covariance networks based on a novel technique using the co-variation of grey matter with independent component analysis in FSL. Group differences were studied controlling for age and gender. To explore whether our approach is effective in examining grey matter changes, regional voxel-based analysis was additionally performed. Premanifest HD and HD patients showed decreased network integrity in two networks compared to controls. One network included the caudate nucleus, precuneous and anterior cingulate cortex (in HD p  < 0.001, in pre-HD p  = 0.003). One other network contained the hippocampus, premotor, sensorimotor, and insular cortices (in HD p  < 0.001, in pre-HD p  = 0.023). Additionally, in HD patients only, decreased network integrity was observed in a network including the lingual gyrus, intracalcarine, cuneal, and lateral occipital cortices ( p  = 0.032). Changes in network integrity were significantly associated with scores of motor and neuropsychological assessments. In premanifest HD, voxel-based analyses showed pronounced volume loss in the basal ganglia, but less prominent in cortical regions. Our results suggest that structural covariance might be a sensitive approach to reveal early grey matter changes, especially for premanifest HD.

  15. Study of continuous blood pressure estimation based on pulse transit time, heart rate and photoplethysmography-derived hemodynamic covariates.

    PubMed

    Feng, Jingjie; Huang, Zhongyi; Zhou, Congcong; Ye, Xuesong

    2018-06-01

    It is widely recognized that pulse transit time (PTT) can track blood pressure (BP) over short periods of time, and hemodynamic covariates such as heart rate, stiffness index may also contribute to BP monitoring. In this paper, we derived a proportional relationship between BP and PPT -2 and proposed an improved method adopting hemodynamic covariates in addition to PTT for continuous BP estimation. We divided 28 subjects from the Multi-parameter Intelligent Monitoring for Intensive Care database into two groups (with/without cardiovascular diseases) and utilized a machine learning strategy based on regularized linear regression (RLR) to construct BP models with different covariates for corresponding groups. RLR was performed for individuals as the initial calibration, while recursive least square algorithm was employed for the re-calibration. The results showed that errors of BP estimation by our method stayed within the Association of Advancement of Medical Instrumentation limits (- 0.98 ± 6.00 mmHg @ SBP, 0.02 ± 4.98 mmHg @ DBP) when the calibration interval extended to 1200-beat cardiac cycles. In comparison with other two representative studies, Chen's method kept accurate (0.32 ± 6.74 mmHg @ SBP, 0.94 ± 5.37 mmHg @ DBP) using a 400-beat calibration interval, while Poon's failed (- 1.97 ± 10.59 mmHg @ SBP, 0.70 ± 4.10 mmHg @ DBP) when using a 200-beat calibration interval. With additional hemodynamic covariates utilized, our method improved the accuracy of PTT-based BP estimation, decreased the calibration frequency and had the potential for better continuous BP estimation.

  16. Graphical User Interfaces and Library Systems: End-User Reactions.

    ERIC Educational Resources Information Center

    Zorn, Margaret; Marshall, Lucy

    1995-01-01

    Describes a study by Parke-Davis Pharmaceutical Research Library to determine user satisfaction with the graphical user interface-based (GUI) Dynix Marquis compared with the text-based Dynix Classic Online Public Access Catalog (OPAC). Results show that the GUI-based OPAC was preferred by endusers over the text-based OPAC. (eight references) (DGM)

  17. Current and future resources for functional metagenomics

    PubMed Central

    Lam, Kathy N.; Cheng, Jiujun; Engel, Katja; Neufeld, Josh D.; Charles, Trevor C.

    2015-01-01

    Functional metagenomics is a powerful experimental approach for studying gene function, starting from the extracted DNA of mixed microbial populations. A functional approach relies on the construction and screening of metagenomic libraries—physical libraries that contain DNA cloned from environmental metagenomes. The information obtained from functional metagenomics can help in future annotations of gene function and serve as a complement to sequence-based metagenomics. In this Perspective, we begin by summarizing the technical challenges of constructing metagenomic libraries and emphasize their value as resources. We then discuss libraries constructed using the popular cloning vector, pCC1FOS, and highlight the strengths and shortcomings of this system, alongside possible strategies to maximize existing pCC1FOS-based libraries by screening in diverse hosts. Finally, we discuss the known bias of libraries constructed from human gut and marine water samples, present results that suggest bias may also occur for soil libraries, and consider factors that bias metagenomic libraries in general. We anticipate that discussion of current resources and limitations will advance tools and technologies for functional metagenomics research. PMID:26579102

  18. An Investigation of the Educational Needs of Health Sciences Library Manpower: I. Definition of the Manpower Problem and Research Design*

    PubMed Central

    Kronick, David A.; Rees, Alan M.; Rothenberg, Lesliebeth

    1970-01-01

    In order to plan adequately for education in health science librarianship and to be able to project future demands and needs we need to know a great deal more about existing manpower in health science libraries. This paper, the first in a series of reports on an investigation to gather this data, discusses the research methodology and the development of an inventory of the institution-program population upon which the survey is based. An analysis in terms of geographic location, type (educational, research, etc.), administrative control, and primary cognate area of these institutions is presented, and their distribution through the various Regional Medical Library areas is noted. Preliminary estimates are made, based on a questionnaire to the libraries, on the size of the library population, their relationship to reporting programs or institutions, exclusive of the hospital population which is being covered in an independent survey. A questionnaire to library personnel is underway which will establish, along with the other questionnaires, a basis for exploring the relationships which exist between institutions or programs, libraries and manpower. PMID:5411708

  19. Linking Formal and Informal Science Education: A Successful Model using Libraries, Volunteers and NASA Resources

    NASA Astrophysics Data System (ADS)

    Race, M. S.; Lafayette Library; Learning Center Foundation (Lllcf)

    2011-12-01

    In these times of budget cuts, tight school schedules, and limited opportunities for student field trips and teacher professional development, it is especially difficult to expose elementary and middle school students to the latest STEM information-particularly in the space sciences. Using our library as a facilitator and catalyst, we built a volunteer-based, multi-faceted, curriculum-linked program for students and teachers in local middle schools (Grade 8) and showcased new astronomical and planetary science information using mainly NASA resources and volunteer effort. The project began with the idea of bringing free NASA photo exhibits (FETTU) to the Lafayette and Antioch Libraries for public display. Subsequently, the effort expanded by adding layers of activities that brought space and science information to teachers, students and the pubic at 5 libraries and schools in the 2 cities, one of which serves a diverse, underserved community. Overall, the effort (supported by a pilot grant from the Bechtel Foundation) included school and library based teacher workshops with resource materials; travelling space museum visits with hands-on activities (Chabot-to-Go); separate powerpoint presentations for students and adults at the library; and concurrent ancillary space-related themes for young children's programs at the library. This pilot project, based largely on the use of free government resources and online materials, demonstrated that volunteer-based, standards-linked STEM efforts can enhance curriculum at the middle school, with libraries serving a special role. Using this model, we subsequently also obtained a small NASA-Space Grant award to bring star parties and hand-on science activities to three libraries this Fall, linking with numerous Grade 5 teachers and students in two additional underserved areas of our county. It's not necessary to reinvent the wheel, you just collect the pieces and build on what you already have.

  20. The construction of an EST database for Bombyx mori and its application

    PubMed Central

    Mita, Kazuei; Morimyo, Mitsuoki; Okano, Kazuhiro; Koike, Yoshiko; Nohata, Junko; Kawasaki, Hideki; Kadono-Okuda, Keiko; Yamamoto, Kimiko; Suzuki, Masataka G.; Shimada, Toru; Goldsmith, Marian R.; Maeda, Susumu

    2003-01-01

    To build a foundation for the complete genome analysis of Bombyx mori, we have constructed an EST database. Because gene expression patterns deeply depend on tissues as well as developmental stages, we analyzed many cDNA libraries prepared from various tissues and different developmental stages to cover the entire set of Bombyx genes. So far, the Bombyx EST database contains 35,000 ESTs from 36 cDNA libraries, which are grouped into ≈11,000 nonredundant ESTs with the average length of 1.25 kb. The comparison with FlyBase suggests that the present EST database, SilkBase, covers >55% of all genes of Bombyx. The fraction of library-specific ESTs in each cDNA library indicates that we have not yet reached saturation, showing the validity of our strategy for constructing an EST database to cover all genes. To tackle the coming saturation problem, we have checked two methods, subtraction and normalization, to increase coverage and decrease the number of housekeeping genes, resulting in a 5–11% increase of library-specific ESTs. The identification of a number of genes and comprehensive cloning of gene families have already emerged from the SilkBase search. Direct links of SilkBase with FlyBase and WormBase provide ready identification of candidate Lepidoptera-specific genes. PMID:14614147

  1. The Evidence-Based Manifesto for School Librarians

    ERIC Educational Resources Information Center

    Todd, Ross

    2008-01-01

    School Library Journal's 2007 Leadership Summit, "Where's the Evidence? Understanding the Impact of School Libraries," focused on the topic of evidence-based practice. Evidence-based school librarianship is a systematic approach that engages research-derived evidence, school librarian-observed evidence, and user-reported evidence in the processes…

  2. A Fragment-Based Method of Creating Small-Molecule Libraries to Target the Aggregation of Intrinsically Disordered Proteins.

    PubMed

    Joshi, Priyanka; Chia, Sean; Habchi, Johnny; Knowles, Tuomas P J; Dobson, Christopher M; Vendruscolo, Michele

    2016-03-14

    The aggregation process of intrinsically disordered proteins (IDPs) has been associated with a wide range of neurodegenerative disorders, including Alzheimer's and Parkinson's diseases. Currently, however, no drug in clinical use targets IDP aggregation. To facilitate drug discovery programs in this important and challenging area, we describe a fragment-based approach of generating small-molecule libraries that target specific IDPs. The method is based on the use of molecular fragments extracted from compounds reported in the literature to inhibit of the aggregation of IDPs. These fragments are used to screen existing large generic libraries of small molecules to form smaller libraries specific for given IDPs. We illustrate this approach by describing three distinct small-molecule libraries to target, Aβ, tau, and α-synuclein, which are three IDPs implicated in Alzheimer's and Parkinson's diseases. The strategy described here offers novel opportunities for the identification of effective molecular scaffolds for drug discovery for neurodegenerative disorders and to provide insights into the mechanism of small-molecule binding to IDPs.

  3. SPlinted Ligation Adapter Tagging (SPLAT), a novel library preparation method for whole genome bisulphite sequencing

    PubMed Central

    Manlig, Erika; Wahlberg, Per

    2017-01-01

    Abstract Sodium bisulphite treatment of DNA combined with next generation sequencing (NGS) is a powerful combination for the interrogation of genome-wide DNA methylation profiles. Library preparation for whole genome bisulphite sequencing (WGBS) is challenging due to side effects of the bisulphite treatment, which leads to extensive DNA damage. Recently, a new generation of methods for bisulphite sequencing library preparation have been devised. They are based on initial bisulphite treatment of the DNA, followed by adaptor tagging of single stranded DNA fragments, and enable WGBS using low quantities of input DNA. In this study, we present a novel approach for quick and cost effective WGBS library preparation that is based on splinted adaptor tagging (SPLAT) of bisulphite-converted single-stranded DNA. Moreover, we validate SPLAT against three commercially available WGBS library preparation techniques, two of which are based on bisulphite treatment prior to adaptor tagging and one is a conventional WGBS method. PMID:27899585

  4. Creating a pediatric digital library for pediatric health care providers and families: using literature and data to define common pediatric problems.

    PubMed

    D'Alessandro, Donna; Kingsley, Peggy

    2002-01-01

    The goal of this study was to complete a literature-based needs assessment with regard to common pediatric problems encountered by pediatric health care providers (PHCPs) and families, and to develop a problem-based pediatric digital library to meet those needs. The needs assessment yielded 65 information sources. Common problems were identified and categorized, and the Internet was manually searched for authoritative Web sites. The created pediatric digital library (www.generalpediatrics.com) used a problem-based interface and was deployed in November 1999. From November 1999 to November 2000, the number of hyperlinks and authoritative Web sites increased 51.1 and 32.2 percent, respectively. Over the same time, visitors increased by 57.3 percent and overall usage increased by 255 percent. A pediatric digital library has been created that begins to bring order to general pediatric resources on the Internet. This pediatric digital library provides current, authoritative, easily accessed pediatric information whenever and wherever the PHCPs and families want assistance.

  5. A Web-Based Library and Algorithm System for Satellite and Airborne Image Products

    DTIC Science & Technology

    2011-06-28

    Sequoia Scientific, Inc., and Dr. Paul Bissett at FERI, under other 6.1/6.2 program funding. 2 A Web-Based Library And Algorithm System For...of the spectrum matching approach to inverting hyperspectral imagery created by Drs. C. Mobley ( Sequoia Scientific) and P. Bissett (FERI...algorithms developed by Sequoia Scientific and FERI. Testing and Implementation of Library This project will result in the delivery of a WeoGeo

  6. OCTANET--an electronic library network: I. Design and development.

    PubMed Central

    Johnson, M F; Pride, R B

    1983-01-01

    The design and development of the OCTANET system for networking among medical libraries in the midcontinental region is described. This system's features and configuration may be attributed, at least in part, to normal evolution of technology in library networking, remote access to computers, and development of machine-readable data bases. Current functions and services of the system are outlined and implications for future developments in computer-based networking are discussed. PMID:6860825

  7. Testing a single regression coefficient in high dimensional linear models

    PubMed Central

    Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling

    2017-01-01

    In linear regression models with high dimensional data, the classical z-test (or t-test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z-test to assess the significance of each covariate. Based on the p-value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively. PMID:28663668

  8. Testing a single regression coefficient in high dimensional linear models.

    PubMed

    Lan, Wei; Zhong, Ping-Shou; Li, Runze; Wang, Hansheng; Tsai, Chih-Ling

    2016-11-01

    In linear regression models with high dimensional data, the classical z -test (or t -test) for testing the significance of each single regression coefficient is no longer applicable. This is mainly because the number of covariates exceeds the sample size. In this paper, we propose a simple and novel alternative by introducing the Correlated Predictors Screening (CPS) method to control for predictors that are highly correlated with the target covariate. Accordingly, the classical ordinary least squares approach can be employed to estimate the regression coefficient associated with the target covariate. In addition, we demonstrate that the resulting estimator is consistent and asymptotically normal even if the random errors are heteroscedastic. This enables us to apply the z -test to assess the significance of each covariate. Based on the p -value obtained from testing the significance of each covariate, we further conduct multiple hypothesis testing by controlling the false discovery rate at the nominal level. Then, we show that the multiple hypothesis testing achieves consistent model selection. Simulation studies and empirical examples are presented to illustrate the finite sample performance and the usefulness of the proposed method, respectively.

  9. Highlighting differences between conditional and unconditional quantile regression approaches through an application to assess medication adherence.

    PubMed

    Borah, Bijan J; Basu, Anirban

    2013-09-01

    The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer's disease. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Model selection for marginal regression analysis of longitudinal data with missing observations and covariate measurement error.

    PubMed

    Shen, Chung-Wei; Chen, Yi-Hau

    2015-10-01

    Missing observations and covariate measurement error commonly arise in longitudinal data. However, existing methods for model selection in marginal regression analysis of longitudinal data fail to address the potential bias resulting from these issues. To tackle this problem, we propose a new model selection criterion, the Generalized Longitudinal Information Criterion, which is based on an approximately unbiased estimator for the expected quadratic error of a considered marginal model accounting for both data missingness and covariate measurement error. The simulation results reveal that the proposed method performs quite well in the presence of missing data and covariate measurement error. On the contrary, the naive procedures without taking care of such complexity in data may perform quite poorly. The proposed method is applied to data from the Taiwan Longitudinal Study on Aging to assess the relationship of depression with health and social status in the elderly, accommodating measurement error in the covariate as well as missing observations. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Using SAS PROC CALIS to fit Level-1 error covariance structures of latent growth models.

    PubMed

    Ding, Cherng G; Jane, Ten-Der

    2012-09-01

    In the present article, we demonstrates the use of SAS PROC CALIS to fit various types of Level-1 error covariance structures of latent growth models (LGM). Advantages of the SEM approach, on which PROC CALIS is based, include the capabilities of modeling the change over time for latent constructs, measured by multiple indicators; embedding LGM into a larger latent variable model; incorporating measurement models for latent predictors; and better assessing model fit and the flexibility in specifying error covariance structures. The strength of PROC CALIS is always accompanied with technical coding work, which needs to be specifically addressed. We provide a tutorial on the SAS syntax for modeling the growth of a manifest variable and the growth of a latent construct, focusing the documentation on the specification of Level-1 error covariance structures. Illustrations are conducted with the data generated from two given latent growth models. The coding provided is helpful when the growth model has been well determined and the Level-1 error covariance structure is to be identified.

  12. Approximations of noise covariance in multi-slice helical CT scans: impact on lung nodule size estimation.

    PubMed

    Zeng, Rongping; Petrick, Nicholas; Gavrielides, Marios A; Myers, Kyle J

    2011-10-07

    Multi-slice computed tomography (MSCT) scanners have become popular volumetric imaging tools. Deterministic and random properties of the resulting CT scans have been studied in the literature. Due to the large number of voxels in the three-dimensional (3D) volumetric dataset, full characterization of the noise covariance in MSCT scans is difficult to tackle. However, as usage of such datasets for quantitative disease diagnosis grows, so does the importance of understanding the noise properties because of their effect on the accuracy of the clinical outcome. The goal of this work is to study noise covariance in the helical MSCT volumetric dataset. We explore possible approximations to the noise covariance matrix with reduced degrees of freedom, including voxel-based variance, one-dimensional (1D) correlation, two-dimensional (2D) in-plane correlation and the noise power spectrum (NPS). We further examine the effect of various noise covariance models on the accuracy of a prewhitening matched filter nodule size estimation strategy. Our simulation results suggest that the 1D longitudinal, 2D in-plane and NPS prewhitening approaches can improve the performance of nodule size estimation algorithms. When taking into account computational costs in determining noise characterizations, the NPS model may be the most efficient approximation to the MSCT noise covariance matrix.

  13. On the Likely Utility of Hybrid Weights Optimized for Variances in Hybrid Error Covariance Models

    NASA Astrophysics Data System (ADS)

    Satterfield, E.; Hodyss, D.; Kuhl, D.; Bishop, C. H.

    2017-12-01

    Because of imperfections in ensemble data assimilation schemes, one cannot assume that the ensemble covariance is equal to the true error covariance of a forecast. Previous work demonstrated how information about the distribution of true error variances given an ensemble sample variance can be revealed from an archive of (observation-minus-forecast, ensemble-variance) data pairs. Here, we derive a simple and intuitively compelling formula to obtain the mean of this distribution of true error variances given an ensemble sample variance from (observation-minus-forecast, ensemble-variance) data pairs produced by a single run of a data assimilation system. This formula takes the form of a Hybrid weighted average of the climatological forecast error variance and the ensemble sample variance. Here, we test the extent to which these readily obtainable weights can be used to rapidly optimize the covariance weights used in Hybrid data assimilation systems that employ weighted averages of static covariance models and flow-dependent ensemble based covariance models. Univariate data assimilation and multi-variate cycling ensemble data assimilation are considered. In both cases, it is found that our computationally efficient formula gives Hybrid weights that closely approximate the optimal weights found through the simple but computationally expensive process of testing every plausible combination of weights.

  14. Estimating and testing interactions when explanatory variables are subject to non-classical measurement error.

    PubMed

    Murad, Havi; Kipnis, Victor; Freedman, Laurence S

    2016-10-01

    Assessing interactions in linear regression models when covariates have measurement error (ME) is complex.We previously described regression calibration (RC) methods that yield consistent estimators and standard errors for interaction coefficients of normally distributed covariates having classical ME. Here we extend normal based RC (NBRC) and linear RC (LRC) methods to a non-classical ME model, and describe more efficient versions that combine estimates from the main study and internal sub-study. We apply these methods to data from the Observing Protein and Energy Nutrition (OPEN) study. Using simulations we show that (i) for normally distributed covariates efficient NBRC and LRC were nearly unbiased and performed well with sub-study size ≥200; (ii) efficient NBRC had lower MSE than efficient LRC; (iii) the naïve test for a single interaction had type I error probability close to the nominal significance level, whereas efficient NBRC and LRC were slightly anti-conservative but more powerful; (iv) for markedly non-normal covariates, efficient LRC yielded less biased estimators with smaller variance than efficient NBRC. Our simulations suggest that it is preferable to use: (i) efficient NBRC for estimating and testing interaction effects of normally distributed covariates and (ii) efficient LRC for estimating and testing interactions for markedly non-normal covariates. © The Author(s) 2013.

  15. Bio-Optical Data Assimilation With Observational Error Covariance Derived From an Ensemble of Satellite Images

    NASA Astrophysics Data System (ADS)

    Shulman, Igor; Gould, Richard W.; Frolov, Sergey; McCarthy, Sean; Penta, Brad; Anderson, Stephanie; Sakalaukus, Peter

    2018-03-01

    An ensemble-based approach to specify observational error covariance in the data assimilation of satellite bio-optical properties is proposed. The observational error covariance is derived from statistical properties of the generated ensemble of satellite MODIS-Aqua chlorophyll (Chl) images. The proposed observational error covariance is used in the Optimal Interpolation scheme for the assimilation of MODIS-Aqua Chl observations. The forecast error covariance is specified in the subspace of the multivariate (bio-optical, physical) empirical orthogonal functions (EOFs) estimated from a month-long model run. The assimilation of surface MODIS-Aqua Chl improved surface and subsurface model Chl predictions. Comparisons with surface and subsurface water samples demonstrate that data assimilation run with the proposed observational error covariance has higher RMSE than the data assimilation run with "optimistic" assumption about observational errors (10% of the ensemble mean), but has smaller or comparable RMSE than data assimilation run with an assumption that observational errors equal to 35% of the ensemble mean (the target error for satellite data product for chlorophyll). Also, with the assimilation of the MODIS-Aqua Chl data, the RMSE between observed and model-predicted fractions of diatoms to the total phytoplankton is reduced by a factor of two in comparison to the nonassimilative run.

  16. Nambu-Poisson gauge theory

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Schupp, Peter; Vysoký, Jan

    2014-06-01

    We generalize noncommutative gauge theory using Nambu-Poisson structures to obtain a new type of gauge theory with higher brackets and gauge fields. The approach is based on covariant coordinates and higher versions of the Seiberg-Witten map. We construct a covariant Nambu-Poisson gauge theory action, give its first order expansion in the Nambu-Poisson tensor and relate it to a Nambu-Poisson matrix model.

  17. Covariance Structure Models for Gene Expression Microarray Data

    ERIC Educational Resources Information Center

    Xie, Jun; Bentler, Peter M.

    2003-01-01

    Covariance structure models are applied to gene expression data using a factor model, a path model, and their combination. The factor model is based on a few factors that capture most of the expression information. A common factor of a group of genes may represent a common protein factor for the transcript of the co-expressed genes, and hence, it…

  18. Library Automation in the Netherlands and Pica.

    ERIC Educational Resources Information Center

    Bossers, Anton; Van Muyen, Martin

    1984-01-01

    Describes the Pica Library Automation Network (originally the Project for Integrated Catalogue Automation), which is based on a centralized bibliographic database. Highlights include the Pica conception of library automation, online shared cataloging system, circulation control system, acquisition system, and online Dutch union catalog with…

  19. Content Management and the Future of Academic Libraries.

    ERIC Educational Resources Information Center

    Wu, Yuhfen Diana; Liu, Mengxiong

    2001-01-01

    Discusses Internet-based electronic content management in digital libraries and considers the future of academic libraries. Topics include digital technologies; content management systems; standards; bandwidth; security and privacy concerns; legal matters, including copyrights and ownership; lifecycle; and multilingual access and interface. (LRW)

  20. Librarians as Money Makers: The Bottom Line.

    ERIC Educational Resources Information Center

    Warner, Alice Sizer

    1990-01-01

    Discusses the monetary aspects of librarianship, focusing on nontraditional library careers such as intrapreneurship (managing a library within an organization as a business) and entrepreneurship. Fee-based library services are considered, and suggestions for becoming a successful information entrepreneur are offered. (10 references) (MES)

Top