A comparison of different functions for predicted protein model quality assessment.
Li, Juan; Fang, Huisheng
2016-07-01
In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.
MatProps: Material Properties Database and Associated Access Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durrenberger, J K; Becker, R C; Goto, D M
2007-08-13
Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less
NASA Astrophysics Data System (ADS)
Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.
1998-04-01
The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.
AQUATOX Data Sources Documents
Contains the data sources for parameter values of the AQUATOX model including: a bibliography for the AQUATOX data libraries and the compendia of parameter values for US Army Corps of Engineers models.
NASA Astrophysics Data System (ADS)
Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.
2009-12-01
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from 51V to 239Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capote, R.; Herman, M.; Oblozinsky, P.
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Capote, R.; Herman, M.; Capote,R.
We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less
NASA Astrophysics Data System (ADS)
Tsalmantza, P.; Kontizas, M.; Rocca-Volmerange, B.; Bailer-Jones, C. A. L.; Kontizas, E.; Bellas-Velidis, I.; Livanou, E.; Korakitis, R.; Dapergolas, A.; Vallenari, A.; Fioc, M.
2009-09-01
Aims: This paper is the second in a series, implementing a classification system for Gaia observations of unresolved galaxies. Our goals are to determine spectral classes and estimate intrinsic astrophysical parameters via synthetic templates. Here we describe (1) a new extended library of synthetic galaxy spectra; (2) its comparison with various observations; and (3) first results of classification and parametrization experiments using simulated Gaia spectrophotometry of this library. Methods: Using the PÉGASE.2 code, based on galaxy evolution models that take account of metallicity evolution, extinction correction, and emission lines (with stellar spectra based on the BaSeL library), we improved our first library and extended it to cover the domain of most of the SDSS catalogue. Our classification and regression models were support vector machines (SVMs). Results: We produce an extended library of 28 885 synthetic galaxy spectra at zero redshift covering four general Hubble types of galaxies, over the wavelength range between 250 and 1050 nm at a sampling of 1 nm or less. The library is also produced for 4 random values of redshift in the range of 0-0.2. It is computed on a random grid of four key astrophysical parameters (infall timescale and 3 parameters defining the SFR) and, depending on the galaxy type, on two values of the age of the galaxy. The synthetic library was compared and found to be in good agreement with various observations. The first results from the SVM classifiers and parametrizers are promising, indicating that Hubble types can be reliably predicted and several parameters estimated with low bias and variance.
Planetary Image Geometry Library
NASA Technical Reports Server (NTRS)
Deen, Robert C.; Pariser, Oleg
2010-01-01
The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.
The relative pose estimation of aircraft based on contour model
NASA Astrophysics Data System (ADS)
Fu, Tai; Sun, Xiangyi
2017-02-01
This paper proposes a relative pose estimation approach based on object contour model. The first step is to obtain a two-dimensional (2D) projection of three-dimensional (3D)-model-based target, which will be divided into 40 forms by clustering and LDA analysis. Then we proceed by extracting the target contour in each image and computing their Pseudo-Zernike Moments (PZM), thus a model library is constructed in an offline mode. Next, we spot a projection contour that resembles the target silhouette most in the present image from the model library with reference of PZM; then similarity transformation parameters are generated as the shape context is applied to match the silhouette sampling location, from which the identification parameters of target can be further derived. Identification parameters are converted to relative pose parameters, in the premise that these values are the initial result calculated via iterative refinement algorithm, as the relative pose parameter is in the neighborhood of actual ones. At last, Distance Image Iterative Least Squares (DI-ILS) is employed to acquire the ultimate relative pose parameters.
NASA Astrophysics Data System (ADS)
Sanchez, P.; Hinojosa, J.; Ruiz, R.
2005-06-01
Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.
Multifractal Characterization of Geologic Noise for Improved UXO Detection and Discrimination
2008-03-01
12 Recovery of the Universal Multifractal Parameters ...dipole-model to each magnetic anomaly and compares the extracted model parameters with a library of UXO items. They found that remnant magnetization...the survey parameters , and the geologic environment. In this pilot study we have focused on the multifractal representation of natural variations
NASA Astrophysics Data System (ADS)
Cheng, Liantao; Zhang, Fenghui; Kang, Xiaoyu; Wang, Lang
2018-05-01
In evolutionary population synthesis (EPS) models, we need to convert stellar evolutionary parameters into spectra via interpolation in a stellar spectral library. For theoretical stellar spectral libraries, the spectrum grid is homogeneous on the effective-temperature and gravity plane for a given metallicity. It is relatively easy to derive stellar spectra. For empirical stellar spectral libraries, stellar parameters are irregularly distributed and the interpolation algorithm is relatively complicated. In those EPS models that use empirical stellar spectral libraries, different algorithms are used and the codes are often not released. Moreover, these algorithms are often complicated. In this work, based on a radial basis function (RBF) network, we present a new spectrum interpolation algorithm and its code. Compared with the other interpolation algorithms that are used in EPS models, it can be easily understood and is highly efficient in terms of computation. The code is written in MATLAB scripts and can be used on any computer system. Using it, we can obtain the interpolated spectra from a library or a combination of libraries. We apply this algorithm to several stellar spectral libraries (such as MILES, ELODIE-3.1 and STELIB-3.2) and give the integrated spectral energy distributions (ISEDs) of stellar populations (with ages from 1 Myr to 14 Gyr) by combining them with Yunnan-III isochrones. Our results show that the differences caused by the adoption of different EPS model components are less than 0.2 dex. All data about the stellar population ISEDs in this work and the RBF spectrum interpolation code can be obtained by request from the first author or downloaded from http://www1.ynao.ac.cn/˜zhangfh.
Computer Simulation of the Circulation Subsystem of a Library
ERIC Educational Resources Information Center
Shaw, W. M., Jr.
1975-01-01
When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)
Atmospheric and Fundamental Parameters of Stars in Hubble's Next Generation Spectral Library
NASA Technical Reports Server (NTRS)
Heap, Sally
2010-01-01
Hubble's Next Generation Spectral Library (NGSL) consists of R approximately 1000 spectra of 374 stars of assorted temperature, gravity, and metallicity. We are presently working to determine the atmospheric and fundamental parameters of the stars from the NGSL spectra themselves via full-spectrum fitting of model spectra to the observed (extinction-corrected) spectrum over the full wavelength range, 0.2-1.0 micron. We use two grids of model spectra for this purpose: the very low-resolution spectral grid from Castelli-Kurucz (2004), and the grid from MARCS (2008). Both the observed spectrum and the MARCS spectra are first degraded in resolution to match the very low resolution of the Castelli-Kurucz models, so that our fitting technique is the same for both model grids. We will present our preliminary results with a comparison with those from the Sloan/Segue Stellar Parameter Pipeline, ELODIE, and MILES, etc.
Architectural Optimization of Digital Libraries
NASA Technical Reports Server (NTRS)
Biser, Aileen O.
1998-01-01
This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.
CHEMICAL EVOLUTION LIBRARY FOR GALAXY FORMATION SIMULATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saitoh, Takayuki R., E-mail: saitoh@elsi.jp
We have developed a software library for chemical evolution simulations of galaxy formation under the simple stellar population (SSP) approximation. In this library, all of the necessary components concerning chemical evolution, such as initial mass functions, stellar lifetimes, yields from Type II and Type Ia supernovae, asymptotic giant branch stars, and neutron star mergers, are compiled from the literature. Various models are pre-implemented in this library so that users can choose their favorite combination of models. Subroutines of this library return released energy and masses of individual elements depending on a given event type. Since the redistribution manner of thesemore » quantities depends on the implementation of users’ simulation codes, this library leaves it up to the simulation code. As demonstrations, we carry out both one-zone, closed-box simulations and 3D simulations of a collapsing gas and dark matter system using this library. In these simulations, we can easily compare the impact of individual models on the chemical evolution of galaxies, just by changing the control flags and parameters of the library. Since this library only deals with the part of chemical evolution under the SSP approximation, any simulation codes that use the SSP approximation—namely, particle-base and mesh codes, as well as semianalytical models—can use it. This library is named “CELib” after the term “Chemical Evolution Library” and is made available to the community.« less
Environment Modeling Using Runtime Values for JPF-Android
NASA Technical Reports Server (NTRS)
van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem
2015-01-01
Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram
Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less
Villarrubia, J S; Vladár, A E; Ming, B; Kline, R J; Sunday, D F; Chawla, J S; List, S
2015-07-01
The width and shape of 10nm to 12 nm wide lithographically patterned SiO2 lines were measured in the scanning electron microscope by fitting the measured intensity vs. position to a physics-based model in which the lines' widths and shapes are parameters. The approximately 32 nm pitch sample was patterned at Intel using a state-of-the-art pitch quartering process. Their narrow widths and asymmetrical shapes are representative of near-future generation transistor gates. These pose a challenge: the narrowness because electrons landing near one edge may scatter out of the other, so that the intensity profile at each edge becomes width-dependent, and the asymmetry because the shape requires more parameters to describe and measure. Modeling was performed by JMONSEL (Java Monte Carlo Simulation of Secondary Electrons), which produces a predicted yield vs. position for a given sample shape and composition. The simulator produces a library of predicted profiles for varying sample geometry. Shape parameter values are adjusted until interpolation of the library with those values best matches the measured image. Profiles thereby determined agreed with those determined by transmission electron microscopy and critical dimension small-angle x-ray scattering to better than 1 nm. Published by Elsevier B.V.
AQUATOX Frequently Asked Questions
Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting
The effect of call libraries and acoustic filters on the identification of bat echolocation.
Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C
2014-09-01
Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.
The effect of call libraries and acoustic filters on the identification of bat echolocation
Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C
2014-01-01
Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys. PMID:25535563
The effect of call libraries and acoustic filters on the identification of bat echolocation
Clement, Matthew; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C
2014-01-01
Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.
Library of Giant Planet Reflection Spectra for WFirst and Future Space Telescopes
NASA Astrophysics Data System (ADS)
Smith, Adam J. R. W.; Fortney, Jonathan; Morley, Caroline; Batalha, Natasha E.; Lewis, Nikole K.
2018-01-01
Future large space space telescopes will be able to directly image exoplanets in optical light. The optical light of a resolved planet is due to stellar flux reflected by Rayleigh scattering or cloud scattering, with absorption features imprinted due to molecular bands in the planetary atmosphere. To aid in the design of such missions, and to better understand a wide range of giant planet atmospheres, we have built a library of model giant planet reflection spectra, for the purpose of determining effective methods of spectral analysis as well as for comparison with actual imaged objects. This library covers a wide range of parameters: objects are modeled at ten orbital distances between 0.5 AU and 5.0 AU, which ranges from planets too warm for water clouds, out to those that are true Jupiter analogs. These calculations include six metalicities between solar and 100x solar, with a variety of different cloud thickness parameters, and across all possible phase angles.
AGAMA: Action-based galaxy modeling framework
NASA Astrophysics Data System (ADS)
Vasiliev, Eugene
2018-05-01
The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).
X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin
NASA Astrophysics Data System (ADS)
Laycock, Silas; Yang, Jun; Christodoulou, Dimitris; Coe, Malcolm; Cappallo, Rigel; Zezas, Andreas; Ho, Wynn C. G.; Hong, JaeSub; Fingerman, Samuel; Drake, Jeremy J.; Kretschmar, Peter; Antoniou, Vallia
2017-08-01
We present our multi-satellite library of X-ray Pulsar observations to the community, and highlight recent science results. Available at www.xraypulsars.space the library provides a range of high-level data products, including: activity histories, pulse-profiles, phased event files, and a unique pulse-profile modeling interface. The initial release (v1.0) contains some 15 years of RXTE-PCA, Chandra ACIS-I, and XMM-PN observations of the Small Magellanic Cloud, creating a valuable record of pulsar behavior. Our library is intended to enable new progress on fundamental NS parameters and accretion physics. The major motivations are (1) Assemble a large homogeneous sample to enable population statistics. This has so far been used to map the propeller transition, and explore the role of retrograde and pro-grade accretion disks. (2) Obtain pulse-profiles for the same pulsars on many different occasions, at different luminosities and states in order to break model degeneracies. This effort has led to preliminary measurements of the offsets between magnetic and spin axes. With the addition of other satellites, and Galactic pulsars, the library will cover the entire available range of luminosity, variability timescales and accretion regimes.
Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria
2018-02-01
It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.
NASA Astrophysics Data System (ADS)
Fallahpour, Mojtaba Behzad; Dehghani, Hamid; Jabbar Rashidi, Ali; Sheikhi, Abbas
2018-05-01
Target recognition is one of the most important issues in the interpretation of the synthetic aperture radar (SAR) images. Modelling, analysis, and recognition of the effects of influential parameters in the SAR can provide a better understanding of the SAR imaging systems, and therefore facilitates the interpretation of the produced images. Influential parameters in SAR images can be divided into five general categories of radar, radar platform, channel, imaging region, and processing section, each of which has different physical, structural, hardware, and software sub-parameters with clear roles in the finally formed images. In this paper, for the first time, a behaviour library that includes the effects of polarisation, incidence angle, and shape of targets, as radar and imaging region sub-parameters, in the SAR images are extracted. This library shows that the created pattern for each of cylindrical, conical, and cubic shapes is unique, and due to their unique properties these types of shapes can be recognised in the SAR images. This capability is applied to data acquired with the Canadian RADARSAT1 satellite.
Multigroup cross section library for GFR2400
NASA Astrophysics Data System (ADS)
Čerba, Štefan; Vrban, Branislav; Lüley, Jakub; Haščík, Ján; Nečas, Vladimír
2017-09-01
In this paper the development and optimization of the SBJ_E71 multigroup cross section library for GFR2400 applications is discussed. A cross section processing scheme, merging Monte Carlo and deterministic codes, was developed. Several fine and coarse group structures and two weighting flux options were analysed through 18 benchmark experiments selected from the handbook of ICSBEP and based on performed similarity assessments. The performance of the collapsed version of the SBJ_E71 library was compared with MCNP5 CE ENDF/B VII.1 and the Korean KAFAX-E70 library. The comparison was made based on integral parameters of calculations performed on full core homogenous models.
NASA Astrophysics Data System (ADS)
Brown, Alexander; Eviston, Connor
2017-02-01
Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.
Analysis of Brown camera distortion model
NASA Astrophysics Data System (ADS)
Nowakowski, Artur; Skarbek, Władysław
2013-10-01
Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.
SP_Ace: a new code to derive stellar parameters and elemental abundances
NASA Astrophysics Data System (ADS)
Boeche, C.; Grebel, E. K.
2016-03-01
Context. Ongoing and future massive spectroscopic surveys will collect large numbers (106-107) of stellar spectra that need to be analyzed. Highly automated software is needed to derive stellar parameters and chemical abundances from these spectra. Aims: We developed a new method of estimating the stellar parameters Teff, log g, [M/H], and elemental abundances. This method was implemented in a new code, SP_Ace (Stellar Parameters And Chemical abundances Estimator). This is a highly automated code suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). Methods: After the astrophysical calibration of the oscillator strengths of 4643 absorption lines covering the wavelength ranges 5212-6860 Å and 8400-8924 Å, we constructed a library that contains the equivalent widths (EW) of these lines for a grid of stellar parameters. The EWs of each line are fit by a polynomial function that describes the EW of the line as a function of the stellar parameters. The coefficients of these polynomial functions are stored in a library called the "GCOG library". SP_Ace, a code written in FORTRAN95, uses the GCOG library to compute the EWs of the lines, constructs models of spectra as a function of the stellar parameters and abundances, and searches for the model that minimizes the χ2 deviation when compared to the observed spectrum. The code has been tested on synthetic and real spectra for a wide range of signal-to-noise and spectral resolutions. Results: SP_Ace derives stellar parameters such as Teff, log g, [M/H], and chemical abundances of up to ten elements for low to medium resolution spectra of FGK-type stars with precision comparable to the one usually obtained with spectra of higher resolution. Systematic errors in stellar parameters and chemical abundances are presented and identified with tests on synthetic and real spectra. Stochastic errors are automatically estimated by the code for all the parameters. A simple Web front end of SP_Ace can be found at http://dc.g-vo.org/SP_ACE while the source code will be published soon. Full Tables D.1-D.3 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A2
Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.
Huynh, Linh; Tagkopoulos, Ilias
2015-08-21
In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.
INTRIGOSS: A new Library of High Resolution Synthetic Spectra
NASA Astrophysics Data System (ADS)
Franchini, Mariagrazia; Morossi, Carlo; Di Marcancantonio, Paolo; Chavez, Miguel; GES-Builders
2018-01-01
INTRIGOSS (INaf Trieste Grid Of Synthetic Spectra) is a new High Resolution (HiRes) synthetic spectral library designed for studying F, G, and K stars. The library is based on atmosphere models computed with specified individual element abundances via ATLAS12 code. Normalized SPectra (NSP) and surface Flux SPectra (FSP), in the 4800-5400 Å wavelength range, were computed by means of the SPECTRUM code. The synthetic spectra are computed with an atomic and bi-atomic molecular line list including "bona fide" Predicted Lines (PLs) built by tuning loggf to reproduce very high SNR Solar spectrum and the UVES-U580 spectra of five cool giants extracted from the Gaia-ESO survey (GES). The astrophysical gf-values were then assessed by using more than 2000 stars with homogenous and accurate atmosphere parameters and detailed chemical composition from GES. The validity and greater accuracy of INTRIGOSS NSPs and FSPs with respect to other available spectral libraries is discussed. INTRIGOSS will be available on the web and will be a valuable tool for both stellar atmospheric parameters and stellar population studies.
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Revived STIS. II. Properties of Stars in the Next Generation Spectral Library
NASA Technical Reports Server (NTRS)
Heap, Sara R.; Lindler, D.
2010-01-01
Spectroscopic surveys of galaxies at high redshift will bring the rest-frame ultraviolet into view of large, ground-based telescopes. The UV-blue spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). The NGSL contains UV-optical spectra (0.2 - 1.0 microns) of 374 stars having a wide range in temperature, luminosity, and metallicity. We will describe our work to derive basic stellar parameters from NGSL spectra using modern model spectra and to use these stellar parameters to develop UV-blue spectral diagnostics.
X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin
NASA Astrophysics Data System (ADS)
Laycock, Silas
We propose to expand the scope of our successful project providing a multi-satellite library of X-ray Pulsar observations to the community. The library provides high-level products, activity monitoring, pulse-profiles, phased event files, spectra, and a unique pulse-profile modeling interface. The library's scientific footprint will expand in 4 key directions: (1) Update, by processing all new XMM-Newton and Chandra observations (2015-2017) of X-ray Binary Pulsars in the Magellanic Clouds. (2) Expand, by including all archival Suzaku, Swift and NuStar observations, and including Galactic pulsars. (3) Improve, by offering innovative data products that provide deeper insight. (4) Advance, by implementing a new generation of physically motivated emission and pulse-profile models. The library currently includes some 2000 individual RXTE-PCA, 200 Chandra ACIS-I, and 120 XMM-PN observations of the SMC spanning 15 years, creating an unrivaled record of pulsar temporal behavior. In Phase-2, additional observations of SMC pulsars will be added: 221 Chandra (ACIS-S and ACIS-I), 22 XMM-PN, 142 XMM-MOS, 92 Suzaku, 25 NuSTAR, and >10,000 Swift; leveraging our pipeline and analysis techniques already developed. With the addition of 7 Galactic pulsars each having many hundred multisatellite observations, these datasets cover the entire range of variability timescales and accretion regimes. We will model the pulse-profiles using state of the art techniques to parameterize their morphology and obtain the distribution of offsets between magnetic and spin axes, and create samples of profiles under specific accretion modes (whether pencil-beam or fan-beam dominated). These products are needed for the next generation of advances in neutron star theory and modeling. The long-duration of the dataset and “whole-galaxy" nature of the SMC sample make possible a new statistical approach to uncover the duty-cycle distribution and hence population demographics of transient High Mass X-ray Binary (HMXB) populations. Our unique library is already fueling progress on fundamental NS parameters and accretion physics.
Determination of elastomeric foam parameters for simulations of complex loading.
Petre, M T; Erdemir, A; Cavanagh, P R
2006-08-01
Finite element (FE) analysis has shown promise for the evaluation of elastomeric foam personal protection devices. Although appropriate representation of foam materials is necessary in order to obtain realistic simulation results, material definitions used in the literature vary widely and often fail to account for the multi-mode loading experienced by these devices. This study aims to provide a library of elastomeric foam material parameters that can be used in FE simulations of complex loading scenarios. Twelve foam materials used in footwear were tested in uni-axial compression, simple shear and volumetric compression. For each material, parameters for a common compressible hyperelastic material model used in FE analysis were determined using: (a) compression; (b) compression and shear data; and (c) data from all three tests. Material parameters and Drucker stability limits for the best fits are provided with their associated errors. The material model was able to reproduce deformation modes for which data was provided during parameter determination but was unable to predict behavior in other deformation modes. Simulation results were found to be highly dependent on the extent of the test data used to determine the parameters in the material definition. This finding calls into question the many published results of simulations of complex loading that use foam material parameters obtained from a single mode of testing. The library of foam parameters developed here presents associated errors in three deformation modes that should provide for a more informed selection of material parameters.
PAR -- Interface to the ADAM Parameter System
NASA Astrophysics Data System (ADS)
Currie, Malcolm J.; Chipperfield, Alan J.
PAR is a library of Fortran subroutines that provides convenient mechanisms for applications to exchange information with the outside world, through input-output channels called parameters. Parameters enable a user to control an application's behaviour. PAR supports numeric, character, and logical parameters, and is currently implemented only on top of the ADAM parameter system. The PAR library permits parameter values to be obtained, without or with a variety of constraints. Results may be put into parameters to be passed onto other applications. Other facilities include setting a prompt string, and suggested defaults. This document also introduces a preliminary C interface for the PAR library -- this may be subject to change in the light of experience.
Modelling of backscatter from vegetation layers
NASA Technical Reports Server (NTRS)
Van Zyl, J. J.; Engheta, N.; Papas, C. H.; Elachi, C.; Zebker, H.
1985-01-01
A simple way to build up a library of models which may be used to distinguish between the different types of vegetation and ground surfaces by means of their backscatter properties is presented. The curve of constant power received by the antenna (Gamma sphere) is calculated for the given Stokes Scattering Operator, and model parameters are adopted of the most similar library model Gamma sphere. Results calculated for a single scattering model resembling coniferous trees are compared with the Gamma spheres of a model resembling tropical region trees. The polarization which would minimize the effect of either the ground surface or the vegetation layer can be calculated and used to analyze the backscatter from the ground surface/vegetation layer combination, and enhance the power received from the desired part of the combination.
Development of Probabilistic Socio-Economic Emissions Scenarios (2012)
The purpose of this analysis is to help overcome these limitations through the development of a publically available library of socio-economic-emissions projections derived from a systematic examination of uncertainty in key underlying model parameters, w
Metrology of deep trench etched memory structures using 3D scatterometry
NASA Astrophysics Data System (ADS)
Reinig, Peter; Dost, Rene; Moert, Manfred; Hingst, Thomas; Mantz, Ulrich; Moffitt, Jasen; Shakya, Sushil; Raymond, Christopher J.; Littau, Mike
2005-05-01
Scatterometry is receiving considerable attention as an emerging optical metrology in the silicon industry. One area of progress in deploying these powerful measurements in process control is performing measurements on real device structures, as opposed to limiting scatterometry measurements to periodic structures, such as line-space gratings, placed in the wafer scribe. In this work we will discuss applications of 3D scatterometry to the measurement of advanced trench memory devices. This is a challenging and complex scatterometry application that requires exceptionally high-performance computational abilities. In order to represent the physical device, the relatively tall structures require a high number of slices in the rigorous coupled wave analysis (RCWA) theoretical model. This is complicated further by the presence of an amorphous silicon hard mask on the surface, which is highly sensitive to reflectance scattering and therefore needs to be modeled in detail. The overall structure is comprised of several layers, with the trenches presenting a complex bow-shape sidewall that must be measured. Finally, the double periodicity in the structures demands significantly greater computational capabilities. Our results demonstrate that angular scatterometry is sensitive to the key parameters of interest. The influence of further model parameters and parameter cross correlations have to be carefully taken into account. Profile results obtained by non-library optimization methods compare favorably with cross-section SEM images. Generating a model library suitable for process control, which is preferred for precision, presents numerical throughput challenges. Details will be discussed regarding library generation approaches and strategies for reducing the numerical overhead. Scatterometry and SEM results will be compared, leading to conclusions about the feasibility of this advanced application.
Wrapping Python around MODFLOW/MT3DMS based groundwater models
NASA Astrophysics Data System (ADS)
Post, V.
2008-12-01
Numerical models that simulate groundwater flow and solute transport require a great amount of input data that is often organized into different files. A large proportion of the input data consists of spatially-distributed model parameters. The model output consists of a variety data such as heads, fluxes and concentrations. Typically all files have different formats. Consequently, preparing input and managing output is a complex and error-prone task. Proprietary software tools are available that facilitate the preparation of input files and analysis of model outcomes. The use of such software may be limited if it does not support all the features of the groundwater model or when the costs of such tools are prohibitive. Therefore a Python library was developed that contains routines to generate input files and process output files of MODFLOW/MT3DMS based models. The library is freely available and has an open structure so that the routines can be customized and linked into other scripts and libraries. The current set of functions supports the generation of input files for MODFLOW and MT3DMS, including the capability to read spatially-distributed input parameters (e.g. hydraulic conductivity) from PNG files. Both ASCII and binary output files can be read efficiently allowing for visualization of, for example, solute concentration patterns in contour plots with superimposed flow vectors using matplotlib. Series of contour plots are then easily saved as an animation. The subroutines can also be used within scripts to calculate derived quantities such as the mass of a solute within a particular region of the model domain. Using Python as a wrapper around groundwater models provides an efficient and flexible way of processing input and output data, which is not constrained by limitations of third-party products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Nathan; Menikoff, Ralph
2017-02-03
Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less
Economical analysis of saturation mutagenesis experiments
Acevedo-Rocha, Carlos G.; Reetz, Manfred T.; Nov, Yuval
2015-01-01
Saturation mutagenesis is a powerful technique for engineering proteins, metabolic pathways and genomes. In spite of its numerous applications, creating high-quality saturation mutagenesis libraries remains a challenge, as various experimental parameters influence in a complex manner the resulting diversity. We explore from the economical perspective various aspects of saturation mutagenesis library preparation: We introduce a cheaper and faster control for assessing library quality based on liquid media; analyze the role of primer purity and supplier in libraries with and without redundancy; compare library quality, yield, randomization efficiency, and annealing bias using traditional and emergent randomization schemes based on mixtures of mutagenic primers; and establish a methodology for choosing the most cost-effective randomization scheme given the screening costs and other experimental parameters. We show that by carefully considering these parameters, laboratory expenses can be significantly reduced. PMID:26190439
Different Manhattan project: automatic statistical model generation
NASA Astrophysics Data System (ADS)
Yap, Chee Keng; Biermann, Henning; Hertzmann, Aaron; Li, Chen; Meyer, Jon; Pao, Hsing-Kuo; Paxia, Salvatore
2002-03-01
We address the automatic generation of large geometric models. This is important in visualization for several reasons. First, many applications need access to large but interesting data models. Second, we often need such data sets with particular characteristics (e.g., urban models, park and recreation landscape). Thus we need the ability to generate models with different parameters. We propose a new approach for generating such models. It is based on a top-down propagation of statistical parameters. We illustrate the method in the generation of a statistical model of Manhattan. But the method is generally applicable in the generation of models of large geographical regions. Our work is related to the literature on generating complex natural scenes (smoke, forests, etc) based on procedural descriptions. The difference in our approach stems from three characteristics: modeling with statistical parameters, integration of ground truth (actual map data), and a library-based approach for texture mapping.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aryal, Prakash; Molloy, Janelle A.; Rivard, Mark J., E-mail: mark.j.rivard@gmail.com
2014-02-15
Purpose: To investigate potential causes for differences in TG-43 brachytherapy dosimetry parameters in the existent literature for the model IAI-125A{sup 125}I seed and to propose new standard dosimetry parameters. Methods: The MCNP5 code was used for Monte Carlo (MC) simulations. Sensitivity of dose distributions, and subsequently TG-43 dosimetry parameters, was explored to reproduce historical methods upon which American Association of Physicists in Medicine (AAPM) consensus data are based. Twelve simulation conditions varying{sup 125}I coating thickness, coating mass density, photon interaction cross-section library, and photon emission spectrum were examined. Results: Varying{sup 125}I coating thickness, coating mass density, photon cross-section library, andmore » photon emission spectrum for the model IAI-125A seed changed the dose-rate constant by up to 0.9%, about 1%, about 3%, and 3%, respectively, in comparison to the proposed standard value of 0.922 cGy h{sup −1} U{sup −1}. The dose-rate constant values by Solberg et al. [“Dosimetric parameters of three new solid core {sup 125}I brachytherapy sources,” J. Appl. Clin. Med. Phys. 3, 119–134 (2002)], Meigooni et al. [“Experimental and theoretical determination of dosimetric characteristics of IsoAid ADVANTAGE™ {sup 125}I brachytherapy source,” Med. Phys. 29, 2152–2158 (2002)], and Taylor and Rogers [“An EGSnrc Monte Carlo-calculated database of TG-43 parameters,” Med. Phys. 35, 4228–4241 (2008)] for the model IAI-125A seed and Kennedy et al. [“Experimental and Monte Carlo determination of the TG-43 dosimetric parameters for the model 9011 THINSeed™ brachytherapy source,” Med. Phys. 37, 1681–1688 (2010)] for the model 6711 seed were +4.3% (0.962 cGy h{sup −1} U{sup −1}), +6.2% (0.98 cGy h{sup −1} U{sup −1}), +0.3% (0.925 cGy h{sup −1} U{sup −1}), and −0.2% (0.921 cGy h{sup −1} U{sup −1}), respectively, in comparison to the proposed standard value. Differences in the radial dose functions between the current study and both Solberg et al. and Meigooni et al. were <10% for r ≤ 5 cm, and increased for r > 5 cm with a maximum difference of 29% at r = 9 cm. In comparison to Taylor and Rogers, these differences were lower (maximum of 2% at r = 9 cm). For the similarly designed model 6711 {sup 125}I seed, differences did not exceed 0.5% for 0.5 ≤ r ≤ 10 cm. Radial dose function values varied by 1% as coating thickness and coating density were changed. Varying the cross-section library and source spectrum altered the radial dose function by 25% and 12%, respectively, but these differences occurred at r = 10 cm where the dose rates were very low. The 2D anisotropy function results were most similar to those of Solberg et al. and most different to those of Meigooni et al. The observed order of simulation condition variables from most to least important for influencing the 2D anisotropy function was spectrum, coating thickness, coating density, and cross-section library. Conclusions: Several MC radiation transport codes are available for calculation of the TG-43 dosimetry parameters for brachytherapy seeds. The physics models in these codes and their related cross-section libraries have been updated and improved since publication of the 2007 AAPM TG-43U1S1 report. Results using modern data indicated statistically significant differences in these dosimetry parameters in comparison to data recommended in the TG-43U1S1 report. Therefore, it seems that professional societies such as the AAPM should consider reevaluating the consensus data for this and others seeds and establishing a process of regular evaluations in which consensus data are based upon methods that remain state-of-the-art.« less
Developing a Suitable Model for Water Uptake for Biodegradable Polymers Using Small Training Sets.
Valenzuela, Loreto M; Knight, Doyle D; Kohn, Joachim
2016-01-01
Prediction of the dynamic properties of water uptake across polymer libraries can accelerate polymer selection for a specific application. We first built semiempirical models using Artificial Neural Networks and all water uptake data, as individual input. These models give very good correlations (R (2) > 0.78 for test set) but very low accuracy on cross-validation sets (less than 19% of experimental points within experimental error). Instead, using consolidated parameters like equilibrium water uptake a good model is obtained (R (2) = 0.78 for test set), with accurate predictions for 50% of tested polymers. The semiempirical model was applied to the 56-polymer library of L-tyrosine-derived polyarylates, identifying groups of polymers that are likely to satisfy design criteria for water uptake. This research demonstrates that a surrogate modeling effort can reduce the number of polymers that must be synthesized and characterized to identify an appropriate polymer that meets certain performance criteria.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamont, Stephen Philip; Brisson, Marcia; Curry, Michael
2011-02-17
Nuclear forensics assessments to determine material process history requires careful comparison of sample data to both measured and modeled nuclear material characteristics. Developing centralized databases, or nuclear forensics libraries, to house this information is an important step to ensure all relevant data will be available for comparison during a nuclear forensics analysis and help expedite the assessment of material history. The approach most widely accepted by the international community at this time is the implementation of National Nuclear Forensics libraries, which would be developed and maintained by individual nations. This is an attractive alternative toan international database since it providesmore » an understanding that each country has data on materials produced and stored within their borders, but eliminates the need to reveal any proprietary or sensitive information to other nations. To support the concept of National Nuclear Forensics libraries, the United States Department of Energy has developed a model library, based on a data dictionary, or set of parameters designed to capture all nuclear forensic relevant information about a nuclear material. Specifically, information includes material identification, collection background and current location, analytical laboratories where measurements were made, material packaging and container descriptions, physical characteristics including mass and dimensions, chemical and isotopic characteristics, particle morphology or metallurgical properties, process history including facilities, and measurement quality assurance information. While not necessarily required, it may also be valuable to store modeled data sets including reactor burn-up or enrichment cascade data for comparison. It is fully expected that only a subset of this information is available or relevant to many materials, and much of the data populating a National Nuclear Forensics library would be process analytical or material accountability measurement data as opposed to a complete forensic analysis of each material in the library.« less
Electron lithography STAR design guidelines. Part 2: The design of a STAR for space applications
NASA Technical Reports Server (NTRS)
Trotter, J. D.; Newman, W.
1982-01-01
The STAR design system developed by NASA enables any user with a logic diagram to design a semicustom digital MOS integrated circuit. The system is comprised of a library of standard logic cells and computr programs to place, route, and display designs implemented with cells from the library. Also described is the development of a radiation-hard array designed for the STAR system. The design is based on the CMOS silicon gate technology developed by SANDIA National Laboratories. The design rules used are given as well as the model parameters developed for the basic array element. Library cells of the CMOS metal gate and CMOS silicon gate technologies were simulated using SPICE, and the results are shown and compared.
NASA Technical Reports Server (NTRS)
Garcia, J.; Dauser, T.; Reynolds, C. S.; Kallman, T. R.; McClintock, J. E.; Wilms, J.; Ekmann, W.
2013-01-01
We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code xillver that incorporates new routines and a richer atomic data base. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index Gamma of the illuminating radiation, the ionization parameter zeta at the surface of the disk (i.e., the ratio of the X-ray flux to the gas density), and the iron abundance A(sub Fe) relative to the solar value. The ranges of the parameters covered are: 1.2 <= Gamma <= 3.4, 1 <= zeta <= 104, and 0.5 <= A(sub Fe) <= 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file suitable for the analysis of X-ray observations via the atable model in xspec. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of xillver.
mr: A C++ library for the matching and running of the Standard Model parameters
NASA Astrophysics Data System (ADS)
Kniehl, Bernd A.; Pikelner, Andrey F.; Veretin, Oleg L.
2016-09-01
We present the C++ program library mr that allows us to reliably calculate the values of the running parameters in the Standard Model at high energy scales. The initial conditions are obtained by relating the running parameters in the MS bar renormalization scheme to observables at lower energies with full two-loop precision. The evolution is then performed in accordance with the renormalization group equations with full three-loop precision. Pure QCD corrections to the matching and running are included through four loops. We also provide a Mathematica interface for this program library. Catalogue identifier: AFAI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 517613 No. of bytes in distributed program, including test data, etc.: 2358729 Distribution format: tar.gz Programming language: C++. Computer: IBM PC. Operating system: Linux, Mac OS X. RAM: 1 GB Classification: 11.1. External routines: TSIL [1], OdeInt [2], boost [3] Nature of problem: The running parameters of the Standard Model renormalized in the MS bar scheme at some high renormalization scale, which is chosen by the user, are evaluated in perturbation theory as precisely as possible in two steps. First, the initial conditions at the electroweak energy scale are evaluated from the Fermi constant GF and the pole masses of the W, Z, and Higgs bosons and the bottom and top quarks including the full two-loop threshold corrections. Second, the evolution to the high energy scale is performed by numerically solving the renormalization group evolution equations through three loops. Pure QCD corrections to the matching and running are included through four loops. Solution method: Numerical integration of analytic expressions Additional comments: Available for download from URL: http://apik.github.io/mr/. The MathLink interface is tested to work with Mathematica 7-9 and, with an additional flag, also with Mathematica 10 under Linux and with Mathematica 10 under Mac OS X. Running time: less than 1 second References: [1] S. P. Martin and D. G. Robertson, Comput. Phys. Commun. 174 (2006) 133-151 [hep-ph/0501132]. [2] K. Ahnert and M. Mulansky, AIP Conf. Proc. 1389 (2011) 1586-1589 [arxiv:1110.3397 [cs.MS
An advanced environment for hybrid modeling of biological systems based on modelica.
Pross, Sabrina; Bachmann, Bernhard
2011-01-20
Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.
Ideas for the rapid development of the structural models in mechanical engineering
NASA Astrophysics Data System (ADS)
Oanta, E.; Raicu, A.; Panait, C.
2017-08-01
Conceiving computer based instruments is a long run concern of the authors. Some of the original solutions are: optimal processing of the large matrices, interfaces between the programming languages, approximation theory using spline functions, numerical programming increased accuracy based on the extended arbitrary precision libraries. For the rapid development of the models we identified the following directions: atomization, ‘librarization’, parameterization, automatization and integration. Each of these directions has some particular aspects if we approach mechanical design problems or software development. Atomization means a thorough top-down decomposition analysis which offers an insight regarding the basic features of the phenomenon. Creation of libraries of reusable mechanical parts and libraries of programs (data types, functions) save time, cost and effort when a new model must be conceived. Parameterization leads to flexible definition of the mechanical parts, the values of the parameters being changed either using a dimensioning program or in accord to other parts belonging to the same assembly. The resulting templates may be also included in libraries. Original software applications are useful for the model’s input data generation, to input the data into CAD/FEA commercial applications and for the data integration of the various types of studies included in the same project.
EMPIRE: Nuclear Reaction Model Code System for Data Evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman, M.; Capote, R.; Carlson, B.V.
EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less
Chang, Ye; Tang, Ning; Qu, Hemi; Liu, Jing; Zhang, Daihua; Zhang, Hao; Pang, Wei; Duan, Xuexin
2016-01-01
In this paper, we have modeled and analyzed affinities and kinetics of volatile organic compounds (VOCs) adsorption (and desorption) on various surface chemical groups using multiple self-assembled monolayers (SAMs) functionalized film bulk acoustic resonator (FBAR) array. The high-frequency and micro-scale resonator provides improved sensitivity in the detections of VOCs at trace levels. With the study of affinities and kinetics, three concentration-independent intrinsic parameters (monolayer adsorption capacity, adsorption energy constant and desorption rate) of gas-surface interactions are obtained to contribute to a multi-parameter fingerprint library of VOC analytes. Effects of functional group’s properties on gas-surface interactions are also discussed. The proposed sensor array with concentration-independent fingerprint library shows potential as a portable electronic nose (e-nose) system for VOCs discrimination and gas-sensitive materials selections. PMID:27045012
Research and realization of key technology in HILS interactive system
NASA Astrophysics Data System (ADS)
Liu, Che; Lu, Huiming; Wang, Fankai
2018-03-01
This paper designed HILS (Hardware In the Loop Simulation) interactive system based on xPC platform . Through the interface between C++ and MATLAB engine, establish the seamless data connection between Simulink and interactive system, complete data interaction between system and Simulink, realize the function development of model configuration, parameter modification and off line simulation. We establish the data communication between host and target machine through TCP/IP protocol to realize the model download and real-time simulation. Use database to store simulation data, implement real-time simulation monitoring and simulation data management. Realize system function integration by Qt graphic interface library and dynamic link library. At last, take the typical control system as an example to verify the feasibility of HILS interactive system.
Comprehensive Assessment of Models and Events based on Library tools (CAMEL)
NASA Astrophysics Data System (ADS)
Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.
2017-12-01
At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.
libSRES: a C library for stochastic ranking evolution strategy for parameter estimation.
Ji, Xinglai; Xu, Ying
2006-01-01
Estimation of kinetic parameters in a biochemical pathway or network represents a common problem in systems studies of biological processes. We have implemented a C library, named libSRES, to facilitate a fast implementation of computer software for study of non-linear biochemical pathways. This library implements a (mu, lambda)-ES evolutionary optimization algorithm that uses stochastic ranking as the constraint handling technique. Considering the amount of computing time it might require to solve a parameter-estimation problem, an MPI version of libSRES is provided for parallel implementation, as well as a simple user interface. libSRES is freely available and could be used directly in any C program as a library function. We have extensively tested the performance of libSRES on various pathway parameter-estimation problems and found its performance to be satisfactory. The source code (in C) is free for academic users at http://csbl.bmb.uga.edu/~jix/science/libSRES/
General Economic and Demographic Background and Projections for Indiana Library Services.
ERIC Educational Resources Information Center
Foust, James D.; Tower, Carl B.
Before future library needs can be estimated, economic and demographic variables that influence the demand for library services must be projected and estimating equations relating library needs to economic and demographic parameters developed. This study considers the size, location and age-sex characteristics of Indiana's current population and…
Editorial Library: User Survey.
ERIC Educational Resources Information Center
Surace, Cecily J.
This report presents the findings of a survey conducted by the editorial library of the Los Angeles Times to measure usage and satisfaction with library service, provide background information on library user characteristics, collect information on patterns of use of the Times' clipping files, relate data on usage and satisfaction parameters to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, J.; McClintock, J. E.; Dauser, T.
2013-05-10
We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code XILLVER that incorporates new routines and a richer atomic database. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index {Gamma} of the illuminating radiation, the ionization parameter {xi} at the surface of the disk (i.e., the ratio of the X-ray flux to themore » gas density), and the iron abundance A{sub Fe} relative to the solar value. The ranges of the parameters covered are 1.2 {<=} {Gamma} {<=} 3.4, 1 {<=} {xi} {<=} 10{sup 4}, and 0.5 {<=} A{sub Fe} {<=} 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file (http://hea-www.cfa.harvard.edu/{approx}javier/xillver/) suitable for the analysis of X-ray observations via the atable model in XSPEC. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of XILLVER.« less
NASA Astrophysics Data System (ADS)
Marchand, R.; Purschke, D.; Samson, J.
2013-03-01
Understanding the physics of interaction between satellites and the space environment is essential in planning and exploiting space missions. Several computer models have been developed over the years to study this interaction. In all cases, simulations are carried out in the reference frame of the spacecraft and effects such as charging, the formation of electrostatic sheaths and wakes are calculated for given conditions of the space environment. In this paper we present a program used to compute magnetic fields and a number of space plasma and space environment parameters relevant to Low Earth Orbits (LEO) spacecraft-plasma interaction modeling. Magnetic fields are obtained from the International Geophysical Reference Field (IGRF) and plasma parameters are obtained from the International Reference Ionosphere (IRI) model. All parameters are computed in the spacecraft frame of reference as a function of its six Keplerian elements. They are presented in a format that can be used directly in most spacecraft-plasma interaction models. Catalogue identifier: AENY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 270308 No. of bytes in distributed program, including test data, etc.: 2323222 Distribution format: tar.gz Programming language: FORTRAN 90. Computer: Non specific. Operating system: Non specific. RAM: 7.1 MB Classification: 19, 4.14. External routines: IRI, IGRF (included in the package). Nature of problem: Compute magnetic field components, direction of the sun, sun visibility factor and approximate plasma parameters in the reference frame of a Low Earth Orbit satellite. Solution method: Orbit integration, calls to IGRF and IRI libraries and transformation of coordinates from geocentric to spacecraft frame reference. Restrictions: Low Earth orbits, altitudes between 150 and 2000 km. Running time: Approximately two seconds to parameterize a full orbit with 1000 points.
Literature review of outcome parameters used in studies of Geriatric Fracture Centers.
Liem, I S L; Kammerlander, C; Suhm, N; Kates, S L; Blauth, M
2014-02-01
A variety of multidisciplinary treatment models have been described to improve outcome after osteoporotic hip fractures. There is a tendency toward better outcomes after implementation of the most sophisticated model with a shared leadership for orthopedic surgeons and geriatricians; the Geriatric Fracture Center. The purpose of this review is to evaluate the use of outcome parameters in published literature on the Geriatric Fracture Center evaluation studies. A literature search was performed using Medline and the Cochrane Library to identify Geriatric Fracture Center evaluation studies. The outcome parameters used in the included studies were evaluated. A total of 16 outcome parameters were used in 11 studies to evaluate patient outcome in 8 different Geriatric Fracture Centers. Two of these outcome parameters are patient-reported outcome measures and 14 outcome parameters were objective measures. In-hospital mortality, length of stay, time to surgery, place of residence and complication rate are the most frequently used outcome parameters. The patient-reported outcomes included activities of daily living and mobility scores. There is a need for generally agreed upon outcome measures to facilitate comparison of different care models.
NASA Astrophysics Data System (ADS)
Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo
2014-05-01
Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.
Demand-Adjusted Shelf Availability Parameters: A Second Look.
ERIC Educational Resources Information Center
Schwarz, Philip
1983-01-01
Data gathered in application of Paul Kantor's demand-adjusted shelf availability model to medium-sized academic library indicate significant differences in shelf availability when data are analyzed by last circulation date, acquisition date, and imprint date, and when they are gathered during periods of low and high use. Ten references are cited.…
NASA Astrophysics Data System (ADS)
Cenarro, A. J.; Cardiel, N.; Gorgas, J.; Peletier, R. F.; Vazdekis, A.; Prada, F.
2001-09-01
A new stellar library at the near-IR spectral region developed for the empirical calibration of the Caii triplet and stellar population synthesis modelling is presented. The library covers the range λλ8348-9020 at 1.5-Å (FWHM) spectral resolution, and consists of 706 stars spanning a wide range in atmospheric parameters. We have defined a new set of near-IR indices, CaT*, CaT and PaT, which mostly overcome the limitations of previous definitions, the former being specially suited for the measurement of the Caii triplet strength corrected for the contamination from Paschen lines. We also present a comparative study of the new and the previous Ca indices, as well as the corresponding transformations between the different systems. A thorough analysis of the sources of index errors and the procedure to calculate them is given. Finally, index and error measurements for the whole stellar library are provided together with the final spectra.
NanoTopoChip: High-throughput nanotopographical cell instruction.
Hulshof, Frits F B; Zhao, Yiping; Vasilevich, Aliaksei; Beijer, Nick R M; de Boer, Meint; Papenburg, Bernke J; van Blitterswijk, Clemens; Stamatialis, Dimitrios; de Boer, Jan
2017-10-15
Surface topography is able to influence cell phenotype in numerous ways and offers opportunities to manipulate cells and tissues. In this work, we develop the Nano-TopoChip and study the cell instructive effects of nanoscale topographies. A combination of deep UV projection lithography and conventional lithography was used to fabricate a library of more than 1200 different defined nanotopographies. To illustrate the cell instructive effects of nanotopography, actin-RFP labeled U2OS osteosarcoma cells were cultured and imaged on the Nano-TopoChip. Automated image analysis shows that of many cell morphological parameters, cell spreading, cell orientation and actin morphology are mostly affected by the nanotopographies. Additionally, by using modeling, the changes of cell morphological parameters could by predicted by several feature shape parameters such as lateral size and spacing. This work overcomes the technological challenges of fabricating high quality defined nanoscale features on unprecedented large surface areas of a material relevant for tissue culture such as PS and the screening system is able to infer nanotopography - cell morphological parameter relationships. Our screening platform provides opportunities to identify and study the effect of nanotopography with beneficial properties for the culture of various cell types. The nanotopography of biomaterial surfaces can be modified to influence adhering cells with the aim to improve the performance of medical implants and tissue culture substrates. However, the necessary knowledge of the underlying mechanisms remains incomplete. One reason for this is the limited availability of high-resolution nanotopographies on relevant biomaterials, suitable to conduct systematic biological studies. The present study shows the fabrication of a library of nano-sized surface topographies with high fidelity. The potential of this library, called the 'NanoTopoChip' is shown in a proof of principle HTS study which demonstrates how cells are affected by nanotopographies. The large dataset, acquired by quantitative high-content imaging, allowed us to use predictive modeling to describe how feature dimensions affect cell morphology. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
System Architecture Modeling for Technology Portfolio Management using ATLAS
NASA Technical Reports Server (NTRS)
Thompson, Robert W.; O'Neil, Daniel A.
2006-01-01
Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.
Algorithm for retrieving vegetative canopy and leaf parameters from multi- and hyperspectral imagery
NASA Astrophysics Data System (ADS)
Borel, Christoph
2009-05-01
In recent years hyper-spectral data has been used to retrieve information about vegetative canopies such as leaf area index and canopy water content. For the environmental scientist these two parameters are valuable, but there is potentially more information to be gained as high spatial resolution data becomes available. We developed an Amoeba (Nelder-Mead or Simplex) based program to invert a vegetative canopy radiosity model coupled with a leaf (PROSPECT5) reflectance model and modeled for the background reflectance (e.g. soil, water, leaf litter) to a measured reflectance spectrum. The PROSPECT5 leaf model has five parameters: leaf structure parameter Nstru, chlorophyll a+b concentration Cab, carotenoids content Car, equivalent water thickness Cw and dry matter content Cm. The canopy model has two parameters: total leaf area index (LAI) and number of layers. The background reflectance model is either a single reflectance spectrum from a spectral library() derived from a bare area pixel on an image or a linear mixture of soil spectra. We summarize the radiosity model of a layered canopy and give references to the leaf/needle models. The method is then tested on simulated and measured data. We investigate the uniqueness, limitations and accuracy of the retrieved parameters on canopy parameters (low, medium and high leaf area index) spectral resolution (32 to 211 band hyperspectral), sensor noise and initial conditions.
Automated system for generation of soil moisture products for agricultural drought assessment
NASA Astrophysics Data System (ADS)
Raja Shekhar, S. S.; Chandrasekar, K.; Sesha Sai, M. V. R.; Diwakar, P. G.; Dadhwal, V. K.
2014-11-01
Drought is a frequently occurring disaster affecting lives of millions of people across the world every year. Several parameters, indices and models are being used globally to forecast / early warning of drought and monitoring drought for its prevalence, persistence and severity. Since drought is a complex phenomenon, large number of parameter/index need to be evaluated to sufficiently address the problem. It is a challenge to generate input parameters from different sources like space based data, ground data and collateral data in short intervals of time, where there may be limitation in terms of processing power, availability of domain expertise, specialized models & tools. In this study, effort has been made to automate the derivation of one of the important parameter in the drought studies viz Soil Moisture. Soil water balance bucket model is in vogue to arrive at soil moisture products, which is widely popular for its sensitivity to soil conditions and rainfall parameters. This model has been encoded into "Fish-Bone" architecture using COM technologies and Open Source libraries for best possible automation to fulfill the needs for a standard procedure of preparing input parameters and processing routines. The main aim of the system is to provide operational environment for generation of soil moisture products by facilitating users to concentrate on further enhancements and implementation of these parameters in related areas of research, without re-discovering the established models. Emphasis of the architecture is mainly based on available open source libraries for GIS and Raster IO operations for different file formats to ensure that the products can be widely distributed without the burden of any commercial dependencies. Further the system is automated to the extent of user free operations if required with inbuilt chain processing for every day generation of products at specified intervals. Operational software has inbuilt capabilities to automatically download requisite input parameters like rainfall, Potential Evapotranspiration (PET) from respective servers. It can import file formats like .grd, .hdf, .img, generic binary etc, perform geometric correction and re-project the files to native projection system. The software takes into account the weather, crop and soil parameters to run the designed soil water balance model. The software also has additional features like time compositing of outputs to generate weekly, fortnightly profiles for further analysis. Other tools to generate "Area Favorable for Crop Sowing" using the daily soil moisture with highly customizable parameters interface has been provided. A whole India analysis would now take a mere 20 seconds for generation of soil moisture products which would normally take one hour per day using commercial software.
Engineering emergent multicellular behavior through synthetic adhesion
NASA Astrophysics Data System (ADS)
Glass, David; Riedel-Kruse, Ingmar
In over a decade, synthetic biology has developed increasingly robust gene networks within single cells, but constructed very few systems that demonstrate multicellular spatio-temporal dynamics. We are filling this gap in synthetic biology's toolbox by developing an E. coli self-assembly platform based on modular cell-cell adhesion. We developed a system in which adhesive selectivity is provided by a library of outer membrane-displayed peptides with intra-library specificities, while affinity is provided by consistent expression across the entire library. We further provide a biophysical model to help understand the parameter regimes in which this tool can be used to self-assemble into cellular clusters, filaments, or meshes. The combined platform will enable future development of synthetic multicellular systems for use in consortia-based metabolic engineering, in living materials, and in controlled study of minimal multicellular systems. Stanford Bio-X Bowes Fellowship.
Shearer, Barbara S.; Nagy, Suzanne P.
2003-01-01
The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters. PMID:12883565
Shearer, Barbara S; Nagy, Suzanne P
2003-07-01
The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters.
Overview of refinement procedures within REFMAC5: utilizing data from different sources.
Kovalevskiy, Oleg; Nicholls, Robert A; Long, Fei; Carlon, Azzurra; Murshudov, Garib N
2018-03-01
Refinement is a process that involves bringing into agreement the structural model, available prior knowledge and experimental data. To achieve this, the refinement procedure optimizes a posterior conditional probability distribution of model parameters, including atomic coordinates, atomic displacement parameters (B factors), scale factors, parameters of the solvent model and twin fractions in the case of twinned crystals, given observed data such as observed amplitudes or intensities of structure factors. A library of chemical restraints is typically used to ensure consistency between the model and the prior knowledge of stereochemistry. If the observation-to-parameter ratio is small, for example when diffraction data only extend to low resolution, the Bayesian framework implemented in REFMAC5 uses external restraints to inject additional information extracted from structures of homologous proteins, prior knowledge about secondary-structure formation and even data obtained using different experimental methods, for example NMR. The refinement procedure also generates the `best' weighted electron-density maps, which are useful for further model (re)building. Here, the refinement of macromolecular structures using REFMAC5 and related tools distributed as part of the CCP4 suite is discussed.
LibKiSAO: a Java library for Querying KiSAO.
Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas
2012-09-24
The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.
Design of a Model Execution Framework: Repetitive Object-Oriented Simulation Environment (ROSE)
NASA Technical Reports Server (NTRS)
Gray, Justin S.; Briggs, Jeffery L.
2008-01-01
The ROSE framework was designed to facilitate complex system analyses. It completely divorces the model execution process from the model itself. By doing so ROSE frees the modeler to develop a library of standard modeling processes such as Design of Experiments, optimizers, parameter studies, and sensitivity studies which can then be applied to any of their available models. The ROSE framework accomplishes this by means of a well defined API and object structure. Both the API and object structure are presented here with enough detail to implement ROSE in any object-oriented language or modeling tool.
NASA Technical Reports Server (NTRS)
He, Yuning
2015-01-01
The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.
Wenderski, Todd A; Stratton, Christopher F; Bauer, Renato A; Kopp, Felix; Tan, Derek S
2015-01-01
Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design.
Wenderski, Todd A.; Stratton, Christopher F.; Bauer, Renato A.; Kopp, Felix; Tan, Derek S.
2015-01-01
Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design. PMID:25618349
Evaluation of chiller modeling approaches and their usability for fault detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreedharan, Priya
Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Several factors must be considered in model evaluation, including accuracy, training data requirements, calibration effort, generality, and computational requirements. All modeling approaches fall somewhere between pure first-principles models, and empirical models. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression air conditioning units, which are commonly known as chillers. Three different models were studied: two are based on first-principles and the third is empirical in nature. The first-principles models are themore » Gordon and Ng Universal Chiller model (2nd generation), and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles. The DOE-2 chiller model as implemented in CoolTools{trademark} was selected for the empirical category. The models were compared in terms of their ability to reproduce the observed performance of an older chiller operating in a commercial building, and a newer chiller in a laboratory. The DOE-2 and Gordon-Ng models were calibrated by linear regression, while a direct-search method was used to calibrate the Toolkit model. The ''CoolTools'' package contains a library of calibrated DOE-2 curves for a variety of different chillers, and was used to calibrate the building chiller to the DOE-2 model. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less
NASA Astrophysics Data System (ADS)
Lyapin, Sergey; Kukovyakin, Alexey
Within the framework of the research program "Textaurus" an operational prototype of multifunctional library T-Libra v.4.1. has been created which makes it possible to carry out flexible parametrizable search within a full-text database. The information system is realized in the architecture Web-browser / Web-server / SQL-server. This allows to achieve an optimal combination of universality and efficiency of text processing, on the one hand, and convenience and minimization of expenses for an end user (due to applying of a standard Web-browser as a client application), on the other one. The following principles underlie the information system: a) multifunctionality, b) intelligence, c) multilingual primary texts and full-text searching, d) development of digital library (DL) by a user ("administrative client"), e) multi-platform working. A "library of concepts", i.e. a block of functional models of semantic (concept-oriented) searching, as well as a subsystem of parametrizable queries to a full-text database, which is closely connected with the "library", serve as a conceptual basis of multifunctionality and "intelligence" of the DL T-Libra v.4.1. An author's paragraph is a unit of full-text searching in the suggested technology. At that, the "logic" of an educational / scientific topic or a problem can be built in a multilevel flexible structure of a query and the "library of concepts", replenishable by the developers and experts. About 10 queries of various level of complexity and conceptuality are realized in the suggested version of the information system: from simple terminological searching (taking into account lexical and grammatical paradigms of Russian) to several kinds of explication of terminological fields and adjustable two-parameter thematic searching (a [set of terms] and a [distance between terms] within the limits of an author's paragraph are such parameters correspondingly).
A Multi-User Microcomputer System for Small Libraries.
ERIC Educational Resources Information Center
Leggate, Peter
1988-01-01
Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)
Exact mass libraries of ESI and APCI mass spectra are not commercially available In-house libraries are dependent on CID parameters and are instrument specific. The ability to identify compounds without reliance on mass spectral libraries is therefore more crucial for liquid sam...
Macro and Microenvironments at the British Library.
ERIC Educational Resources Information Center
Shenton, Helen
This paper describes the storage of the 12 million items that have just been moved into the new British Library building. The specifications for the storage and environmental conditions for different types of library and archive material are explained. The varying environmental parameters for storage areas and public areas, including reading rooms…
The WAGGS project - I. The WiFeS Atlas of Galactic Globular cluster Spectra
NASA Astrophysics Data System (ADS)
Usher, Christopher; Pastorello, Nicola; Bellstedt, Sabine; Alabi, Adebusola; Cerulo, Pierluigi; Chevalier, Leonie; Fraser-McKelvie, Amelia; Penny, Samantha; Foster, Caroline; McDermid, Richard M.; Schiavon, Ricardo P.; Villaume, Alexa
2017-07-01
We present the WiFeS Atlas of Galactic Globular cluster Spectra, a library of integrated spectra of Milky Way and Local Group globular clusters. We used the WiFeS integral field spectrograph on the Australian National University 2.3 m telescope to observe the central regions of 64 Milky Way globular clusters and 22 globular clusters hosted by the Milky Way's low-mass satellite galaxies. The spectra have wider wavelength coverage (3300-9050 Å) and higher spectral resolution (R = 6800) than existing spectral libraries of Milky Way globular clusters. By including Large and Small Magellanic Cloud star clusters, we extend the coverage of parameter space of existing libraries towards young and intermediate ages. While testing stellar population synthesis models and analysis techniques is the main aim of this library, the observations may also further our understanding of the stellar populations of Local Group globular clusters and make possible the direct comparison of extragalactic globular cluster integrated light observations with well-understood globular clusters in the Milky Way. The integrated spectra are publicly available via the project website.
Design and implementation of a cloud based lithography illumination pupil processing application
NASA Astrophysics Data System (ADS)
Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie
2017-02-01
Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.
A Computerized Library and Evaluation System for Integral Neutron Experiments.
ERIC Educational Resources Information Center
Hampel, Viktor E.; And Others
A computerized library of references to integral neutron experiments has been developed at the Lawrence Radiation Laboratory at Livermore. This library serves as a data base for the systematic retrieval of documents describing diverse critical and bulk nuclear experiments. The evaluation and reduction of the physical parameters of the experiments…
CONSTRUCTING A FLEXIBLE LIKELIHOOD FUNCTION FOR SPECTROSCOPIC INFERENCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.
2015-10-20
We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically addresses the common problem of mismatches in model spectralmore » line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line “outliers.” By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high-resolution V-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate-resolution K-band spectrum of Gliese 51, an M5 field dwarf.« less
Trabelsi, Heykel; Koch, Mathilde; Faulon, Jean-Loup
2018-05-07
Progress in synthetic biology tools has transformed the way we engineer living cells. Applications of circuit design have reached a new level, offering solutions for metabolic engineering challenges that include developing screening approaches for libraries of pathway variants. The use of transcription-factor-based biosensors for screening has shown promising results, but the quantitative relationship between the sensors and the sensed molecules still needs more rational understanding. Herein, we have successfully developed a novel biosensor to detect pinocembrin based on a transcriptional regulator. The FdeR transcription factor (TF), known to respond to naringenin, was combined with a fluorescent reporter protein. By varying the copy number of its plasmid and the concentration of the biosensor TF through a combinatorial library, different responses have been recorded and modeled. The fitted model provides a tool to understand the impact of these parameters on the biosensor behavior in terms of dose-response and time curves and offers guidelines to build constructs oriented to increased sensitivity and or ability of linear detection at higher titers. Our model, the first to explicitly take into account the impact of plasmid copy number on biosensor sensitivity using Hill-based formalism, is able to explain uncharacterized systems without extensive knowledge of the properties of the TF. Moreover, it can be used to model the response of the biosensor to different compounds (here naringenin and pinocembrin) with minimal parameter refitting. © 2018 Wiley Periodicals, Inc.
Library based x-ray scatter correction for dedicated cone beam breast CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Linxi; Zhu, Lei, E-mail: leizhu@gatech.edu
Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correctionmore » on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal views. Conclusions: The library-based scatter correction does not require increase in radiation dose or hardware modifications, and it improves over the existing methods on implementation simplicity and computational efficiency. As demonstrated through patient studies, the authors’ approach is effective and stable, and is therefore clinically attractive for CBBCT imaging.« less
Library based x-ray scatter correction for dedicated cone beam breast CT
Shi, Linxi; Karellas, Andrew; Zhu, Lei
2016-01-01
Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the geant4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal views. Conclusions: The library-based scatter correction does not require increase in radiation dose or hardware modifications, and it improves over the existing methods on implementation simplicity and computational efficiency. As demonstrated through patient studies, the authors’ approach is effective and stable, and is therefore clinically attractive for CBBCT imaging. PMID:27487870
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, Madeline Louise; McMath, Garrett Earl
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries
Lockhart, Madeline Louise; McMath, Garrett Earl
2017-10-26
Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less
UQTk Version 3.0.3 User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sargsyan, Khachik; Safta, Cosmin; Chowdhary, Kamaljit Singh
2017-05-01
The UQ Toolkit (UQTk) is a collection of libraries and tools for the quantification of uncertainty in numerical model predictions. Version 3.0.3 offers intrusive and non-intrusive methods for propagating input uncertainties through computational models, tools for sen- sitivity analysis, methods for sparse surrogate construction, and Bayesian inference tools for inferring parameters from experimental data. This manual discusses the download and installation process for UQTk, provides pointers to the UQ methods used in the toolkit, and describes some of the examples provided with the toolkit.
Library fingerprints: a novel approach to the screening of virtual libraries.
Klon, Anthony E; Diller, David J
2007-01-01
We propose a novel method to prioritize libraries for combinatorial synthesis and high-throughput screening that assesses the viability of a particular library on the basis of the aggregate physical-chemical properties of the compounds using a naïve Bayesian classifier. This approach prioritizes collections of related compounds according to the aggregate values of their physical-chemical parameters in contrast to single-compound screening. The method is also shown to be useful in screening existing noncombinatorial libraries when the compounds in these libraries have been previously clustered according to their molecular graphs. We show that the method used here is comparable or superior to the single-compound virtual screening of combinatorial libraries and noncombinatorial libraries and is superior to the pairwise Tanimoto similarity searching of a collection of combinatorial libraries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKenzie, IV, George Espy; Goda, Joetta Marie; Grove, Travis Justin
This paper examines the comparison of MCNP® code’s capability to calculate kinetics parameters effectively for a thermal system containing highly enriched uranium (HEU). The Rossi-α parameter was chosen for this examination because it is relatively easy to measure as well as easy to calculate using MCNP®’s kopts card. The Rossi-α also incorporates many other parameters of interest in nuclear kinetics most of which are more difficult to precisely measure. The comparison looks at two different nuclear data libraries for comparison to the experimental data. These libraries are ENDF/BVI (.66c) and ENDF/BVII (.80c).
A reliable computational workflow for the selection of optimal screening libraries.
Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch
2015-01-01
The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.
DPN-Generated Combinatorial Libraries
2012-02-29
µM final concentration) is added (Fig. 6). Control over deposition parameters was examined for two model proteins, cholera toxin β subunit (CTβ...conjugated anti- cholera toxin beta (anti-CTb). The wells in the mould are inverted pyramids with an average depth of 86 µm, edge length of 120 µm, and...Therapeutics,” (2011). 91. University of New Mexico , Chemistry Department Colloquium, Albuquerque, NM, “The Polyvalent Gold Nanoparticle Conjugate
Rowland, Mark S.; Howard, Douglas E.; Wong, James L.; Jessup, James L.; Bianchini, Greg M.; Miller, Wayne O.
2007-10-23
A real-time method and computer system for identifying radioactive materials which collects gamma count rates from a HPGe gamma-radiation detector to produce a high-resolution gamma-ray energy spectrum. A library of nuclear material definitions ("library definitions") is provided, with each uniquely associated with a nuclide or isotope material and each comprising at least one logic condition associated with a spectral parameter of a gamma-ray energy spectrum. The method determines whether the spectral parameters of said high-resolution gamma-ray energy spectrum satisfy all the logic conditions of any one of the library definitions, and subsequently uniquely identifies the material type as that nuclide or isotope material associated with the satisfied library definition. The method is iteratively repeated to update the spectrum and identification in real time.
Model-Based Thermal System Design Optimization for the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-01-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
Model-based thermal system design optimization for the James Webb Space Telescope
NASA Astrophysics Data System (ADS)
Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.
2017-10-01
Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.
NASA Astrophysics Data System (ADS)
Parviainen, Hannu
2015-10-01
PyLDTk automates the calculation of custom stellar limb darkening (LD) profiles and model-specific limb darkening coefficients (LDC) using the library of PHOENIX-generated specific intensity spectra by Husser et al. (2013). It facilitates exoplanet transit light curve modeling, especially transmission spectroscopy where the modeling is carried out for custom narrow passbands. PyLDTk construct model-specific priors on the limb darkening coefficients prior to the transit light curve modeling. It can also be directly integrated into the log posterior computation of any pre-existing transit modeling code with minimal modifications to constrain the LD model parameter space directly by the LD profile, allowing for the marginalization over the whole parameter space that can explain the profile without the need to approximate this constraint by a prior distribution. This is useful when using a high-order limb darkening model where the coefficients are often correlated, and the priors estimated from the tabulated values usually fail to include these correlations.
Parameterizable Library Components for SAW Devices
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2006-01-01
To facilitate quick fabrication of Surface Acoustic Wave (SAW) sensors we have found it necessary to develop a library of parameterizable components. This library is the first module in our strategy towards a design tool that is integrated into existing Electronic Design Automation (EDA) tools. This library is similar to the standard cell libraries found in digital design packages. The library cells allow the user to input the design parameters which automatically generate a detailed layout of the SAW component. This paper presents the results of our development of parameterizable cells for an InterDigitated Transducer (IDT), reflector, SAW delay line, and both one and two port resonators.
libprofit: Image creation from luminosity profiles
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Taranu, D.; Tobar, R.
2016-12-01
libprofit is a C++ library for image creation based on different luminosity profiles. It offers fast and accurate two-dimensional integration for a useful number of profiles, including Sersic, Core-Sersic, broken-exponential, Ferrer, Moffat, empirical King, point-source and sky, with a simple mechanism for adding new profiles. libprofit provides a utility to read the model and profile parameters from the command-line and generate the corresponding image. It can output the resulting image as text values, a binary stream, or as a simple FITS file. It also provides a shared library exposing an API that can be used by any third-party application. R and Python interfaces are available: ProFit (ascl:1612.004) and PyProfit (ascl:1612.005).
Biomolecular Force Field Parameterization via Atoms-in-Molecule Electron Density Partitioning.
Cole, Daniel J; Vilseck, Jonah Z; Tirado-Rives, Julian; Payne, Mike C; Jorgensen, William L
2016-05-10
Molecular mechanics force fields, which are commonly used in biomolecular modeling and computer-aided drug design, typically treat nonbonded interactions using a limited library of empirical parameters that are developed for small molecules. This approach does not account for polarization in larger molecules or proteins, and the parametrization process is labor-intensive. Using linear-scaling density functional theory and atoms-in-molecule electron density partitioning, environment-specific charges and Lennard-Jones parameters are derived directly from quantum mechanical calculations for use in biomolecular modeling of organic and biomolecular systems. The proposed methods significantly reduce the number of empirical parameters needed to construct molecular mechanics force fields, naturally include polarization effects in charge and Lennard-Jones parameters, and scale well to systems comprised of thousands of atoms, including proteins. The feasibility and benefits of this approach are demonstrated by computing free energies of hydration, properties of pure liquids, and the relative binding free energies of indole and benzofuran to the L99A mutant of T4 lysozyme.
Chemical Space of DNA-Encoded Libraries.
Franzini, Raphael M; Randolph, Cassie
2016-07-28
In recent years, DNA-encoded chemical libraries (DECLs) have attracted considerable attention as a potential discovery tool in drug development. Screening encoded libraries may offer advantages over conventional hit discovery approaches and has the potential to complement such methods in pharmaceutical research. As a result of the increased application of encoded libraries in drug discovery, a growing number of hit compounds are emerging in scientific literature. In this review we evaluate reported encoded library-derived structures and identify general trends of these compounds in relation to library design parameters. We in particular emphasize the combinatorial nature of these libraries. Generally, the reported molecules demonstrate the ability of this technology to afford hits suitable for further lead development, and on the basis of them, we derive guidelines for DECL design.
NASA Astrophysics Data System (ADS)
Kaliuzhnyi, Mykola; Bushuev, Felix; Shulga, Oleksandr; Sybiryakova, Yevgeniya; Shakun, Leonid; Bezrukovs, Vladislavs; Moskalenko, Sergiy; Kulishenko, Vladislav; Malynovskyi, Yevgen
2016-12-01
An international network of passive correlation ranging of a geostationary telecommunication satellite is considered in the article. The network is developed by the RI "MAO". The network consists of five spatially separated stations of synchronized reception of DVB-S signals of digital satellite TV. The stations are located in Ukraine and Latvia. The time difference of arrival (TDOA) on the network stations of the DVB-S signals, radiated by the satellite, is a measured parameter. The results of TDOA estimation obtained by the network in May-August 2016 are presented in the article. Orbital parameters of the tracked satellite are determined using measured values of the TDOA and two models of satellite motion: the analytical model SGP4/SDP4 and the model of numerical integration of the equations of satellite motion. Both models are realized using the free low-level space dynamics library OREKIT (ORbit Extrapolation KIT).
Using a Gravity Model to Predict Circulation in a Public Library System.
ERIC Educational Resources Information Center
Ottensmann, John R.
1995-01-01
Describes the development of a gravity model based upon principles of spatial interaction to predict the circulation of libraries in the Indianapolis-Marion County Public Library (Indiana). The model effectively predicted past circulation figures and was tested by predicting future library circulation, particularly for a new branch library.…
Welter, David E.; Doherty, John E.; Hunt, Randall J.; Muffels, Christopher T.; Tonkin, Matthew J.; Schreuder, Willem A.
2012-01-01
An object-oriented parameter estimation code was developed to incorporate benefits of object-oriented programming techniques for solving large parameter estimation modeling problems. The code is written in C++ and is a formulation and expansion of the algorithms included in PEST, a widely used parameter estimation code written in Fortran. The new code is called PEST++ and is designed to lower the barriers of entry for users and developers while providing efficient algorithms that can accommodate large, highly parameterized problems. This effort has focused on (1) implementing the most popular features of PEST in a fashion that is easy for novice or experienced modelers to use and (2) creating a software design that is easy to extend; that is, this effort provides a documented object-oriented framework designed from the ground up to be modular and extensible. In addition, all PEST++ source code and its associated libraries, as well as the general run manager source code, have been integrated in the Microsoft Visual Studio® 2010 integrated development environment. The PEST++ code is designed to provide a foundation for an open-source development environment capable of producing robust and efficient parameter estimation tools for the environmental modeling community into the future.
Impact of New Nuclear Data Libraries on Small Sized Long Life CANDLE HTGR Design Parameters
NASA Astrophysics Data System (ADS)
Liem, Peng Hong; Hartanto, Donny; Tran, Hoai Nam
2017-01-01
The impact of new evaluated nuclear data libraries (JENDL-4.0, ENDF/B-VII.0 and JEFF-3.1) on the core characteristics of small-sized long-life CANDLE High Temperature Gas-Cooled Reactors (HTGRs) with uranium and thorium fuel cycles was investigated. The most important parameters of the CANDLE core characteristics investigated here covered (1) infinite multiplication factor of the fresh fuel containing burnable poison, (2) the effective multiplication factor of the equilibrium core, (3) the moving velocity of the burning region, (4) the attained discharge burnup, and (5) the maximum power density. The reference case was taken from the current JENDL-3.3 results. For the uranium fuel cycle, the impact of the new libraries was small, while significant impact was found for thorium fuel cycle. The findings indicated the needs of more accurate nuclear data libraries for nuclides involved in thorium fuel cycle in the future.
Field size dependent mapping of medical linear accelerator radiation leakage
NASA Astrophysics Data System (ADS)
Vũ Bezin, Jérémi; Veres, Attila; Lefkopoulos, Dimitri; Chavaudra, Jean; Deutsch, Eric; de Vathaire, Florent; Diallo, Ibrahima
2015-03-01
The purpose of this study was to investigate the suitability of a graphics library based model for the assessment of linear accelerator radiation leakage. Transmission through the shielding elements was evaluated using the build-up factor corrected exponential attenuation law and the contribution from the electron guide was estimated using the approximation of a linear isotropic radioactive source. Model parameters were estimated by a fitting series of thermoluminescent dosimeter leakage measurements, achieved up to 100 cm from the beam central axis along three directions. The distribution of leakage data at the patient plane reflected the architecture of the shielding elements. Thus, the maximum leakage dose was found under the collimator when only one jaw shielded the primary beam and was about 0.08% of the dose at isocentre. Overall, we observe that the main contributor to leakage dose according to our model was the electron beam guide. Concerning the discrepancies between the measurements used to calibrate the model and the calculations from the model, the average difference was about 7%. Finally, graphics library modelling is a readily and suitable way to estimate leakage dose distribution on a personal computer. Such data could be useful for dosimetric evaluations in late effect studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sin, M.; Capote, R.; Herman, M. W.
Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less
Optimization of a Thermodynamic Model Using a Dakota Toolbox Interface
NASA Astrophysics Data System (ADS)
Cyrus, J.; Jafarov, E. E.; Schaefer, K. M.; Wang, K.; Clow, G. D.; Piper, M.; Overeem, I.
2016-12-01
Scientific modeling of the Earth physical processes is an important driver of modern science. The behavior of these scientific models is governed by a set of input parameters. It is crucial to choose accurate input parameters that will also preserve the corresponding physics being simulated in the model. In order to effectively simulate real world processes the models output data must be close to the observed measurements. To achieve this optimal simulation, input parameters are tuned until we have minimized the objective function, which is the error between the simulation model outputs and the observed measurements. We developed an auxiliary package, which serves as a python interface between the user and DAKOTA. The package makes it easy for the user to conduct parameter space explorations, parameter optimizations, as well as sensitivity analysis while tracking and storing results in a database. The ability to perform these analyses via a Python library also allows the users to combine analysis techniques, for example finding an approximate equilibrium with optimization then immediately explore the space around it. We used the interface to calibrate input parameters for the heat flow model, which is commonly used in permafrost science. We performed optimization on the first three layers of the permafrost model, each with two thermal conductivity coefficients input parameters. Results of parameter space explorations indicate that the objective function not always has a unique minimal value. We found that gradient-based optimization works the best for the objective functions with one minimum. Otherwise, we employ more advanced Dakota methods such as genetic optimization and mesh based convergence in order to find the optimal input parameters. We were able to recover 6 initially unknown thermal conductivity parameters within 2% accuracy of their known values. Our initial tests indicate that the developed interface for the Dakota toolbox could be used to perform analysis and optimization on a `black box' scientific model more efficiently than using just Dakota.
Creating a library holding group: an approach to large system integration.
Huffman, Isaac R; Martin, Heather J; Delawska-Elliott, Basia
2016-10-01
Faced with resource constraints, many hospital libraries have considered joint operations. This case study describes how Providence Health & Services created a single group to provide library services. Using a holding group model, staff worked to unify more than 6,100 nonlibrary subscriptions and 14 internal library sites. Our library services grew by unifying 2,138 nonlibrary subscriptions and 11 library sites and hiring more library staff. We expanded access to 26,018 more patrons. A model with built-in flexibility allowed successful library expansion. Although challenges remain, this success points to a viable model of unified operations.
EMPIRE: A code for nuclear astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palumbo, A.
The nuclear reaction code EMPIRE is presented as a useful tool for nuclear astrophysics. EMPIRE combines a variety of the reaction models with a comprehensive library of input parameters providing a diversity of options for the user. With exclusion of the directsemidirect capture all reaction mechanisms relevant to the nuclear astrophysics energy range of interest are implemented in the code. Comparison to experimental data show consistent agreement for all relevant channels.
Computation and Validation of the Dynamic Response Index (DRI)
2013-08-06
matplotlib plotting library. • Executed from command line. • Allows several optional arguments. • Runs on Windows, Linux, UNIX, and Mac OS X. 10... vs . Time: Triangular pulse input data with given time duration and peak acceleration: Time (s) EARTH Code: Motivation • Error Assessment of...public release • ARC provided electrothermal battery model example: • Test vs . simulation data for terminal voltage. • EARTH input parameters
OnGuard, a Computational Platform for Quantitative Kinetic Modeling of Guard Cell Physiology1[W][OA
Hills, Adrian; Chen, Zhong-Hua; Amtmann, Anna; Blatt, Michael R.; Lew, Virgilio L.
2012-01-01
Stomatal guard cells play a key role in gas exchange for photosynthesis while minimizing transpirational water loss from plants by opening and closing the stomatal pore. Foliar gas exchange has long been incorporated into mathematical models, several of which are robust enough to recapitulate transpirational characteristics at the whole-plant and community levels. Few models of stomata have been developed from the bottom up, however, and none are sufficiently generalized to be widely applicable in predicting stomatal behavior at a cellular level. We describe here the construction of computational models for the guard cell, building on the wealth of biophysical and kinetic knowledge available for guard cell transport, signaling, and homeostasis. The OnGuard software was constructed with the HoTSig library to incorporate explicitly all of the fundamental properties for transporters at the plasma membrane and tonoplast, the salient features of osmolite metabolism, and the major controls of cytosolic-free Ca2+ concentration and pH. The library engenders a structured approach to tier and interrelate computational elements, and the OnGuard software allows ready access to parameters and equations ‘on the fly’ while enabling the network of components within each model to interact computationally. We show that an OnGuard model readily achieves stability in a set of physiologically sensible baseline or Reference States; we also show the robustness of these Reference States in adjusting to changes in environmental parameters and the activities of major groups of transporters both at the tonoplast and plasma membrane. The following article addresses the predictive power of the OnGuard model to generate unexpected and counterintuitive outputs. PMID:22635116
Visualization Based Data Mining for Comparison Between Two Solar Cell Libraries.
Yosipof, Abraham; Kaspi, Omer; Majhi, Koushik; Senderowitz, Hanoch
2016-12-01
Material informatics may provide meaningful insights and powerful predictions for the development of new and efficient Metal Oxide (MO) based solar cells. The main objective of this paper is to establish the usefulness of data reduction and visualization methods for analyzing data sets emerging from multiple all-MOs solar cell libraries. For this purpose, two libraries, TiO 2 |Co 3 O 4 and TiO 2 |Co 3 O 4 |MoO 3 , differing only by the presence of a MoO 3 layer in the latter were analyzed with Principal Component Analysis and Self-Organizing Maps. Both analyses suggest that the addition of the MoO 3 layer to the TiO 2 |Co 3 O 4 library has affected the overall photovoltaic (PV) activity profile of the solar cells making the two libraries clearly distinguishable from one another. Furthermore, while MoO 3 had an overall favorable effect on PV parameters, a sub-population of cells was identified which were either indifferent to its presence or even demonstrated a reduction in several parameters. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-12-09
PV_LIB comprises a library of Matlab? code for modeling photovoltaic (PV) systems. Included are functions to compute solar position and to estimate irradiance in the PV system's plane of array, cell temperature, PV module electrical output, and conversion from DC to AC power. Also included are functions that aid in determining parameters for module performance models from module characterization testing. PV_LIB is open source code primarily intended for research and academic purposes. All algorithms are documented in openly available literature with the appropriate references included in comments within the code.
ERIC Educational Resources Information Center
Ohio Library Foundation, Columbus.
A guide which any library may use to achieve its own statement of personnel policy presents policy models which suggest rules and regulations to be used to supervise the staffs of public and academic libraries. These policies cover: (1) appointments; (2) classification of positions; (3) faculty and staff development; (4) performance evaluations;…
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
Rubble-pile Simulations Using The Open Dynamics Engine
NASA Astrophysics Data System (ADS)
Korycansky, Donald; Asphaug, E.
2008-09-01
We describe a series of calculations of low-speed collisions of km-scale rubble piles (i.e. asteroids or planetesimals), similar to previous work (Korycansky and Asphaug 2006). The rubble piles are aggregates of polyhedra held together by gravity and friction. Collision velocities are typically of order 1 to 100 m/sec.In this work we make use of a so-called "physics engine" to solve the equations of rigid-body motion and collisions of the polyhedra. Such code libraries have been primarily developed for computer simulations and games. The chief advantage of these libraries is the inclusion of sophisticated algorithms for collision detection, which we have found to be the main computational bottleneck in our calculations. The package we have used is the Open Dynamics Engine, a freely available open-source library (www.ode.org). It solves the equations of motion to first-order accuracy in time and utilizes a fast algorithm for collision detection. We have found a factor of approximately 30 speed-up for our calculations, allowing the exploration of a much larger range of parameter space and the running of multiple calculations in order to sample the stochasticity of the results. For the calculations we report on here, the basic model is the collision of an impactor in the range 0.1--1 km in diameter with a target of 1 km diameter.argets are modeled with 1000 polyhedral elements and impactors modeled with 1 to 1000 elements depending on mass. Collisions of objects with both equal-mass elements, and elements chosen from a power-law distribution, are studied. We concentrate on determining the energy required for catastrophic disruption (Q*D) as a function of impactor/target mass atio and impactor parameter for off-center collisions. This work has been supported by NASA Planetary Geology and Geophysics Program grant NNX07AQ04G.
ERIC Educational Resources Information Center
Koontz, Christine M.
1992-01-01
Presents a methodology for construction of location modeling for public library facilities in diverse urban environments. Historical and current research in library location is reviewed; and data collected from a survey of six library systems are analyzed according to population, spatial, library use, and library attractiveness variables. (48…
Random sampling and validation of covariance matrices of resonance parameters
NASA Astrophysics Data System (ADS)
Plevnik, Lucijan; Zerovnik, Gašper
2017-09-01
Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.
New H-band Stellar Spectral Libraries for the SDSS-III/APOGEE Survey
NASA Astrophysics Data System (ADS)
Zamora, O.; García-Hernández, D. A.; Allende Prieto, C.; Carrera, R.; Koesterke, L.; Edvardsson, B.; Castelli, F.; Plez, B.; Bizyaev, D.; Cunha, K.; García Pérez, A. E.; Gustafsson, B.; Holtzman, J. A.; Lawler, J. E.; Majewski, S. R.; Manchado, A.; Mészáros, Sz.; Shane, N.; Shetrone, M.; Smith, V. V.; Zasowski, G.
2015-06-01
The Sloan Digital Sky Survey-III (SDSS-III) Apache Point Observatory Galactic Evolution Experiment (APOGEE) has obtained high-resolution (R ˜ 22,500), high signal-to-noise ratio (\\gt 100) spectra in the H-band (˜1.5-1.7 μm) for about 146,000 stars in the Milky Way galaxy. We have computed spectral libraries with effective temperature ({{T}eff}) ranging from 3500 to 8000 K for the automated chemical analysis of the survey data. The libraries, used to derive stellar parameters and abundances from the APOGEE spectra in the SDSS-III data release 12 (DR12), are based on ATLAS9 model atmospheres and the ASSɛT spectral synthesis code. We present a second set of libraries based on MARCS model atmospheres and the spectral synthesis code Turbospectrum. The ATLAS9/ASSɛT ({{T}eff} = 3500-8000 K) and MARCS/Turbospectrum ({{T}eff} = 3500-5500 K) grids cover a wide range of metallicity (-2.5 ≤slant [M/H] ≤slant +0.5 dex), surface gravity (0 ≤ log g ≤slant 5 dex), microturbulence (0.5 ≤slant ξ ≤slant 8 km s-1), carbon (-1 ≤slant [C/M] ≤slant +1 dex), nitrogen (-1 ≤slant [N/M] ≤slant +1 dex), and α-element (-1 ≤slant [α/M] ≤slant +1 dex) variations, having thus seven dimensions. We compare the ATLAS9/ASSɛT and MARCS/Turbospectrum libraries and apply both of them to the analysis of the observed H-band spectra of the Sun and the K2 giant Arcturus, as well as to a selected sample of well-known giant stars observed at very high resolution. The new APOGEE libraries are publicly available and can be employed for chemical studies in the H-band using other high-resolution spectrographs.
Application of heterogeneous pulse coupled neural network in image quantization
NASA Astrophysics Data System (ADS)
Huang, Yi; Ma, Yide; Li, Shouliang; Zhan, Kun
2016-11-01
On the basis of the different strengths of synaptic connections between actual neurons, this paper proposes a heterogeneous pulse coupled neural network (HPCNN) algorithm to perform quantization on images. HPCNNs are developed from traditional pulse coupled neural network (PCNN) models, which have different parameters corresponding to different image regions. This allows pixels of different gray levels to be classified broadly into two categories: background regional and object regional. Moreover, an HPCNN also satisfies human visual characteristics. The parameters of the HPCNN model are calculated automatically according to these categories, and quantized results will be optimal and more suitable for humans to observe. At the same time, the experimental results of natural images from the standard image library show the validity and efficiency of our proposed quantization method.
Hypercluster parallel processing library user's manual
NASA Technical Reports Server (NTRS)
Quealy, Angela
1990-01-01
This User's Manual describes the Hypercluster Parallel Processing Library, composed of FORTRAN-callable subroutines which enable a FORTRAN programmer to manipulate and transfer information throughout the Hypercluster at NASA Lewis Research Center. Each subroutine and its parameters are described in detail. A simple heat flow application using Laplace's equation is included to demonstrate the use of some of the library's subroutines. The manual can be used initially as an introduction to the parallel features provided by the library. Thereafter it can be used as a reference when programming an application.
A Physically based Model of the Ionizing Radiation from Active Galaxies for Photoionization Modeling
NASA Astrophysics Data System (ADS)
Thomas, A. D.; Groves, B. A.; Sutherland, R. S.; Dopita, M. A.; Kewley, L. J.; Jin, C.
2016-12-01
We present a simplified model of active galactic nucleus (AGN) continuum emission designed for photoionization modeling. The new model oxaf reproduces the diversity of spectral shapes that arise in physically based models. We identify and explain degeneracies in the effects of AGN parameters on model spectral shapes, with a focus on the complete degeneracy between the black hole mass and AGN luminosity. Our reparametrized model oxaf removes these degeneracies and accepts three parameters that directly describe the output spectral shape: the energy of the peak of the accretion disk emission {E}{peak}, the photon power-law index of the non-thermal emission Γ, and the proportion of the total flux that is emitted in the non-thermal component {p}{NT}. The parameter {E}{peak} is presented as a function of the black hole mass, AGN luminosity, and “coronal radius” of the optxagnf model upon which oxaf is based. We show that the soft X-ray excess does not significantly affect photoionization modeling predictions of strong emission lines in Seyfert narrow-line regions. Despite its simplicity, oxaf accounts for opacity effects where the accretion disk is ionized because it inherits the “color correction” of optxagnf. We use a grid of mappings photoionization models with oxaf ionizing spectra to demonstrate how predicted emission-line ratios on standard optical diagnostic diagrams are sensitive to each of the three oxaf parameters. The oxaf code is publicly available in the Astrophysics Source Code Library.
Self-consistent two-phase AGN torus models⋆. SED library for observers
NASA Astrophysics Data System (ADS)
Siebenmorgen, Ralf; Heymann, Frank; Efstathiou, Andreas
2015-11-01
We assume that dust near active galactic nuclei (AGNs) is distributed in a torus-like geometry, which can be described as a clumpy medium or a homogeneous disk, or as a combination of the two (i.e. a two-phase medium). The dust particles considered are fluffy and have higher submillimeter emissivities than grains in the diffuse interstellar medium. The dust-photon interaction is treated in a fully self-consistent three-dimensional radiative transfer code. We provide an AGN library of spectral energy distributions (SEDs). Its purpose is to quickly obtain estimates of the basic parameters of the AGNs, such as the intrinsic luminosity of the central source, the viewing angle, the inner radius, the volume filling factor and optical depth of the clouds, and the optical depth of the disk midplane, and to predict the flux at yet unobserved wavelengths. The procedure is simple and consists of finding an element in the library that matches the observations. We discuss the general properties of the models and in particular the 10 μm silicate band. The AGN library accounts well for the observed scatter of the feature strengths and wavelengths of the peak emission. AGN extinction curves are discussed and we find that there is no direct one-to-one link between the observed extinction and the wavelength dependence of the dust cross sections. We show that objects in the library cover the observed range of mid-infrared colors of known AGNs. The validity of the approach is demonstrated by matching the SEDs of a number of representative objects: Four Seyferts and two quasars for which we present new Herschel photometry, two radio galaxies, and one hyperluminous infrared galaxy. Strikingly, for the five luminous objects we find that pure AGN models fit the SED without needing to postulate starburst activity. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.The SED library of the AGN models is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/583/A120, and at http://www.eso.org/~rsiebenm/agn_models/
Unresolved Galaxy Classifier for ESA/Gaia mission: Support Vector Machines approach
NASA Astrophysics Data System (ADS)
Bellas-Velidis, Ioannis; Kontizas, Mary; Dapergolas, Anastasios; Livanou, Evdokia; Kontizas, Evangelos; Karampelas, Antonios
A software package Unresolved Galaxy Classifier (UGC) is being developed for the ground-based pipeline of ESA's Gaia mission. It aims to provide an automated taxonomic classification and specific parameters estimation analyzing Gaia BP/RP instrument low-dispersion spectra of unresolved galaxies. The UGC algorithm is based on a supervised learning technique, the Support Vector Machines (SVM). The software is implemented in Java as two separate modules. An offline learning module provides functions for SVM-models training. Once trained, the set of models can be repeatedly applied to unknown galaxy spectra by the pipeline's application module. A library of galaxy models synthetic spectra, simulated for the BP/RP instrument, is used to train and test the modules. Science tests show a very good classification performance of UGC and relatively good regression performance, except for some of the parameters. Possible approaches to improve the performance are discussed.
A Selected Library of Transport Coefficients for Combustion and Plasma Physics Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cloutman, L.D.
2000-08-01
COYOTE and similar combustion programs based on the multicomponent Navier-Stokes equations require the mixture viscosity, thermal conductivity, and species transport coefficients as input. This report documents a model of these molecular transport coefficients that is simpler than the general theory, but which provides adequate accuracy for many purposes. This model leads to a computationally convenient, self-contained, and easy-to-use source of such data in a format suitable for use by such programs. We present the data for various neutral species in two forms. The first form is a simple functional fit to the transport coefficients. The second form is the usemore » of tabulated Lennard-Jones parameters in simple theoretical expressions for the gas-phase transport coefficients. The model then is extended to the case of a two-temperature plasma. Lennard-Jones parameters are given for a number of chemical species of interest in combustion research.« less
Cost Accounting and Analysis for University Libraries
ERIC Educational Resources Information Center
Leimkuhler, Ferdinand F.; Cooper, Michael D.
1971-01-01
The approach to library planning studied in this paper is the use of accounting models to measure library costs and implement program budgets. A cost-flow model for a university library is developed and tested with historical data from the General Library at the University of California, Berkeley. (4 references) (Author)
Modelling Neutron-induced Reactions on 232–237U from 10 keV up to 30 MeV
Sin, M.; Capote, R.; Herman, M. W.; ...
2017-01-17
Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less
Minimizing energy dissipation of matrix multiplication kernel on Virtex-II
NASA Astrophysics Data System (ADS)
Choi, Seonil; Prasanna, Viktor K.; Jang, Ju-wook
2002-07-01
In this paper, we develop energy-efficient designs for matrix multiplication on FPGAs. To analyze the energy dissipation, we develop a high-level model using domain-specific modeling techniques. In this model, we identify architecture parameters that significantly affect the total energy (system-wide energy) dissipation. Then, we explore design trade-offs by varying these parameters to minimize the system-wide energy. For matrix multiplication, we consider a uniprocessor architecture and a linear array architecture to develop energy-efficient designs. For the uniprocessor architecture, the cache size is a parameter that affects the I/O complexity and the system-wide energy. For the linear array architecture, the amount of storage per processing element is a parameter affecting the system-wide energy. By using maximum amount of storage per processing element and minimum number of multipliers, we obtain a design that minimizes the system-wide energy. We develop several energy-efficient designs for matrix multiplication. For example, for 6×6 matrix multiplication, energy savings of upto 52% for the uniprocessor architecture and 36% for the linear arrary architecture is achieved over an optimized library for Virtex-II FPGA from Xilinx.
NASA Astrophysics Data System (ADS)
Nasertdinova, A. D.; Bochkarev, V. V.
2017-11-01
Deep neural networks with a large number of parameters are a powerful tool for solving problems of pattern recognition, prediction and classification. Nevertheless, overfitting remains a serious problem in the use of such networks. A method of solving the problem of overfitting is proposed in this article. This method is based on reducing the number of independent parameters of a neural network model using the principal component analysis, and can be implemented using existing libraries of neural computing. The algorithm was tested on the problem of recognition of handwritten symbols from the MNIST database, as well as on the task of predicting time series (rows of the average monthly number of sunspots and series of the Lorentz system were used). It is shown that the application of the principal component analysis enables reducing the number of parameters of the neural network model when the results are good. The average error rate for the recognition of handwritten figures from the MNIST database was 1.12% (which is comparable to the results obtained using the "Deep training" methods), while the number of parameters of the neural network can be reduced to 130 times.
ZASPE: A Code to Measure Stellar Atmospheric Parameters and their Covariance from Spectra
NASA Astrophysics Data System (ADS)
Brahm, Rafael; Jordán, Andrés; Hartman, Joel; Bakos, Gáspár
2017-05-01
We describe the Zonal Atmospheric Stellar Parameters Estimator (zaspe), a new algorithm, and its associated code, for determining precise stellar atmospheric parameters and their uncertainties from high-resolution echelle spectra of FGK-type stars. zaspe estimates stellar atmospheric parameters by comparing the observed spectrum against a grid of synthetic spectra only in the most sensitive spectral zones to changes in the atmospheric parameters. Realistic uncertainties in the parameters are computed from the data itself, by taking into account the systematic mismatches between the observed spectrum and the best-fitting synthetic one. The covariances between the parameters are also estimated in the process. zaspe can in principle use any pre-calculated grid of synthetic spectra, but unbiased grids are required to obtain accurate parameters. We tested the performance of two existing libraries, and we concluded that neither is suitable for computing precise atmospheric parameters. We describe a process to synthesize a new library of synthetic spectra that was found to generate consistent results when compared with parameters obtained with different methods (interferometry, asteroseismology, equivalent widths).
ERIC Educational Resources Information Center
Jantz, Ronald
2001-01-01
Analyzes the implications of electronic book technology (e-books) on academic libraries. Discusses new business models for publishers, including self-publishing, Internet publishing, and partnerships with libraries as publishers; impact on library services, including cataloging, circulation, and digital preservation; user benefits; standards;…
Law Libraries as Special Libraries: An Educational Model.
ERIC Educational Resources Information Center
Hazelton, Penny A.
1993-01-01
Summarizes the history of the law library profession and the development of the educational model for law librarians in light of the particular demands and needs of corporate and law firm libraries. Guidelines of the American Association of Law Libraries for graduate programs in law librarianship are discussed. (Contains 17 references.) (LRW)
Cost Accounting and Analysis for University Libraries.
ERIC Educational Resources Information Center
Leimkuhler, Ferdinand F.; Cooper, Michael D.
The approach to library planning studied in this report is the use of accounting models to measure library costs and implement program budgets. A cost-flow model for a university library is developed and listed with historical data from the Berkeley General Library. Various comparisons of an exploratory nature are made of the unit costs for…
Development of the FITS tools package for multiple software environments
NASA Technical Reports Server (NTRS)
Pence, W. D.; Blackburn, J. K.
1992-01-01
The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.
Vieira, Ricardo P; Gonzalez, Alessandra M; Cardoso, Alexander M; Oliveira, Denise N; Albano, Rodolpho M; Clementino, Maysa M; Martins, Orlando B; Paranhos, Rodolfo
2008-01-01
This study is the first to apply a comparative analysis of environmental chemistry, microbiological parameters and bacterioplankton 16S rRNA clone libraries from different areas of a 50 km transect along a trophic gradient in the tropical Guanabara Bay ecosystem. Higher bacterial diversity was found in the coastal area, whereas lower richness was observed in the more polluted inner bay water. The significance of differences between clone libraries was examined with LIBSHUFF statistics. Paired reciprocal comparisons indicated that each of the libraries differs significantly from the others, and this is in agreement with direct interpretation of the phylogenetic tree. Furthermore, correspondence analyses showed that some taxa are related to specific abiotic, trophic and microbiological parameters in Guanabara Bay estuarine system.
An efficient platform for genetic selection and screening of gene switches in Escherichia coli
Muranaka, Norihito; Sharma, Vandana; Nomura, Yoko; Yokobayashi, Yohei
2009-01-01
Engineered gene switches and circuits that can sense various biochemical and physical signals, perform computation, and produce predictable outputs are expected to greatly advance our ability to program complex cellular behaviors. However, rational design of gene switches and circuits that function in living cells is challenging due to the complex intracellular milieu. Consequently, most successful designs of gene switches and circuits have relied, to some extent, on high-throughput screening and/or selection from combinatorial libraries of gene switch and circuit variants. In this study, we describe a generic and efficient platform for selection and screening of gene switches and circuits in Escherichia coli from large libraries. The single-gene dual selection marker tetA was translationally fused to green fluorescent protein (gfpuv) via a flexible peptide linker and used as a dual selection and screening marker for laboratory evolution of gene switches. Single-cycle (sequential positive and negative selections) enrichment efficiencies of >7000 were observed in mock selections of model libraries containing functional riboswitches in liquid culture. The technique was applied to optimize various parameters affecting the selection outcome, and to isolate novel thiamine pyrophosphate riboswitches from a complex library. Artificial riboswitches with excellent characteristics were isolated that exhibit up to 58-fold activation as measured by fluorescent reporter gene assay. PMID:19190095
NASA Astrophysics Data System (ADS)
Lapteva, M. V.
Building up a specialized library collection of the Library of the Institute of Theoretical Astronomy of the Russian Academy of Sciences beginning with foundation of the Library (1924) up to the present time have been considered in their historical perspective. The main acquisition sources, stock figures, various parameters of the collection composi- tion, including information on rare foreign editions are also dealt with. The data on the existing retrieval systems and the perspectives of developing computerized problem directed reference bibliographic complexes are also considered.
ERIC Educational Resources Information Center
Choi, Youngok; Rasmussen, Edie
2009-01-01
As academic library functions and activities continue to evolve, libraries have broadened the traditional library model, which focuses on management of physical resources and activities, to include a digital library model, transforming resources and services into digital formats to support teaching, learning, and research. This transition has…
Mining for osteogenic surface topographies: In silico design to in vivo osseo-integration.
Hulshof, Frits F B; Papenburg, Bernke; Vasilevich, Aliaksei; Hulsman, Marc; Zhao, Yiping; Levers, Marloes; Fekete, Natalie; de Boer, Meint; Yuan, Huipin; Singh, Shantanu; Beijer, Nick; Bray, Mark-Anthony; Logan, David J; Reinders, Marcel; Carpenter, Anne E; van Blitterswijk, Clemens; Stamatialis, Dimitrios; de Boer, Jan
2017-08-01
Stem cells respond to the physicochemical parameters of the substrate on which they grow. Quantitative material activity relationships - the relationships between substrate parameters and the phenotypes they induce - have so far poorly predicted the success of bioactive implant surfaces. In this report, we screened a library of randomly selected designed surface topographies for those inducing osteogenic differentiation of bone marrow-derived mesenchymal stem cells. Cell shape features, surface design parameters, and osteogenic marker expression were strongly correlated in vitro. Furthermore, the surfaces with the highest osteogenic potential in vitro also demonstrated their osteogenic effect in vivo: these indeed strongly enhanced bone bonding in a rabbit femur model. Our work shows that by giving stem cells specific physicochemical parameters through designed surface topographies, differentiation of these cells can be dictated. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Blöcher, Johanna; Kuraz, Michal
2017-04-01
In this contribution we propose implementations of the dual permeability model with different inter-domain exchange descriptions and metaheuristic optimization algorithms for parameter identification and mesh optimization. We compare variants of the coupling term with different numbers of parameters to test if a reduction of parameters is feasible. This can reduce parameter uncertainty in inverse modeling, but also allow for different conceptual models of the domain and matrix coupling. The different variants of the dual permeability model are implemented in the open-source objective library DRUtES written in FORTRAN 2003/2008 in 1D and 2D. For parameter identification we use adaptations of the particle swarm optimization (PSO) and Teaching-learning-based optimization (TLBO), which are population-based metaheuristics with different learning strategies. These are high-level stochastic-based search algorithms that don't require gradient information or a convex search space. Despite increasing computing power and parallel processing, an overly fine mesh is not feasible for parameter identification. This creates the need to find a mesh that optimizes both accuracy and simulation time. We use a bi-objective PSO algorithm to generate a Pareto front of optimal meshes to account for both objectives. The dual permeability model and the optimization algorithms were tested on virtual data and field TDR sensor readings. The TDR sensor readings showed a very steep increase during rapid rainfall events and a subsequent steep decrease. This was theorized to be an effect of artificial macroporous envelopes surrounding TDR sensors creating an anomalous region with distinct local soil hydraulic properties. One of our objectives is to test how well the dual permeability model can describe this infiltration behavior and what coupling term would be most suitable.
Discrete RNA libraries from pseudo-torsional space
Humphris-Narayanan, Elisabeth
2012-01-01
The discovery that RNA molecules can fold into complex structures and carry out diverse cellular roles has led to interest in developing tools for modeling RNA tertiary structure. While significant progress has been made in establishing that the RNA backbone is rotameric, few libraries of discrete conformations specifically for use in RNA modeling have been validated. Here, we present six libraries of discrete RNA conformations based on a simplified pseudo-torsional notation of the RNA backbone, comparable to phi and psi in the protein backbone. We evaluate the ability of each library to represent single nucleotide backbone conformations and we show how individual library fragments can be assembled into dinucleotides that are consistent with established RNA backbone descriptors spanning from sugar to sugar. We then use each library to build all-atom models of 20 test folds and we show how the composition of a fragment library can limit model quality. Despite the limitations inherent in using discretized libraries, we find that several hundred discrete fragments can rebuild RNA folds up to 174 nucleotides in length with atomic-level accuracy (<1.5Å RMSD). We anticipate the libraries presented here could easily be incorporated into RNA structural modeling, analysis, or refinement tools. PMID:22425640
Sand, Andreas; Kristiansen, Martin; Pedersen, Christian N S; Mailund, Thomas
2013-11-22
Hidden Markov models are widely used for genome analysis as they combine ease of modelling with efficient analysis algorithms. Calculating the likelihood of a model using the forward algorithm has worst case time complexity linear in the length of the sequence and quadratic in the number of states in the model. For genome analysis, however, the length runs to millions or billions of observations, and when maximising the likelihood hundreds of evaluations are often needed. A time efficient forward algorithm is therefore a key ingredient in an efficient hidden Markov model library. We have built a software library for efficiently computing the likelihood of a hidden Markov model. The library exploits commonly occurring substrings in the input to reuse computations in the forward algorithm. In a pre-processing step our library identifies common substrings and builds a structure over the computations in the forward algorithm which can be reused. This analysis can be saved between uses of the library and is independent of concrete hidden Markov models so one preprocessing can be used to run a number of different models.Using this library, we achieve up to 78 times shorter wall-clock time for realistic whole-genome analyses with a real and reasonably complex hidden Markov model. In one particular case the analysis was performed in less than 8 minutes compared to 9.6 hours for the previously fastest library. We have implemented the preprocessing procedure and forward algorithm as a C++ library, zipHMM, with Python bindings for use in scripts. The library is available at http://birc.au.dk/software/ziphmm/.
NASA Astrophysics Data System (ADS)
Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.
2017-12-01
The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.
NASA Astrophysics Data System (ADS)
Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.
2015-11-01
Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.
Some Reflections on Strategic Planning in Public Libraries.
ERIC Educational Resources Information Center
Palmour, Vernon E.
1985-01-01
Presents the Public Library Association's planning model for strategic planning in public libraries. The development of the model is explained, the basic steps of the planning process are described, and improvements to the model are suggested. (CLB)
Comparison of chiller models for use in model-based fault detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sreedharan, Priya; Haves, Philip
Selecting the model is an important and essential step in model based fault detection and diagnosis (FDD). Factors that are considered in evaluating a model include accuracy, training data requirements, calibration effort, generality, and computational requirements. The objective of this study was to evaluate different modeling approaches for their applicability to model based FDD of vapor compression chillers. Three different models were studied: the Gordon and Ng Universal Chiller model (2nd generation) and a modified version of the ASHRAE Primary Toolkit model, which are both based on first principles, and the DOE-2 chiller model, as implemented in CoolTools{trademark}, which ismore » empirical. The models were compared in terms of their ability to reproduce the observed performance of an older, centrifugal chiller operating in a commercial office building and a newer centrifugal chiller in a laboratory. All three models displayed similar levels of accuracy. Of the first principles models, the Gordon-Ng model has the advantage of being linear in the parameters, which allows more robust parameter estimation methods to be used and facilitates estimation of the uncertainty in the parameter values. The ASHRAE Toolkit Model may have advantages when refrigerant temperature measurements are also available. The DOE-2 model can be expected to have advantages when very limited data are available to calibrate the model, as long as one of the previously identified models in the CoolTools library matches the performance of the chiller in question.« less
NASA Astrophysics Data System (ADS)
De Geyter, G.; Baes, M.; Fritz, J.; Camps, P.
2013-02-01
We present FitSKIRT, a method to efficiently fit radiative transfer models to UV/optical images of dusty galaxies. These images have the advantage that they have better spatial resolution compared to FIR/submm data. FitSKIRT uses the GAlib genetic algorithm library to optimize the output of the SKIRT Monte Carlo radiative transfer code. Genetic algorithms prove to be a valuable tool in handling the multi- dimensional search space as well as the noise induced by the random nature of the Monte Carlo radiative transfer code. FitSKIRT is tested on artificial images of a simulated edge-on spiral galaxy, where we gradually increase the number of fitted parameters. We find that we can recover all model parameters, even if all 11 model parameters are left unconstrained. Finally, we apply the FitSKIRT code to a V-band image of the edge-on spiral galaxy NGC 4013. This galaxy has been modeled previously by other authors using different combinations of radiative transfer codes and optimization methods. Given the different models and techniques and the complexity and degeneracies in the parameter space, we find reasonable agreement between the different models. We conclude that the FitSKIRT method allows comparison between different models and geometries in a quantitative manner and minimizes the need of human intervention and biasing. The high level of automation makes it an ideal tool to use on larger sets of observed data.
Multi-registration of software library resources
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN
2011-04-05
Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.
ERIC Educational Resources Information Center
Coyle, William J.
1989-01-01
Discusses the current widespread acceptance of the public library model for prison libraries, in which preferences of the inmates are the chief consideration in programing and collection development. It is argued that this model results in recreational programs and collections that fail to fulfill the prison library's role in education and…
ERIC Educational Resources Information Center
Sullivan, Todd
Using an IBM System/360 Model 50 computer, the New York Statewide Film Library Network schedules film use, reports on materials handling and statistics, and provides for interlibrary loan of films. Communications between the film libraries and the computer are maintained by Teletype model 33 ASR Teletypewriter terminals operating on TWX…
Anxiety-Expectation Mediation Model of Library Anxiety.
ERIC Educational Resources Information Center
Jiao, Qun G.; Onwuegbuzie, Anthony J.
This study presents a test of the Anxiety-Expectation Mediation (AEM) model of library anxiety. The AEM model contains variables that are directly or indirectly related to information search performance, as measured by students' scores on their research proposals. This model posits that library anxiety and self-perception serve as factors that…
Adaptive control based on an on-line parameter estimation of an upper limb exoskeleton.
Riani, Akram; Madani, Tarek; Hadri, Abdelhafid El; Benallegue, Abdelaziz
2017-07-01
This paper presents an adaptive control strategy for an upper-limb exoskeleton based on an on-line dynamic parameter estimator. The objective is to improve the control performance of this system that plays a critical role in assisting patients for shoulder, elbow and wrist joint movements. In general, the dynamic parameters of the human limb are unknown and differ from a person to another, which degrade the performances of the exoskeleton-human control system. For this reason, the proposed control scheme contains a supplementary loop based on a new efficient on-line estimator of the dynamic parameters. Indeed, the latter is acting upon the parameter adaptation of the controller to ensure the performances of the system in the presence of parameter uncertainties and perturbations. The exoskeleton used in this work is presented and a physical model of the exoskeleton interacting with a 7 Degree of Freedom (DoF) upper limb model is generated using the SimMechanics library of MatLab/Simulink. To illustrate the effectiveness of the proposed approach, an example of passive rehabilitation movements is performed using multi-body dynamic simulation. The aims is to maneuver the exoskeleton that drive the upper limb to track desired trajectories in the case of the passive arm movements.
Neutrino oscillation parameter sampling with MonteCUBES
NASA Astrophysics Data System (ADS)
Blennow, Mattias; Fernandez-Martinez, Enrique
2010-01-01
We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].
Interface for the documentation and compilation of a library of computer models in physiology.
Summers, R. L.; Montani, J. P.
1994-01-01
A software interface for the documentation and compilation of a library of computer models in physiology was developed. The interface is an interactive program built within a word processing template in order to provide ease and flexibility of documentation. A model editor within the interface directs the model builder as to standardized requirements for incorporating models into the library and provides the user with an index to the levels of documentation. The interface and accompanying library are intended to facilitate model development, preservation and distribution and will be available for public use. PMID:7950046
Muñoz, Enrique
2015-01-01
We compare the results obtained from searching a smaller library thoroughly versus searching a more diverse, larger library sparsely. We study protein evolution with reduced amino acid alphabets, by simulating directed evolution experiments at three different alphabet sizes: 20, 5 and 2. We employ a physical model for evolution, the generalized NK model, that has proved successful in modeling protein evolution, antibody evolution, and T cell selection. We find that antibodies with higher affinity are found by searching a library with a larger alphabet sparsely than by searching a smaller library thoroughly, even with well-designed reduced libraries. We find ranked amino acid usage frequencies in agreement with observations of the CDR-H3 variable region of human antibodies. PMID:18375453
Low temperature Grüneisen parameter of cubic ionic crystals
NASA Astrophysics Data System (ADS)
Batana, Alicia; Monard, María C.; Rosario Soriano, María
1987-02-01
Title of program: CAROLINA Catalogue number: AATG Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland (see application form in this issue) Computer: IBM/370, Model 158; Installation: Centro de Tecnología y Ciencia de Sistemas, Universidad de Buenos Aires Operating system: VM/370 Programming language used: FORTRAN High speed storage required: 3 kwords No. of bits in a word: 32 Peripherals used: disk IBM 3340/70 MB No. of lines in combined program and test deck: 447
Collection Development Policy: Academic Library, St. Mary's University. Revised.
ERIC Educational Resources Information Center
Sylvia, Margaret
This guide spells out the collection development policy of the library of St. Mary's University in San Antonio, Texas. The guide is divided into the following five topic areas: (1) introduction to the community served, parameters of the collection, cooperation in collection development, and priorities of the collection; (2) considerations in…
Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.
ERIC Educational Resources Information Center
Brooks, Terrence A.; Forys, John W., Jr.
1986-01-01
Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)
Fast computation of close-coupling exchange integrals using polynomials in a tree representation
NASA Astrophysics Data System (ADS)
Wallerberger, Markus; Igenbergs, Katharina; Schweinzer, Josef; Aumayr, Friedrich
2011-03-01
The semi-classical atomic-orbital close-coupling method is a well-known approach for the calculation of cross sections in ion-atom collisions. It strongly relies on the fast and stable computation of exchange integrals. We present an upgrade to earlier implementations of the Fourier-transform method. For this purpose, we implement an extensive library for symbolic storage of polynomials, relying on sophisticated tree structures to allow fast manipulation and numerically stable evaluation. Using this library, we considerably speed up creation and computation of exchange integrals. This enables us to compute cross sections for more complex collision systems. Program summaryProgram title: TXINT Catalogue identifier: AEHS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 12 332 No. of bytes in distributed program, including test data, etc.: 157 086 Distribution format: tar.gz Programming language: Fortran 95 Computer: All with a Fortran 95 compiler Operating system: All with a Fortran 95 compiler RAM: Depends heavily on input, usually less than 100 MiB Classification: 16.10 Nature of problem: Analytical calculation of one- and two-center exchange matrix elements for the close-coupling method in the impact parameter model. Solution method: Similar to the code of Hansen and Dubois [1], we use the Fourier-transform method suggested by Shakeshaft [2] to compute the integrals. However, we heavily speed up the calculation using a library for symbolic manipulation of polynomials. Restrictions: We restrict ourselves to a defined collision system in the impact parameter model. Unusual features: A library for symbolic manipulation of polynomials, where polynomials are stored in a space-saving left-child right-sibling binary tree. This provides stable numerical evaluation and fast mutation while maintaining full compatibility with the original code. Additional comments: This program makes heavy use of the new features provided by the Fortran 90 standard, most prominently pointers, derived types and allocatable structures and a small portion of Fortran 95. Only newer compilers support these features. Following compilers support all features needed by the program. GNU Fortran Compiler "gfortran" from version 4.3.0 GNU Fortran 95 Compiler "g95" from version 4.2.0 Intel Fortran Compiler "ifort" from version 11.0
FITspec: A New Algorithm for the Automated Fit of Synthetic Stellar Spectra for OB Stars
NASA Astrophysics Data System (ADS)
Fierro-Santillán, Celia R.; Zsargó, Janos; Klapp, Jaime; Díaz-Azuara, Santiago A.; Arrieta, Anabel; Arias, Lorena; Sigalotti, Leonardo Di G.
2018-06-01
In this paper we describe the FITspec code, a data mining tool for the automatic fitting of synthetic stellar spectra. The program uses a database of 27,000 CMFGEN models of stellar atmospheres arranged in a six-dimensional (6D) space, where each dimension corresponds to one model parameter. From these models a library of 2,835,000 synthetic spectra were generated covering the ultraviolet, optical, and infrared regions of the electromagnetic spectrum. Using FITspec we adjust the effective temperature and the surface gravity. From the 6D array we also get the luminosity, the metallicity, and three parameters for the stellar wind: the terminal velocity ({v}∞ ), the β exponent of the velocity law, and the clumping filling factor (F cl). Finally, the projected rotational velocity (v\\cdot \\sin i) can be obtained from the library of stellar spectra. Validation of the algorithm was performed by analyzing the spectra of a sample of eight O-type stars taken from the IACOB spectroscopic survey of Northern Galactic OB stars. The spectral lines used for the adjustment of the analyzed stars are reproduced with good accuracy. In particular, the effective temperatures calculated with the FITspec are in good agreement with those derived from spectral type and other calibrations for the same stars. The stellar luminosities and projected rotational velocities are also in good agreement with previous quantitative spectroscopic analyses in the literature. An important advantage of FITspec over traditional codes is that the time required for spectral analyses is reduced from months to a few hours.
NASA Astrophysics Data System (ADS)
Marrero, Carlos Sosa; Aubert, Vivien; Ciferri, Nicolas; Hernández, Alfredo; de Crevoisier, Renaud; Acosta, Oscar
2017-11-01
Understanding the response to irradiation in cancer radiotherapy (RT) may help devising new strategies with improved tumor local control. Computational models may allow to unravel the underlying radiosensitive mechanisms intervening in the dose-response relationship. By using extensive simulations a wide range of parameters may be evaluated providing insights on tumor response thus generating useful data to plan modified treatments. We propose in this paper a computational model of tumor growth and radiation response which allows to simulate a whole RT protocol. Proliferation of tumor cells, cell life-cycle, oxygen diffusion, radiosensitivity, RT response and resorption of killed cells were implemented in a multiscale framework. The model was developed in C++, using the Multi-formalism Modeling and Simulation Library (M2SL). Radiosensitivity parameters extracted from literature enabled us to simulate in a regular grid (voxel-wise) a prostate cell tissue. Histopathological specimens with different aggressiveness levels extracted from patients after prostatectomy were used to initialize in silico simulations. Results on tumor growth exhibit a good agreement with data from in vitro studies. Moreover, standard fractionation of 2 Gy/fraction, with a total dose of 80 Gy as a real RT treatment was applied with varying radiosensitivity and oxygen diffusion parameters. As expected, the high influence of these parameters was observed by measuring the percentage of survival tumor cell after RT. This work paves the way to further models allowing to simulate increased doses in modified hypofractionated schemes and to develop new patient-specific combined therapies.
Structured grid technology to enable flow simulation in an integrated system environment
NASA Astrophysics Data System (ADS)
Remotigue, Michael Gerard
An application-driven Computational Fluid Dynamics (CFD) environment needs flexible and general tools to effectively solve complex problems in a timely manner. In addition, reusable, portable, and maintainable specialized libraries will aid in rapidly developing integrated systems or procedures. The presented structured grid technology enables the flow simulation for complex geometries by addressing grid generation, grid decomposition/solver setup, solution, and interpretation. Grid generation is accomplished with the graphical, arbitrarily-connected, multi-block structured grid generation software system (GUM-B) developed and presented here. GUM-B is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a solid-modeling data structure that utilizes a structured grid generation library and a geometric library based on Non-Uniform Rational B-Splines (NURBS). A presented modification of the solid-modeling data structure provides the capability for arbitrarily-connected regions between the grid blocks. The presented grid generation library provides algorithms that are reliable and accurate. GUM-B has been utilized to generate numerous structured grids for complex geometries in hydrodynamics, propulsors, and aerodynamics. The versatility of the libraries that compose GUM-B is also displayed in a prototype to automatically regenerate a grid for a free-surface solution. Grid decomposition and solver setup is accomplished with the graphical grid manipulation and repartition software system (GUMBO) developed and presented here. GUMBO is an integrated system comprised of specialized libraries for the graphical user interface and graphical display coupled with a structured grid-tools library. The described functions within the grid-tools library reduce the possibility of human error during decomposition and setup for the numerical solver by accounting for boundary conditions and connectivity. GUMBO is linked with a flow solver interface, to the parallel UNCLE code, to provide load balancing tools and solver setup. Weeks of boundary condition and connectivity specification and validation has been reduced to hours. The UNCLE flow solver is utilized for the solution of the flow field. To accelerate convergence toward a quick engineering answer, a full multigrid (FMG) approach coupled with UNCLE, which is a full approximation scheme (FAS), is presented. The prolongation operators used in the FMG-FAS method are compared. The procedure is demonstrated on a marine propeller in incompressible flow. Interpretation of the solution is accomplished by vortex feature detection. Regions of "Intrinsic Swirl" are located by interrogating the velocity gradient tensor for complex eigenvalues. The "Intrinsic Swirl" parameter is visualized on a solution of a marine propeller to determine if any vortical features are captured. The libraries and the structured grid technology presented herein are flexible and general enough to tackle a variety of complex applications. This technology has significantly enabled the capability of the ERC personnel to effectively calculate solutions for complex geometries.
TAP 1: A Finite Element Program for Steady-State Thermal Analysis of Convectively Cooled Structures
NASA Technical Reports Server (NTRS)
Thornton, E. A.
1976-01-01
The program has a finite element library of six elements: two conduction/convection elements to model heat transfer in a solid, two convection elements to model heat transfer in a fluid, and two integrated conduction/convection elements to represent combined heat transfer in tubular and plate/fin fluid passages. Nonlinear thermal analysis due to temperature dependent thermal parameters is performed using the Newton-Raphson iteration method. Program output includes nodal temperatures and element heat fluxes. Pressure drops in fluid passages may be computed as an option. A companion plotting program for displaying the finite element model and predicted temperature distributions is presented. User instructions and sample problems are presented in appendixes.
ADVANCED WAVEFORM SIMULATION FOR SEISMIC MONITORING EVENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helmberger, Donald V.; Tromp, Jeroen; Rodgers, Arthur J.
Earthquake source parameters underpin several aspects of nuclear explosion monitoring. Such aspects are: calibration of moment magnitudes (including coda magnitudes) and magnitude and distance amplitude corrections (MDAC); source depths; discrimination by isotropic moment tensor components; and waveform modeling for structure (including waveform tomography). This project seeks to improve methods for and broaden the applicability of estimating source parameters from broadband waveforms using the Cut-and-Paste (CAP) methodology. The CAP method uses a library of Green’s functions for a one-dimensional (1D, depth-varying) seismic velocity model. The method separates the main arrivals of the regional waveform into 5 windows: Pnl (vertical and radialmore » components), Rayleigh (vertical and radial components) and Love (transverse component). Source parameters are estimated by grid search over strike, dip, rake and depth and seismic moment or equivalently moment magnitude, MW, are adjusted to fit the amplitudes. Key to the CAP method is allowing the synthetic seismograms to shift in time relative to the data in order to account for path-propagation errors (delays) in the 1D seismic velocity model used to compute the Green’s functions. The CAP method has been shown to improve estimates of source parameters, especially when delay and amplitude biases are calibrated using high signal-to-noise data from moderate earthquakes, CAP+.« less
Building and Sustaining Digital Collections: Models for Libraries and Museums.
ERIC Educational Resources Information Center
Council on Library and Information Resources, Washington, DC.
In February 2001, the Council on Library and Information Resources (CLIR) and the National Initiative for a Networked Cultural Heritage (NINCH) convened a meeting to discuss how museums and libraries are building digital collections and what business models are available to sustain them. A group of museum and library senior executives met with…
ERIC Educational Resources Information Center
Martins, Jorge Tiago; Martins, Rosa Maria
2012-01-01
This paper reports the implementation results of the Portuguese School Libraries Evaluation Model, more specifically the results of primary schools self-evaluation of their libraries' reading promotion and information literacy development activities. School libraries that rated their performance as either "Excellent" or "Poor"…
FISPACT-II: An Advanced Simulation System for Activation, Transmutation and Material Modelling
NASA Astrophysics Data System (ADS)
Sublet, J.-Ch.; Eastwood, J. W.; Morgan, J. G.; Gilbert, M. R.; Fleming, M.; Arter, W.
2017-01-01
Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2 and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.
Myokit: A simple interface to cardiac cellular electrophysiology.
Clerx, Michael; Collins, Pieter; de Lange, Enno; Volders, Paul G A
2016-01-01
Myokit is a new powerful and versatile software tool for modeling and simulation of cardiac cellular electrophysiology. Myokit consists of an easy-to-read modeling language, a graphical user interface, single and multi-cell simulation engines and a library of advanced analysis tools accessible through a Python interface. Models can be loaded from Myokit's native file format or imported from CellML. Model export is provided to C, MATLAB, CellML, CUDA and OpenCL. Patch-clamp data can be imported and used to estimate model parameters. In this paper, we review existing tools to simulate the cardiac cellular action potential to find that current tools do not cater specifically to model development and that there is a gap between easy-to-use but limited software and powerful tools that require strong programming skills from their users. We then describe Myokit's capabilities, focusing on its model description language, simulation engines and import/export facilities in detail. Using three examples, we show how Myokit can be used for clinically relevant investigations, multi-model testing and parameter estimation in Markov models, all with minimal programming effort from the user. This way, Myokit bridges a gap between performance, versatility and user-friendliness. Copyright © 2015 Elsevier Ltd. All rights reserved.
A study of the 3D radiative transfer effect in cloudy atmospheres
NASA Astrophysics Data System (ADS)
Okata, M.; Teruyuki, N.; Suzuki, K.
2015-12-01
Evaluation of the effect of clouds in the atmosphere is a significant problem in the Earth's radiation budget study with their large uncertainties of microphysics and the optical properties. In this situation, we still need more investigations of 3D cloud radiative transer problems using not only models but also satellite observational data.For this purpose, we have developed a 3D-Monte-Carlo radiative transfer code that is implemented with various functions compatible with the OpenCLASTR R-Star radiation code for radiance and flux computation, i.e. forward and backward tracing routines, non-linear k-distribution parameterization (Sekiguchi and Nakajima, 2008) for broad band solar flux calculation, and DM-method for flux and TMS-method for upward radiance (Nakajima and Tnaka 1998). We also developed a Minimum cloud Information Deviation Profiling Method (MIDPM) as a method for a construction of 3D cloud field with MODIS/AQUA and CPR/CloudSat data. We then selected a best-matched radar reflectivity factor profile from the library for each of off-nadir pixels of MODIS where CPR profile is not available, by minimizing the deviation between library MODIS parameters and those at the pixel. In this study, we have used three cloud microphysical parameters as key parameters for the MIDPM, i.e. effective particle radius, cloud optical thickness and top of cloud temperature, and estimated 3D cloud radiation budget. We examined the discrepancies between satellite observed and mode-simulated radiances and three cloud microphysical parameter's pattern for studying the effects of cloud optical and microphysical properties on the radiation budget of the cloud-laden atmospheres.
Collection Metadata Solutions for Digital Library Applications
NASA Technical Reports Server (NTRS)
Hill, Linda L.; Janee, Greg; Dolin, Ron; Frew, James; Larsgaard, Mary
1999-01-01
Within a digital library, collections may range from an ad hoc set of objects that serve a temporary purpose to established library collections intended to persist through time. The objects in these collections vary widely, from library and data center holdings to pointers to real-world objects, such as geographic places, and the various metadata schemas that describe them. The key to integrated use of such a variety of collections in a digital library is collection metadata that represents the inherent and contextual characteristics of a collection. The Alexandria Digital Library (ADL) Project has designed and implemented collection metadata for several purposes: in XML form, the collection metadata "registers" the collection with the user interface client; in HTML form, it is used for user documentation; eventually, it will be used to describe the collection to network search agents; and it is used for internal collection management, including mapping the object metadata attributes to the common search parameters of the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Starodumov, Ilya; Kropotin, Nikolai
2016-08-10
We investigate the three-dimensional mathematical model of crystal growth called PFC (Phase Field Crystal) in a hyperbolic modification. This model is also called the modified model PFC (originally PFC model is formulated in parabolic form) and allows to describe both slow and rapid crystallization processes on atomic length scales and on diffusive time scales. Modified PFC model is described by the differential equation in partial derivatives of the sixth order in space and second order in time. The solution of this equation is possible only by numerical methods. Previously, authors created the software package for the solution of the Phasemore » Field Crystal problem, based on the method of isogeometric analysis (IGA) and PetIGA program library. During further investigation it was found that the quality of the solution can strongly depends on the discretization parameters of a numerical method. In this report, we show the features that should be taken into account during constructing the computational grid for the numerical simulation.« less
Marchetti, Luca; Manca, Vincenzo
2015-04-15
MpTheory Java library is an open-source project collecting a set of objects and algorithms for modeling observed dynamics by means of the Metabolic P (MP) theory, that is, a mathematical theory introduced in 2004 for modeling biological dynamics. By means of the library, it is possible to model biological systems both at continuous and at discrete time. Moreover, the library comprises a set of regression algorithms for inferring MP models starting from time series of observations. To enhance the modeling experience, beside a pure Java usage, the library can be directly used within the most popular computing environments, such as MATLAB, GNU Octave, Mathematica and R. The library is open-source and licensed under the GNU Lesser General Public License (LGPL) Version 3.0. Source code, binaries and complete documentation are available at http://mptheory.scienze.univr.it. luca.marchetti@univr.it, marchetti@cosbi.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Health sciences library building projects: 1995 survey.
Ludwig, L
1996-01-01
The Medical Library Association's fifth annual survey of recent health sciences library building projects identified twenty-five libraries planning, expanding, or constructing new library facilities. None of the fifteen new library projects are free standing structures; however, several occupy a major portion of the project space. Ten projects involve renovation of or addition to existing space. Information regarding size, cost of project, type of construction, completion date, and other factual data was provided for twelve projects. The remaining identified projects are in pre-design or early-design stages, or are awaiting funding approval. Library building projects for three hospital libraries, three academic medical libraries, and an association library are described. Each illustrates how considerations of economics and technology are changing the traditional library model from a centrally stored information depository housing a wide range of information under one roof where users come to the information, into an electronic model gradually shifting from investment in the physical presence of resources to investment in creating work space for creditible information specialists who help in-house and distanced users to obtain information electronically from any place and at any time. This new model includes a highly skilled library team to manage, filter, and package the information to users trained by these resident experts. Images PMID:8883981
Health sciences library building projects: 1995 survey.
Ludwig, L
1996-07-01
The Medical Library Association's fifth annual survey of recent health sciences library building projects identified twenty-five libraries planning, expanding, or constructing new library facilities. None of the fifteen new library projects are free standing structures; however, several occupy a major portion of the project space. Ten projects involve renovation of or addition to existing space. Information regarding size, cost of project, type of construction, completion date, and other factual data was provided for twelve projects. The remaining identified projects are in pre-design or early-design stages, or are awaiting funding approval. Library building projects for three hospital libraries, three academic medical libraries, and an association library are described. Each illustrates how considerations of economics and technology are changing the traditional library model from a centrally stored information depository housing a wide range of information under one roof where users come to the information, into an electronic model gradually shifting from investment in the physical presence of resources to investment in creating work space for creditible information specialists who help in-house and distanced users to obtain information electronically from any place and at any time. This new model includes a highly skilled library team to manage, filter, and package the information to users trained by these resident experts.
Toward Continual Reform: Progress in Academic Libraries in China.
ERIC Educational Resources Information Center
Ping, Ke
2002-01-01
Traces developments in China's academic libraries: managing human resources, restructuring library developments, revising and implementing new policies, evaluating services and operations, establishing library systems, building new structures, and exploring joint-use library models. Major focus was to improve services for library user. (Author/LRW)
The R.E.D. tools: advances in RESP and ESP charge derivation and force field library building.
Dupradeau, François-Yves; Pigache, Adrien; Zaffran, Thomas; Savineau, Corentin; Lelong, Rodolphe; Grivel, Nicolas; Lelong, Dimitri; Rosanski, Wilfried; Cieplak, Piotr
2010-07-28
Deriving atomic charges and building a force field library for a new molecule are key steps when developing a force field required for conducting structural and energy-based analysis using molecular mechanics. Derivation of popular RESP charges for a set of residues is a complex and error prone procedure because it depends on numerous input parameters. To overcome these problems, the R.E.D. Tools (RESP and ESP charge Derive, ) have been developed to perform charge derivation in an automatic and straightforward way. The R.E.D. program handles chemical elements up to bromine in the periodic table. It interfaces different quantum mechanical programs employed for geometry optimization and computing molecular electrostatic potential(s), and performs charge fitting using the RESP program. By defining tight optimization criteria and by controlling the molecular orientation of each optimized geometry, charge values are reproduced at any computer platform with an accuracy of 0.0001 e. The charges can be fitted using multiple conformations, making them suitable for molecular dynamics simulations. R.E.D. allows also for defining charge constraints during multiple molecule charge fitting, which are used to derive charges for molecular fragments. Finally, R.E.D. incorporates charges into a force field library, readily usable in molecular dynamics computer packages. For complex cases, such as a set of homologous molecules belonging to a common family, an entire force field topology database is generated. Currently, the atomic charges and force field libraries have been developed for more than fifty model systems and stored in the RESP ESP charge DDataBase. Selected results related to non-polarizable charge models are presented and discussed.
The Academic Library/High School Library Connection: Needs Assessment and Proposed Model.
ERIC Educational Resources Information Center
LeClercq, Angie
1986-01-01
Recognizing the limited resources available to gifted high school students and inadequate level of information-gathering skills students often bring to college, University of Tennessee-Knoxville was awarded a grant from the Council on Library Resources to develop a model for providing access to research library resources for high school students.…
America's Youth Are at Risk: Developing Models for Action in the Nation's Public Libraries.
ERIC Educational Resources Information Center
Flum, Judith G.; Weisner, Stan
1993-01-01
Discussion of public library support systems for at-risk teens focuses on the Bay Area Library and Information System (BALIS) that was developed to improve library services to at-risk teenagers in the San Francisco Bay area. Highlights include needs assessment; staff training; intervention models; and project evaluation. (10 references) (LRW)
A Method of Predicting Queuing at Library Online PCs
ERIC Educational Resources Information Center
Beranek, Lea G.
2006-01-01
On-campus networked personal computer (PC) usage at La Trobe University Library was surveyed during September 2005. The survey's objectives were to confirm peak usage times, to measure some of the relevant parameters of online PC usage, and to determine the effect that 24 new networked PCs had on service quality. The survey found that clients…
Cao, Shuanghe; Siriwardana, Chamindika L; Kumimoto, Roderick W; Holt, Ben F
2011-05-19
Monocots, especially the temperate grasses, represent some of the most agriculturally important crops for both current food needs and future biofuel development. Because most of the agriculturally important grass species are difficult to study (e.g., they often have large, repetitive genomes and can be difficult to grow in laboratory settings), developing genetically tractable model systems is essential. Brachypodium distachyon (hereafter Brachypodium) is an emerging model system for the temperate grasses. To fully realize the potential of this model system, publicly accessible discovery tools are essential. High quality cDNA libraries that can be readily adapted for multiple downstream purposes are a needed resource. Additionally, yeast two-hybrid (Y2H) libraries are an important discovery tool for protein-protein interactions and are not currently available for Brachypodium. We describe the creation of two high quality, publicly available Gateway™ cDNA entry libraries and their derived Y2H libraries for Brachypodium. The first entry library represents cloned cDNA populations from both short day (SD, 8/16-h light/dark) and long day (LD, 20/4-h light/dark) grown plants, while the second library was generated from hormone treated tissues. Both libraries have extensive genome coverage (~5 × 107 primary clones each) and average clone lengths of ~1.5 Kb. These entry libraries were then used to create two recombination-derived Y2H libraries. Initial proof-of-concept screens demonstrated that a protein with known interaction partners could readily re-isolate those partners, as well as novel interactors. Accessible community resources are a hallmark of successful biological model systems. Brachypodium has the potential to be a broadly useful model system for the grasses, but still requires many of these resources. The Gateway™ compatible entry libraries created here will facilitate studies for multiple user-defined purposes and the derived Y2H libraries can be immediately applied to large scale screening and discovery of novel protein-protein interactions. All libraries are freely available for distribution to the research community.
Wu, Y.; Liu, S.
2012-01-01
Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.
Cartographic symbol library considering symbol relations based on anti-aliasing graphic library
NASA Astrophysics Data System (ADS)
Mei, Yang; Li, Lin
2007-06-01
Cartographic visualization represents geographic information with a map form, which enables us retrieve useful geospatial information. In digital environment, cartographic symbol library is the base of cartographic visualization and is an essential component of Geographic Information System as well. Existing cartographic symbol libraries have two flaws. One is the display quality and the other one is relations adjusting. Statistic data presented in this paper indicate that the aliasing problem is a major factor on the symbol display quality on graphic display devices. So, effective graphic anti-aliasing methods based on a new anti-aliasing algorithm are presented and encapsulated in an anti-aliasing graphic library with the form of Component Object Model. Furthermore, cartographic visualization should represent feature relation in the way of correctly adjusting symbol relations besides displaying an individual feature. But current cartographic symbol libraries don't have this capability. This paper creates a cartographic symbol design model to implement symbol relations adjusting. Consequently the cartographic symbol library based on this design model can provide cartographic visualization with relations adjusting capability. The anti-aliasing graphic library and the cartographic symbol library are sampled and the results prove that the two libraries both have better efficiency and effect.
Impact of nuclear data on sodium-cooled fast reactor calculations
NASA Astrophysics Data System (ADS)
Aures, Alexander; Bostelmann, Friederike; Zwermann, Winfried; Velkov, Kiril
2016-03-01
Neutron transport and depletion calculations are performed in combination with various nuclear data libraries in order to assess the impact of nuclear data on safety-relevant parameters of sodium-cooled fast reactors. These calculations are supplemented by systematic uncertainty analyses with respect to nuclear data. Analysed quantities are the multiplication factor and nuclide densities as a function of burn-up and the Doppler and Na-void reactivity coefficients at begin of cycle. While ENDF/B-VII.0 / -VII.1 yield rather consistent results, larger discrepancies are observed between the JEFF libraries. While the newest evaluation, JEFF-3.2, agrees with the ENDF/B-VII libraries, the JEFF-3.1.2 library yields significant larger multiplication factors.
E-Global Library: The Academic Campus Library Meets the Internet.
ERIC Educational Resources Information Center
Heilig, Jean M.
2001-01-01
Describes e-global library, the first Internet-based virtual library designed for online students at Jones International University and that has grown into a separately licensable product. Highlights include marketing to other academic libraries, both online and traditional; fees; the e-global library model; collection development policies;…
A communication library for the parallelization of air quality models on structured grids
NASA Astrophysics Data System (ADS)
Miehe, Philipp; Sandu, Adrian; Carmichael, Gregory R.; Tang, Youhua; Dăescu, Dacian
PAQMSG is an MPI-based, Fortran 90 communication library for the parallelization of air quality models (AQMs) on structured grids. It consists of distribution, gathering and repartitioning routines for different domain decompositions implementing a master-worker strategy. The library is architecture and application independent and includes optimization strategies for different architectures. This paper presents the library from a user perspective. Results are shown from the parallelization of STEM-III on Beowulf clusters. The PAQMSG library is available on the web. The communication routines are easy to use, and should allow for an immediate parallelization of existing AQMs. PAQMSG can also be used for constructing new models.
Shared ownership: what's the future?
Roth, Karen L
2013-01-01
The status of library consortia has evolved over time in terms of their composition and alternative negotiating models. New purchasing models may allow improved library involvement in the acquisitions process and improved methods for meeting users' future needs. Ever-increasing costs of library resources and the need to reduce expenses make it necessary to continue the exploration of library consortia for group purchases.
A Two-Tiered Model for Analyzing Library Web Site Usage Statistics, Part 1: Web Server Logs.
ERIC Educational Resources Information Center
Cohen, Laura B.
2003-01-01
Proposes a two-tiered model for analyzing web site usage statistics for academic libraries: one tier for library administrators that analyzes measures indicating library use, and a second tier for web site managers that analyzes measures aiding in server maintenance and site design. Discusses the technology of web site usage statistics, and…
Autonomous Modelling of X-ray Spectra Using Robust Global Optimization Methods
NASA Astrophysics Data System (ADS)
Rogers, Adam; Safi-Harb, Samar; Fiege, Jason
2015-08-01
The standard approach to model fitting in X-ray astronomy is by means of local optimization methods. However, these local optimizers suffer from a number of problems, such as a tendency for the fit parameters to become trapped in local minima, and can require an involved process of detailed user intervention to guide them through the optimization process. In this work we introduce a general GUI-driven global optimization method for fitting models to X-ray data, written in MATLAB, which searches for optimal models with minimal user interaction. We directly interface with the commonly used XSPEC libraries to access the full complement of pre-existing spectral models that describe a wide range of physics appropriate for modelling astrophysical sources, including supernova remnants and compact objects. Our algorithm is powered by the Ferret genetic algorithm and Locust particle swarm optimizer from the Qubist Global Optimization Toolbox, which are robust at finding families of solutions and identifying degeneracies. This technique will be particularly instrumental for multi-parameter models and high-fidelity data. In this presentation, we provide details of the code and use our techniques to analyze X-ray data obtained from a variety of astrophysical sources.
Model Preservation Program for a Small University Library.
ERIC Educational Resources Information Center
Robbins, Louise S.
This report proposes a preservation program assuming a model of a university library serving 5,000 or fewer students and 350 or fewer faculty members. The model program is not for a comprehensive university or research institution, and the library's collection is one developed and used as a curriculum-support collection. The goal of the…
An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence
NASA Astrophysics Data System (ADS)
Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras
2014-05-01
We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.
Witoonchart, Peerajak; Chongstitvatana, Prabhas
2017-08-01
In this study, for the first time, we show how to formulate a structured support vector machine (SSVM) as two layers in a convolutional neural network, where the top layer is a loss augmented inference layer and the bottom layer is the normal convolutional layer. We show that a deformable part model can be learned with the proposed structured SVM neural network by backpropagating the error of the deformable part model to the convolutional neural network. The forward propagation calculates the loss augmented inference and the backpropagation calculates the gradient from the loss augmented inference layer to the convolutional layer. Thus, we obtain a new type of convolutional neural network called an Structured SVM convolutional neural network, which we applied to the human pose estimation problem. This new neural network can be used as the final layers in deep learning. Our method jointly learns the structural model parameters and the appearance model parameters. We implemented our method as a new layer in the existing Caffe library. Copyright © 2017 Elsevier Ltd. All rights reserved.
Library Web Site Administration: A Strategic Planning Model For the Smaller Academic Library
ERIC Educational Resources Information Center
Ryan, Susan M.
2003-01-01
Strategic planning provides a useful structure for creating and implementing library web sites. The planned integration of a library's web site into its mission and objectives ensures that the library's community of users will consider the web site one of the most important information tools the library offers.
A Feminist Paradigm for Library and Information Science.
ERIC Educational Resources Information Center
Hannigan, Jane Anne; Crew, Hilary
1993-01-01
Discussion of feminist scholarship and feminist thinking focuses on feminism in librarianship. Topics addressed include research methodologies; implications for library and information science; a feminist model, including constructed knowledge; standpoint theory; benefits of feminist scholarship; and a library model. (Contains 14 references.) (LRW)
NASA Astrophysics Data System (ADS)
Baudette, Maxime; Castro, Marcelo; Rabuzin, Tin; Lavenius, Jan; Bogodorova, Tetiana; Vanfretti, Luigi
2018-01-01
This paper presents the latest improvements implemented in the Open-Instance Power System Library (OpenIPSL). The OpenIPSL is a fork from the original iTesla Power Systems Library (iPSL) by some of the original developers of the iPSL. This fork's motivation comes from the will of the authors to further develop the library with additional features tailored to research and teaching purposes. The enhancements include improvements to existing models, the addition of a new package of three phase models, and the implementation of automated tests through continuous integration.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
Library of Advanced Materials for Engineering (LAME) 4.44.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherzinger, William M.; Lester, Brian T.
Accurate and efficient constitutive modeling remains a cornerstone issues for solid mechanics analysis. Over the years, the LAME advanced material model library has grown to address this challenge by implementing models capable of describing material systems spanning soft polymers to s ti ff ceramics including both isotropic and anisotropic responses. Inelastic behaviors including (visco) plasticity, damage, and fracture have all incorporated for use in various analyses. This multitude of options and flexibility, however, comes at the cost of many capabilities, features, and responses and the ensuing complexity in the resulting implementation. Therefore, to enhance confidence and enable the utilization ofmore » the LAME library in application, this effort seeks to document and verify the various models in the LAME library. Specifically, the broader strategy, organization, and interface of the library itself is first presented. The physical theory, numerical implementation, and user guide for a large set of models is then discussed. Importantly, a number of verification tests are performed with each model to not only have confidence in the model itself but also highlight some important response characteristics and features that may be of interest to end-users. Finally, in looking ahead to the future, approaches to add material models to this library and further expand the capabilities are presented.« less
Library of Advanced Materials for Engineering (LAME) 4.48.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherzinger, William M.; Lester, Brian T.
Accurate and efficient constitutive modeling remains a cornerstone issues for solid mechanics analysis. Over the years, the LAME advanced material model library has grown to address this challenge by implement- ing models capable of describing material systems spanning soft polymers to stiff ceramics including both isotropic and anisotropic responses. Inelastic behaviors including (visco)plasticity, damage, and fracture have all incorporated for use in various analyses. This multitude of options and flexibility, however, comes at the cost of many capabilities, features, and responses and the ensuing complexity in the resulting imple- mentation. Therefore, to enhance confidence and enable the utilization of themore » LAME library in application, this effort seeks to document and verify the various models in the LAME library. Specifically, the broader strategy, organization, and interface of the library itself is first presented. The physical theory, numerical implementation, and user guide for a large set of models is then discussed. Importantly, a number of verifi- cation tests are performed with each model to not only have confidence in the model itself but also highlight some important response characteristics and features that may be of interest to end-users. Finally, in looking ahead to the future, approaches to add material models to this library and further expand the capabilities are presented.« less
NASA Astrophysics Data System (ADS)
Silversides, Katherine L.; Melkumyan, Arman
2017-03-01
Machine learning techniques such as Gaussian Processes can be used to identify stratigraphically important features in geophysical logs. The marker shales in the banded iron formation hosted iron ore deposits of the Hamersley Ranges, Western Australia, form distinctive signatures in the natural gamma logs. The identification of these marker shales is important for stratigraphic identification of unit boundaries for the geological modelling of the deposit. Machine learning techniques each have different unique properties that will impact the results. For Gaussian Processes (GPs), the output values are inclined towards the mean value, particularly when there is not sufficient information in the library. The impact that these inclinations have on the classification can vary depending on the parameter values selected by the user. Therefore, when applying machine learning techniques, care must be taken to fit the technique to the problem correctly. This study focuses on optimising the settings and choices for training a GPs system to identify a specific marker shale. We show that the final results converge even when different, but equally valid starting libraries are used for the training. To analyse the impact on feature identification, GP models were trained so that the output was inclined towards a positive, neutral or negative output. For this type of classification, the best results were when the pull was towards a negative output. We also show that the GP output can be adjusted by using a standard deviation coefficient that changes the balance between certainty and accuracy in the results.
Library Catalog Log Analysis in E-Book Patron-Driven Acquisitions (PDA): A Case Study
ERIC Educational Resources Information Center
Urbano, Cristóbal; Zhang, Yin; Downey, Kay; Klingler, Thomas
2015-01-01
Patron-Driven Acquisitions (PDA) is a new model used for e-book acquisition by academic libraries. A key component of this model is to make records of ebooks available in a library catalog and let actual patron usage decide whether or not an item is purchased. However, there has been a lack of research examining the role of the library catalog as…
Library Literacy Programs for English Language Learners. ERIC Digest.
ERIC Educational Resources Information Center
McMurrer, Eileen; Terrill, Lynda
This digest summarizes the history of public libraries and library literacy programs; describes current delivery models; and discusses initiatives in library literacy, profiling one successful public library program that serves adult English language learners and their families. (Adjunct ERIC Clearinghouse for ESL Literacy Education) (Author/VWL)
Evolution of Reference: A New Service Model for Science and Engineering Libraries
ERIC Educational Resources Information Center
Bracke, Marianne Stowell; Chinnaswamy, Sainath; Kline, Elizabeth
2008-01-01
This article explores the different steps involved in adopting a new service model at the University of Arizona Science-Engineering Library. In a time of shrinking budgets and changing user behavior the library was forced to rethink it reference services to be cost effective and provide quality service at the same time. The new model required…
Chapelle, D; Fragu, M; Mallet, V; Moireau, P
2013-11-01
We present the fundamental principles of data assimilation underlying the Verdandi library, and how they are articulated with the modular architecture of the library. This translates--in particular--into the definition of standardized interfaces through which the data assimilation library interoperates with the model simulation software and the so-called observation manager. We also survey various examples of data assimilation applied to the personalization of biophysical models, in particular, for cardiac modeling applications within the euHeart European project. This illustrates the power of data assimilation concepts in such novel applications, with tremendous potential in clinical diagnosis assistance.
CHARMM-GUI ligand reader and modeler for CHARMM force field generation of small molecules.
Kim, Seonghoon; Lee, Jumin; Jo, Sunhwan; Brooks, Charles L; Lee, Hui Sun; Im, Wonpil
2017-06-05
Reading ligand structures into any simulation program is often nontrivial and time consuming, especially when the force field parameters and/or structure files of the corresponding molecules are not available. To address this problem, we have developed Ligand Reader & Modeler in CHARMM-GUI. Users can upload ligand structure information in various forms (using PDB ID, ligand ID, SMILES, MOL/MOL2/SDF file, or PDB/mmCIF file), and the uploaded structure is displayed on a sketchpad for verification and further modification. Based on the displayed structure, Ligand Reader & Modeler generates the ligand force field parameters and necessary structure files by searching for the ligand in the CHARMM force field library or using the CHARMM general force field (CGenFF). In addition, users can define chemical substitution sites and draw substituents in each site on the sketchpad to generate a set of combinatorial structure files and corresponding force field parameters for throughput or alchemical free energy simulations. Finally, the output from Ligand Reader & Modeler can be used in other CHARMM-GUI modules to build a protein-ligand simulation system for all supported simulation programs, such as CHARMM, NAMD, GROMACS, AMBER, GENESIS, LAMMPS, Desmond, OpenMM, and CHARMM/OpenMM. Ligand Reader & Modeler is available as a functional module of CHARMM-GUI at http://www.charmm-gui.org/input/ligandrm. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Marshall, Emily L.; Borrego, David; Tran, Trung; Fudge, James C.; Bolch, Wesley E.
2018-03-01
Epidemiologic data demonstrate that pediatric patients face a higher relative risk of radiation induced cancers than their adult counterparts at equivalent exposures. Infants and children with congenital heart defects are a critical patient population exposed to ionizing radiation during life-saving procedures. These patients will likely incur numerous procedures throughout their lifespan, each time increasing their cumulative radiation absorbed dose. As continued improvements in long-term prognosis of congenital heart defect patients is achieved, a better understanding of organ radiation dose following treatment becomes increasingly vital. Dosimetry of these patients can be accomplished using Monte Carlo radiation transport simulations, coupled with modern anatomical patient models. The aim of this study was to evaluate the performance of the University of Florida/National Cancer Institute (UF/NCI) pediatric hybrid computational phantom library for organ dose assessment of patients that have undergone fluoroscopically guided cardiac catheterizations. In this study, two types of simulations were modeled. A dose assessment was performed on 29 patient-specific voxel phantoms (taken as representing the patient’s true anatomy), height/weight-matched hybrid library phantoms, and age-matched reference phantoms. Two exposure studies were conducted for each phantom type. First, a parametric study was constructed by the attending pediatric interventional cardiologist at the University of Florida to model the range of parameters seen clinically. Second, four clinical cardiac procedures were simulated based upon internal logfiles captured by a Toshiba Infinix-i Cardiac Bi-Plane fluoroscopic unit. Performance of the phantom library was quantified by computing both the percent difference in individual organ doses, as well as the organ dose root mean square values for overall phantom assessment between the matched phantoms (UF/NCI library or reference) and the patient-specific phantoms. The UF/NCI hybrid phantoms performed at percent differences of between 15% and 30% for the parametric set of irradiation events. Among internal logfile reconstructed procedures, the UF/NCI hybrid phantoms performed with RMS organ dose values between 7% and 29%. Percent improvement in organ dosimetry via the use of hybrid library phantoms over the reference phantoms ranged from 6.6% to 93%. The use of a hybrid phantom library, Monte Carlo radiation transport methods, and clinical information on irradiation events provide a means for tracking organ dose in these radiosensitive patients undergoing fluoroscopically guided cardiac procedures. This work was supported by Advanced Laboratory for Radiation Dosimetry Studies, University of Florida, American Association of University Women, National Cancer Institute Grant 1F31 CA159464.
FreeSASA: An open source C library for solvent accessible surface area calculations.
Mitternacht, Simon
2016-01-01
Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.
Public Libraries and Internet Public Access Models: Describing Possible Approaches.
ERIC Educational Resources Information Center
Tomasello, Tami K.; McClure, Charles R.
2002-01-01
Discusses ways of providing Internet access to the general public and analyzes eight models currently in use: public schools, public libraries, cybermobiles, public housing, community technology centers, community networks, kiosks, and cyber cafes. Concludes that public libraries may wish to develop collaborative strategies with other…
Genetic Control of Plant Root Colonization by the Biocontrol agent, Pseudomonas fluorescens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Benjamin J.; Fletcher, Meghan; Waters, Jordan
Plant growth promoting rhizobacteria (PGPR) are a critical component of plant root ecosystems. PGPR promote plant growth by solubilizing inaccessible minerals, suppressing pathogenic microorganisms in the soil, and directly stimulating growth through hormone synthesis. Pseudomonas fluorescens is a well-established PGPR isolated from wheat roots that can also colonize the root system of the model plant, Arabidopsis thaliana. We have created barcoded transposon insertion mutant libraries suitable for genome-wide transposon-mediated mutagenesis followed by sequencing (TnSeq). These libraries consist of over 105 independent insertions, collectively providing loss-of-function mutants for nearly all genes in the P.fluorescens genome. Each insertion mutant can be unambiguouslymore » identified by a randomized 20 nucleotide sequence (barcode) engineered into the transposon sequence. We used these libraries in a gnotobiotic assay to examine the colonization ability of P.fluorescens on A.thaliana roots. Taking advantage of the ability to distinguish individual colonization events using barcode sequences, we assessed the timing and microbial concentration dependence of colonization of the rhizoplane niche. These data provide direct insight into the dynamics of plant root colonization in an in vivo system and define baseline parameters for the systematic identification of the bacterial genes and molecular pathways using TnSeq assays. Having determined parameters that facilitate potential colonization of roots by thousands of independent insertion mutants in a single assay, we are currently establishing a genome-wide functional map of genes required for root colonization in P.fluorescens. Importantly, the approach developed and optimized here for P.fluorescens>A.thaliana colonization will be applicable to a wide range of plant-microbe interactions, including biofuel feedstock plants and microbes known or hypothesized to impact on biofuel-relevant traits including biomass productivity and pathogen resistance.« less
Van Moorsel, Guillaume
2005-01-01
Libraries often do not know how clients value their product/ service offerings. Yet at a time when the mounting costs for library support are increasingly difficult to justify to the parent institution, the library's ability to gauge the value of its offerings to clients has never been more critical. Client Value Models (CVMs) establish a common definition of value elements-or a "value vocabulary"-for libraries and their clients, thereby providing a basis upon which to make rational planning decisions regarding product/service acquisition and development. The CVM concept is borrowed from business and industry, but its application has a natural fit in libraries. This article offers a theoretical consideration and practical illustration of CVM application in libraries.
Yu, Hai-bo; Zou, Bei-yan; Wang, Xiao-liang; Li, Min
2016-01-01
Aim: hERG potassium channels display miscellaneous interactions with diverse chemical scaffolds. In this study we assessed the hERG inhibition in a large compound library of diverse chemical entities and provided data for better understanding of the mechanisms underlying promiscuity of hERG inhibition. Methods: Approximately 300 000 compounds contained in Molecular Library Small Molecular Repository (MLSMR) library were tested. Compound profiling was conducted on hERG-CHO cells using the automated patch-clamp platform–IonWorks Quattro™. Results: The compound library was tested at 1 and 10 μmol/L. IC50 values were predicted using a modified 4-parameter logistic model. Inhibitor hits were binned into three groups based on their potency: high (IC50<1 μmol/L), intermediate (1 μmol/L< IC50<10 μmol/L), and low (IC50>10 μmol/L) with hit rates of 1.64%, 9.17% and 16.63%, respectively. Six physiochemical properties of each compound were acquired and calculated using ACD software to evaluate the correlation between hERG inhibition and the properties: hERG inhibition was positively correlative to the physiochemical properties ALogP, molecular weight and RTB, and negatively correlative to TPSA. Conclusion: Based on a large diverse compound collection, this study provides experimental evidence to understand the promiscuity of hERG inhibition. This study further demonstrates that hERG liability compounds tend to be more hydrophobic, high-molecular, flexible and polarizable. PMID:26725739
Matching user needs in health care.
Bremer, G; Leggate, P
1982-01-01
Outlines patterns of library service available to users in the (British) National Health Service (NHS); gives preliminary results of a study detailing the character of the library system users, the nature of their needs and the degree to which the library system output meshes with user needs. The NHS library system is composed of widely scattered but small libraries, many of which can be characterized according to the narrow range of needs they serve. The largest libraries of the system, the teaching hospital library, share features of the NHS and university library systems. The most numerous, however, are the postgraduate medical education center libraries and nurse training school libraries. Until recently, there had been no cohesive organizational structure uniting the facilities and services of these various library groups. Regional library systems dating from 1967 have sought to encourage collaboration among libraries and to take into account user needs in policy planning. Recent research into medical libraries conducted by the British Library Research and Development Department points out the need for medical students and doctors to be able to adapt themselves to extracting needed information from any library which they might need to use during their career. Data showed that both clinical and preclinical medical personnel use a wide range of libraries, and are most satisfied with a multifunction library. Interpretation of data was undertaken within a conceptual framework of modelling developed by Checkland which provides a structure for thinking about the system. It is hoped that this approach will help to identify the sort of model of users which the library must maintain in order to provide them with the services they desire.
Thanapaisal, Soodjai; Thanapaisal, Chaiwit
2013-09-01
Faculty of Medicine Library, Khon Kaen University started to acquire online information resources since 2001 with the subscriptions to 2 databases. Nowadays it has 29 items of subscriptions and the expenses on online information resources reach to 17 million baht, more than 70 percent of the information resources budget, serving the academic purposes of the Faculty of Medicine. The problems of online information resources acquisition fall into 4 categories, and lead to 4 aspects conforming the model of the acquisition, comparing or benchmarking with the 4 selected medical school libraries in Bangkok, Chiang Mai, and Songkhla, and discussion with some other Thai and foreign libraries. The acquisition model of online information resources is developed from those problems and proposed for Faculty of Medicine Library, Khon Kaen University as well as for any medical libraries which prefer.
NASA Technical Reports Server (NTRS)
Lagar, Gunnar
1994-01-01
The scope of this presentation is to give a state-of-the-art report on the present situation of Nordic technology libraries, to elaborate on a plan for national resource libraries in Sweden, and to share how the Royal Institute of Technology Library in Stockholm (KTHB) has fostered a network of cooperating libraries in order to optimize government funding for the system of resource libraries.
A Model Privacy Statement for Ohio Library Web Sites.
ERIC Educational Resources Information Center
Monaco, Michael J.
The purpose of this research was to develop a model privacy policy statement for library World Wide Web sites. First, standards of privacy protection were identified. These standards were culled from the privacy and confidentiality policies of the American Library Association, the Federal Trade Commission's online privacy reports, the guidelines…
Using the Gamma-Poisson Model to Predict Library Circulations.
ERIC Educational Resources Information Center
Burrell, Quentin L.
1990-01-01
Argues that the gamma mixture of Poisson processes, for all its perceived defects, can be used to make predictions regarding future library book circulations of a quality adequate for general management requirements. The use of the model is extensively illustrated with data from two academic libraries. (Nine references) (CLB)
Cloud Computing and Your Library
ERIC Educational Resources Information Center
Mitchell, Erik T.
2010-01-01
One of the first big shifts in how libraries manage resources was the move from print-journal purchasing models to database-subscription and electronic-journal purchasing models. Libraries found that this transition helped them scale their resources and provide better service just by thinking a bit differently about their services. Likewise,…
ERIC Educational Resources Information Center
Lewin, Heather S.; Passonneau, Sarah M.
2012-01-01
This research provides the first review of publicly available assessment information found on Association of Research Libraries (ARL) members' websites. After providing an overarching review of benchmarking assessment data, and of professionally recommended assessment models, this paper examines if libraries contextualized their assessment…
ERIC Educational Resources Information Center
Kavass, Igor
Examination of several library legislation models developed to meet the needs of developed and developing nations reveals that our traditional notion of the library's role in society must be abandoned if we wish to reconcile its benefits to its costs. Four models currently exist: many nations, particularly Asian, have no legislation; most nations,…
NASA Astrophysics Data System (ADS)
Li, D. Y.; Li, K.; Wu, C.
2017-08-01
With the promotion of fine degree of the heritage building surveying and mapping, building information modelling technology(BIM) begins to be used in surveying and mapping, renovation, recording and research of heritage building, called historical building information modelling(HBIM). The hierarchical frameworks of parametric component library of BIM, belonging to the same type with the same parameters, has the same internal logic with archaeological typology which is more and more popular in the age identification of ancient buildings. Compared with the common materials, 2D drawings and photos, typology with HBIM has two advantages — (1) comprehensive building information both in collection and representation and (2) uniform and reasonable classification criteria This paper will take the information surveying and mapping of Jiayuguan Fortress Town as an example to introduce the field work method of information surveying and mapping based on HBIM technology and the construction of Revit family library.And then in order to prove the feasibility and advantage of HBIM technology used in typology method, this paper will identify the age of Guanghua gate tower, Rouyuan gate tower, Wenchang pavilion and the theater building of Jiayuguan Fortress Town with HBIM technology and typology method.
Research evaluation support services in biomedical libraries.
Gutzman, Karen Elizabeth; Bales, Michael E; Belter, Christopher W; Chambers, Thane; Chan, Liza; Holmes, Kristi L; Lu, Ya-Ling; Palmer, Lisa A; Reznik-Zellen, Rebecca C; Sarli, Cathy C; Suiter, Amy M; Wheeler, Terrie R
2018-01-01
The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.
Modeling Enclosure Design in Above-Grade Walls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lstiburek, J.; Ueno, K.; Musunuru, S.
2016-03-01
This report describes the modeling of typical wall assemblies that have performed well historically in various climate zones. The WUFI (Warme und Feuchte instationar) software (Version 5.3) model was used. A library of input data and results are provided. The provided information can be generalized for application to a broad population of houses, within the limits of existing experience. The WUFI software model was calibrated or tuned using wall assemblies with historically successful performance. The primary performance criteria or failure criteria establishing historic performance was moisture content of the exterior sheathing. The primary tuning parameters (simulation inputs) were airflow andmore » specifying appropriate material properties. Rational hygric loads were established based on experience - specifically rain wetting and interior moisture (RH levels). The tuning parameters were limited or bounded by published data or experience. The WUFI templates provided with this report supply useful information resources to new or less-experienced users. The files present various custom settings that will help avoid results that will require overly conservative enclosure assemblies. Overall, better material data, consistent initial assumptions, and consistent inputs among practitioners will improve the quality of WUFI modeling, and improve the level of sophistication in the field.« less
Experience, Challenges, and Opportunities of Being Fully Embedded in a User Group.
Wu, Lin; Thornton, Joel
2017-01-01
Embedded librarian models can assume different forms and levels, depending on patron needs and a library's choice of delivery services. An academic health sciences library decided to enhance its service delivery model by integrating a librarian into the College of Pharmacy, approximately 250 miles away from the main library. This article describes the embedded librarian's first-year experience, challenges, and opportunities working as a library faculty in the college. The comparison of one-year recorded statistics on preembedded and postembedded activities demonstrated the effectiveness and impact of such an embedded librarian model.
Latin Hypercube Sampling (LHS) UNIX Library/Standalone
DOE Office of Scientific and Technical Information (OSTI.GOV)
2004-05-13
The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less
Using the results of a satisfaction survey to demonstrate the impact of a new library service model.
Powelson, Susan E; Reaume, Renee D
2012-09-01
In 2005, the University of Calgary entered into a contract to provide library services to the staff and physicians of Alberta Health Services Calgary Zone (AHS CZ), creating the Health Information Network Calgary (HINC). A user satisfaction survey was contractually required to determine whether the new library service model created through the agreement with the University of Calgary was successful. Our additional objective was to determine whether information and resources provided through the HINC were making an impact on patient care. A user satisfaction survey of 18 questions was created in collaboration with AHS CZ contract partners and distributed using the snowball or convenience sample method. Six hundred and ninety-four surveys were returned. Of respondents, 75% use the HINC library services. More importantly, 43% of respondents indicated that search results provided by library staff had a direct impact on patient care decisions. Alberta Health Services Calgary Zone staff are satisfied with the new service delivery model, they are taking advantage of the services offered, and using library provided information to improve patient care. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group.
NASA Astrophysics Data System (ADS)
Geyer, Amy M.; O'Reilly, Shannon; Lee, Choonsik; Long, Daniel J.; Bolch, Wesley E.
2014-09-01
Substantial increases in pediatric and adult obesity in the US have prompted a major revision to the current UF/NCI (University of Florida/National Cancer Institute) family of hybrid computational phantoms to more accurately reflect current trends in larger body morphometry. A decision was made to construct the new library in a gridded fashion by height/weight without further reference to age-dependent weight/height percentiles as these become quickly outdated. At each height/weight combination, circumferential parameters were defined and used for phantom construction. All morphometric data for the new library were taken from the CDC NHANES survey data over the time period 1999-2006, the most recent reported survey period. A subset of the phantom library was then used in a CT organ dose sensitivity study to examine the degree to which body morphometry influences the magnitude of organ doses for patients that are underweight to morbidly obese in body size. Using primary and secondary morphometric parameters, grids containing 100 adult male height/weight bins, 93 adult female height/weight bins, 85 pediatric male height/weight bins and 73 pediatric female height/weight bins were constructed. These grids served as the blueprints for construction of a comprehensive library of patient-dependent phantoms containing 351 computational phantoms. At a given phantom standing height, normalized CT organ doses were shown to linearly decrease with increasing phantom BMI for pediatric males, while curvilinear decreases in organ dose were shown with increasing phantom BMI for adult females. These results suggest that one very useful application of the phantom library would be the construction of a pre-computed dose library for CT imaging as needed for patient dose-tracking.
Computational hybrid anthropometric paediatric phantom library for internal radiation dosimetry
NASA Astrophysics Data System (ADS)
Xie, Tianwu; Kuster, Niels; Zaidi, Habib
2017-04-01
Hybrid computational phantoms combine voxel-based and simplified equation-based modelling approaches to provide unique advantages and more realism for the construction of anthropomorphic models. In this work, a methodology and C++ code are developed to generate hybrid computational phantoms covering statistical distributions of body morphometry in the paediatric population. The paediatric phantoms of the Virtual Population Series (IT’IS Foundation, Switzerland) were modified to match target anthropometric parameters, including body mass, body length, standing height and sitting height/stature ratio, determined from reference databases of the National Centre for Health Statistics and the National Health and Nutrition Examination Survey. The phantoms were selected as representative anchor phantoms for the newborn, 1, 2, 5, 10 and 15 years-old children, and were subsequently remodelled to create 1100 female and male phantoms with 10th, 25th, 50th, 75th and 90th body morphometries. Evaluation was performed qualitatively using 3D visualization and quantitatively by analysing internal organ masses. Overall, the newly generated phantoms appear very reasonable and representative of the main characteristics of the paediatric population at various ages and for different genders, body sizes and sitting stature ratios. The mass of internal organs increases with height and body mass. The comparison of organ masses of the heart, kidney, liver, lung and spleen with published autopsy and ICRP reference data for children demonstrated that they follow the same trend when correlated with age. The constructed hybrid computational phantom library opens up the prospect of comprehensive radiation dosimetry calculations and risk assessment for the paediatric population of different age groups and diverse anthropometric parameters.
Ebalunode, Jerry O; Zheng, Weifan; Tropsha, Alexander
2011-01-01
Optimization of chemical library composition affords more efficient identification of hits from biological screening experiments. The optimization could be achieved through rational selection of reagents used in combinatorial library synthesis. However, with a rapid advent of parallel synthesis methods and availability of millions of compounds synthesized by many vendors, it may be more efficient to design targeted libraries by means of virtual screening of commercial compound collections. This chapter reviews the application of advanced cheminformatics approaches such as quantitative structure-activity relationships (QSAR) and pharmacophore modeling (both ligand and structure based) for virtual screening. Both approaches rely on empirical SAR data to build models; thus, the emphasis is placed on achieving models of the highest rigor and external predictive power. We present several examples of successful applications of both approaches for virtual screening to illustrate their utility. We suggest that the expert use of both QSAR and pharmacophore models, either independently or in combination, enables users to achieve targeted libraries enriched with experimentally confirmed hit compounds.
NASA Astrophysics Data System (ADS)
Hardyanto, W.; Purwinarko, A.; Adhi, M. A.
2018-03-01
The library which is the gate of the University should be supported by the existence of an adequate information system, to provide excellent service and optimal to every user. Library management system that has been in existence since 2009 needs to be re-evaluated so that the system can meet the needs of both operator and Unnes user in particular, and users from outside Unnes in general. This study aims to evaluate and improve the existing library management system to produce a system that is accountable and able to meet the needs of end users, as well as produce a library management system that is integrated Unnes. Research is directed to produce evaluation report with Technology Acceptance Model (TAM) approach and library management system integrated with the national standard.
The Chandra X-ray Observatory PSF Library
NASA Astrophysics Data System (ADS)
Karovska, M.; Beikman, S. J.; Elvis, M. S.; Flanagan, J. M.; Gaetz, T.; Glotfelty, K. J.; Jerius, D.; McDowell, J. C.; Rots, A. H.
Pre-flight and on-orbit calibration of the Chandra X-Ray Observatory provided a unique base for developing detailed models of the optics and detectors. Using these models we have produced a set of simulations of the Chandra point spread function (PSF) which is available to the users via PSF library files. We describe here how the PSF models are generated and the design and content of the Chandra PSF library files.
NASA Astrophysics Data System (ADS)
Marsh, C.; Pomeroy, J. W.; Wheater, H. S.
2016-12-01
There is a need for hydrological land surface schemes that can link to atmospheric models, provide hydrological prediction at multiple scales and guide the development of multiple objective water predictive systems. Distributed raster-based models suffer from an overrepresentation of topography, leading to wasted computational effort that increases uncertainty due to greater numbers of parameters and initial conditions. The Canadian Hydrological Model (CHM) is a modular, multiphysics, spatially distributed modelling framework designed for representing hydrological processes, including those that operate in cold-regions. Unstructured meshes permit variable spatial resolution, allowing coarse resolutions at low spatial variability and fine resolutions as required. Model uncertainty is reduced by lessening the necessary computational elements relative to high-resolution rasters. CHM uses a novel multi-objective approach for unstructured triangular mesh generation that fulfills hydrologically important constraints (e.g., basin boundaries, water bodies, soil classification, land cover, elevation, and slope/aspect). This provides an efficient spatial representation of parameters and initial conditions, as well as well-formed and well-graded triangles that are suitable for numerical discretization. CHM uses high-quality open source libraries and high performance computing paradigms to provide a framework that allows for integrating current state-of-the-art process algorithms. The impact of changes to model structure, including individual algorithms, parameters, initial conditions, driving meteorology, and spatial/temporal discretization can be easily tested. Initial testing of CHM compared spatial scales and model complexity for a spring melt period at a sub-arctic mountain basin. The meshing algorithm reduced the total number of computational elements and preserved the spatial heterogeneity of predictions.
Public Library and Community College: A Model for Off-Campus Instruction.
ERIC Educational Resources Information Center
Stevens, Mary A.
Black Hawk College's Study Unlimited cooperative program with the River Bend Library System, established in 1972, is presented as a model for community college and public library cooperation in offering off-campus instructional opportunities to new student populations by breaking time and place access barriers. Study Unlimited's objectives are to…
The Public Library User and the Charter Tourist: Two Travellers, One Analogy
ERIC Educational Resources Information Center
Eriksson, Catarina A. M.; Michnik, Katarina E.; Nordeborg, Yoshiko
2013-01-01
Introduction: A new theoretical model, relevant to library and information science, is implemented in this paper. The aim of this study is to contribute to the theoretical concepts of library and information science by introducing an ethnological model developed for investigating charter tourist styles thereby increasing our knowledge of users'…
NASA Technical Reports Server (NTRS)
Cullimore, B.
1994-01-01
SINDA, the Systems Improved Numerical Differencing Analyzer, is a software system for solving lumped parameter representations of physical problems governed by diffusion-type equations. SINDA was originally designed for analyzing thermal systems represented in electrical analog, lumped parameter form, although its use may be extended to include other classes of physical systems which can be modeled in this form. As a thermal analyzer, SINDA can handle such interrelated phenomena as sublimation, diffuse radiation within enclosures, transport delay effects, and sensitivity analysis. FLUINT, the FLUid INTegrator, is an advanced one-dimensional fluid analysis program that solves arbitrary fluid flow networks. The working fluids can be single phase vapor, single phase liquid, or two phase. The SINDA'85/FLUINT system permits the mutual influences of thermal and fluid problems to be analyzed. The SINDA system consists of a programming language, a preprocessor, and a subroutine library. The SINDA language is designed for working with lumped parameter representations and finite difference solution techniques. The preprocessor accepts programs written in the SINDA language and converts them into standard FORTRAN. The SINDA library consists of a large number of FORTRAN subroutines that perform a variety of commonly needed actions. The use of these subroutines can greatly reduce the programming effort required to solve many problems. A complete run of a SINDA'85/FLUINT model is a four step process. First, the user's desired model is run through the preprocessor which writes out data files for the processor to read and translates the user's program code. Second, the translated code is compiled. The third step requires linking the user's code with the processor library. Finally, the processor is executed. SINDA'85/FLUINT program features include 20,000 nodes, 100,000 conductors, 100 thermal submodels, and 10 fluid submodels. SINDA'85/FLUINT can also model two phase flow, capillary devices, user defined fluids, gravity and acceleration body forces on a fluid, and variable volumes. SINDA'85/FLUINT offers the following numerical solution techniques. The Finite difference formulation of the explicit method is the Forward-difference explicit approximation. The formulation of the implicit method is the Crank-Nicolson approximation. The program allows simulation of non-uniform heating and facilitates modeling thin-walled heat exchangers. The ability to model non-equilibrium behavior within two-phase volumes is included. Recent improvements to the program were made in modeling real evaporator-pumps and other capillary-assist evaporators. SINDA'85/FLUINT is available by license for a period of ten (10) years to approved licensees. The licensed program product includes the source code and one copy of the supporting documentation. Additional copies of the documentation may be purchased separately at any time. SINDA'85/FLUINT is written in FORTRAN 77. Version 2.3 has been implemented on Cray series computers running UNICOS, CONVEX computers running CONVEX OS, and DEC RISC computers running ULTRIX. Binaries are included with the Cray version only. The Cray version of SINDA'85/FLUINT also contains SINGE, an additional graphics program developed at Johnson Space Flight Center. Both source and executable code are provided for SINGE. Users wishing to create their own SINGE executable will also need the NASA Device Independent Graphics Library (NASADIG, previously known as SMDDIG; UNIX version, MSC-22001). The Cray and CONVEX versions of SINDA'85/FLUINT are available on 9-track 1600 BPI UNIX tar format magnetic tapes. The CONVEX version is also available on a .25 inch streaming magnetic tape cartridge in UNIX tar format. The DEC RISC ULTRIX version is available on a TK50 magnetic tape cartridge in UNIX tar format. SINDA was developed in 1971, and first had fluid capability added in 1975. SINDA'85/FLUINT version 2.3 was released in 1990.
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
CHEMKIN2. General Gas-Phase Chemical Kinetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rupley, F.M.
1992-01-24
CHEMKIN is a high-level tool for chemists to use to describe arbitrary gas-phase chemical reaction mechanisms and systems of governing equations. It remains, however, for the user to select and implement a solution method; this is not provided. It consists of two major components: the Interpreter and the Gas-phase Subroutine Library. The Interpreter reads a symbolic description of an arbitrary, user-specified chemical reaction mechanism. A data file is generated which forms a link to the Gas-phase Subroutine Library, a collection of about 200 modular subroutines which may be called to return thermodynamic properties, chemical production rates, derivatives of thermodynamic properties,more » derivatives of chemical production rates, or sensitivity parameters. Both single and double precision versions of CHEMKIN are included. Also provided is a set of FORTRAN subroutines for evaluating gas-phase transport properties such as thermal conductivities, viscosities, and diffusion coefficients. These properties are an important part of any computational simulation of a chemically reacting flow. The transport properties subroutines are designed to be used in conjunction with the CHEMKIN Subroutine Library. The transport properties depend on the state of the gas and on certain molecular parameters. The parameters considered are the Lennard-Jones potential well depth and collision diameter, the dipole moment, the polarizability, and the rotational relaxation collision number.« less
ERIC Educational Resources Information Center
BIVONA, WILLIAM A.
THIS REPORT PRESENTS AN ANALYSIS OF OVER EIGHTEEN SMALL, INTERMEDIATE, AND LARGE SCALE SYSTEMS FOR THE SELECTIVE DISSEMINATION OF INFORMATION (SDI). SYSTEMS ARE COMPARED AND ANALYZED WITH RESPECT TO DESIGN CRITERIA AND THE FOLLOWING NINE SYSTEM PARAMETERS--(1) INFORMATION INPUT, (2) METHODS OF INDEXING AND ABSTRACTING, (3) USER INTEREST PROFILE…
ERIC Educational Resources Information Center
Witt, Steven
2013-01-01
Amid growing isolationism after World War I, the American Library Association transferred its wartime programs to train librarians in Europe and promote the American model of public libraries. Working in collaboration with American philanthropists and members of the French library community, ALA established a permanent library school in Paris that…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banerjee, Kaushik; Clarity, Justin B; Cumberland, Riley M
This will be licensed via RSICC. A new, integrated data and analysis system has been designed to simplify and automate the performance of accurate and efficient evaluations for characterizing the input to the overall nuclear waste management system -UNF-Storage, Transportation & Disposal Analysis Resource and Data System (UNF-ST&DARDS). A relational database within UNF-ST&DARDS provides a standard means by which UNF-ST&DARDS can succinctly store and retrieve modeling and simulation (M&S) parameters for specific spent nuclear fuel analysis. A library of various analysis model templates provides the ability to communicate the various set of M&S parameters to the most appropriate M&S application.more » Interactive visualization capabilities facilitate data analysis and results interpretation. UNF-ST&DARDS current analysis capabilities include (1) assembly-specific depletion and decay, (2) and spent nuclear fuel cask-specific criticality and shielding. Currently, UNF-ST&DARDS uses SCALE nuclear analysis code system for performing nuclear analysis.« less
Jones, Ryan J. R.; Shinde, Aniketa; Guevarra, Dan; ...
2015-01-05
There are many energy technologies require electrochemical stability or preactivation of functional materials. Due to the long experiment duration required for either electrochemical preactivation or evaluation of operational stability, parallel screening is required to enable high throughput experimentation. We found that imposing operational electrochemical conditions to a library of materials in parallel creates several opportunities for experimental artifacts. We discuss the electrochemical engineering principles and operational parameters that mitigate artifacts int he parallel electrochemical treatment system. We also demonstrate the effects of resistive losses within the planar working electrode through a combination of finite element modeling and illustrative experiments. Operationmore » of the parallel-plate, membrane-separated electrochemical treatment system is demonstrated by exposing a composition library of mixed metal oxides to oxygen evolution conditions in 1M sulfuric acid for 2h. This application is particularly important because the electrolysis and photoelectrolysis of water are promising future energy technologies inhibited by the lack of highly active, acid-stable catalysts containing only earth abundant elements.« less
Parameter estimation and prediction for the course of a single epidemic outbreak of a plant disease.
Kleczkowski, A; Gilligan, C A
2007-10-22
Many epidemics of plant diseases are characterized by large variability among individual outbreaks. However, individual epidemics often follow a well-defined trajectory which is much more predictable in the short term than the ensemble (collection) of potential epidemics. In this paper, we introduce a modelling framework that allows us to deal with individual replicated outbreaks, based upon a Bayesian hierarchical analysis. Information about 'similar' replicate epidemics can be incorporated into a hierarchical model, allowing both ensemble and individual parameters to be estimated. The model is used to analyse the data from a replicated experiment involving spread of Rhizoctonia solani on radish in the presence or absence of a biocontrol agent, Trichoderma viride. The rate of primary (soil-to-plant) infection is found to be the most variable factor determining the final size of epidemics. Breakdown of biological control in some replicates results in high levels of primary infection and increased variability. The model can be used to predict new outbreaks of disease based upon knowledge from a 'library' of previous epidemics and partial information about the current outbreak. We show that forecasting improves significantly with knowledge about the history of a particular epidemic, whereas the precision of hindcasting to identify the past course of the epidemic is largely independent of detailed knowledge of the epidemic trajectory. The results have important consequences for parameter estimation, inference and prediction for emerging epidemic outbreaks.
Computer Center CDC Libraries/NSRD (Subprograms).
1984-06-01
VALUES Y - ARRAY OR CORRESPONDING Y-VALUES N - NUMBER OF VALUES CM REQUIRED: IOOB ERROR MESSAGE ’ L=XXXXX, X=X.XXXXXXX E+YY, X NOT MONOTONE STOP SELF ...PARAMETERS (SUBSEQUENT REPORTS MAY BE UNSOLICITED) . PCRTP1 - REQUEST TERMINAL PARAMETERS (SUBSEQUENT REPORTS ONLY IN RESPOSE TO HOST REQUEST) DA - REQUEST
Estimates of the atmospheric parameters of M-type stars: a machine-learning perspective
NASA Astrophysics Data System (ADS)
Sarro, L. M.; Ordieres-Meré, J.; Bello-García, A.; González-Marcos, A.; Solano, E.
2018-05-01
Estimating the atmospheric parameters of M-type stars has been a difficult task due to the lack of simple diagnostics in the stellar spectra. We aim at uncovering good sets of predictive features of stellar atmospheric parameters (Teff, log (g), [M/H]) in spectra of M-type stars. We define two types of potential features (equivalent widths and integrated flux ratios) able to explain the atmospheric physical parameters. We search the space of feature sets using a genetic algorithm that evaluates solutions by their prediction performance in the framework of the BT-Settl library of stellar spectra. Thereafter, we construct eight regression models using different machine-learning techniques and compare their performances with those obtained using the classical χ2 approach and independent component analysis (ICA) coefficients. Finally, we validate the various alternatives using two sets of real spectra from the NASA Infrared Telescope Facility (IRTF) and Dwarf Archives collections. We find that the cross-validation errors are poor measures of the performance of regression models in the context of physical parameter prediction in M-type stars. For R ˜ 2000 spectra with signal-to-noise ratios typical of the IRTF and Dwarf Archives, feature selection with genetic algorithms or alternative techniques produces only marginal advantages with respect to representation spaces that are unconstrained in wavelength (full spectrum or ICA). We make available the atmospheric parameters for the two collections of observed spectra as online material.
Balancing novelty with confined chemical space in modern drug discovery.
Medina-Franco, José L; Martinez-Mayorga, Karina; Meurice, Nathalie
2014-02-01
The concept of chemical space has broad applications in drug discovery. In response to the needs of drug discovery campaigns, different approaches are followed to efficiently populate, mine and select relevant chemical spaces that overlap with biologically relevant chemical spaces. This paper reviews major trends in current drug discovery and their impact on the mining and population of chemical space. We also survey different approaches to develop screening libraries with confined chemical spaces balancing physicochemical properties. In this context, the confinement is guided by criteria that can be divided in two broad categories: i) library design focused on a relevant therapeutic target or disease and ii) library design focused on the chemistry or a desired molecular function. The design and development of chemical libraries should be associated with the specific purpose of the library and the project goals. The high complexity of drug discovery and the inherent imperfection of individual experimental and computational technologies prompt the integration of complementary library design and screening approaches to expedite the identification of new and better drugs. Library design approaches including diversity-oriented synthesis, biological-oriented synthesis or combinatorial library design, to name a few, and the design of focused libraries driven by target/disease, chemical structure or molecular function are more efficient if they are guided by multi-parameter optimization. In this context, consideration of pharmaceutically relevant properties is essential for balancing novelty with chemical space in drug discovery.
Research evaluation support services in biomedical libraries
Gutzman, Karen Elizabeth; Bales, Michael E.; Belter, Christopher W.; Chambers, Thane; Chan, Liza; Holmes, Kristi L.; Lu, Ya-Ling; Palmer, Lisa A.; Reznik-Zellen, Rebecca C.; Sarli, Cathy C.; Suiter, Amy M.; Wheeler, Terrie R.
2018-01-01
Objective The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. Methods A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. Results The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Conclusions Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries. PMID:29339930
Star clusters: age, metallicity and extinction from integrated spectra
NASA Astrophysics Data System (ADS)
González Delgado, Rosa M.; Cid Fernandes, Roberto
2010-01-01
Integrated optical spectra of star clusters in the Magellanic Clouds and a few Galactic globular clusters are fitted using high-resolution spectral models for single stellar populations. The goal is to estimate the age, metallicity and extinction of the clusters, and evaluate the degeneracies among these parameters. Several sets of evolutionary models that were computed with recent high-spectral-resolution stellar libraries (MILES, GRANADA, STELIB), are used as inputs to the starlight code to perform the fits. The comparison of the results derived from this method and previous estimates available in the literature allow us to evaluate the pros and cons of each set of models to determine star cluster properties. In addition, we quantify the uncertainties associated with the age, metallicity and extinction determinations resulting from variance in the ingredients for the analysis.
Hubble's Next Generation Spectral Library
NASA Astrophysics Data System (ADS)
Heap, Sara R.; Lindler, D.
2008-03-01
Spectroscopic surveys of galaxies at z 1 or more bring the rest-frame ultraviolet into view of large, ground-based telescopes. This spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). This library contains UV-optical spectra (0.2-1.0 microns) of 378 stars having a wide range in temperature, luminosity, and metallicity. We have derived the basic stellar parameters from the optical spectral region (0.35 - 1.0 microns) and are using them to calibrate UV spectral diagnostic indices and colors.
PeTTSy: a computational tool for perturbation analysis of complex systems biology models.
Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A
2016-03-10
Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.
Great Service Pays: A Model for Service Delivery in an Academic Music Library
ERIC Educational Resources Information Center
Wilson, Andrew M.
2007-01-01
Special-subject libraries can be particularly intimidating for casual and seasoned patrons alike. Music libraries, with their variety of materials, formats, and vocabularies, can present particular challenges for the user. With this proposal for a model of service delivery, as well as many tips gained through experience working the front-of-house…
Measuring Levels of End-Users' Acceptance and Use of Hybrid Library Services
ERIC Educational Resources Information Center
Tibenderana, Prisca; Ogao, Patrick; Ikoja-Odongo, J.; Wokadala, James
2010-01-01
This study concerns the adoption of Information Communication Technology (ICT) services in libraries. The study collected 445 usable data from university library end-users using a cross-sectional survey instrument. It develops, applies and tests a research model of acceptance and use of such services based on an existing UTAUT model by Venkatesh,…
ERIC Educational Resources Information Center
Chen, Ching-chih; Hernon, Peter
This publication, extracted from a full study report, summarizes the development and utilization of an assessment model for the effectiveness of library and non-library network delivery of consumer information. Consumer information is defined as that needed by the general public to resolve problems within the family or household. The network…
Designing Public Library Websites for Teens: A Conceptual Model
ERIC Educational Resources Information Center
Naughton, Robin Amanda
2012-01-01
The main goal of this research study was to develop a conceptual model for the design of public library websites for teens (TLWs) that would enable designers and librarians to create library websites that better suit teens' information needs and practices. It bridges a gap in the research literature between user interface design in human-computer…
EPA EXPOSURE MODELS LIBRARY AND INTEGRATED MODEL EVALUATION SYSTEM
The third edition of the U.S. Environmental Protection Agencys (EPA) EML/IMES (Exposure Models Library and Integrated Model Evaluation System) on CD-ROM is now available. The purpose of the disc is to provide a compact and efficient means to distribute exposure models, documentat...
A standard library for modeling satellite orbits on a microcomputer
NASA Astrophysics Data System (ADS)
Beutel, Kenneth L.
1988-03-01
Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.
Intelligent Libraries and Apomediators: Distinguishing between Library 3.0 and Library 2.0
ERIC Educational Resources Information Center
Kwanya, Tom; Stilwell, Christine; Underwood, Peter G.
2013-01-01
Using the "point oh" naming system for developments in librarianship is attracting debate about its appropriateness, basis and syntax and the meaning and potential of Library 2.0. Now a new term, Library 3.0, has emerged. Is there is any significant difference between the two models? Using documentary analysis to explore the terms, the…
Automatic design optimization tool for passive structural control systems
NASA Astrophysics Data System (ADS)
Mojolic, Cristian; Hulea, Radu; Parv, Bianca Roxana
2017-07-01
The present paper proposes an automatic dynamic process in order to find the parameters of the seismic isolation systems applied to large span structures. Three seismic isolation solutions are proposed for the model of the new Slatina Sport Hall. The first case uses friction pendulum system (FP), the second one uses High Damping Rubber Bearing (HDRB) and Lead Rubber Bearings, while (LRB) are used for the last case of isolation. The placement of the isolation level is at the top end of the roof supporting columns. The aim is to calculate the parameters of each isolation system so that the whole's structure first vibration periods is the one desired by the user. The model is computed with the use of SAP2000 software. In order to find the best solution for the optimization problem, an optimization process based on Genetic Algorithms (GA) has been developed in Matlab. With the use of the API (Application Programming Interface) libraries a two way link is created between the two programs in order to exchange results and link parameters. The main goal is to find the best seismic isolation method for each desired modal period so that the bending moment on the supporting columns should be minimum.
Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino
2017-01-01
Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error. PMID:28505201
Fantini, Marco; Pandolfini, Luca; Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Terrigno, Marco; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino
2017-01-01
Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error.
Centralization vs. Decentralization: A Location Analysis Approach for Librarians
ERIC Educational Resources Information Center
Raffel, Jeffrey; Shishko, Robert
1972-01-01
An application of location theory to the question of centralized versus decentralized library facilities for a university, with relevance for special libraries is presented. The analysis provides models for a single library, for two or more libraries, or for decentralized facilities. (6 references) (Author/NH)
NASA Astrophysics Data System (ADS)
Holmes, Jesse Curtis
Nuclear data libraries provide fundamental reaction information required by nuclear system simulation codes. The inclusion of data covariances in these libraries allows the user to assess uncertainties in system response parameters as a function of uncertainties in the nuclear data. Formats and procedures are currently established for representing covariances for various types of reaction data in ENDF libraries. This covariance data is typically generated utilizing experimental measurements and empirical models, consistent with the method of parent data production. However, ENDF File 7 thermal neutron scattering library data is, by convention, produced theoretically through fundamental scattering physics model calculations. Currently, there is no published covariance data for ENDF File 7 thermal libraries. Furthermore, no accepted methodology exists for quantifying or representing uncertainty information associated with this thermal library data. The quality of thermal neutron inelastic scattering cross section data can be of high importance in reactor analysis and criticality safety applications. These cross sections depend on the material's structure and dynamics. The double-differential scattering law, S(alpha, beta), tabulated in ENDF File 7 libraries contains this information. For crystalline solids, S(alpha, beta) is primarily a function of the material's phonon density of states (DOS). Published ENDF File 7 libraries are commonly produced by calculation and processing codes, such as the LEAPR module of NJOY, which utilize the phonon DOS as the fundamental input for inelastic scattering calculations to directly output an S(alpha, beta) matrix. To determine covariances for the S(alpha, beta) data generated by this process, information about uncertainties in the DOS is required. The phonon DOS may be viewed as a probability density function of atomic vibrational energy states that exist in a material. Probable variation in the shape of this spectrum may be established that depends on uncertainties in the physics models and methodology employed to produce the DOS. Through Monte Carlo sampling of perturbations from the reference phonon spectrum, an S(alpha, beta) covariance matrix may be generated. In this work, density functional theory and lattice dynamics in the harmonic approximation are used to calculate the phonon DOS for hexagonal crystalline graphite. This form of graphite is used as an example material for the purpose of demonstrating procedures for analyzing, calculating and processing thermal neutron inelastic scattering uncertainty information. Several sources of uncertainty in thermal neutron inelastic scattering calculations are examined, including sources which cannot be directly characterized through a description of the phonon DOS uncertainty, and their impacts are evaluated. Covariances for hexagonal crystalline graphite S(alpha, beta) data are quantified by coupling the standard methodology of LEAPR with a Monte Carlo sampling process. The mechanics of efficiently representing and processing this covariance information is also examined. Finally, with appropriate sensitivity information, it is shown that an S(alpha, beta) covariance matrix can be propagated to generate covariance data for integrated cross sections, secondary energy distributions, and coupled energy-angle distributions. This approach enables a complete description of thermal neutron inelastic scattering cross section uncertainties which may be employed to improve the simulation of nuclear systems.
NASA Astrophysics Data System (ADS)
Sicot, G.; Lennon, M.; Miegebielle, V.; Dubucq, D.
2015-08-01
The thickness and the emulsion rate of an oil spill are two key parameters allowing to design a tailored response to an oil discharge. If estimated on per pixel basis at a high spatial resolution, the estimation of the oil thickness allows the volume of pollutant to be estimated, and that volume is needed in order to evaluate the magnitude of the pollution, and to determine the most adapted recovering means to use. The estimation of the spatial distribution of the thicknesses also allows the guidance of the recovering means at sea. The emulsion rate can guide the strategy to adopt in order to deal with an offshore oil spill: efficiency of dispersants is for example not identical on a pure oil or on an emulsion. Moreover, the thickness and emulsion rate allow the amount of the oil that has been discharged to be estimated. It appears that the shape of the reflectance spectrum of oil in the SWIR range (1000-2500nm) varies according to the emulsion rate and to the layer thickness. That shape still varies when the oil layer reaches a few millimetres, which is not the case in the visible range (400-700nm), where the spectral variation saturates around 200 μm (the upper limit of the Bonn agreement oil appearance code). In that context, hyperspectral imagery in the SWIR range shows a high potential to describe and characterize oil spills. Previous methods which intend to estimate those two parameters are based on the use of a spectral library. In that paper, we will present a method based on the inversion of a simple radiative transfer model in the oil layer. We will show that the proposed method is robust against another parameter that affects the reflectance spectrum: the size of water droplets in the emulsion. The method shows relevant results using measurements made in laboratory, equivalent to the ones obtained using methods based on the use of a spectral library. The method has the advantage to release the need of a spectral library, and to provide maps of thickness and emulsion rate values per pixel. The maps obtained are not composed of regions of thickness ranges, such as the ones obtained using discretized levels of measurements in the spectral library, or maps made from visual observations following the Bonn agreement oil appearance code.
Design, fabrication and characterization of a poly-silicon PN junction
NASA Astrophysics Data System (ADS)
Tower, Jason D.
This thesis details the design, fabrication, and characterization of a PN junction formed from p-type mono-crystalline silicon and n-type poly-crystalline silicon. The primary product of this project was a library of standard operating procedures (SOPs) for the fabrication of such devices, laying the foundations for future work and the development of a class in fabrication processes. The fabricated PN junction was characterized; in particular its current-voltage relationship was measured and fit to models. This characterization was to determine whether or not the fabrication process could produce working PN junctions with acceptable operational parameters.
compuGUT: An in silico platform for simulating intestinal fermentation
NASA Astrophysics Data System (ADS)
Moorthy, Arun S.; Eberl, Hermann J.
The microbiota inhabiting the colon and its effect on health is a topic of significant interest. In this paper, we describe the compuGUT - a simulation tool developed to assist in exploring interactions between intestinal microbiota and their environment. The primary numerical machinery is implemented in C, and the accessory scripts for loading and visualization are prepared in bash (LINUX) and R. SUNDIALS libraries are employed for numerical integration, and googleVis API for interactive visualization. Supplementary material includes a concise description of the underlying mathematical model, and detailed characterization of numerical errors and computing times associated with implementation parameters.
Viscoelastic material inversion using Sierra-SD and ROL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, Timothy; Aquino, Wilkins; Ridzal, Denis
2014-11-01
In this report we derive frequency-domain methods for inverse characterization of the constitutive parameters of viscoelastic materials. The inverse problem is cast in a PDE-constrained optimization framework with efficient computation of gradients and Hessian vector products through matrix free operations. The abstract optimization operators for first and second derivatives are derived from first principles. Various methods from the Rapid Optimization Library (ROL) are tested on the viscoelastic inversion problem. The methods described herein are applied to compute the viscoelastic bulk and shear moduli of a foam block model, which was recently used in experimental testing for viscoelastic property characterization.
Sathe, Nila A; Lee, Patricia; Giuse, Nunzia Bettinsoli
2004-10-01
Observation and immersion in the user community are critical factors in designing and implementing informatics solutions; such practices ensure relevant interventions and promote user acceptance. Libraries can adapt these strategies to developing instruction and outreach. While needs assessment is typically a core facet of library instruction, sustained, iterative assessment underlying the development of user-centered instruction is key to integrating resource use into the workflow. This paper describes the Eskind Biomedical Library's (EBL's) recent work with the Tennessee public health community to articulate a training model centered around developing power information users (PIUs). PIUs are community-based individuals with an advanced understanding of information seeking and resource use and are committed to championing information integration. As model development was informed by observation of PIU workflow and information needs, it also allowed for informal testing of the applicability of assessment via domain immersion in library outreach. Though the number of PIUs involved in the project was small, evaluation indicated that the model was useful for promoting information use in PIU workgroups and that the concept of domain immersion was relevant to library-related projects. Moreover, EBL continues to employ principles of domain understanding inherent in the PIU model to develop further interventions for the public health community and library users.
NASA Technical Reports Server (NTRS)
McComas, David
2013-01-01
The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.
\\Space: A new code to estimate \\temp, \\logg, and elemental abundances
NASA Astrophysics Data System (ADS)
Boeche, C.
2016-09-01
\\Space is a FORTRAN95 code that derives stellar parameters and elemental abundances from stellar spectra. To derive these parameters, \\Space does not measure equivalent widths of lines nor it uses templates of synthetic spectra, but it employs a new method based on a library of General Curve-Of-Growths. To date \\Space works on the wavelength range 5212-6860 Å and 8400-8921 Å, and at the spectral resolution R=2000-20000. Extensions of these limits are possible. \\Space is a highly automated code suitable for application to large spectroscopic surveys. A web front end to this service is publicly available at http://dc.g-vo.org/SP_ACE together with the library and the binary code.
The HST/STIS Next Generation Spectral Library
NASA Technical Reports Server (NTRS)
Gregg, M. D.; Silva, D.; Rayner, J.; Worthey, G.; Valdes, F.; Pickles, A.; Rose, J.; Carney, B.; Vacca, W.
2006-01-01
During Cycles 10, 12, and 13, we obtained STIS G230LB, G430L, and G750L spectra of 378 bright stars covering a wide range in abundance, effective temperature, and luminosity. This HST/STIS Next Generation Spectral Library was scheduled to reach its goal of 600 targets by the end of Cycle 13 when STIS came to an untimely end. Even at 2/3 complete, the library significantly improves the sampling of stellar atmosphere parameter space compared to most other spectral libraries by including the near-UV and significant numbers of metal poor and super-solar abundance stars. Numerous calibration challenges have been encountered, some expected, some not; these arise from the use of the E1 aperture location, non-standard wavelength calibration, and, most significantly, the serious contamination of the near-UV spectra by red light. Maximizing the utility of the library depends directly on overcoming or at least minimizing these problems, especially correcting the UV spectra.
Pauthenier, Cyrille; Faulon, Jean-Loup
2014-07-01
PrecisePrimer is a web-based primer design software made to assist experimentalists in any repetitive primer design task such as preparing, cloning and shuffling DNA libraries. Unlike other popular primer design tools, it is conceived to generate primer libraries with popular PCR polymerase buffers proposed as pre-set options. PrecisePrimer is also meant to design primers in batches, such as for DNA libraries creation of DNA shuffling experiments and to have the simplest interface possible. It integrates the most up-to-date melting temperature algorithms validated with experimental data, and cross validated with other computational tools. We generated a library of primers for the extraction and cloning of 61 genes from yeast DNA genomic extract using default parameters. All primer pairs efficiently amplified their target without any optimization of the PCR conditions. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Raster graphics display library
NASA Technical Reports Server (NTRS)
Grimsrud, Anders; Stephenson, Michael B.
1987-01-01
The Raster Graphics Display Library (RGDL) is a high level subroutine package that give the advanced raster graphics display capabilities needed. The RGDL uses FORTRAN source code routines to build subroutines modular enough to use as stand-alone routines in a black box type of environment. Six examples are presented which will teach the use of RGDL in the fastest, most complete way possible. Routines within the display library that are used to produce raster graphics are presented in alphabetical order, each on a separate page. Each user-callable routine is described by function and calling parameters. All common blocks that are used in the display library are listed and the use of each variable within each common block is discussed. A reference on the include files that are necessary to compile the display library is contained. Each include file and its purpose are listed. The link map for MOVIE.BYU version 6, a general purpose computer graphics display system that uses RGDL software, is also contained.
Jones, Josette; Harris, Marcelline; Bagley-Thompson, Cheryl; Root, Jane
2003-01-01
This poster describes the development of user-centered interfaces in order to extend the functionality of the Virginia Henderson International Nursing Library (VHINL) from library to web based portal to nursing knowledge resources. The existing knowledge structure and computational models are revised and made complementary. Nurses' search behavior is captured and analyzed, and the resulting search models are mapped to the revised knowledge structure and computational model.
The Evolution of PLA's Planning Model.
ERIC Educational Resources Information Center
Elsner, Edward J.
2002-01-01
Explores the movement toward community-centered standards in public libraries. Tracks the changes of the Public Library Association's (PLA's) planning model through four incarnations, summarizes each model, and examines trends and suggests a way to use the various models together for an easier planning process. (Author/LRW)
Simscape Modeling of a Custom Closed-Volume Tank
NASA Technical Reports Server (NTRS)
Fischer, Nathaniel P.
2015-01-01
The library for Mathworks Simscape does not currently contain a model for a closed volume fluid tank where the ullage pressure is variable. In order to model a closed-volume variable ullage pressure tank, it was necessary to consider at least two separate cases: a vertical cylinder, and a sphere. Using library components, it was possible to construct a rough model for the cylindrical tank. It was not possible to construct a model for a spherical tank, using library components, due to the variable area. It was decided that, for these cases, it would be preferable to create a custom library component to represent each case, using the Simscape language. Once completed, the components were added to models, where filling and draining the tanks could be simulated. When the models were performing as expected, it was necessary to generate code from the models and run them in Trick (a real-time simulation program). The data output from Trick was then compared to the output from Simscape and found to be within acceptable limits.
ModFossa: A library for modeling ion channels using Python.
Ferneyhough, Gareth B; Thibealut, Corey M; Dascalu, Sergiu M; Harris, Frederick C
2016-06-01
The creation and simulation of ion channel models using continuous-time Markov processes is a powerful and well-used tool in the field of electrophysiology and ion channel research. While several software packages exist for the purpose of ion channel modeling, most are GUI based, and none are available as a Python library. In an attempt to provide an easy-to-use, yet powerful Markov model-based ion channel simulator, we have developed ModFossa, a Python library supporting easy model creation and stimulus definition, complete with a fast numerical solver, and attractive vector graphics plotting.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wetter, Michael; Zuo, Wangda; Nouidui, Thierry S.
This paper describes the Buildings library, a free open-source library that is implemented in Modelica, an equation-based object-oriented modeling language. The library supports rapid prototyping, as well as design and operation of building energy and control systems. First, we describe the scope of the library, which covers HVAC systems, multi-zone heat transfer and multi-zone airflow and contaminant transport. Next, we describe differentiability requirements and address how we implemented them. We describe the class hierarchy that allows implementing component models by extending partial implementations of base models of heat and mass exchangers, and by instantiating basic models for conservation equations andmore » flow resistances. We also describe associated tools for pre- and post-processing, regression tests, co-simulation and real-time data exchange with building automation systems. Furthermore, the paper closes with an example of a chilled water plant, with and without water-side economizer, in which we analyzed the system-level efficiency for different control setpoints.« less
Wetter, Michael; Zuo, Wangda; Nouidui, Thierry S.; ...
2013-03-13
This paper describes the Buildings library, a free open-source library that is implemented in Modelica, an equation-based object-oriented modeling language. The library supports rapid prototyping, as well as design and operation of building energy and control systems. First, we describe the scope of the library, which covers HVAC systems, multi-zone heat transfer and multi-zone airflow and contaminant transport. Next, we describe differentiability requirements and address how we implemented them. We describe the class hierarchy that allows implementing component models by extending partial implementations of base models of heat and mass exchangers, and by instantiating basic models for conservation equations andmore » flow resistances. We also describe associated tools for pre- and post-processing, regression tests, co-simulation and real-time data exchange with building automation systems. Furthermore, the paper closes with an example of a chilled water plant, with and without water-side economizer, in which we analyzed the system-level efficiency for different control setpoints.« less
What Is Your Budget Saying about Your Library?
ERIC Educational Resources Information Center
Jacobs, Leslie; Strouse, Roger
2002-01-01
Discusses budgeting for corporate libraries and how to keep budgets from getting cut. Topics include whether the budget is considered corporate overhead; recovering costs; models for content cost recovery; showing return on library investment; marketing library value to senior management; user needs and satisfaction; and comparing budgets to other…
Tomorrow's Research Library: Vigor or Rigor Mortis?
ERIC Educational Resources Information Center
Hacken, Richard D.
1988-01-01
Compares, contrasts, and critiques predictions that have been made about the future of research libraries, focusing on the impact of technology on the library's role and users' needs. The discussion includes models for the adaptation of new technologies that may assist in library planning and change. (38 references) (CLB)
Financing the Electronic Library: Models and Options.
ERIC Educational Resources Information Center
Waters, Richard L.; Kralisz, Victor Frank
1981-01-01
Places the cost considerations associated with public library automation in a framework of public finance comfortable to most administrators, discusses the importance of experience with use patterns in the electronic library in opening up new and innovative financing methods, and stresses the role of the library in the information industry. (JL)
Content and Workflow Management for Library Websites: Case Studies
ERIC Educational Resources Information Center
Yu, Holly, Ed.
2005-01-01
Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…
Building a Learning Commons: Necessary Conditions for Success
ERIC Educational Resources Information Center
McKay, Richard
2014-01-01
Remodeling an academic library according to the Learning Commons service model will challenge the library staff. This paper gives insight into four of these challenges: Working with the design team, preserving a scholarly environment, ensuring the most efficient arrangement of the library's service centers, and moving the library's collection. It…
Web 2.0 Strategy in Libraries and Information Services
ERIC Educational Resources Information Center
Byrne, Alex
2008-01-01
Web 2.0 challenges libraries to change from their predominantly centralised service models with integrated library management systems at the hub. Implementation of Web 2.0 technologies and the accompanying attitudinal shifts will demand reconceptualisation of the nature of library and information service around a dynamic, ever changing, networked,…
Library and Information Networks: Centralization and Decentralization.
ERIC Educational Resources Information Center
Segal, JoAnn S.
1988-01-01
Describes the development of centralized library networks and the current factors that make library sharing on a smaller scale feasible. The discussion covers the need to decide the level at which library cooperation should occur and the possibility of linking via the Open System Interface Reference Model. (37 references) (CLB)
A New Consortial Model for Building Digital Libraries.
ERIC Educational Resources Information Center
Neff, Raymond K.
The libraries in U.S. research universities are being systematically depopulated of current subscriptions to scholarly journals. Annual increases in subscription costs are consistently outpacing the growth in library budgets; this has become a chronic problem for academic libraries which collect in the fields of science, engineering, and medicine.…
Library Automation: Guidelines to Costing.
ERIC Educational Resources Information Center
Ford, Geoffrey
As with all new programs, the costs associated with library automation must be carefully considered before implementation. This document suggests guidelines to be followed and areas to be considered in the costing of library procedures. An existing system model has been suggested as a standard (Appendix A) and a classification of library tasks…
Mobile Libraries in Vietnam in 21st Century.
ERIC Educational Resources Information Center
The Khang, Pham
With encouragement from IFLA (International Federation of Library Associations and Institutions) and support from the government, over 150 mobile libraries have been established in Vietnam and have been in active operation for the last 10 years. Various models of mobile libraries suited to different areas have been identified, such as…
LISPA (Library and Information Center Staff Planning Advisor): A Microcomputer-Based System.
ERIC Educational Resources Information Center
Devadason, F. J.; Vespry, H. A.
1996-01-01
Describes LISPA (Library and Information Center Staff Planning Advisor), a set of programs based on Ranganathan's staff plan model. LISPA particularly aids in planning for library staff requirements, both professional and paraprofessional, in developing countries where automated systems for other library operations are not yet available.…
Elegent—An elastic event generator
NASA Astrophysics Data System (ADS)
Kašpar, J.
2014-03-01
Although elastic scattering of nucleons may look like a simple process, it presents a long-lasting challenge for theory. Due to missing hard energy scale, the perturbative QCD cannot be applied. Instead, many phenomenological/theoretical models have emerged. In this paper we present a unified implementation of some of the most prominent models in a C++ library, moreover extended to account for effects of the electromagnetic interaction. The library is complemented with a number of utilities. For instance, programs to sample many distributions of interest in four-momentum transfer squared, t, impact parameter, b, and collision energy √{s}. These distributions at ISR, Spp¯S, RHIC, Tevatron and LHC energies are available for download from the project web site. Both in the form of ROOT files and PDF figures providing comparisons among the models. The package includes also a tool for Monte-Carlo generation of elastic scattering events, which can easily be embedded in any other program framework. Catalogue identifier: AERT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERT_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 10551 No. of bytes in distributed program, including test data, etc.: 126316 Distribution format: tar.gz Programming language: C++. Computer: Any in principle, tested on x86-64 architecture. Operating system: Any in principle, tested on GNU/Linux. RAM: Strongly depends on the task, but typically below 20MB Classification: 11.6. External routines: ROOT, HepMC Nature of problem: Monte-Carlo simulation of elastic nucleon-nucleon collisions Solution method: Implementation of some of the most prominent phenomenological/theoretical models providing cumulative distribution function that is used for random event generation. Running time: Strongly depends on the task, but typically below 1 h.
Performance of the libraries in Isfahan University of Medical Sciences based on the EFQM model.
Karimi, Saeid; Atashpour, Bahareh; Papi, Ahmad; Nouri, Rasul; Hasanzade, Akbar
2014-01-01
Performance measurement is inevitable for university libraries. Hence, planning and establishing a constant and up-to-date measurement system is required for the libraries, especially the university libraries. The primary studies and analyses reveal that the EFQM Excellence Model has been efficient, and the administrative reform program has focused on the implementation of this model. Therefore, on the basis of these facts as well as the need for a measurement system, the researchers measured the performance of libraries in schools and hospitals supported by Isfahan University of Medical Sciences, using the EFQM Organizational Excellence Model. This descriptive research study was carried out by a cross-sectional survey method in 2011. This research study included librarians and library directors of Isfahan University of Medical Sciences (70 people). The validity of the instrument was measured by the specialists in the field of Management and Library Science. To measure the reliability of the questionnaire, the Cronbach's alpha coefficient value was measured (0.93). The t-test, ANOVA, and Spearman's rank correlation coefficient were used for measurements. The data were analyzed by SPSS. Data analysis revealed that the mean score of the performance measurement for the libraries under study and between nine dimensions the highest score was 65.3% for leadership dimension and the lowest scores were 55.1% for people and 55.1% for society results. In general, using the ninth EFQM model the average level of all dimensions, which is in good agreement with normal values, was assessed. However, compared to other results, the criterion people and society results were poor. It is Recommended by forming the expert committee on criterion people and society results by individuals concerned with the various conferences and training courses to improve the aspects.
Performance of the libraries in Isfahan University of Medical Sciences based on the EFQM model
Karimi, Saeid; Atashpour, Bahareh; Papi, Ahmad; Nouri, Rasul; Hasanzade, Akbar
2014-01-01
Introduction: Performance measurement is inevitable for university libraries. Hence, planning and establishing a constant and up-to-date measurement system is required for the libraries, especially the university libraries. The primary studies and analyses reveal that the EFQM Excellence Model has been efficient, and the administrative reform program has focused on the implementation of this model. Therefore, on the basis of these facts as well as the need for a measurement system, the researchers measured the performance of libraries in schools and hospitals supported by Isfahan University of Medical Sciences, using the EFQM Organizational Excellence Model. Materials and Methods: This descriptive research study was carried out by a cross-sectional survey method in 2011. This research study included librarians and library directors of Isfahan University of Medical Sciences (70 people). The validity of the instrument was measured by the specialists in the field of Management and Library Science. To measure the reliability of the questionnaire, the Cronbach's alpha coefficient value was measured (0.93). The t-test, ANOVA, and Spearman's rank correlation coefficient were used for measurements. The data were analyzed by SPSS. Results: Data analysis revealed that the mean score of the performance measurement for the libraries under study and between nine dimensions the highest score was 65.3% for leadership dimension and the lowest scores were 55.1% for people and 55.1% for society results. Conclusion: In general, using the ninth EFQM model the average level of all dimensions, which is in good agreement with normal values, was assessed. However, compared to other results, the criterion people and society results were poor. It is Recommended by forming the expert committee on criterion people and society results by individuals concerned with the various conferences and training courses to improve the aspects. PMID:25540795
Ertl, P
1998-02-01
Easy to use, interactive, and platform-independent WWW-based tools are ideal for development of chemical applications. By using the newly emerging Web technologies such as Java applets and sophisticated scripting, it is possible to deliver powerful molecular processing capabilities directly to the desk of synthetic organic chemists. In Novartis Crop Protection in Basel, a Web-based molecular modelling system has been in use since 1995. In this article two new modules of this system are presented: a program for interactive calculation of important hydrophobic, electronic, and steric properties of organic substituents, and a module for substituent similarity searches enabling the identification of bioisosteric functional groups. Various possible applications of calculated substituent parameters are also discussed, including automatic design of molecules with the desired properties and creation of targeted virtual combinatorial libraries.
ERIC Educational Resources Information Center
Indiana Univ., Bloomington.
The Dedication Address was given by Gordon N. Ray. Alan R. Taylor presented "A Model of Academic Library Service," which was followed by "Views and Reviews," given by Edwin H. Cady; "Comments on 'A Model of Academic Library Service'," by Stephen A. McCarthy and "Critique of Taylor Paper," by Marvin E. Olsen.…
LibHalfSpace: A C++ object-oriented library to study deformation and stress in elastic half-spaces
NASA Astrophysics Data System (ADS)
Ferrari, Claudio; Bonafede, Maurizio; Belardinelli, Maria Elina
2016-11-01
The study of deformation processes in elastic half-spaces is widely employed for many purposes (e.g. didactic, scientific investigation of real processes, inversion of geodetic data, etc.). We present a coherent programming interface containing a set of tools designed to make easier and faster the study of processes in an elastic half-space. LibHalfSpace is presented in the form of an object-oriented library. A set of well known and frequently used source models (Mogi source, penny shaped horizontal crack, inflating spheroid, Okada rectangular dislocation, etc.) are implemented to describe the potential usage and the versatility of the library. The common interface given to library tools enables us to switch easily among the effects produced by different deformation sources that can be monitored at the free surface. Furthermore, the library also offers an interface which simplifies the creation of new source models exploiting the features of object-oriented programming (OOP). These source models can be built as distributions of rectangular boundary elements. In order to better explain how new models can be deployed some examples are included in the library.
The structure of a thermophilic kinase shapes fitness upon random circular permutation
Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.
2016-01-01
Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658
The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.
Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J
2016-05-20
Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.
Boyce, Lindsay M
2016-01-01
Library orientation at an academic health sciences library consisted of a five-minute overview within new student orientation. Past experience indicated this brief presentation was insufficient for students to learn about library resources. In 2014, an effort was made to supplement orientation by developing an online game aimed at enabling students to become self-sufficient through hands-on learning. A gaming model was chosen with expectations that competition and rewards would motivate students. Although the pilots suffered from low participation rates, the experience merits further research into the potential of a broader model of online library instruction in the health sciences environment.
Learning Extended Finite State Machines
NASA Technical Reports Server (NTRS)
Cassel, Sofia; Howar, Falk; Jonsson, Bengt; Steffen, Bernhard
2014-01-01
We present an active learning algorithm for inferring extended finite state machines (EFSM)s, combining data flow and control behavior. Key to our learning technique is a novel learning model based on so-called tree queries. The learning algorithm uses the tree queries to infer symbolic data constraints on parameters, e.g., sequence numbers, time stamps, identifiers, or even simple arithmetic. We describe sufficient conditions for the properties that the symbolic constraints provided by a tree query in general must have to be usable in our learning model. We have evaluated our algorithm in a black-box scenario, where tree queries are realized through (black-box) testing. Our case studies include connection establishment in TCP and a priority queue from the Java Class Library.
Zarrinabadi, Zarrin; Isfandyari-Moghaddam, Alireza; Erfani, Nasrolah; Tahour Soltani, Mohsen Ahmadi
2018-01-01
INTRODUCTION: According to the research mission of the librarianship and information sciences field, it is necessary to have the ability to communicate constructively between the user of the information and information in these students, and it appears more important in medical librarianship and information sciences because of the need for quick access to information for clinicians. Considering the role of spiritual intelligence in capability to establish effective and balanced communication makes it important to study this variable in librarianship and information students. One of the main factors that can affect the results of any research is conceptual model of measure variables. Accordingly, the purpose of this study was codification of spiritual intelligence measurement model. METHODS: This correlational study was conducted through structural equation model, and 270 students were opted from library and medical information students of nationwide medical universities by simple random sampling and responded to the King spiritual intelligence questionnaire (2008). Initially, based on the data, the model parameters were estimated using maximum likelihood method; then, spiritual intelligence measurement model was tested by fit indices. Data analysis was performed by Smart-Partial Least Squares software. RESULTS: Preliminary results showed that due to the positive indicators of predictive association and t-test results for spiritual intelligence parameters, the King measurement model has the acceptable fit and internal correlation of the questionnaire items was significant. Composite reliability and Cronbach's alpha of parameters indicated high reliability of spiritual intelligence model. CONCLUSIONS: The spiritual intelligence measurement model was evaluated, and results showed that the model has a good fit, so it is recommended that domestic researchers use this questionnaire to assess spiritual intelligence. PMID:29922688
Zarrinabadi, Zarrin; Isfandyari-Moghaddam, Alireza; Erfani, Nasrolah; Tahour Soltani, Mohsen Ahmadi
2018-01-01
According to the research mission of the librarianship and information sciences field, it is necessary to have the ability to communicate constructively between the user of the information and information in these students, and it appears more important in medical librarianship and information sciences because of the need for quick access to information for clinicians. Considering the role of spiritual intelligence in capability to establish effective and balanced communication makes it important to study this variable in librarianship and information students. One of the main factors that can affect the results of any research is conceptual model of measure variables. Accordingly, the purpose of this study was codification of spiritual intelligence measurement model. This correlational study was conducted through structural equation model, and 270 students were opted from library and medical information students of nationwide medical universities by simple random sampling and responded to the King spiritual intelligence questionnaire (2008). Initially, based on the data, the model parameters were estimated using maximum likelihood method; then, spiritual intelligence measurement model was tested by fit indices. Data analysis was performed by Smart-Partial Least Squares software. Preliminary results showed that due to the positive indicators of predictive association and t -test results for spiritual intelligence parameters, the King measurement model has the acceptable fit and internal correlation of the questionnaire items was significant. Composite reliability and Cronbach's alpha of parameters indicated high reliability of spiritual intelligence model. The spiritual intelligence measurement model was evaluated, and results showed that the model has a good fit, so it is recommended that domestic researchers use this questionnaire to assess spiritual intelligence.
The U. S. Geological Survey, Digital Spectral Library: Version 1 (0.2 to 3.0um)
Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea J.; King, Trude V.V.; Calvin, Wendy M.
1993-01-01
We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 498 spectra of 444 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 um . The spectral resolution (Full Width Half Maximum) of the reflectance data is <= 4 nm in the visible (0.2-0.8 um) and <= 10 nm in the NIR (0.8-2.35 um). All spectra were corrected to absolute reflectance using an NIST Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from borate, carbonate, chloride, element, halide, hydroxide, nitrate, oxide, phosphate, sulfate, sulfide, sulfosalt, and the silicate (cyclosilicate, inosilicate, nesosilicate, phyllosilicate, sorosilicate, and tectosilicate) classes are represented. X-Ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses. The library and software are available as a series of U.S.G.S. Open File reports. PC user software is available to convert the binary data to ascii files (a separate U.S.G.S. open file report). Additionally, a binary data files are on line at the U.S.G.S. in Denver for anonymous ftp to users on the Internet. The library search software enables a user to search on documentation parameters as well as spectral features. The analysis system includes general spectral analysis routines, plotting packages, radiative transfer software for computing intimate mixtures, routines to derive optical constants from reflectance spectra, tools to analyze spectral features, and the capability to access imaging spectrometer data cubes for spectral analysis. Users may build customized libraries (at specific wavelengths and spectral resolution) for their own instruments using the library software. We are currently extending spectral coverage to 150 um. The libraries (original and convolved) will be made available in the future on a CD-ROM.
2006-09-01
of Library Science . It could be possible that just by looking into the classic Library Science that a good working model for the web can be designed...projects confirmed that the principles of Library Science could be applied to the world of electronic media, they identified a significant void in...Library community to apply Library Science to the realm of electronic data resources. [29] In the environment of physical media, librarians have become so
A case study: the evolution of a "facilitator model" liaison program in an academic medical library.
Crossno, Jon E; DeShay, Claudia H; Huslig, Mary Ann; Mayo, Helen G; Patridge, Emily F
2012-07-01
What type of liaison program would best utilize both librarians and other library staff to effectively promote library services and resources to campus departments? The case is an academic medical center library serving a large, diverse campus. The library implemented a "facilitator model" program to provide personalized service to targeted clients that allowed for maximum staff participation with limited subject familiarity. To determine success, details of liaison-contact interactions and results of liaison and department surveys were reviewed. Liaisons successfully recorded 595 interactions during the program's first 10 months of existence. A significant majority of departmental contact persons (82.5%) indicated they were aware of the liaison program, and 75% indicated they preferred email communication. The "facilitator model" provides a well-defined structure for assigning liaisons to departments or groups; however, training is essential to ensure that liaisons are able to communicate effectively with their clients.
Building a Better Fragment Library for De Novo Protein Structure Prediction
de Oliveira, Saulo H. P.; Shi, Jiye; Deane, Charlotte M.
2015-01-01
Fragment-based approaches are the current standard for de novo protein structure prediction. These approaches rely on accurate and reliable fragment libraries to generate good structural models. In this work, we describe a novel method for structure fragment library generation and its application in fragment-based de novo protein structure prediction. The importance of correct testing procedures in assessing the quality of fragment libraries is demonstrated. In particular, the exclusion of homologs to the target from the libraries to correctly simulate a de novo protein structure prediction scenario, something which surprisingly is not always done. We demonstrate that fragments presenting different predominant predicted secondary structures should be treated differently during the fragment library generation step and that exhaustive and random search strategies should both be used. This information was used to develop a novel method, Flib. On a validation set of 41 structurally diverse proteins, Flib libraries presents both a higher precision and coverage than two of the state-of-the-art methods, NNMake and HHFrag. Flib also achieves better precision and coverage on the set of 275 protein domains used in the two previous experiments of the the Critical Assessment of Structure Prediction (CASP9 and CASP10). We compared Flib libraries against NNMake libraries in a structure prediction context. Of the 13 cases in which a correct answer was generated, Flib models were more accurate than NNMake models for 10. “Flib is available for download at: http://www.stats.ox.ac.uk/research/proteins/resources”. PMID:25901595
CCFpams: Atmospheric stellar parameters from cross-correlation functions
NASA Astrophysics Data System (ADS)
Malavolta, Luca; Lovis, Christophe; Pepe, Francesco; Sneden, Christopher; Udry, Stephane
2017-07-01
CCFpams allows the measurement of stellar temperature, metallicity and gravity within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, the technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. Literature stellar parameters of high signal-to-noise (SNR) and high-resolution HARPS spectra of FGK Main Sequence stars are used to calibrate the stellar parameters as a function of CCF areas.
NASA Technical Reports Server (NTRS)
Parker, K. C.; Torian, J. G.
1980-01-01
A sample environmental control and life support model performance analysis using the environmental analysis routines library is presented. An example of a complete model set up and execution is provided. The particular model was synthesized to utilize all of the component performance routines and most of the program options.
Alzheimer's Disease Diagnosis in Individual Subjects using Structural MR Images: Validation Studies
Vemuri, Prashanthi; Gunter, Jeffrey L.; Senjem, Matthew L.; Whitwell, Jennifer L.; Kantarci, Kejal; Knopman, David S.; Boeve, Bradley F.; Petersen, Ronald C.; Jack, Clifford R.
2008-01-01
OBJECTIVE To develop and validate a tool for Alzheimer's disease (AD) diagnosis in individual subjects using support vector machine (SVM) based classification of structural MR (sMR) images. BACKGROUND Libraries of sMR scans of clinically well characterized subjects can be harnessed for the purpose of diagnosing new incoming subjects. METHODS 190 patients with probable AD were age- and gender-matched with 190 cognitively normal (CN) subjects. Three different classification models were implemented: Model I uses tissue densities obtained from sMR scans to give STructural Abnormality iNDex (STAND)-score; and Models II and III use tissue densities as well as covariates (demographics and Apolipoprotein E genotype) to give adjusted-STAND (aSTAND)-score. Data from 140 AD and 140 CN were used for training. The SVM parameter optimization and training was done by four-fold cross validation. The remaining independent sample of 50 AD and 50 CN were used to obtain a minimally biased estimate of the generalization error of the algorithm. RESULTS The CV accuracy of Model II and Model III aSTAND-scores was 88.5% and 89.3% respectively and the developed models generalized well on the independent test datasets. Anatomic patterns best differentiating the groups were consistent with the known distribution of neurofibrillary AD pathology. CONCLUSIONS This paper presents preliminary evidence that application of SVM-based classification of an individual sMR scan relative to a library of scans can provide useful information in individual subjects for diagnosis of AD. Including demographic and genetic information in the classification algorithm slightly improves diagnostic accuracy. PMID:18054253
ERIC Educational Resources Information Center
Ratto, Brooke Gilmore; Lynch, Andy
2012-01-01
Student and faculty frustrations with traditional higher education textbook models continue to escalate. These frustrations present an opportunity for academic libraries to forge partnerships with teaching faculty and vendors to repurpose existing library resources. Library and teaching faculty at Southern New Hampshire University collaborated to…
Library Faculty Workload: A Case Study in Implementing a Teaching Faculty Model.
ERIC Educational Resources Information Center
Goudy, Frank Wm.
In the January 1988 issue of "Library Administration & Management," an article titled "The Dilemma of Library Faculty Workload: One Solution" described the efforts of the library faculty at Western Illinois University to achieve a more equitable situation compared to other faculty on the campus. A totally new approach to…
ERIC Educational Resources Information Center
Cigale, George
2005-01-01
The surprising success of Live Homework Help, a product of Tutor.com unknown to libraries before 2001, is in very large part owing to the company's effort to understand library funding and help libraries tap into all kinds of potential funding sources. The story is a model for partnerships between other companies and libraries. For the people of…
"A Really Nice Spot": Evaluating Place, Space, and Technology in Academic Libraries
ERIC Educational Resources Information Center
Khoo, Michael J.; Rozaklis, Lily; Hall, Catherine; Kusunoki, Diana
2016-01-01
This article describes a qualitative mixed-method study of students' perceptions of place and space in an academic library. The approach is informed by Scott Bennett's model of library design, which posits a shift from a "book-centered" to a technology supported "learning centered" paradigm of library space. Two surveys…
Best Small Library in America 2010
ERIC Educational Resources Information Center
Berry, John N., III
2010-01-01
This article features Glen Carbon Centennial Library (GCCL), Illinois, which is named "LJ"'s Best Small Library in America 2010. The attitude of doing whatever it takes to encourage every patron to come back permeates GCCL and is the foundation that makes it a model small library. GCCL delivers much "more than one expects." The…
ERIC Educational Resources Information Center
Dubnjakovic, Ana
2012-01-01
The current study investigates factors influencing increase in reference transactions in a typical week in academic libraries across the United States of America. Employing multiple regression analysis and general linear modeling, variables of interest from the "Academic Library Survey (ALS) 2006" survey (sample size 3960 academic libraries) were…
A Framework for Studying Organizational Innovation in Research Libraries
ERIC Educational Resources Information Center
Jantz, Ronald C.
2012-01-01
The objective of this paper is two-fold: to propose a theoretical framework and model for studying organizational innovation in research libraries and to set forth propositions that can provide directions for future empirical studies of innovation in research libraries. Research libraries can be considered members of a class of organizations…
ERIC Educational Resources Information Center
Olden, Anthony
2005-01-01
The Tanganyika Library Service (TLS) was the national public library service set up in Tanzania, East Africa, in the 1960s. By the end of the decade, it was generally regarded as a model of Western-style public library development in Africa. This is an account of its establishment and early years based on accessible documentary sources in Tanzania…
ERIC Educational Resources Information Center
Greene, Gloria; Robb, Reive
The main concerns of this manpower survey were to examine and, where possible, modify and expand on the manpower planning model generated in the 1982 pilot study, and to use the model to assist with the forecasting of manpower requirements for library and information systems in the Caribbean region. Libraries and information systems in this area…
SLHAplus: A library for implementing extensions of the standard model
NASA Astrophysics Data System (ADS)
Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.
2011-03-01
We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec
iTesla Power Systems Library (iPSL): A Modelica library for phasor time-domain simulations
NASA Astrophysics Data System (ADS)
Vanfretti, L.; Rabuzin, T.; Baudette, M.; Murad, M.
The iTesla Power Systems Library (iPSL) is a Modelica package providing a set of power system components for phasor time-domain modeling and simulation. The Modelica language provides a systematic approach to develop models using a formal mathematical description, that uniquely specifies the physical behavior of a component or the entire system. Furthermore, the standardized specification of the Modelica language (Modelica Association [1]) enables unambiguous model exchange by allowing any Modelica-compliant tool to utilize the models for simulation and their analyses without the need of a specific model transformation tool. As the Modelica language is being developed with open specifications, any tool that implements these requirements can be utilized. This gives users the freedom of choosing an Integrated Development Environment (IDE) of their choice. Furthermore, any integration solver can be implemented within a Modelica tool to simulate Modelica models. Additionally, Modelica is an object-oriented language, enabling code factorization and model re-use to improve the readability of a library by structuring it with object-oriented hierarchy. The developed library is released under an open source license to enable a wider distribution and let the user customize it to their specific needs. This paper describes the iPSL and provides illustrative application examples.
Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio
2016-11-01
The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.
The Catalog Takes to the Highway.
ERIC Educational Resources Information Center
Chesbro, Melinda
1999-01-01
Discusses new developments in online library catalogs, including Web-based catalogs; interconnectivity within the library; interconnectivity between libraries; graphical user interfaces; pricing models; and a checklist of questions to ask when purchasing a new online catalog. (LRW)
A Library Planning Model--Some Theory and How It Works.
ERIC Educational Resources Information Center
Goldberg, Robert L.
1985-01-01
The first part of the article describes the theoretical background of a library planning model based on the concept of right brain/left brain activities. The second describes the implementation of a short term planning model based on this theory. (CLB)
Model-based reconstruction of synthetic promoter library in Corynebacterium glutamicum.
Zhang, Shuanghong; Liu, Dingyu; Mao, Zhitao; Mao, Yufeng; Ma, Hongwu; Chen, Tao; Zhao, Xueming; Wang, Zhiwen
2018-05-01
To develop an efficient synthetic promoter library for fine-tuned expression of target genes in Corynebacterium glutamicum. A synthetic promoter library for C. glutamicum was developed based on conserved sequences of the - 10 and - 35 regions. The synthetic promoter library covered a wide range of strengths, ranging from 1 to 193% of the tac promoter. 68 promoters were selected and sequenced for correlation analysis between promoter sequence and strength with a statistical model. A new promoter library was further reconstructed with improved promoter strength and coverage based on the results of correlation analysis. Tandem promoter P70 was finally constructed with increased strength by 121% over the tac promoter. The promoter library developed in this study showed a great potential for applications in metabolic engineering and synthetic biology for the optimization of metabolic networks. To the best of our knowledge, this is the first reconstruction of synthetic promoter library based on statistical analysis of C. glutamicum.
NASA Astrophysics Data System (ADS)
Jiang, Jingtao; Sui, Rendong; Shi, Yan; Li, Furong; Hu, Caiqi
In this paper 3-D models of combined fixture elements are designed, classified by their functions, and saved in computer as supporting elements library, jointing elements library, basic elements library, localization elements library, clamping elements library, and adjusting elements library etc. Then automatic assembly of 3-D combined checking fixture for auto-body part is presented based on modularization theory. And in virtual auto-body assembly space, Locating constraint mapping technique and assembly rule-based reasoning technique are used to calculate the position of modular elements according to localization points and clamp points of auto-body part. Auto-body part model is transformed from itself coordinate system space to virtual assembly space by homogeneous transformation matrix. Automatic assembly of different functional fixture elements and auto-body part is implemented with API function based on the second development of UG. It is proven in practice that the method in this paper is feasible and high efficiency.
A portable MPI-based parallel vector template library
NASA Technical Reports Server (NTRS)
Sheffler, Thomas J.
1995-01-01
This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C++ by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of C or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.
A Portable MPI-Based Parallel Vector Template Library
NASA Technical Reports Server (NTRS)
Sheffler, Thomas J.
1995-01-01
This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C + + by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of c or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.
BurnMan: Towards a multidisciplinary toolkit for reproducible deep Earth science
NASA Astrophysics Data System (ADS)
Myhill, R.; Cottaar, S.; Heister, T.; Rose, I.; Unterborn, C. T.; Dannberg, J.; Martin-Short, R.
2016-12-01
BurnMan (www.burnman.org) is an open-source toolbox to compute thermodynamic and thermoelastic properties as a function of pressure and temperature using published mineral physical parameters and equations-of-state. The framework is user-friendly, written in Python, and modular, allowing the user to implement their own equations of state, endmember and solution model libraries, geotherms, and averaging schemes. Here we introduce various new modules, which can be used to: Fit thermodynamic variables to data from high pressure static and shock wave experiments, Calculate equilibrium assemblages given a bulk composition, pressure and temperature, Calculate chemical potentials and oxygen fugacities for given assemblages Compute 3D synthetic seismic models using output from geodynamic models and compare these results with global seismic tomographic models, Create input files for synthetic seismogram codes. Users can contribute scripts that reproduce the results from peer-reviewed articles and practical demonstrations (e.g. Cottaar et al., 2014).
Library Services Funding Assessment
NASA Technical Reports Server (NTRS)
Lorig, Jonathan A.
2004-01-01
The Glenn Technical Library is a science and engineering library that primarily supports research activities at the Glenn Research Center, and provides selected services to researchers at all of the NASA research centers. Resources available in the library include books, journals, CD-ROMs, and access to various online sources, as well as live reference and inter-library loan services. The collection contains over 77,000 books, 800,000 research reports, and print or online access to over 1,400 journals. Currently the library operates within the Logistics and Technical Information Division, and is funded as an open-access resource within the GRC. Some of the research units at the GRC have recently requested that the library convert to a "pay-for-services" model, in which individual research units could fund only those journal subscriptions for which they have a specific need. Under this model, the library would always maintain a certain minimum level of pooled-expense services, including the ready reference and book collections, and inter-library loan services. Theoretically the "pay-for-services" model would encourage efficient financial allocation, and minimize the extent to which paid journal subscriptions go unused. However, this model also could potentially negate the benefits of group purchases for journal subscriptions and access. All of the major journal publishers offer package subscriptions that compare favorably in cost with the sum of individual subscription costs for a similar selection of titles. Furthermore, some of these subscription packages are "consortium" purchases that are funded collectively by the libraries at multiple NASA research centers; such consortia1 memberships would be difficult for the library to pay, if enough GRC research units were to withdraw their pooled contributions. cost of collectively-funded journal access with the cost of individual subscriptions. My primary task this summer is to create the cost dataset framework, and collect as much of the relevant data as possible. Hopefully this dataset will permit the research units at the GRC, and library administration as well, to make informed decisions about future library funding. Prior to the creation of the actual dataset, I established a comprehensive list of the library s print and online journal subscriptions. This list will be useful outside the context of the cost analysis project, as an addition to the library website. The cost analysis dataset s primary fields are: journal name, vendor, publisher, ISSN (International Standard Serial Number, to uniquely identify the titles), stand-alone price, and indication as to the presence of the journal in current GRC Technical Library consortium membership subscriptions. The dataset will hopefully facilitate comparisons between the stand-alone journal prices and the cost of shared journal subscriptions for groups of titles.
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koziol, Quincey
The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. We will accomplish this through three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community.
An Interaction Library for the FcεRI Signaling Network
Chylek, Lily A.; Holowka, David A.; Baird, Barbara A.; ...
2014-04-15
Antigen receptors play a central role in adaptive immune responses. Although the molecular networks associated with these receptors have been extensively studied, we currently lack a systems-level understanding of how combinations of non-covalent interactions and post-translational modifications are regulated during signaling to impact cellular decision-making. To fill this knowledge gap, it will be necessary to formalize and piece together information about individual molecular mechanisms to form large-scale computational models of signaling networks. To this end, we have developed an interaction library for signaling by the high-affinity IgE receptor, FcεRI. The library consists of executable rules for protein–protein and protein–lipid interactions.more » This library extends earlier models for FcεRI signaling and introduces new interactions that have not previously been considered in a model. Thus, this interaction library is a toolkit with which existing models can be expanded and from which new models can be built. As an example, we present models of branching pathways from the adaptor protein Lat, which influence production of the phospholipid PIP 3 at the plasma membrane and the soluble second messenger IP 3. We find that inclusion of a positive feedback loop gives rise to a bistable switch, which may ensure robust responses to stimulation above a threshold level. In addition, the library is visualized to facilitate understanding of network circuitry and identification of network motifs.« less
An Interaction Library for the FcεRI Signaling Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chylek, Lily A.; Holowka, David A.; Baird, Barbara A.
Antigen receptors play a central role in adaptive immune responses. Although the molecular networks associated with these receptors have been extensively studied, we currently lack a systems-level understanding of how combinations of non-covalent interactions and post-translational modifications are regulated during signaling to impact cellular decision-making. To fill this knowledge gap, it will be necessary to formalize and piece together information about individual molecular mechanisms to form large-scale computational models of signaling networks. To this end, we have developed an interaction library for signaling by the high-affinity IgE receptor, FcεRI. The library consists of executable rules for protein–protein and protein–lipid interactions.more » This library extends earlier models for FcεRI signaling and introduces new interactions that have not previously been considered in a model. Thus, this interaction library is a toolkit with which existing models can be expanded and from which new models can be built. As an example, we present models of branching pathways from the adaptor protein Lat, which influence production of the phospholipid PIP 3 at the plasma membrane and the soluble second messenger IP 3. We find that inclusion of a positive feedback loop gives rise to a bistable switch, which may ensure robust responses to stimulation above a threshold level. In addition, the library is visualized to facilitate understanding of network circuitry and identification of network motifs.« less
The ESA Space Environment Information System (SPENVIS)
NASA Astrophysics Data System (ADS)
Heynderickx, D.; Quaghebeur, B.; Evans, H. D. R.
2002-01-01
The ESA SPace ENVironment Information System (SPENVIS) provides standardized access to models of the hazardous space environment through a user-friendly WWW interface. The interface includes parameter input with extensive defaulting, definition of user environments, streamlined production of results (both in graphical and textual form), background information, and on-line help. It is available on-line at http://www.spenvis.oma.be/spenvis/. SPENVIS Is designed to help spacecraft engineers perform rapid analyses of environmental problems and, with extensive documentation and tutorial information, allows engineers with relatively little familiarity with the models to produce reliable results. It has been developed in response to the increasing pressure for rapid-response tools for system engineering, especially in low-cost commercial and educational programmes. It is very useful in conjunction with radiation effects and electrostatic charging testing in the context of hardness assurance. SPENVIS is based on internationally recognized standard models and methods in many domains. It uses an ESA-developed orbit generator to produce orbital point files necessary for many different types of problem. It has various reporting and graphical utilities, and extensive help facilities. The SPENVIS radiation module features models of the proton and electron radiation belts, as well as solar energetic particle and cosmic ray models. The particle spectra serve as input to models of ionising dose (SHIELDOSE), Non-Ionising Energy Loss (NIEL), and Single Event Upsets (CREME). Material shielding is taken into account for all these models, either as a set of user-defined shielding thicknesses, or in combination with a sectoring analysis that produces a shielding distribution from a geometric description of the satellite system. A sequence of models, from orbit generator to folding dose curves with a shielding distribution, can be run as one process, which minimizes user interaction and facilitates multiple runs with different orbital or shielding configurations. SPENVIS features a number of models and tools for evaluating spacecraft charging. The DERA DICTAT tool for evaluation of internal charging calculates the electron current that passes through a conductive shield and becomes deposited inside a dielectric, and predicts whether an electrostatic discharge will occur. SPENVIS has implemented the DERA EQUIPOT non-geometrical tool for assessing material susceptibility to charging in typical orbital environments, including polar and GEO environments. SPENVIS Also includes SOLARC, for assessment of the current collection and the floating potential of solar arrays in LEO. Finally, the system features access to data from surface charging events on CRRES and the Russian Gorizont spacecraft, in the form of spectrograms and double Maxwellian fit parameters. SPENVIS also contains an active, integrated version of the ECSS Space Environment Standard, and access to in-flight data. Apart from radiation and plasma environments, SPENVIS includes meteoroid and debris models, atmospheric models (including atomic oxygen), and magnetic field models implemented by means of the UNILIB library for magnetic coordinate evaluation, magnetic field line tracing and drift shell tracing. The UNILIB library is freely accessible from the Web (http://www.magnet.oma.be/unilib/) for downloading in the form of a Fortran object library for different platforms (DecAlpha, SunOS, HPUX and PC/MS-Windows).
The Medical Library Association Benchmarking Network: development and implementation.
Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C; Smith, Bernie Todd
2006-04-01
This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program.
The Medical Library Association Benchmarking Network: development and implementation*
Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C.; Smith, Bernie Todd
2006-01-01
Objective: This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. Methods: The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. Results: The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. Conclusions: The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program. PMID:16636702
Vriamont, Nicolas; Govaerts, Bernadette; Grenouillet, Pierre; de Bellefon, Claude; Riant, Olivier
2009-06-15
A library of catalysts was designed for asymmetric-hydrogen transfer to acetophenone. At first, the whole library was submitted to evaluation using high-throughput experiments (HTE). The catalysts were listed in ascending order, with respect to their performance, and best catalysts were identified. In the second step, various simulated evolution experiments, based on a genetic algorithm, were applied to this library. A small part of the library, called the mother generation (G0), thus evolved from generation to generation. The goal was to use our collection of HTE data to adjust the parameters of the genetic algorithm, in order to obtain a maximum of the best catalysts within a minimal number of generations. It was namely found that simulated evolution's results depended on the selection of G0 and that a random G0 should be preferred. We also demonstrated that it was possible to get 5 to 6 of the ten best catalysts while investigating only 10 % of the library. Moreover, we developed a double algorithm making this result still achievable if the evolution started with one of the worst G0.
Evaluating digital libraries in the health sector. Part 1: measuring inputs and outputs.
Cullen, Rowena
2003-12-01
This is the first part of a two-part paper which explores methods that can be used to evaluate digital libraries in the health sector. In this first part, some approaches to evaluation that have been proposed for mainstream digital information services are examined for their suitability to provide models for the health sector. The paper summarizes some major national and collaborative initiatives to develop measures for digital libraries, and analyses these approaches in terms of their relationship to traditional measures of library performance, which are focused on inputs and outputs, and their relevance to current debates among health information specialists. The second part* looks more specifically at evaluative models based on outcomes, and models being developed in the health sector.
Full stellar kinematical profiles of central parts of nearby galaxies
NASA Astrophysics Data System (ADS)
Vudragović, A.; Samurović, S.; Jovanović, M.
2016-09-01
Context. We present the largest catalog of detailed stellar kinematics of the central parts of nearby galaxies, which includes higher moments of the line-of-sight velocity distribution (LOSVD) function represented by the Gauss-Hermite series. The kinematics is measured on a sample of galaxies selected from the Arecibo Legacy Fast ALFA (Alfalfa) survey using spectroscopy from the Sloan Digital Sky Survey (SDSS DR7). Aims: The SDSS DR7 offers measurements of the LOSVD based on the assumption of a pure Gaussian shape of the broadening function caused by the combination of rotational and random motion of the stars in galaxies. We discuss the consequences of this oversimplification since the velocity dispersion, one of the measured quantities, often serves as the proxy to important modeling parameters such as the black-hole mass and the virial mass of galaxies. Methods: The publicly available pPXF code is used to calculate the full kinematical profile for the sample galaxies including higher moments of their LOSVD. Both observed and synthetic stellar libraries were used and the related template mismatch problem is discussed. Results: For the whole sample of 2180 nearby galaxies reflecting morphological distribution characteristic for the local Universe, we successfully recovered stellar kinematics of their central parts, including higher order moments of the LOSVD function, for signal-to-noise above 50. Conclusions: We show the consequences of the oversimplification of the LOSVD function with Gaussian function on the velocity dispersion for the empirical and the synthetic stellar library. For the empirical stellar library, this approximation leads to an increase in the virial mass of 13% on average, while for the synthetic library the effect is weaker, with an increase of 9% on average. Systematic erroneous estimates of the velocity dispersion comes from the use of the synthetic stellar library instead of the empirical one and is much larger than the value imposed by the use of the Gaussian function. Only after a careful analysis of the template mismatch problem does one need to address the issue of the deviation of the LOSVD from the Gaussian function. We also show that the kurtotic parameter describing symmetrical departures from the Gaussian seems to increase along the continuous morphological sequence from late- to early-type galaxies. The catalog is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/593/A40
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martz, Roger L.
The Revised Eolus Grid Library (REGL) is a mesh-tracking library that was developed for use with the MCNP6TM computer code so that (radiation) particles can track on an unstructured mesh. The unstructured mesh is a finite element representation of any geometric solid model created with a state-of-the-art CAE/CAD tool. The mesh-tracking library is written using modern Fortran and programming standards; the library is Fortran 2003 compliant. The library was created with a defined application programmer interface (API) so that it could easily integrate with other particle tracking/transport codes. The library does not handle parallel processing via the message passing interfacemore » (mpi), but has been used successfully where the host code handles the mpi calls. The library is thread-safe and supports the OpenMP paradigm. As a library, all features are available through the API and overall a tight coupling between it and the host code is required. Features of the library are summarized with the following list: Can accommodate first and second order 4, 5, and 6-sided polyhedra; any combination of element types may appear in a single geometry model; parts may not contain tetrahedra mixed with other element types; pentahedra and hexahedra can be together in the same part; robust handling of overlaps and gaps; tracks element-to-element to produce path length results at the element level; finds element numbers for a given mesh location; finds intersection points on element faces for the particle tracks; produce a data file for post processing results analysis; reads Abaqus .inp input (ASCII) files to obtain information for the global mesh-model; supports parallel input processing via mpi; and support parallel particle transport by both mpi and OpenMP.« less
ERIC Educational Resources Information Center
Horrocks, Norman
1984-01-01
Reports on conference convened by Association for Library and Information Science Education for discussion of library school accreditation by 17 library-related associations and agencies. Highlights include accreditation models, accrediting information science, records management, special librarians, certification for archivists, M.L.S. in…
A Model for Service to the Elderly by the Small/Medium Sized Public Library.
ERIC Educational Resources Information Center
Jeffries, Stephen R.
Citing the American Library Association's 1964 statement of libraries' responsibilities to the aging, this paper presents a needs assessment of the elderly and suggests methods by which public libraries can attempt to meet some of these needs, e.g., barrier-free design, programs, materials, and services. The needs assessment of older users…
Marketing & Libraries Do Mix: A Handbook for Libraries and Information Centers.
ERIC Educational Resources Information Center
Tenney, H. Baird; And Others
This handbook offers a practical set of ideas to help all types of libraries in the task of marketing their services in an increasingly competitive economy and provides a model program as urged by the White House Conference on Library and Information Services. It is aimed at adult information services in particular, with passing references to…
A Study of Organization and Governance of Alabama State Library Systems.
ERIC Educational Resources Information Center
Public Administration Service, Washington, DC.
In order to provide the citizens of Alabama with the best possible library service for a given level of funding, this study recommends a model for the organization and funding of multi-type library cooperation in the state, based on a review of past developments and current conditions, together with proposed changes in state library legislation…
ERIC Educational Resources Information Center
Chen, Ching-chih; Hernon, Peter
This two-part publication reports on a study of consumer information delivery by library and non-library networks, which involved an extensive literature review, a telephone survey of 620 library networks, the development of an assessment model for the effectiveness of network information delivery, the development of an in-depth guide for…
Atmospheric particulate analysis using angular light scattering
NASA Technical Reports Server (NTRS)
Hansen, M. Z.
1980-01-01
Using the light scattering matrix elements measured by a polar nephelometer, a procedure for estimating the characteristics of atmospheric particulates was developed. A theoretical library data set of scattering matrices derived from Mie theory was tabulated for a range of values of the size parameter and refractive index typical of atmospheric particles. Integration over the size parameter yielded the scattering matrix elements for a variety of hypothesized particulate size distributions. A least squares curve fitting technique was used to find a best fit from the library data for the experimental measurements. This was used as a first guess for a nonlinear iterative inversion of the size distributions. A real index of 1.50 and an imaginary index of -0.005 are representative of the smoothed inversion results for the near ground level atmospheric aerosol in Tucson.
ERIC Educational Resources Information Center
Carlson, Lawrence O.
The project, intended to design and field test models of specialized library services for older adults, was conducted in two parts. Phase 1 consisted of collecting and evaluating data for use in designing models in Louisville, Lexington, Somerset, and Hazard, Kentucky. Data was collected by search of the literature, personal interviews, a…
Amber Plug-In for Protein Shop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliva, Ricardo
2004-05-10
The Amber Plug-in for ProteinShop has two main components: an AmberEngine library to compute the protein energy models, and a module to solve the energy minimization problem using an optimization algorithm in the OPTI-+ library. Together, these components allow the visualization of the protein folding process in ProteinShop. AmberEngine is a object-oriented library to compute molecular energies based on the Amber model. The main class is called ProteinEnergy. Its main interface methods are (1) "init" to initialize internal variables needed to compute the energy. (2) "eval" to evaluate the total energy given a vector of coordinates. Additional methods allow themore » user to evaluate the individual components of the energy model (bond, angle, dihedral, non-bonded-1-4, and non-bonded energies) and to obtain the energy of each individual atom. The Amber Engine library source code includes examples and test routines that illustrate the use of the library in stand alone programs. The energy minimization module uses the AmberEngine library and the nonlinear optimization library OPT++. OPT++ is open source software available under the GNU Lesser General Public License. The minimization module currently makes use of the LBFGS optimization algorithm in OPT++ to perform the energy minimization. Future releases may give the user a choice of other algorithms available in OPT++.« less
Optimization Methods in Sherpa
NASA Astrophysics Data System (ADS)
Siemiginowska, Aneta; Nguyen, Dan T.; Doe, Stephen M.; Refsdal, Brian L.
2009-09-01
Forward fitting is a standard technique used to model X-ray data. A statistic, usually assumed weighted chi^2 or Poisson likelihood (e.g. Cash), is minimized in the fitting process to obtain a set of the best model parameters. Astronomical models often have complex forms with many parameters that can be correlated (e.g. an absorbed power law). Minimization is not trivial in such setting, as the statistical parameter space becomes multimodal and finding the global minimum is hard. Standard minimization algorithms can be found in many libraries of scientific functions, but they are usually focused on specific functions. However, Sherpa designed as general fitting and modeling application requires very robust optimization methods that can be applied to variety of astronomical data (X-ray spectra, images, timing, optical data etc.). We developed several optimization algorithms in Sherpa targeting a wide range of minimization problems. Two local minimization methods were built: Levenberg-Marquardt algorithm was obtained from MINPACK subroutine LMDIF and modified to achieve the required robustness; and Nelder-Mead simplex method has been implemented in-house based on variations of the algorithm described in the literature. A global search Monte-Carlo method has been implemented following a differential evolution algorithm presented by Storn and Price (1997). We will present the methods in Sherpa and discuss their usage cases. We will focus on the application to Chandra data showing both 1D and 2D examples. This work is supported by NASA contract NAS8-03060 (CXC).
Bleckenwegner, Petra; Mardare, Cezarina Cela; Cobet, Christoph; Kollender, Jan Philipp; Hassel, Achim Walter; Mardare, Andrei Ionut
2017-02-13
Optical bandgap mapping of Nb-Ti mixed oxides anodically grown on a thin film parent metallic combinatorial library was performed via variable angle spectroscopic ellipsometry (VASE). A wide Nb-Ti compositional spread ranging from Nb-90 at.% Ti to Nb-15 at.% Ti deposited by cosputtering was used for this purpose. The Nb-Ti library was stepwise anodized at potentials up to 10 V SHE, and the anodic oxides optical properties were mapped along the Nb-Ti library with 2 at.% resolution. The surface dissimilarities along the Nb-Ti compositional gradient were minimized by tuning the deposition parameters, thus allowing a description of the mixed Nb-Ti oxides based on a single Tauc-Lorentz oscillator for data fitting. Mapping of the Nb-Ti oxides optical bandgap along the entire compositional spread showed a clear deviation from the linear model based on mixing individual Nb and Ti electronegativities proportional to their atomic fractions. This is attributed to the strong amorphization and an in-depth compositional gradient of the mixed oxides. A systematic optical bandgap decrease toward values as low as 2.0 eV was identified at approximately 50 at.% Nb. Mixing of Nb 2 O 5 and TiO 2 with both amorphous and crystalline phases is concluded, whereas the possibility of complex Nb a Ti b O y oxide formation during anodization is unlikely.
TRAC Searchable Research Library
2016-05-01
network accessible document repository for technical documents and similar document artifacts. We used a model-based approach using the Vector...demonstration and model refinement. 14. SUBJECT TERMS Knowledge Management, Document Repository , Digital Library, Vector Directional Data Model...27 Figure D1. Administrator Repository Upload Page. ................................................................... D-2 Figure D2
GeoTess: A generalized Earth model software utility
Ballard, Sanford; Hipp, James; Kraus, Brian; ...
2016-03-23
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.
STANLEY (Sandia Text ANaLysis Extensible librarY) Ver. 1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
BENZ, ZACHARY; APODACA, VINCENT; BASILICO, JUSTIN
2009-11-10
Reusable, extensible text analysis library. This library forms the basis for the automated generation of cognitive models from text for Sandia's Cognition program. It also is the basis for the suite of underlying, related applications.
Graded nanowell arrays: a fine plasmonic "library" with an adjustable spectral range.
Xue, Peihong; Ye, Shunsheng; Su, Hongyang; Wang, Shuli; Nan, Jingjie; Chen, Xingchi; Ruan, Weidong; Zhang, Junhu; Cui, Zhanchen; Yang, Bai
2017-05-25
We present an effective approach for fabricating graded plasmonic arrays based on ordered micro-/nanostructures with a geometric gradient. Ag nanowell arrays with graded geometric parameters were fabricated and systematically investigated. The order of the graded plasmonic arrays is generated by colloidal lithography, while the geometric gradient is the result of inclined reactive ion etching. The surface plasmon resonance (SPR) peaks were measured at different positions, which move gradually along the Ag nanowell arrays with a geometric gradient. Such micro-/nanostructure arrays with graded and integrated SPR peaks can work as a fine plasmonic "library" (FPL), and the spectral range can be controlled using a "coarse adjustment knob" (lattice constant) and a "fine adjustment knob" (pore diameter). Additionally, the spectral resolution of the FPL is high, which benefits from the high value of the full height/full width at half-maximum and the small step size of the wavelength shift (0.5 nm). Meanwhile, the FPL could be effectively applied as a well-defined model to verify the plasmonic enhancement in surface enhanced Raman scattering. As the FPL is an integrated optical material with graded individual SPR peaks, it can not only be a theoretical model for fundamental research, but also has great potential in high-throughput screening of optical materials, multiplex sensors, etc.
Evaluation of the Pool Critical Assembly Benchmark with Explicitly-Modeled Geometry using MCNP6
Kulesza, Joel A.; Martz, Roger Lee
2017-03-01
Despite being one of the most widely used benchmarks for qualifying light water reactor (LWR) radiation transport methods and data, no benchmark calculation of the Oak Ridge National Laboratory (ORNL) Pool Critical Assembly (PCA) pressure vessel wall benchmark facility (PVWBF) using MCNP6 with explicitly modeled core geometry exists. As such, this paper provides results for such an analysis. First, a criticality calculation is used to construct the fixed source term. Next, ADVANTG-generated variance reduction parameters are used within the final MCNP6 fixed source calculations. These calculations provide unadjusted dosimetry results using three sets of dosimetry reaction cross sections of varyingmore » ages (those packaged with MCNP6, from the IRDF-2002 multi-group library, and from the ACE-formatted IRDFF v1.05 library). These results are then compared to two different sets of measured reaction rates. The comparison agrees in an overall sense within 2% and on a specific reaction- and dosimetry location-basis within 5%. Except for the neptunium dosimetry, the individual foil raw calculation-to-experiment comparisons usually agree within 10% but is typically greater than unity. Finally, in the course of developing these calculations, geometry that has previously not been completely specified is provided herein for the convenience of future analysts.« less
Essential issues in the design of shared document/image libraries
NASA Astrophysics Data System (ADS)
Gladney, Henry M.; Mantey, Patrick E.
1990-08-01
We consider what is needed to create electronic document libraries which mimic physical collections of books, papers, and other media. The quantitative measures of merit for personal workstations-cost, speed, size of volatile and persistent storage-will improve by at least an order ofmagnitude in the next decade. Every professional worker will be able to afford a very powerful machine, but databases and libraries are not really economical and useful unless they are shared. We therefore see a two-tier world emerging, in which custodians of information make it available to network-attached workstations. A client-server model is the natural description of this world. In collaboration with several state governments, we have considered what would be needed to replace paper-based record management for a dozen different applications. We find that a professional worker can anticipate most data needs and that (s)he is interested in each clump of data for a period of days to months. We further find that only a small fraction of any collection will be used in any period. Given expected bandwidths, data sizes, search times and costs, and other such parameters, an effective strategy to support user interaction is to bring large clumps from their sources, to transform them into convenient representations, and only then start whatever investigation is intended. A system-managed hierarchy of caches and archives is indicated. Each library is a combination of a catalog and a collection, and each stored item has a primary instance which is the standard by which the correctness of any copy is judged. Catalog records mostly refer to 1 to 3 stored items. Weighted by the number of bytes to be stored, immutable data dominate collections. These characteristics affect how consistency, currency, and access control of replicas distributed in the network should be managed. We present the large features of a design for network docun1ent/image library services. A prototype is being built for State of California pilot applications. The design allows library servers in any environment with an ANSI SQL database; clients execute in any environment; conimunications are with either TCP/IP or SNA LU 6.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sublet, J.-Ch., E-mail: jean-christophe.sublet@ukaea.uk; Eastwood, J.W.; Morgan, J.G.
Fispact-II is a code system and library database for modelling activation-transmutation processes, depletion-burn-up, time dependent inventory and radiation damage source terms caused by nuclear reactions and decays. The Fispact-II code, written in object-style Fortran, follows the evolution of material irradiated by neutrons, alphas, gammas, protons, or deuterons, and provides a wide range of derived radiological output quantities to satisfy most needs for nuclear applications. It can be used with any ENDF-compliant group library data for nuclear reactions, particle-induced and spontaneous fission yields, and radioactive decay (including but not limited to TENDL-2015, ENDF/B-VII.1, JEFF-3.2, JENDL-4.0u, CENDL-3.1 processed into fine-group-structure files, GEFY-5.2more » and UKDD-16), as well as resolved and unresolved resonance range probability tables for self-shielding corrections and updated radiological hazard indices. The code has many novel features including: extension of the energy range up to 1 GeV; additional neutron physics including self-shielding effects, temperature dependence, thin and thick target yields; pathway analysis; and sensitivity and uncertainty quantification and propagation using full covariance data. The latest ENDF libraries such as TENDL encompass thousands of target isotopes. Nuclear data libraries for Fispact-II are prepared from these using processing codes PREPRO, NJOY and CALENDF. These data include resonance parameters, cross sections with covariances, probability tables in the resonance ranges, PKA spectra, kerma, dpa, gas and radionuclide production and energy-dependent fission yields, supplemented with all 27 decay types. All such data for the five most important incident particles are provided in evaluated data tables. The Fispact-II simulation software is described in detail in this paper, together with the nuclear data libraries. The Fispact-II system also includes several utility programs for code-use optimisation, visualisation and production of secondary radiological quantities. Included in the paper are summaries of results from the suite of verification and validation reports available with the code.« less
Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars
2016-04-12
A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.
Analysis of Network Topologies Underlying Ethylene Growth Response Kinetics
Prescott, Aaron M.; McCollough, Forest W.; Eldreth, Bryan L.; Binder, Brad M.; Abel, Steven M.
2016-01-01
Most models for ethylene signaling involve a linear pathway. However, measurements of seedling growth kinetics when ethylene is applied and removed have resulted in more complex network models that include coherent feedforward, negative feedback, and positive feedback motifs. The dynamical responses of the proposed networks have not been explored in a quantitative manner. Here, we explore (i) whether any of the proposed models are capable of producing growth-response behaviors consistent with experimental observations and (ii) what mechanistic roles various parts of the network topologies play in ethylene signaling. To address this, we used computational methods to explore two general network topologies: The first contains a coherent feedforward loop that inhibits growth and a negative feedback from growth onto itself (CFF/NFB). In the second, ethylene promotes the cleavage of EIN2, with the product of the cleavage inhibiting growth and promoting the production of EIN2 through a positive feedback loop (PFB). Since few network parameters for ethylene signaling are known in detail, we used an evolutionary algorithm to explore sets of parameters that produce behaviors similar to experimental growth response kinetics of both wildtype and mutant seedlings. We generated a library of parameter sets by independently running the evolutionary algorithm many times. Both network topologies produce behavior consistent with experimental observations, and analysis of the parameter sets allows us to identify important network interactions and parameter constraints. We additionally screened these parameter sets for growth recovery in the presence of sub-saturating ethylene doses, which is an experimentally-observed property that emerges in some of the evolved parameter sets. Finally, we probed simplified networks maintaining key features of the CFF/NFB and PFB topologies. From this, we verified observations drawn from the larger networks about mechanisms underlying ethylene signaling. Analysis of each network topology results in predictions about changes that occur in network components that can be experimentally tested to give insights into which, if either, network underlies ethylene responses. PMID:27625669
Fernández-San-Martín, Maria Isabel; Martín-López, Luis Miguel; Masa-Font, Roser; Olona-Tabueña, Noemí; Roman, Yuani; Martin-Royo, Jaume; Oller-Canet, Silvia; González-Tejón, Susana; San-Emeterio, Luisa; Barroso-Garcia, Albert; Viñas-Cabrera, Lidia; Flores-Mateo, Gemma
2014-01-01
Patients with severe mental illness have higher prevalences of cardiovascular risk factors (CRF). The objective is to determine whether interventions to modify lifestyles in these patients reduce anthropometric and analytical parameters related to CRF in comparison to routine clinical practice. Systematic review of controlled clinical trials with lifestyle intervention in Medline, Cochrane Library, Embase, PsycINFO and CINALH. Change in body mass index, waist circumference, cholesterol, triglycerides and blood sugar. Meta-analyses were performed using random effects models to estimate the weighted mean difference. Heterogeneity was determined using i(2) statistical and subgroups analyses. 26 studies were selected. Lifestyle interventions decrease anthropometric and analytical parameters at 3 months follow up. At 6 and 12 months, the differences between the intervention and control groups were maintained, although with less precision. More studies with larger samples and long-term follow-up are needed.
NASA Technical Reports Server (NTRS)
Brackett, Robert A.; Arvidson, Raymond E.
1993-01-01
A technique is presented that allows extraction of compositional and textural information from visible, near and thermal infrared remotely sensed data. Using a library of both emissivity and reflectance spectra, endmember abundances and endmember thermal inertias are extracted from AVIRIS (Airborne Visible and Infrared Imaging Spectrometer) and TIMS (Thermal Infrared Mapping Spectrometer) data over Lunar Crater Volcanic Field, Nevada, using a dual inversion. The inversion technique is motivated by upcoming Mars Observer data and the need for separation of composition and texture parameters from sub pixel mixtures of bedrock and dust. The model employed offers the opportunity to extract compositional and textural information for a variety of endmembers within a given pixel. Geologic inferences concerning grain size, abundance, and source of endmembers can be made directly from the inverted data. These parameters are of direct relevance to Mars exploration, both for Mars Observer and for follow-on missions.
Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.
Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A
2017-10-20
Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.
McGill Library Makes E-Books Portable: E-Reader Loan Service in a Canadian Academic Library
ERIC Educational Resources Information Center
Savova, Maria; Garsia, Matthew
2012-01-01
E-readers are increasingly popular personal devices, but can they be effectively used for the needs of academic libraries' clients? This paper employs an evidence-based approach that examines the role and efficacy of implementing an E-reader Loan Service at McGill University Library. Suggestions are offered as to what lending model and device…
UNSW Library's Outreach Librarian Service: What They Need before They Want It!
ERIC Educational Resources Information Center
Hinsch, Neil; Dunn, Kate
2009-01-01
At the end of 2006 the UNSW Library restructured its five special libraries into a new division called the Information Services Department. It was a radical restructure and provides a new model through which the Library delivers services in both the physical and online environments in a flexible and proactive manner. A pivotal role within this new…
ERIC Educational Resources Information Center
Ross, Lyman; Sennyey, Pongracz
2008-01-01
As a direct consequence of the digital revolution, academic libraries today face competition as information providers. Using Richard N. Foster's technology S curves as the analytical model, this article shows that academic libraries are in the midst of discontinuous change by questioning a number of assumptions that support the current practice of…
ERIC Educational Resources Information Center
Bidwell, Charles M.; Auricchio, Dominick
The project set out to establish an operational film scheduling network to improve service to New York State teachers using 16mm educational films. The Network is designed to serve local libraries located in Boards of Cooperative Educational Services (BOCES), regional libraries, and a statewide Syracuse University Film Rental Library (SUFRL). The…
A Historical Introduction to Library Education: Problems and Progress to 1951.
ERIC Educational Resources Information Center
White, Carl M.
The growth of libraries and of technical education in the middle of the 19th century led to the organization of Melvil Dewey's School of Library Economy in 1887. The School offered a technical course to replace the apprenticeships then in favor. Its curriculum persisted as the model for library education through 1920. A break with the early form…
ERIC Educational Resources Information Center
Morag, Azriel
Libraries faced with the challenge of cooperative cataloging must maintain a high degree of unification within the library network (consortium) without compromising local libraries' independence. This paper compares a traditional model for cooperative catalogs achieved by means of a Union Catalog that depends entirely on replication of data…
Pierce, M L; Ruffner, D E
1998-01-01
Antisense-mediated gene inhibition uses short complementary DNA or RNA oligonucleotides to block expression of any mRNA of interest. A key parameter in the success or failure of an antisense therapy is the identification of a suitable target site on the chosen mRNA. Ultimately, the accessibility of the target to the antisense agent determines target suitability. Since accessibility is a function of many complex factors, it is currently beyond our ability to predict. Consequently, identification of the most effective target(s) requires examination of every site. Towards this goal, we describe a method to construct directed ribozyme libraries against any chosen mRNA. The library contains nearly equal amounts of ribozymes targeting every site on the chosen transcript and the library only contains ribozymes capable of binding to that transcript. Expression of the ribozyme library in cultured cells should allow identification of optimal target sites under natural conditions, subject to the complexities of a fully functional cell. Optimal target sites identified in this manner should be the most effective sites for therapeutic intervention. PMID:9801305
Hole filling and library optimization: application to commercially available fragment libraries.
An, Yuling; Sherman, Woody; Dixon, Steven L
2012-09-15
Compound libraries comprise an integral component of drug discovery in the pharmaceutical and biotechnology industries. While in-house libraries often contain millions of molecules, this number pales in comparison to the accessible space of drug-like molecules. Therefore, care must be taken when adding new compounds to an existing library in order to ensure that unexplored regions in the chemical space are filled efficiently while not needlessly increasing the library size. In this work, we present an automated method to fill holes in an existing library using compounds from an external source and apply it to commercially available fragment libraries. The method, called Canvas HF, uses distances computed from 2D chemical fingerprints and selects compounds that fill vacuous regions while not suffering from the problem of selecting only compounds at the edge of the chemical space. We show that the method is robust with respect to different databases and the number of requested compounds to retrieve. We also present an extension of the method where chemical properties can be considered simultaneously with the selection process to bias the compounds toward a desired property space without imposing hard property cutoffs. We compare the results of Canvas HF to those obtained with a standard sphere exclusion method and with random compound selection and find that Canvas HF performs favorably. Overall, the method presented here offers an efficient and effective hole-filling strategy to augment compound libraries with compounds from external sources. The method does not have any fit parameters and therefore it should be applicable in most hole-filling applications. Copyright © 2012 Elsevier Ltd. All rights reserved.
Allocating the Book Budget: A Model
ERIC Educational Resources Information Center
Kohut, Joseph D.
1974-01-01
Inflation is currently affecting library book budgets, particularly the acquisition of serials. This model would balance the purchase of serials against the purchase of monographs by individual funding units within the academic library. Consideration is given to inflation as a cost factor. The model is applied to a specific example. (Author/LS)
Landlab: an Open-Source Python Library for Modeling Earth Surface Dynamics
NASA Astrophysics Data System (ADS)
Gasparini, N. M.; Adams, J. M.; Hobley, D. E. J.; Hutton, E.; Nudurupati, S. S.; Istanbulluoglu, E.; Tucker, G. E.
2016-12-01
Landlab is an open-source Python modeling library that enables users to easily build unique models to explore earth surface dynamics. The Landlab library provides a number of tools and functionalities that are common to many earth surface models, thus eliminating the need for a user to recode fundamental model elements each time she explores a new problem. For example, Landlab provides a gridding engine so that a user can build a uniform or nonuniform grid in one line of code. The library has tools for setting boundary conditions, adding data to a grid, and performing basic operations on the data, such as calculating gradients and curvature. The library also includes a number of process components, which are numerical implementations of physical processes. To create a model, a user creates a grid and couples together process components that act on grid variables. The current library has components for modeling a diverse range of processes, from overland flow generation to bedrock river incision, from soil wetting and drying to vegetation growth, succession and death. The code is freely available for download (https://github.com/landlab/landlab) or can be installed as a Python package. Landlab models can also be built and run on Hydroshare (www.hydroshare.org), an online collaborative environment for sharing hydrologic data, models, and code. Tutorials illustrating a wide range of Landlab capabilities such as building a grid, setting boundary conditions, reading in data, plotting, using components and building models are also available (https://github.com/landlab/tutorials). The code is also comprehensively documented both online and natively in Python. In this presentation, we illustrate the diverse capabilities of Landlab. We highlight existing functionality by illustrating outcomes from a range of models built with Landlab - including applications that explore landscape evolution and ecohydrology. Finally, we describe the range of resources available for new users.
NASA Astrophysics Data System (ADS)
Shipp, S.; Nelson, B.; Stockman, S.; Weir, H.; Carter, B.; Bleacher, L.
2008-07-01
Libraries are vibrant learning places, seeking partners in science programming. LPI's Explore! program offers a model for public engagement in lunar exploration in libraries, as shown by materials created collaboratively with the LRO E/PO team.
Lavine, Barry K; White, Collin G; Allen, Matthew D; Fasasi, Ayuba; Weakley, Andrew
2016-10-01
A prototype library search engine has been further developed to search the infrared spectral libraries of the paint data query database to identify the line and model of a vehicle from the clear coat, surfacer-primer, and e-coat layers of an intact paint chip. For this study, search prefilters were developed from 1181 automotive paint systems spanning 3 manufacturers: General Motors, Chrysler, and Ford. The best match between each unknown and the spectra in the hit list generated by the search prefilters was identified using a cross-correlation library search algorithm that performed both a forward and backward search. In the forward search, spectra were divided into intervals and further subdivided into windows (which corresponds to the time lag for the comparison) within those intervals. The top five hits identified in each search window were compiled; a histogram was computed that summarized the frequency of occurrence for each library sample, with the IR spectra most similar to the unknown flagged. The backward search computed the frequency and occurrence of each line and model without regard to the identity of the individual spectra. Only those lines and models with a frequency of occurrence greater than or equal to 20% were included in the final hit list. If there was agreement between the forward and backward search results, the specific line and model common to both hit lists was always the correct assignment. Samples assigned to the same line and model by both searches are always well represented in the library and correlate well on an individual basis to specific library samples. For these samples, one can have confidence in the accuracy of the match. This was not the case for the results obtained using commercial library search algorithms, as the hit quality index scores for the top twenty hits were always greater than 99%. Copyright © 2016 Elsevier B.V. All rights reserved.
Optimized multiple quantum MAS lineshape simulations in solid state NMR
NASA Astrophysics Data System (ADS)
Brouwer, William J.; Davis, Michael C.; Mueller, Karl T.
2009-10-01
The majority of nuclei available for study in solid state Nuclear Magnetic Resonance have half-integer spin I>1/2, with corresponding electric quadrupole moment. As such, they may couple with a surrounding electric field gradient. This effect introduces anisotropic line broadening to spectra, arising from distinct chemical species within polycrystalline solids. In Multiple Quantum Magic Angle Spinning (MQMAS) experiments, a second frequency dimension is created, devoid of quadrupolar anisotropy. As a result, the center of gravity of peaks in the high resolution dimension is a function of isotropic second order quadrupole and chemical shift alone. However, for complex materials, these parameters take on a stochastic nature due in turn to structural and chemical disorder. Lineshapes may still overlap in the isotropic dimension, complicating the task of assignment and interpretation. A distributed computational approach is presented here which permits simulation of the two-dimensional MQMAS spectrum, generated by random variates from model distributions of isotropic chemical and quadrupole shifts. Owing to the non-convex nature of the residual sum of squares (RSS) function between experimental and simulated spectra, simulated annealing is used to optimize the simulation parameters. In this manner, local chemical environments for disordered materials may be characterized, and via a re-sampling approach, error estimates for parameters produced. Program summaryProgram title: mqmasOPT Catalogue identifier: AEEC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3650 No. of bytes in distributed program, including test data, etc.: 73 853 Distribution format: tar.gz Programming language: C, OCTAVE Computer: UNIX/Linux Operating system: UNIX/Linux Has the code been vectorised or parallelized?: Yes RAM: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 3.5M, SMP AMD opteron Classification: 2.3 External routines: OCTAVE ( http://www.gnu.org/software/octave/), GNU Scientific Library ( http://www.gnu.org/software/gsl/), OPENMP ( http://openmp.org/wp/) Nature of problem: The optimal simulation and modeling of multiple quantum magic angle spinning NMR spectra, for general systems, especially those with mild to significant disorder. The approach outlined and implemented in C and OCTAVE also produces model parameter error estimates. Solution method: A model for each distinct chemical site is first proposed, for the individual contribution of crystallite orientations to the spectrum. This model is averaged over all powder angles [1], as well as the (stochastic) parameters; isotropic chemical shift and quadrupole coupling constant. The latter is accomplished via sampling from a bi-variate Gaussian distribution, using the Box-Muller algorithm to transform Sobol (quasi) random numbers [2]. A simulated annealing optimization is performed, and finally the non-linear jackknife [3] is applied in developing model parameter error estimates. Additional comments: The distribution contains a script, mqmasOpt.m, which runs in the OCTAVE language workspace. Running time: Example: (1597 powder angles) × (200 Samples) × (81 F2 frequency pts) × (31 F1 frequency points) = 58.35 seconds, SMP AMD opteron. References:S.K. Zaremba, Annali di Matematica Pura ed Applicata 73 (1966) 293. H. Niederreiter, Random Number Generation and Quasi-Monte Carlo Methods, SIAM, 1992. T. Fox, D. Hinkley, K. Larntz, Technometrics 22 (1980) 29.
NASA Astrophysics Data System (ADS)
Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey
2015-04-01
A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.
An Old Story in the Parallel Synthesis World: An Approach to Hydantoin Libraries.
Bogolubsky, Andrey V; Moroz, Yurii S; Savych, Olena; Pipko, Sergey; Konovets, Angelika; Platonov, Maxim O; Vasylchenko, Oleksandr V; Hurmach, Vasyl V; Grygorenko, Oleksandr O
2018-01-08
An approach to the parallel synthesis of hydantoin libraries by reaction of in situ generated 2,2,2-trifluoroethylcarbamates and α-amino esters was developed. To demonstrate utility of the method, a library of 1158 hydantoins designed according to the lead-likeness criteria (MW 200-350, cLogP 1-3) was prepared. The success rate of the method was analyzed as a function of physicochemical parameters of the products, and it was found that the method can be considered as a tool for lead-oriented synthesis. A hydantoin-bearing submicromolar primary hit acting as an Aurora kinase A inhibitor was discovered with a combination of rational design, parallel synthesis using the procedures developed, in silico and in vitro screenings.
Climate Change for Agriculture, Forest Cover and 3d Urban Models
NASA Astrophysics Data System (ADS)
Kapoor, M.; Bassir, D.
2014-11-01
This research demonstrates the important role of the remote sensing in finding out the different parameters behind the agricultural crop change, forest cover and urban 3D models. Standalone software is developed to view and analysis the different factors effecting the change in crop productions. Open-source libraries from the Open Source Geospatial Foundation have been used for the development of the shape-file viewer. Software can be used to get the attribute information, scale, zoom in/out and pan the shapefiles. Environmental changes due to pollution and population that are increasing the urbanisation and decreasing the forest cover on the earth. Satellite imagery such as Landsat 5(1984) to Landsat TRIS/8 (2014), Landsat Data Continuity Mission (LDCM) and NDVI are used to analyse the different parameters that are effecting the agricultural crop production change and forest change. It is advisable for the development of good quality of NDVI and forest cover maps to use data collected from the same processing methods for the complete region. Management practices have been developed from the analysed data for the betterment of the crop and saving the forest cover
NASA Astrophysics Data System (ADS)
Kaliuzhnyi, M. P.; Bushuev, F. I.; Sibiriakova, Ye. S.; Shulga, O. V.; Shakun, L. S.; Bezrukovs, V.; Kulishenko, V. F.; Moskalenko, S. S.; Malynovsky, Ye. V.; Balagura, O. A.
2017-02-01
The results of the determination of the geostationary satellite "Eutelsat-13B" orbital position obtained during 2015-2016 years using European stations' network for reception of DVB-S signals from the satellite are presented. The network consists of five stations located in Ukraine and Latvia. The stations are equipped with a radio engineering complex developed by the RI "MAO". The measured parameter is a time difference of arrival (TDOA) of the DVB-S signals to the stations of the network. The errors of TDOA determination and satellite coordinates, obtained using a numerical model of satellite motion, are equal ±2.6 m and ±35 m respectively. Software implementation of the numerical model is taken from the free space dynamics library OREKIT.
NASA Technical Reports Server (NTRS)
Howard, Ayanna
2005-01-01
The Fuzzy Logic Engine is a software package that enables users to embed fuzzy-logic modules into their application programs. Fuzzy logic is useful as a means of formulating human expert knowledge and translating it into software to solve problems. Fuzzy logic provides flexibility for modeling relationships between input and output information and is distinguished by its robustness with respect to noise and variations in system parameters. In addition, linguistic fuzzy sets and conditional statements allow systems to make decisions based on imprecise and incomplete information. The user of the Fuzzy Logic Engine need not be an expert in fuzzy logic: it suffices to have a basic understanding of how linguistic rules can be applied to the user's problem. The Fuzzy Logic Engine is divided into two modules: (1) a graphical-interface software tool for creating linguistic fuzzy sets and conditional statements and (2) a fuzzy-logic software library for embedding fuzzy processing capability into current application programs. The graphical- interface tool was developed using the Tcl/Tk programming language. The fuzzy-logic software library was written in the C programming language.
Detail view of lamp in law library; Jennewein modeled symbols ...
Detail view of lamp in law library; Jennewein modeled symbols of the four seasons on the lamp's aluminum supports - United States Department of Justice, Constitution Avenue between Ninth & Tenth Streets, Northwest, Washington, District of Columbia, DC
Theories of learning: models of good practice for evidence-based information skills teaching.
Spring, Hannah
2010-12-01
This feature considers models of teaching and learning and how these can be used to support evidence based practice. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ballard, Sanford; Hipp, James; Kraus, Brian
GeoTess is a model parameterization and software support library that manages the construction, population, storage, and interrogation of data stored in 2D and 3D Earth models. Here, the software is available in Java and C++, with a C interface to the C++ library.
Henry, Kevin A; Tanha, Jamshid
2018-05-01
Fully human synthetic single-domain antibodies (sdAbs) are desirable therapeutic molecules but their development is a considerable challenge. Here, using a retrospective analysis of in-house historical data, we examined the parameters that impact the outcome of screening phage-displayed synthetic human sdAb libraries to discover antigen-specific binders. We found no evidence for a differential effect of domain type (V H or V L ), library randomization strategy, incorporation of a stabilizing disulfide linkage or sdAb display format (monovalent vs. multivalent) on the probability of obtaining any antigen-binding human sdAbs, instead finding that the success of library screens was primarily related to properties of target antigens, especially molecular mass. The solubility and binding affinity of sdAbs isolated from successful screens depended both on properties of the sdAb libraries (primarily domain type) and the target antigens. Taking attrition of sdAbs with major manufacturability concerns (aggregation; low expression) and sdAbs that do not recognize native cell-surface antigens as independent probabilities, we calculate the overall likelihood of obtaining ≥1 antigen-binding human sdAb from a single library-target screen as ~24%. Successful library-target screens should be expected to yield ~1.3 human sdAbs on average, each with average binding affinity of ~2 μM. Copyright © 2018 Elsevier B.V. All rights reserved.
LESTO: an Open Source GIS-based toolbox for LiDAR analysis
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino
2015-04-01
During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed considering an Optimizer Algorithm based on Particle Swarm (PS) and a matching procedure which takes the position and the height of the extracted trees respect to the measured ones and iteratively tries to improve the candidate solution changing the models' parameters. Example of application of the LESTO tools will be presented on test sites. Test area consists in a series of circular sampling plots randomly selected from a 50x50 m regular grid within a buffer zone of 150 m from the forest road. Other studies on the same sites take as reference measurements of position, diameter, species and height and proposed allometric relationships. These allometric relationship were obtained for each species deriving the stem volume of single trees based on height and diameter at breast height. LESTO is integrated in the JGrassTools project and available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
The AHEC library program and consortia development in California.
Jensen, M A; Maddalena, B
1986-07-01
A brief history of the first Area Health Education Center (AHEC) Library Program in California is presented, with a description of methodology and results. The goals of this program were to develop and improve hospital library resources and services, to train hospital library personnel, and to promote resource sharing in a medically underserved area. The health sciences library consortium that evolved became a model for the ten other library consortia in the state. Based on AHEC's twelve years' experience with consortia, from 1973 to 1985, recommendations are made as to size, composition, leadership, outside funding, group participation, publicity, and linkages.
The Marketing of Public Library Services.
ERIC Educational Resources Information Center
Dragon, Andrea C.
1983-01-01
Defines the concept of marketing and relates models involving the exchanges and transactions of markets and charities to services offered by libraries. Market segmentation, understanding the behavior of markets, competition, and movement toward a market-oriented library are highlighted. Nineteen references are cited. (EJS)
Damsel: A Data Model Storage Library for Exascale Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Liao, Wei-keng
Computational science applications have been described as having one of seven motifs (the “seven dwarfs”), each having a particular pattern of computation and communication. From a storage and I/O perspective, these applications can also be grouped into a number of data model motifs describing the way data is organized and accessed during simulation, analysis, and visualization. Major storage data models developed in the 1990s, such as Network Common Data Format (netCDF) and Hierarchical Data Format (HDF) projects, created support for more complex data models. Development of both netCDF and HDF5 was influenced by multi-dimensional dataset storage requirements, but their accessmore » models and formats were designed with sequential storage in mind (e.g., a POSIX I/O model). Although these and other high-level I/O libraries have had a beneficial impact on large parallel applications, they do not always attain a high percentage of peak I/O performance due to fundamental design limitations, and they do not address the full range of current and future computational science data models. The goal of this project is to enable exascale computational science applications to interact conveniently and efficiently with storage through abstractions that match their data models. The project consists of three major activities: (1) identifying major data model motifs in computational science applications and developing representative benchmarks; (2) developing a data model storage library, called Damsel, that supports these motifs, provides efficient storage data layouts, incorporates optimizations to enable exascale operation, and is tolerant to failures; and (3) productizing Damsel and working with computational scientists to encourage adoption of this library by the scientific community. The product of this project, Damsel library, is openly available for download from http://cucis.ece.northwestern.edu/projects/DAMSEL. Several case studies and application programming interface reference are also available to assist new users to learn to use the library.« less
ERIC Educational Resources Information Center
Dearnley, James; McKnight, Cliff; Morris, Anne
2004-01-01
This article reports on one aspect of a Laser Foundation-funded research project that tested different models of e-book delivery and offered guidelines for developing e-book collections in UK public libraries. An e-book collection was offered to library users (primarily, users relying on a mobile library service) on Personal Digital Assistant…
ERIC Educational Resources Information Center
Chressanthis, George A.; Chressanthis, June D.
1994-01-01
Provides regression-based empirical evidence of the effects of variations in exchange rate risk on 1985 library prices of the top-ranked 99 journals in economics. The relationship between individual journal prices and library prices is shown, and other factors associated with increases and decreases in library journal prices are given. (Contains…
ERIC Educational Resources Information Center
Lo, Patrick; Chiu, Dickson K. W.; Chu, Wilson
2013-01-01
The Hong Kong Design Institute (HKDI) is a leading design education institute in Hong Kong under the Vocational Training Council (VTC) group. Opened in September 2010, the HKDI Learning Resources Centre is a specialist library for the study of art and design. The mission of the HKDI Library is to support and promote the academic goals of the…
2015-01-01
The 5-hydroxytryptamine 1A (5-HT1A) serotonin receptor has been an attractive target for treating mood and anxiety disorders such as schizophrenia. We have developed binary classification quantitative structure–activity relationship (QSAR) models of 5-HT1A receptor binding activity using data retrieved from the PDSP Ki database. The prediction accuracy of these models was estimated by external 5-fold cross-validation as well as using an additional validation set comprising 66 structurally distinct compounds from the World of Molecular Bioactivity database. These validated models were then used to mine three major types of chemical screening libraries, i.e., drug-like libraries, GPCR targeted libraries, and diversity libraries, to identify novel computational hits. The five best hits from each class of libraries were chosen for further experimental testing in radioligand binding assays, and nine of the 15 hits were confirmed to be active experimentally with binding affinity better than 10 μM. The most active compound, Lysergol, from the diversity library showed very high binding affinity (Ki) of 2.3 nM against 5-HT1A receptor. The novel 5-HT1A actives identified with the QSAR-based virtual screening approach could be potentially developed as novel anxiolytics or potential antischizophrenic drugs. PMID:24410373
NASA Astrophysics Data System (ADS)
Zou, Wen-bo; Chong, Xiao-meng; Wang, Yan; Hu, Chang-qin
2018-05-01
The accuracy of NIR quantitative models depends on calibration samples with concentration variability. Conventional sample collecting methods have some shortcomings especially the time-consuming which remains a bottleneck in the application of NIR models for Process Analytical Technology (PAT) control. A study was performed to solve the problem of sample selection collection for construction of NIR quantitative models. Amoxicillin and potassium clavulanate oral dosage forms were used as examples. The aim was to find a normal approach to rapidly construct NIR quantitative models using an NIR spectral library based on the idea of a universal model [2021]. The NIR spectral library of amoxicillin and potassium clavulanate oral dosage forms was defined and consisted of spectra of 377 batches of samples produced by 26 domestic pharmaceutical companies, including tablets, dispersible tablets, chewable tablets, oral suspensions, and granules. The correlation coefficient (rT) was used to indicate the similarities of the spectra. The samples’ calibration sets were selected from a spectral library according to the median rT of the samples to be analyzed. The rT of the samples selected was close to the median rT. The difference in rT of those samples was 1.0% to 1.5%. We concluded that sample selection is not a problem when constructing NIR quantitative models using a spectral library versus conventional methods of determining universal models. The sample spectra with a suitable concentration range in the NIR models were collected quickly. In addition, the models constructed through this method were more easily targeted.
Redesign of Library Workflows: Experimental Models for Electronic Resource Description.
ERIC Educational Resources Information Center
Calhoun, Karen
This paper explores the potential for and progress of a gradual transition from a highly centralized model for cataloging to an iterative, collaborative, and broadly distributed model for electronic resource description. The purpose is to alert library managers to some experiments underway and to help them conceptualize new methods for defining,…
Forecasting Techniques and Library Circulation Operations: Implications for Management.
ERIC Educational Resources Information Center
Ahiakwo, Okechukwu N.
1988-01-01
Causal regression and time series models were developed using six years of data for home borrowing, average readership, and books consulted at a university library. The models were tested for efficacy in producing short-term planning and control data. Combined models were tested in establishing evaluation measures. (10 references) (Author/MES)
ERIC Educational Resources Information Center
Maddox, Alexia; Zhao, Linlin
2017-01-01
This case study presents a conceptual model of researcher performance developed by Deakin University Library, Australia. The model aims to organize research performance data into meaningful researcher profiles, referred to as researcher typologies, which support the demonstration of research impact and value. Three dimensions shaping researcher…
Journal selection decisions: a biomedical library operations research model. I. The framework.
Kraft, D H; Polacsek, R A; Soergel, L; Burns, K; Klair, A
1976-01-01
The problem of deciding which journal titles to select for acquisition in a biomedical library is modeled. The approach taken is based on cost/benefit ratios. Measures of journal worth, methods of data collection, and journal cost data are considered. The emphasis is on the development of a practical process for selecting journal titles, based on the objectivity and rationality of the model; and on the collection of the approprate data and library statistics in a reasonable manner. The implications of this process towards an overall management information system (MIS) for biomedical serials handling are discussed. PMID:820391
Designing Multi-target Compound Libraries with Gaussian Process Models.
Bieler, Michael; Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Kriegl, Jan M; Schneider, Gisbert
2016-05-01
We present the application of machine learning models to selecting G protein-coupled receptor (GPCR)-focused compound libraries. The library design process was realized by ant colony optimization. A proprietary Boehringer-Ingelheim reference set consisting of 3519 compounds tested in dose-response assays at 11 GPCR targets served as training data for machine learning and activity prediction. We compared the usability of the proprietary data with a public data set from ChEMBL. Gaussian process models were trained to prioritize compounds from a virtual combinatorial library. We obtained meaningful models for three of the targets (5-HT2c , MCH, A1), which were experimentally confirmed for 12 of 15 selected and synthesized or purchased compounds. Overall, the models trained on the public data predicted the observed assay results more accurately. The results of this study motivate the use of Gaussian process regression on public data for virtual screening and target-focused compound library design. © 2016 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
Exact diagonalization library for quantum electron models
NASA Astrophysics Data System (ADS)
Iskakov, Sergei; Danilov, Michael
2018-04-01
We present an exact diagonalization C++ template library (EDLib) for solving quantum electron models, including the single-band finite Hubbard cluster and the multi-orbital impurity Anderson model. The observables that can be computed using EDLib are single particle Green's functions and spin-spin correlation functions. This code provides three different types of Hamiltonian matrix storage that can be chosen based on the model.
NASA Astrophysics Data System (ADS)
Guerrero, C.; Zornoza, R.; Gómez, I.; Mataix-Solera, J.; Navarro-Pedreño, J.; Mataix-Beneyto, J.; García-Orenes, F.
2009-04-01
Near infrared (NIR) reflectance spectroscopy offers important advantages because is a non-destructive technique, the pre-treatments needed in samples are minimal, and the spectrum of the sample is obtained in less than 1 minute without the needs of chemical reagents. For these reasons, NIR is a fast and cost-effective method. Moreover, NIR allows the analysis of several constituents or parameters simultaneously from the same spectrum once it is obtained. For this, a needed steep is the development of soil spectral libraries (set of samples analysed and scanned) and calibrations (using multivariate techniques). The calibrations should contain the variability of the target site soils in which the calibration is to be used. Many times this premise is not easy to fulfil, especially in libraries recently developed. A classical way to solve this problem is through the repopulation of libraries and the subsequent recalibration of the models. In this work we studied the changes in the accuracy of the predictions as a consequence of the successive addition of samples to repopulation. In general, calibrations with high number of samples and high diversity are desired. But we hypothesized that calibrations with lower quantities of samples (lower size) will absorb more easily the spectral characteristics of the target site. Thus, we suspect that the size of the calibration (model) that will be repopulated could be important. For this reason we also studied this effect in the accuracy of predictions of the repopulated models. In this study we used those spectra of our library which contained data of soil Kjeldahl Nitrogen (NKj) content (near to 1500 samples). First, those spectra from the target site were removed from the spectral library. Then, different quantities of samples of the library were selected (representing the 5, 10, 25, 50, 75 and 100% of the total library). These samples were used to develop calibrations with different sizes (%) of samples. We used partial least squares regression, and leave-one-out cross validation as methods of calibration. Two methods were used to select the different quantities (size of models) of samples: (1) Based on Characteristics of Spectra (BCS), and (2) Based on NKj Values of Samples (BVS). Both methods tried to select representative samples. Each of the calibrations (containing the 5, 10, 25, 50, 75 or 100% of the total samples of the library) was repopulated with samples from the target site and then recalibrated (by leave-one-out cross validation). This procedure was sequential. In each step, 2 samples from the target site were added to the models, and then recalibrated. This process was repeated successively 10 times, being 20 the total number of samples added. A local model was also created with the 20 samples used for repopulation. The repopulated, non-repopulated and local calibrations were used to predict the NKj content in those samples from the target site not included in repopulations. For the measurement of the accuracy of the predictions, the r2, RMSEP and slopes were calculated comparing predicted with analysed NKj values. This scheme was repeated for each of the four target sites studied. In general, scarce differences can be found between results obtained with BCS and BVS models. We observed that the repopulation of models increased the r2 of the predictions in sites 1 and 3. The repopulation caused scarce changes of the r2 of the predictions in sites 2 and 4, maybe due to the high initial values (using non-repopulated models r2 >0.90). As consequence of repopulation, the RMSEP decreased in all the sites except in site 2, where a very low RMESP was obtained before the repopulation (0.4 g×kg-1). The slopes trended to approximate to 1, but this value was reached only in site 4 and after the repopulation with 20 samples. In sites 3 and 4, accurate predictions were obtained using the local models. Predictions obtained with models using similar size of samples (similar %) were averaged with the aim to describe the main patterns. The r2 of predictions obtained with models of higher size were not more accurate than those obtained with models of lower size. After repopulation, the RMSEP of predictions using models with lower sizes (5, 10 and 25% of samples of the library) were lower than RMSEP obtained with higher sizes (75 and 100%), indicating that small models can easily integrate the variability of the soils from the target site. The results suggest that calibrations of small size could be repopulated and "converted" in local calibrations. According to this, we can focus most of the efforts in the obtainment of highly accurate analytical values in a reduced set of samples (including some samples from the target sites). The patterns observed here are in opposition with the idea of global models. These results could encourage the expansion of this technique, because very large data based seems not to be needed. Future studies with very different samples will help to confirm the robustness of the patterns observed. Authors acknowledge to "Bancaja-UMH" for the financial support of the project "NIRPROS".
A Model Assessing Relevant Factors in Building Minority Library Service.
ERIC Educational Resources Information Center
Bonin, Kenneth Roy
1983-01-01
Presents research design applicable to definition of minority library service needs for any minority language group in Canada, focusing on French-speaking population outside Quebec. Profiles of the target group's population, culture, needs, and library services are highlighted. Five sources are given. (EJS)
At the Threshold of a Library Network.
ERIC Educational Resources Information Center
Khalid, Farooq A.
1996-01-01
Highlights both the benefits and the problems associated with networking in libraries and discusses circumstances that are forcing information centers in the Arabian Gulf region to begin thinking about library networking. Topics include governing models, resource sharing, timeliness, cost effectiveness, currency, reliability, and a union catalog…
From Information Center to Discovery System: Next Step for Libraries?
ERIC Educational Resources Information Center
Marcum, James W.
2001-01-01
Proposes a discovery system model to guide technology integration in academic libraries that fuses organizational learning, systems learning, and knowledge creation techniques with constructivist learning practices to suggest possible future directions for digital libraries. Topics include accessing visual and continuous media; information…
Information Technology and Disabilities, 1997.
ERIC Educational Resources Information Center
McNulty, Tom, Ed.
1997-01-01
Articles published during 1997 include: "The Multi-Disability Workstation for Small Libraries" (Dick Banks and Steve Noble); "Talking Books: Toward a Digital Model" (John Cookson and others); "World Wide Access: Focus on Libraries" (Sheryl Burgstahler); "The Virtual Library: Collaborative Data Exchange and Electronic Text Delivery" (Steve Noble);…
Standards for Community College Library Facilities.
ERIC Educational Resources Information Center
California State Postsecondary Education Commission, Sacramento.
This report contains proposed standards for community college library facilities developed by the California Postsecondary Education Commission. Formulae for calculating stack space, staff space, reader station space, and total space are included in the report. Three alternative models for revising the present library standards were considered:…
Educational Technology Funding Models
ERIC Educational Resources Information Center
Mark, Amy E.
2008-01-01
Library and cross-disciplinary literature all stress the increasing importance of instructional technology in higher education. However, there is a dearth of articles detailing funding for library instructional technology. The bulk of library literature on funding for these projects focuses on one-time grant opportunities and on the architecture…
Kim, Stephanie; Eliot, Melissa; Koestler, Devin C; Houseman, Eugene A; Wetmur, James G; Wiencke, John K; Kelsey, Karl T
2016-09-01
We examined whether variation in blood-based epigenome-wide association studies could be more completely explained by augmenting existing reference DNA methylation libraries. We compared existing and enhanced libraries in predicting variability in three publicly available 450K methylation datasets that collected whole-blood samples. Models were fit separately to each CpG site and used to estimate the additional variability when adjustments for cell composition were made with each library. Calculation of the mean difference in the CpG-specific residual sums of squares error between models for an arthritis, aging and metabolic syndrome dataset, indicated that an enhanced library explained significantly more variation across all three datasets (p < 10(-3)). Pathologically important immune cell subtypes can explain important variability in epigenome-wide association studies done in blood.
NASA Astrophysics Data System (ADS)
Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.
2018-01-01
The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.
Hsu, Chih-Yuan; Pan, Zhen-Ming; Hu, Rei-Hsing; Chang, Chih-Chun; Cheng, Hsiao-Chun; Lin, Che; Chen, Bor-Sen
2015-01-01
In this study, robust biological filters with an external control to match a desired input/output (I/O) filtering response are engineered based on the well-characterized promoter-RBS libraries and a cascade gene circuit topology. In the field of synthetic biology, the biological filter system serves as a powerful detector or sensor to sense different molecular signals and produces a specific output response only if the concentration of the input molecular signal is higher or lower than a specified threshold. The proposed systematic design method of robust biological filters is summarized into three steps. Firstly, several well-characterized promoter-RBS libraries are established for biological filter design by identifying and collecting the quantitative and qualitative characteristics of their promoter-RBS components via nonlinear parameter estimation method. Then, the topology of synthetic biological filter is decomposed into three cascade gene regulatory modules, and an appropriate promoter-RBS library is selected for each module to achieve the desired I/O specification of a biological filter. Finally, based on the proposed systematic method, a robust externally tunable biological filter is engineered by searching the promoter-RBS component libraries and a control inducer concentration library to achieve the optimal reference match for the specified I/O filtering response.
NASA Technical Reports Server (NTRS)
Jackson, Mariea Dunn; Dischinger, Charles; Stambolian, Damon; Henderson, Gena
2012-01-01
Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a Primitive motion capture library. The Library will be used by the human factors engineering in the future to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the Primitive models are being developed for the library the project has selected several current human factors issues to be addressed for the SLS and Orion launch systems. This paper explains how the Motion Capture of unique ground systems activities are being used to verify the human factors analysis requirements for ground system used to process the STS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.
Postures and Motions Library Development for Verification of Ground Crew Human Factors Requirements
NASA Technical Reports Server (NTRS)
Stambolian, Damon; Henderson, Gena; Jackson, Mariea Dunn; Dischinger, Charles
2013-01-01
Spacecraft and launch vehicle ground processing activities require a variety of unique human activities. These activities are being documented in a primitive motion capture library. The library will be used by human factors engineering analysts to infuse real to life human activities into the CAD models to verify ground systems human factors requirements. As the primitive models are being developed for the library, the project has selected several current human factors issues to be addressed for the Space Launch System (SLS) and Orion launch systems. This paper explains how the motion capture of unique ground systems activities is being used to verify the human factors engineering requirements for ground systems used to process the SLS and Orion vehicles, and how the primitive models will be applied to future spacecraft and launch vehicle processing.
Critical Features of Fragment Libraries for Protein Structure Prediction
dos Santos, Karina Baptista
2017-01-01
The use of fragment libraries is a popular approach among protein structure prediction methods and has proven to substantially improve the quality of predicted structures. However, some vital aspects of a fragment library that influence the accuracy of modeling a native structure remain to be determined. This study investigates some of these features. Particularly, we analyze the effect of using secondary structure prediction guiding fragments selection, different fragments sizes and the effect of structural clustering of fragments within libraries. To have a clearer view of how these factors affect protein structure prediction, we isolated the process of model building by fragment assembly from some common limitations associated with prediction methods, e.g., imprecise energy functions and optimization algorithms, by employing an exact structure-based objective function under a greedy algorithm. Our results indicate that shorter fragments reproduce the native structure more accurately than the longer. Libraries composed of multiple fragment lengths generate even better structures, where longer fragments show to be more useful at the beginning of the simulations. The use of many different fragment sizes shows little improvement when compared to predictions carried out with libraries that comprise only three different fragment sizes. Models obtained from libraries built using only sequence similarity are, on average, better than those built with a secondary structure prediction bias. However, we found that the use of secondary structure prediction allows greater reduction of the search space, which is invaluable for prediction methods. The results of this study can be critical guidelines for the use of fragment libraries in protein structure prediction. PMID:28085928
Critical Features of Fragment Libraries for Protein Structure Prediction.
Trevizani, Raphael; Custódio, Fábio Lima; Dos Santos, Karina Baptista; Dardenne, Laurent Emmanuel
2017-01-01
The use of fragment libraries is a popular approach among protein structure prediction methods and has proven to substantially improve the quality of predicted structures. However, some vital aspects of a fragment library that influence the accuracy of modeling a native structure remain to be determined. This study investigates some of these features. Particularly, we analyze the effect of using secondary structure prediction guiding fragments selection, different fragments sizes and the effect of structural clustering of fragments within libraries. To have a clearer view of how these factors affect protein structure prediction, we isolated the process of model building by fragment assembly from some common limitations associated with prediction methods, e.g., imprecise energy functions and optimization algorithms, by employing an exact structure-based objective function under a greedy algorithm. Our results indicate that shorter fragments reproduce the native structure more accurately than the longer. Libraries composed of multiple fragment lengths generate even better structures, where longer fragments show to be more useful at the beginning of the simulations. The use of many different fragment sizes shows little improvement when compared to predictions carried out with libraries that comprise only three different fragment sizes. Models obtained from libraries built using only sequence similarity are, on average, better than those built with a secondary structure prediction bias. However, we found that the use of secondary structure prediction allows greater reduction of the search space, which is invaluable for prediction methods. The results of this study can be critical guidelines for the use of fragment libraries in protein structure prediction.
Using CellML with OpenCMISS to Simulate Multi-Scale Physiology
Nickerson, David P.; Ladd, David; Hussan, Jagir R.; Safaei, Soroush; Suresh, Vinod; Hunter, Peter J.; Bradley, Christopher P.
2014-01-01
OpenCMISS is an open-source modeling environment aimed, in particular, at the solution of bioengineering problems. OpenCMISS consists of two main parts: a computational library (OpenCMISS-Iron) and a field manipulation and visualization library (OpenCMISS-Zinc). OpenCMISS is designed for the solution of coupled multi-scale, multi-physics problems in a general-purpose parallel environment. CellML is an XML format designed to encode biophysically based systems of ordinary differential equations and both linear and non-linear algebraic equations. A primary design goal of CellML is to allow mathematical models to be encoded in a modular and reusable format to aid reproducibility and interoperability of modeling studies. In OpenCMISS, we make use of CellML models to enable users to configure various aspects of their multi-scale physiological models. This avoids the need for users to be familiar with the OpenCMISS internal code in order to perform customized computational experiments. Examples of this are: cellular electrophysiology models embedded in tissue electrical propagation models; material constitutive relationships for mechanical growth and deformation simulations; time-varying boundary conditions for various problem domains; and fluid constitutive relationships and lumped-parameter models. In this paper, we provide implementation details describing how CellML models are integrated into multi-scale physiological models in OpenCMISS. The external interface OpenCMISS presents to users is also described, including specific examples exemplifying the extensibility and usability these tools provide the physiological modeling and simulation community. We conclude with some thoughts on future extension of OpenCMISS to make use of other community developed information standards, such as FieldML, SED-ML, and BioSignalML. Plans for the integration of accelerator code (graphical processing unit and field programmable gate array) generated from CellML models is also discussed. PMID:25601911
NASA Astrophysics Data System (ADS)
Chou, H. K.; Ochoa-Tocachi, B. F.; Buytaert, W.
2017-12-01
Community land surface models such as JULES are increasingly used for hydrological assessment because of their state-of-the-art representation of land-surface processes. However, a major weakness of JULES and other land surface models is the limited number of land surface parameterizations that is available. Therefore, this study explores the use of data from a network of catchments under homogeneous land-use to generate parameter "libraries" to extent the land surface parameterizations of JULES. The network (called iMHEA) is part of a grassroots initiative to characterise the hydrological response of different Andean ecosystems, and collects data on streamflow, precipitation, and several weather variables at a high temporal resolution. The tropical Andes are a useful case study because of the complexity of meteorological and geographical conditions combined with extremely heterogeneous land-use that result in a wide range of hydrological responses. We then calibrated JULES for each land-use represented in the iMHEA dataset. For the individual land-use types, the results show improved simulations of streamflow when using the calibrated parameters with respect to default values. In particular, the partitioning between surface and subsurface flows can be improved. But also, on a regional scale, hydrological modelling was greatly benefitted from constraining parameters using such distributed citizen-science generated streamflow data. This study demonstrates the modelling and prediction on regional hydrology by integrating citizen science and land surface model. In the context of hydrological study, the limitation of data scarcity could be solved indeed by using this framework. Improved predictions of such impacts could be leveraged by catchment managers to guide watershed interventions, to evaluate their effectiveness, and to minimize risks.
A Model for Designing Library Instruction for Distance Learning
ERIC Educational Resources Information Center
Rand, Angela Doucet
2013-01-01
Providing library instruction in distance learning environments presents a unique set of challenges for instructional librarians. Innovations in computer-mediated communication and advances in cognitive science research provide the opportunity for designing library instruction that meets a variety of student information seeking needs. Using a…
Creating an Online Library To Support a Virtual Learning Community.
ERIC Educational Resources Information Center
Sandelands, Eric
1998-01-01
International Management Centres (IMC), an independent business school, and Anbar Electronic Intelligence (AEI), a database publisher, have created a virtual library for IMC's virtual business school. Topics discussed include action learning; IMC's partnership with AEI; the virtual university model; designing virtual library resources; and…
Virtual Reference Transcript Analysis: A Few Models.
ERIC Educational Resources Information Center
Smyth, Joanne
2003-01-01
Describes the introduction of virtual, or digital, reference service at the University of New Brunswick libraries. Highlights include analyzing transcripts from LIVE (Library Information in a Virtual Environment); reference question types; ACRL (Association of College and Research Libraries) information literacy competency standards; and the Big 6…
ERIC Educational Resources Information Center
Conable, Gordon
1990-01-01
Discusses investigation of librarians by the FBI (Federal Bureau of Investigation) and the status of the Library Awareness Program. An American Library Association (ALA) resolution opposing FBI activities in libraries and demanding release of individual files to persons subjected to checks is described. A model letter for individuals to use in…
Determining and Communicating the Value of the Special Library.
ERIC Educational Resources Information Center
Matthews, Joseph R.
2003-01-01
Discusses performance measures for libraries that will indicate the goodness of the library and its services. Highlights include a general evaluation model that includes input, process, output, and outcome measures; balanced scorecard approach that includes financial perspectives; focusing on strategy; strategies for change; user criteria for…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.
1997-06-01
Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less
PAL: A Positional Astronomy Library
NASA Astrophysics Data System (ADS)
Jenness, T.; Berry, D. S.
2013-10-01
PAL is a new positional astronomy library written in C that attempts to retain the SLALIB API but is distributed with an open source GPL license. The library depends on the IAU SOFA library wherever a SOFA routine exists and uses the most recent nutation and precession models. Currently about 100 of the 200 SLALIB routines are available. Interfaces are also available from Perl and Python. PAL is freely available via github.
NASA Astrophysics Data System (ADS)
Mohamed, Hassan; Lindley, Benjamin; Parks, Geoffrey
2017-01-01
Nuclear data consists of measured or evaluated probabilities of various fundamental physical interactions involving the nuclei of atoms and their properties. Most fluoride salt-cooled high-temperature reactor (FHR) studies that were reviewed do not give detailed information on the data libraries used in their assessments. Therefore, the main objective of this data libraries comparison study is to investigate whether there are any significant discrepancies between main data libraries, namely ENDF/B-VII, JEFF-3.1 and JEF-2.2. Knowing the discrepancies, especially its magnitude, is important and relevant for readers as to whether further cautions are necessary for any future verification or validation processes when modelling an FHR. The study is performed using AMEC's reactor physics software tool, WIMS. The WIMS calculation is simply a 2-D infinite lattice of fuel assembly calculation. The comparison between the data libraries in terms of infinite multiplication factor, kinf and pin power map are presented. Results show that the discrepancy between JEFF-3.1 and ENDF/B-VII libraries is reasonably small but increases as the fuel depletes due to the data libraries uncertainties that are accumulated at each burnup step. Additionally, there are large discrepancies between JEF-2.2 and ENDF/B-VII because of the inadequacy of the JEF-2.2 library.
STAR Library Education Network: a hands-on learning program for libraries and their communities
NASA Astrophysics Data System (ADS)
Dusenbery, P.
2010-12-01
Science and technology are widely recognized as major drivers of innovation and industry (e.g. Rising above the Gathering Storm, 2006). While the focus for education reform is on school improvement, there is considerable research that supports the role that out-of-school experiences can play in student achievement and public understanding of STEM disciplines. Libraries provide an untapped resource for engaging underserved youth and their families in fostering an appreciation and deeper understanding of science and technology topics. Designed spaces, like libraries, allow lifelong, life-wide, and life-deep learning to take place though the research basis for learning in libraries is not as developed as other informal settings like science centers. The Space Science Institute’s National Center for Interactive Learning (NCIL) in partnership with the American Library Association (ALA), the Lunar and Planetary Institute (LPI), and the National Girls Collaborative Project (NGCP) have received funding from NSF to develop a national education project called the STAR Library Education Network: a hands-on learning program for libraries and their communities (or STAR-Net for short). STAR stands for Science-Technology, Activities and Resources. The overarching goal of the project is to reach underserved youth and their families with informal STEM learning experiences. This project will deepen our knowledge of informal/lifelong learning that takes place in libraries and establish a learning model that can be compared to the more established free-choice learning model for science centers and museums. The project includes the development of two STEM hands-on exhibits on topics that are of interest to library staff and their patrons: Discover Earth and Discover Tech. In addition, the project will produce resources and inquiry-based activities that libraries can use to enrich the exhibit experience. Additional resources will be provided through partnerships with relevant professional science and technology organizations (e.g. American Geophysical Union; National Academy of Engineering) that will provide speakers for host library events and webinars. Online and in-person workshops will be conducted for library staff with a focus on increasing content knowledge and improving facilitation expertise. This presentation will report on strategic planning activities for STAR-Net, a Community of Practice model, and the evaluation/research components of this national education program.
NASA Astrophysics Data System (ADS)
Kordy, M. A.; Wannamaker, P. E.; Maris, V.; Cherkaev, E.; Hill, G. J.
2014-12-01
We have developed an algorithm for 3D simulation and inversion of magnetotelluric (MT) responses using deformable hexahedral finite elements that permits incorporation of topography. Direct solvers parallelized on symmetric multiprocessor (SMP), single-chassis workstations with large RAM are used for the forward solution, parameter jacobians, and model update. The forward simulator, jacobians calculations, as well as synthetic and real data inversion are presented. We use first-order edge elements to represent the secondary electric field (E), yielding accuracy O(h) for E and its curl (magnetic field). For very low frequency or small material admittivity, the E-field requires divergence correction. Using Hodge decomposition, correction may be applied after the forward solution is calculated. It allows accurate E-field solutions in dielectric air. The system matrix factorization is computed using the MUMPS library, which shows moderately good scalability through 12 processor cores but limited gains beyond that. The factored matrix is used to calculate the forward response as well as the jacobians of field and MT responses using the reciprocity theorem. Comparison with other codes demonstrates accuracy of our forward calculations. We consider a popular conductive/resistive double brick structure and several topographic models. In particular, the ability of finite elements to represent smooth topographic slopes permits accurate simulation of refraction of electromagnetic waves normal to the slopes at high frequencies. Run time tests indicate that for meshes as large as 150x150x60 elements, MT forward response and jacobians can be calculated in ~2.5 hours per frequency. For inversion, we implemented data space Gauss-Newton method, which offers reduction in memory requirement and a significant speedup of the parameter step versus model space approach. For dense matrix operations we use tiling approach of PLASMA library, which shows very good scalability. In synthetic inversions we examine the importance of including the topography in the inversion and we test different regularization schemes using weighted second norm of model gradient as well as inverting for a static distortion matrix following Miensopust/Avdeeva approach. We also apply our algorithm to invert MT data collected at Mt St Helens.
Foight, Glenna Wink; Chen, T. Scott; Richman, Daniel; Keating, Amy E.
2017-01-01
Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model. PMID:28236241
Foight, Glenna Wink; Chen, T Scott; Richman, Daniel; Keating, Amy E
2017-01-01
Peptide reagents with high affinity or specificity for their target protein interaction partner are of utility for many important applications. Optimization of peptide binding by screening large libraries is a proven and powerful approach. Libraries designed to be enriched in peptide sequences that are predicted to have desired affinity or specificity characteristics are more likely to yield success than random mutagenesis. We present a library optimization method in which the choice of amino acids to encode at each peptide position can be guided by available experimental data or structure-based predictions. We discuss how to use analysis of predicted library performance to inform rounds of library design. Finally, we include protocols for more complex library design procedures that consider the chemical diversity of the amino acids at each peptide position and optimize a library score based on a user-specified input model.
Brassolatti, Patricia; de Andrade, Ana Laura Martins; Bossini, Paulo Sérgio; Otterço, Albaiza Nicoletti; Parizotto, Nivaldo Antônio
2018-05-05
Burn is defined as a traumatic injury of thermal origin, which affects the organic tissue. Low-level laser therapy (LLLT) has gained great prominence as a treatment in this type of injury; however, the application parameters are still controversial in the literature. The aims of this study were to review the literature studies that use LLLT as a treatment in burns conducted in an experimental model, discuss the main parameters used, and highlight the benefits found in order to choose an appropriate therapeutic window to be applied in this type of injury. The selection of the studies related to the theme was carried out in the main databases (PubMed, Cochrane Library, LILACS, Web of Science, and Scopus in the period from 2001 to 2017). Subsequently, the articles were then chosen that fell within the inclusion criteria previously established. In the end, 22 were evaluated, and the main parameters were presented. The analyzed studies presented both LLLT use in continuous and pulsed mode. Differences between the parameters used (power, fluence, and total energy) were observed. In addition, the protocols are distinct as to the type of injury and the number of treatment sessions. Among the results obtained by the authors are the improvements in the local microcirculation and cellular proliferation; however, a study reported no effects with LLLT as a treatment. LLLT is effective in accelerating the healing process. However, there is immense difficulty in establishing the most adequate protocol, due to the great discrepancy found in the applied dosimetry values.
Reproducible Hydrogeophysical Inversions through the Open-Source Library pyGIMLi
NASA Astrophysics Data System (ADS)
Wagner, F. M.; Rücker, C.; Günther, T.
2017-12-01
Many tasks in applied geosciences cannot be solved by a single measurement method and require the integration of geophysical, geotechnical and hydrological methods. In the emerging field of hydrogeophysics, researchers strive to gain quantitative information on process-relevant subsurface parameters by means of multi-physical models, which simulate the dynamic process of interest as well as its geophysical response. However, such endeavors are associated with considerable technical challenges, since they require coupling of different numerical models. This represents an obstacle for many practitioners and students. Even technically versatile users tend to build individually tailored solutions by coupling different (and potentially proprietary) forward simulators at the cost of scientific reproducibility. We argue that the reproducibility of studies in computational hydrogeophysics, and therefore the advancement of the field itself, requires versatile open-source software. To this end, we present pyGIMLi - a flexible and computationally efficient framework for modeling and inversion in geophysics. The object-oriented library provides management for structured and unstructured meshes in 2D and 3D, finite-element and finite-volume solvers, various geophysical forward operators, as well as Gauss-Newton based frameworks for constrained, joint and fully-coupled inversions with flexible regularization. In a step-by-step demonstration, it is shown how the hydrogeophysical response of a saline tracer migration can be simulated. Tracer concentration data from boreholes and measured voltages at the surface are subsequently used to estimate the hydraulic conductivity distribution of the aquifer within a single reproducible Python script.
Open Source Tools for Seismicity Analysis
NASA Astrophysics Data System (ADS)
Powers, P.
2010-12-01
The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.
Ervik, Åsmund; Mejía, Andrés; Müller, Erich A
2016-09-26
Coarse-grained molecular simulation has become a popular tool for modeling simple and complex fluids alike. The defining aspects of a coarse grained model are the force field parameters, which must be determined for each particular fluid. Because the number of molecular fluids of interest in nature and in engineering processes is immense, constructing force field parameter tables by individually fitting to experimental data is a futile task. A step toward solving this challenge was taken recently by Mejía et al., who proposed a correlation that provides SAFT-γ Mie force field parameters for a fluid provided one knows the critical temperature, the acentric factor and a liquid density, all relatively accessible properties. Building on this, we have applied the correlation to more than 6000 fluids, and constructed a web application, called "Bottled SAFT", which makes this data set easily searchable by CAS number, name or chemical formula. Alternatively, the application allows the user to calculate parameters for components not present in the database. Once the intermolecular potential has been found through Bottled SAFT, code snippets are provided for simulating the desired substance using the "raaSAFT" framework, which leverages established molecular dynamics codes to run the simulations. The code underlying the web application is written in Python using the Flask microframework; this allows us to provide a modern high-performance web app while also making use of the scientific libraries available in Python. Bottled SAFT aims at taking the complexity out of obtaining force field parameters for a wide range of molecular fluids, and facilitates setting up and running coarse-grained molecular simulations. The web application is freely available at http://www.bottledsaft.org . The underlying source code is available on Bitbucket under a permissive license.
NASA Astrophysics Data System (ADS)
Brockmann, J. M.; Schuh, W.-D.
2011-07-01
The estimation of the global Earth's gravity field parametrized as a finite spherical harmonic series is computationally demanding. The computational effort depends on the one hand on the maximal resolution of the spherical harmonic expansion (i.e. the number of parameters to be estimated) and on the other hand on the number of observations (which are several millions for e.g. observations from the GOCE satellite missions). To circumvent these restrictions, a massive parallel software based on high-performance computing (HPC) libraries as ScaLAPACK, PBLAS and BLACS was designed in the context of GOCE HPF WP6000 and the GOCO consortium. A prerequisite for the use of these libraries is that all matrices are block-cyclic distributed on a processor grid comprised by a large number of (distributed memory) computers. Using this set of standard HPC libraries has the benefit that once the matrices are distributed across the computer cluster, a huge set of efficient and highly scalable linear algebra operations can be used.
A Geometric Model for Specularity Prediction on Planar Surfaces with Multiple Light Sources.
Morgand, Alexandre; Tamaazousti, Mohamed; Bartoli, Adrien
2018-05-01
Specularities are often problematic in computer vision since they impact the dynamic range of the image intensity. A natural approach would be to predict and discard them using computer graphics models. However, these models depend on parameters which are difficult to estimate (light sources, objects' material properties and camera). We present a geometric model called JOLIMAS: JOint LIght-MAterial Specularity, which predicts the shape of specularities. JOLIMAS is reconstructed from images of specularities observed on a planar surface. It implicitly includes light and material properties, which are intrinsic to specularities. This model was motivated by the observation that specularities have a conic shape on planar surfaces. The conic shape is obtained by projecting a fixed quadric on the planar surface. JOLIMAS thus predicts the specularity using a simple geometric approach with static parameters (object material and light source shape). It is adapted to indoor light sources such as light bulbs and fluorescent lamps. The prediction has been tested on synthetic and real sequences. It works in a multi-light context by reconstructing a quadric for each light source with special cases such as lights being switched on or off. We also used specularity prediction for dynamic retexturing and obtained convincing rendering results. Further results are presented as supplementary video material, which can be found on the Computer Society Digital Library at http://doi.ieeecomputersociety.org/10.1109/TVCG.2017.2677445.
Multimedia in German Libraries--Aspects of Cooperation and Integration.
ERIC Educational Resources Information Center
Cremer, Monika
This paper on multimedia in German libraries begins with an introduction to multimedia. Initiatives of the federal government and in the Laender (federal states) are then described, including: a 1997 symposium organized by the university library of Goettingen that presented several multimedia models developed in universities; the multimedia…
Circulation Clusters--An Empirical Approach to Decentralization of Academic Libraries.
ERIC Educational Resources Information Center
McGrath, William E.
1986-01-01
Discusses the issue of centralization or decentralization of academic library collections, and describes a statistical analysis of book circulation at the University of Southwestern Louisiana that yielded subject area clusters as a compromise solution to the problem. Applications of the cluster model for all types of library catalogs are…
Reviews, Holdings, and Presses and Publishers in Academic Library Book Acquisitions.
ERIC Educational Resources Information Center
Calhoun, John C.
2001-01-01
Discussion of academic library book acquisition reviews pertinent literature on book reviews, book selection, and evaluation and proposes a model for academic library book acquisition using a two-year relational database file of approval plan records. Defines a core collection for the California State University system, and characterizes…
Five Aspects of Current Trends in German Library Science
ERIC Educational Resources Information Center
Steierwald, Ulrike
2006-01-01
The specialisation Library Science at the Hochschule Darmstadt/University of Applied Science Darmstadt is the newest academic program in Germany for the higher education of librarians. Five current trends in library science in Germany reflect the new "Darmstadt Model": (1) The delimitation of a specific professional field…
Competency-Based Education Programs: A Library Perspective
ERIC Educational Resources Information Center
Sanders, Colleen
2015-01-01
Competency-based education (CBE) is an emerging model for higher education designed to reduce certain barriers to educational attainment. This essay describes CBE and the challenges and opportunities for academic librarians desiring to serve students and faculty in Library and Information Management Master of Library Science (MLS) programs. Every…
Convergence and Professional Identity in the Academic Library
ERIC Educational Resources Information Center
Wilson, Kerry M.; Halpin, Eddie
2006-01-01
This paper discusses the effects of operational convergence, and the subsequent growth of the hybrid library model, upon the professional self-identity of academic library staff. The role of professionalism as a concept and motivational driver within contemporary academic librarianship is examined. Main themes of investigation include the extent…
E-Books in Public School Libraries: Are We There Yet?
ERIC Educational Resources Information Center
Rothman, Allison
2017-01-01
Demands for school technology innovations, implementation of 1:1 device models, and increased interest in digital media highlight complicated issues such as funding, equity, and decision making for e-book collection development and programming in school libraries. School librarians considering purchase of e-books for school libraries still cannot…
Librarians and Libraries Supporting Open Access Publishing
ERIC Educational Resources Information Center
Richard, Jennifer; Koufogiannakis, Denise; Ryan, Pam
2009-01-01
As new models of scholarly communication emerge, librarians and libraries have responded by developing and supporting new methods of storing and providing access to information and by creating new publishing support services. This article will examine the roles of libraries and librarians in developing and supporting open access publishing…
Building Bridges: A Research Library Model for Technology-Based Partnerships
ERIC Educational Resources Information Center
Snyder, Carolyn A.; Carter, Howard; Soltys, Mickey
2005-01-01
The nature of technology-based collaboration is affected by the changing goals and priorities, budgetary considerations, staff expertise, and leadership of each of the organizations involved in the partnership. In the context of a national research library, this article will describe Southern Illinois University Carbondale Library Affairs'…
A Management Information System in a Library Environment.
ERIC Educational Resources Information Center
Sutton, Michael J.; Black, John B.
More effective use of diminishing resources was needed to provide the best possible services at the University of Guelph (Ontario, Canada) library. This required the improved decision-making processes of a Library Management Information System (LMIS) to provide systematic information analysis. An information flow model was created, and an…
Hiring and Recruitment Practices in Academic Libraries: Problems and Solutions.
ERIC Educational Resources Information Center
Raschke, Gregory K.
2003-01-01
Academic libraries need to change their recruiting and hiring procedures to stay competitive in today's changing marketplace. To be more competitive and effective in their recruitment and hiring processes, academic libraries must foster manageable internal solutions, look to other professions for effective hiring techniques and models, and employ…
Library 2.0: Service for the Next-Generation Library
ERIC Educational Resources Information Center
Casey, Michael E.; Savastinuk, Laura C.
2006-01-01
Libraries are changing. Funding limits and customer demands are transforming staffing levels, service models, access to resources, and services to the public. Administrators and taxpayers are seeking more efficient ways of delivering services to achieve greater returns on financial investments. In this article, the author discusses the benefits of…
ERIC Educational Resources Information Center
Bensman, Stephen J.
2000-01-01
This speculative historiographic essay attempts to fix the present position of library and information science within the context of the probabilistic revolution that has been encompassing all of science. Comprises a guide to statistical research in library and information science, discussing skewed distributions, biostatistics, stochastic models,…
Wireless Computing in the Library: A Successful Model at St. Louis Community College.
ERIC Educational Resources Information Center
Patton, Janice K.
2001-01-01
Describes the St. Louis Community College (Missouri) library's use of laptop computers in the instruction lab as a way to save space and wiring costs. Discusses the pros and cons of wireless library instruction-advantages include its flexibility and its ability to eliminate cabling. (NB)
Organizational Effectiveness in Libraries: A Review and Some Suggestions.
ERIC Educational Resources Information Center
Aversa, Elizabeth
1981-01-01
Reviews some approaches to organizational effectiveness suggested by organizational theorists, reports on the applications of these theories in libraries, develops some hypotheses regarding the assessment of performance in libraries, and describes a model which synthesizes some of the approaches. A 52-item reference list is attached. (Author/JL)
Pricing Models and Payment Schemes for Library Collections.
ERIC Educational Resources Information Center
Stern, David
2002-01-01
Discusses new pricing and payment options for libraries in light of online products. Topics include alternative cost models rather than traditional subscriptions; use-based pricing; changes in scholarly communication due to information technology; methods to determine appropriate charges for different organizations; consortial plans; funding; and…
New Consortial Model for E-Books Acquisitions
ERIC Educational Resources Information Center
Swindler, Luke
2016-01-01
E-books constitute major challenges for library collections generally and present fundamental problems for consortial collection development specifically. The Triangle Research Libraries Network (TRLN) and Oxford University Press (OUP) have created a mutually equitable and financially sustainable model for the consortial acquisition of e-books…
Eliminating traditional reference services in an academic health sciences library: a case study
Schulte, Stephanie J
2011-01-01
Question: How were traditional librarian reference desk services successfully eliminated at one health sciences library? Setting: The analysis was done at an academic health sciences library at a major research university. Method: A gap analysis was performed, evaluating changes in the first eleven months through analysis of reference transaction and instructional session data. Main Results: Substantial increases were seen in the overall number of specialized reference transactions and those conducted by librarians lasting more than thirty minutes. The number of reference transactions overall increased after implementing the new model. Several new small-scale instructional initiatives began, though perhaps not directly related to the new model. Conclusion: Traditional reference desk services were eliminated at one academic health sciences library without negative impact on reference and instructional statistics. Eliminating ties to the confines of the physical library due to staffing reference desk hours removed one significant barrier to a more proactive liaison program. PMID:22022221
SBSI: an extensible distributed software infrastructure for parameter estimation in systems biology.
Adams, Richard; Clark, Allan; Yamaguchi, Azusa; Hanlon, Neil; Tsorman, Nikos; Ali, Shakir; Lebedeva, Galina; Goltsov, Alexey; Sorokin, Anatoly; Akman, Ozgur E; Troein, Carl; Millar, Andrew J; Goryanin, Igor; Gilmore, Stephen
2013-03-01
Complex computational experiments in Systems Biology, such as fitting model parameters to experimental data, can be challenging to perform. Not only do they frequently require a high level of computational power, but the software needed to run the experiment needs to be usable by scientists with varying levels of computational expertise, and modellers need to be able to obtain up-to-date experimental data resources easily. We have developed a software suite, the Systems Biology Software Infrastructure (SBSI), to facilitate the parameter-fitting process. SBSI is a modular software suite composed of three major components: SBSINumerics, a high-performance library containing parallelized algorithms for performing parameter fitting; SBSIDispatcher, a middleware application to track experiments and submit jobs to back-end servers; and SBSIVisual, an extensible client application used to configure optimization experiments and view results. Furthermore, we have created a plugin infrastructure to enable project-specific modules to be easily installed. Plugin developers can take advantage of the existing user-interface and application framework to customize SBSI for their own uses, facilitated by SBSI's use of standard data formats. All SBSI binaries and source-code are freely available from http://sourceforge.net/projects/sbsi under an Apache 2 open-source license. The server-side SBSINumerics runs on any Unix-based operating system; both SBSIVisual and SBSIDispatcher are written in Java and are platform independent, allowing use on Windows, Linux and Mac OS X. The SBSI project website at http://www.sbsi.ed.ac.uk provides documentation and tutorials.
Thermodynamic properties and atomic structure of Ca-based liquid alloys
NASA Astrophysics Data System (ADS)
Poizeau, Sophie
To identify the most promising positive electrodes for Ca-based liquid metal batteries, the thermodynamic properties of diverse Ca-based liquid alloys were investigated. The thermodynamic properties of Ca-Sb alloys were determined by emf measurements. It was found that Sb as positive electrode would provide the highest voltage for Ca-based liquid metal batteries (1 V). The price of such a battery would be competitive for the grid-scale energy storage market. The impact of Pb, a natural impurity of Sb, was predicted successfully and confirmed via electrochemical measurements. It was shown that the impact on the open circuit voltage would be minor. Indeed, the interaction between Ca and Sb was demonstrated to be much stronger than between Ca and Pb using thermodynamic modeling, which explains why the partial thermodynamic properties of Ca would not vary much with the addition of Pb to Sb. However, the usage of the positive electrode would be reduced, which would limit the interest of a Pb-Sb positive electrode. Throughout this work, the molecular interaction volume model (MIVM) was used for the first time for alloys with thermodynamic properties showing strong negative deviation from ideality. This model showed that systems such as Ca-Sb have strong short-range order: Ca is most stable when its first nearest neighbors are Sb. This is consistent with what the more traditional thermodynamic model, the regular association model, would predict. The advantages of the MIVM are the absence of assumption regarding the composition of an associate, and the reduced number of fitting parameters (2 instead of 5). Based on the parameters derived from the thermodynamic modeling using the MIVM, a new potential of mixing for liquid alloys was defined to compare the strength of interaction in different Ca-based alloys. Comparing this trend with the strength of interaction in the solid state of these systems (assessed by the energy of formation of the intermetallics), the systems with the most stable intermetallics were found to have the strongest interaction in the liquid state. Eventually, a new criteria was formulated to select electrode materials for liquid metal batteries. Systems with the most stable intermetallics, which can be evaluated by the enthalpy of formation of these systems, will yield the highest voltage when assembled as positive and negative electrodes in a liquid metal battery. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs@mit.edu)
NASA Astrophysics Data System (ADS)
Burello, E.; Bologa, C.; Frecer, V.; Miertus, S.
Combinatorial chemistry and technologies have been developed to a stage where synthetic schemes are available for generation of a large variety of organic molecules. The innovative concept of combinatorial design assumes that screening of a large and diverse library of compounds will increase the probability of finding an active analogue among the compounds tested. Since the rate at which libraries are screened for activity currently constitutes a limitation to the use of combinatorial technologies, it is important to be selective about the number of compounds to be synthesized. Early experience with combinatorial chemistry indicated that chemical diversity alone did not result in a significant increase in the number of generated lead compounds. Emphasis has therefore been increasingly put on the use of computer assisted combinatorial chemical techniques. Computational methods are valuable in the design of virtual libraries of molecular models. Selection strategies based on computed physicochemical properties of the models or of a target compound are introduced to reduce the time and costs of library synthesis and screening. In addition, computational structure-based library focusing methods can be used to perform in silico screening of the activity of compounds against a target receptor by docking the ligands into the receptor model. Three case studies are discussed dealing with the design of targeted combinatorial libraries of inhibitors of HIV-1 protease, P. falciparum plasmepsin and human urokinase as potential antivirial, antimalarial and anticancer drugs. These illustrate library focusing strategies.
Developing Information Power Grid Based Algorithms and Software
NASA Technical Reports Server (NTRS)
Dongarra, Jack
1998-01-01
This exploratory study initiated our effort to understand performance modeling on parallel systems. The basic goal of performance modeling is to understand and predict the performance of a computer program or set of programs on a computer system. Performance modeling has numerous applications, including evaluation of algorithms, optimization of code implementations, parallel library development, comparison of system architectures, parallel system design, and procurement of new systems. Our work lays the basis for the construction of parallel libraries that allow for the reconstruction of application codes on several distinct architectures so as to assure performance portability. Following our strategy, once the requirements of applications are well understood, one can then construct a library in a layered fashion. The top level of this library will consist of architecture-independent geometric, numerical, and symbolic algorithms that are needed by the sample of applications. These routines should be written in a language that is portable across the targeted architectures.
The Managerial Roles of Academic Library Directors: The Mintzberg Model.
ERIC Educational Resources Information Center
Moskowitz, Michael Ann
1986-01-01
A study based on a model developed by Henry Mintzberg examined the internal and external managerial roles of 126 New England college and university library directors. Survey results indicate that the 97 responding directors were primarily involved with internal managerial roles and work contacts. (CDD)
Air Pollution and Quality of Sperm: A Meta-Analysis
Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin
2015-01-01
Context: Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. Evidence Acquisition: The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. Results: The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. Conclusions: The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram. PMID:26023349
Air pollution and quality of sperm: a meta-analysis.
Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin
2015-04-01
Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram.
n+235U resonance parameters and neutron multiplicities in the energy region below 100 eV
NASA Astrophysics Data System (ADS)
Pigni, Marco T.; Capote, Roberto; Trkov, Andrej; Pronyaev, Vladimir G.
2017-09-01
In August 2016, following the recent effort within the Collaborative International Evaluated Library Organization (CIELO) pilot project to improve the neutron cross sections of 235U, Oak Ridge National Laboratory (ORNL) collaborated with the International Atomic Energy Agency (IAEA) to release a resonance parameter evaluation. This evaluation restores the performance of the evaluated cross sections for the thermal- and above-thermal-solution benchmarks on the basis of newly evaluated thermal neutron constants (TNCs) and thermal prompt fission neutron spectra (PFNS). Performed with support from the US Nuclear Criticality Safety Program (NCSP) in an effort to provide the highest fidelity general purpose nuclear database for nuclear criticality applications, the resonance parameter evaluation was submitted as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The resonance parameter evaluation methodology used the Reich-Moore approximation of the R-matrix formalism implemented in the code SAMMY to fit the available time-of-flight (TOF) measured data for the thermal induced cross section of n+235U up to 100 eV. While maintaining reasonably good agreement with the experimental data, the validation analysis focused on restoring the benchmark performance for 235U solutions by combining changes to the resonance parameters and to the prompt resonance v̅ below 100 eV.
NASA Astrophysics Data System (ADS)
Kuntoro, Iman; Sembiring, T. M.; Susilo, Jati; Deswandri; Sunaryo, G. R.
2018-02-01
Calculations of criticality of the AP1000 core due to the use of new edition of nuclear data library namely ENDF/B-VII and ENDF/B-VII.1 have been done. This work is aimed to know the accuracy of ENDF/B-VII.1 compared to ENDF/B-VII and ENDF/B-VI.8. in determining the criticality parameter of AP1000. Analysis ws imposed to core at cold zero power (CZP) conditions. The calculations have been carried out by means of MCNP computer code for 3 dimension geometry. The results show that criticality parameter namely effective multiplication factor of the AP1000 core are higher than that ones resulted from ENDF/B-VI.8 with relative differences of 0.39% for application of ENDF/B-VII and of 0.34% for application of ENDF/B-VII.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arion is a library and tool set that enables researchers to holistically define test system models. To define a complex system for testing an algorithm or control requires expertise across multiple domains. Simulating a complex system requires the integration of multiple simulators and test hardware, each with their own specification languages and concepts. This requires extensive set of knowledge and capabilities. Arion was developed to alleviate this challenge. Arion is a library of Java libraries that abstracts the concepts from supported simulators into a cohesive model language that allows someone to build models to their needed level of fidelity andmore » expertise. Arion is also a software tool that translates the users model back into the specification languages of the simulators and test hardware needed for execution.« less
Discrete event simulation tool for analysis of qualitative models of continuous processing systems
NASA Technical Reports Server (NTRS)
Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)
1990-01-01
An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.
Simulation of rarefied low pressure RF plasma flow around the sample
NASA Astrophysics Data System (ADS)
Zheltukhin, V. S.; Shemakhin, A. Yu
2017-01-01
The paper describes a mathematical model of the flow of radio frequency plasma at low pressure. The hybrid mathematical model includes the Boltzmann equation for the neutral component of the RF plasma, the continuity and the thermal equations for the charged component. Initial and boundary conditions for the corresponding equations are described. The electron temperature in the calculations is 1-4 eV, atoms temperature in the plasma clot is (3-4) • 103 K, in the plasma jet is (3.2-10) • 102 K, the degree of ionization is 10-7-10-5, electron density is 1015-1019 m-3. For calculations plasma parameters is developed soft package on C++ program language, that uses the OpenFOAM library package. Simulations for the vacuum chamber in the presence of a sample and the free jet flow were carried out.
TAP 2: A finite element program for thermal analysis of convectively cooled structures
NASA Technical Reports Server (NTRS)
Thornton, E. A.
1980-01-01
A finite element computer program (TAP 2) for steady-state and transient thermal analyses of convectively cooled structures is presented. The program has a finite element library of six elements: two conduction/convection elements to model heat transfer in a solid, two convection elements to model heat transfer in a fluid, and two integrated conduction/convection elements to represent combined heat transfer in tubular and plate/fin fluid passages. Nonlinear thermal analysis due to temperature-dependent thermal parameters is performed using the Newton-Raphson iteration method. Transient analyses are performed using an implicit Crank-Nicolson time integration scheme with consistent or lumped capacitance matrices as an option. Program output includes nodal temperatures and element heat fluxes. Pressure drops in fluid passages may be computed as an option. User instructions and sample problems are presented in appendixes.
Mr.CAS-A minimalistic (pure) Ruby CAS for fast prototyping and code generation
NASA Astrophysics Data System (ADS)
Ragni, Matteo
There are Computer Algebra System (CAS) systems on the market with complete solutions for manipulation of analytical models. But exporting a model that implements specific algorithms on specific platforms, for target languages or for particular numerical library, is often a rigid procedure that requires manual post-processing. This work presents a Ruby library that exposes core CAS capabilities, i.e. simplification, substitution, evaluation, etc. The library aims at programmers that need to rapidly prototype and generate numerical code for different target languages, while keeping separated mathematical expression from the code generation rules, where best practices for numerical conditioning are implemented. The library is written in pure Ruby language and is compatible with most Ruby interpreters.
Evaluating digital libraries in the health sector. Part 2: measuring impacts and outcomes.
Cullen, Rowena
2004-03-01
This is the second part of a two-part paper which explores methods that can be used to evaluate digital libraries in the health sector. Part 1 focuses on approaches to evaluation that have been proposed for mainstream digital information services. This paper investigates evaluative models developed for some innovative digital library projects, and some major national and international electronic health information projects. The value of ethnographic methods to provide qualitative data to explore outcomes, adding to quantitative approaches based on inputs and outputs is discussed. The paper concludes that new 'post-positivist' models of evaluation are needed to cover all the dimensions of the digital library in the health sector, and some ways of doing this are outlined.
ERIC Educational Resources Information Center
Dewey, Patrick R.
1986-01-01
The history of patron access microcomputers in libraries is described as carrying on a tradition that information and computer power should be shared. Questions that all types of libraries need to ask in planning microcomputer centers are considered and several model centers are described. (EM)
ERIC Educational Resources Information Center
McGervey, Teresa
2000-01-01
Discusses the concept of Earth's Largest Library (ELL), a mega-virtual library based on the Amazon.com model. Topics include who will be included; privacy; censorship; scope of the collection; costs; legal aspects; collection development; personnel management; access; the concept of community; public service; lending policies; technical…
Quality Management and Self Assessment Tools for Public Libraries.
ERIC Educational Resources Information Center
Evans, Margaret Kinnell
This paper describes a two-year study by the British Library Research and Innovation Centre that examined the potential of self-assessment for public library services. The approaches that formed the basis for the investigation were the Business Excellence Model, the Quality Framework, and the Democratic Approach. Core values were identified by…
Personnel for Research Libraries; Qualifications, Responsibilities and Use. Final Report.
ERIC Educational Resources Information Center
Clark, Philip M.
The project was conceived to examine the current manpower situation in research libraries and to develop a methodological model for projecting future personnel needs. Eight academic research libraries were selected for investigation and three instruments developed to gather data toward these ends. A personal interview format was used to interview…
Personalized Boutique Service: Critical to Academic Library Success?
ERIC Educational Resources Information Center
Tilley, Elizabeth
2013-01-01
An academic library that focuses on delivering a personalized service is examined within the context of the boutique library model. It is suggested that a critical success factor in adopting a personalized, boutique-style service is acquiring knowledge and insight of our users. This, together with appropriate evaluation, will assist with providing…
ERIC Educational Resources Information Center
Chao, Sheau-yueh J.; Evans, Beth; Phillips, Ryan; Polger, Mark Aaron; Posner, Beth; Sexton, Ellen
2013-01-01
This paper describes the City University of New York (CUNY)-Shanghai Librarian Faculty Exchange Program. By observing and working in academic library services at CUNY, Shanghai University (SU), and Shanghai Normal University (SNU), participants were able to teach and learn from their colleagues, bringing their experiences back to further share…
Circulating a Good Service Model at Its Core: Circulation!
ERIC Educational Resources Information Center
Hernandez, Edmee Sofia; Germain, Carol Anne, Ed.
2009-01-01
Circulation is the library's tireless foot soldier: it serves as the front gate to the library's services and resources. This service point is where most patrons enter and leave; and experience their first and last impressions--impressions that linger. In an age when academic libraries are facing meager budgets and declining usage statistics, this…
Dispensing with the DVD Circulation Dilemma
ERIC Educational Resources Information Center
Ellis, Mark
2008-01-01
Richmond Public Library (RPL) is a four-branch suburban library with the highest per capita circulation of any comparable library in Canada. While DVDs naturally fit into RPL's emphasis on popular material, circulating them using the standard model proved problematic: Long hold queues built up, DVDs idled on the hold shelves, and the circulation…
A Basic Hybrid Library Support Model to Distance Learners in Sudan
ERIC Educational Resources Information Center
Abdelrahman, Omer Hassan
2012-01-01
Distance learning has flourished in Sudan during the last two decades; more and more higher education institutions offer distance learning programmes to off-campus students. Like on-campus students, distance learners should have access to appropriate library and information support services. They also have specific needs for library and…
Questioning LibQUAL+[TM]: Expanding Its Assessment of Academic Library Effectiveness
ERIC Educational Resources Information Center
Edgar, William B.
2006-01-01
This article examines LibQUAL+[TM]'s instrument, fundamental assumption, and research approach and proposes a functional/technical model of academic library effectiveness. This expanded view of library effectiveness complements LibQUAL+[TM], emphasizing it to be dependent upon users' experience of service delivery, as LibQUAL+[TM] recognizes.…
ERIC Educational Resources Information Center
Tuai, Cameron K.
2011-01-01
The integration of librarians and technologists to deliver information services represents a new and potentially costly organizational challenge for many library administrators. To understand better how to control the costs of integration, the research presented here will use structural contingency theory to study the coordination of librarians…
Social Work Information Center 2.0: A Case Study
ERIC Educational Resources Information Center
Xu, F. Grace
2009-01-01
The social work library at USC provides a case study of an academic library's transition to an information center service model. Analysis of the collection, user community, Web 2.0 applications, and Web usage data demonstrates how the changes facilitated library services and information literacy instruction. (Contains 6 tables and 3 figures.)
Toward a User-Centered Academic Library Home Page
ERIC Educational Resources Information Center
McHale, Nina
2008-01-01
In the past decade, academic libraries have struggled with the design of an effective library home page. Since librarians' mental models of information architecture differ from those of their patrons, usability assessments are necessary in designing a user-centered home page. This study details a usability sequence of card sort and paper and…
galario: Gpu Accelerated Library for Analyzing Radio Interferometer Observations
NASA Astrophysics Data System (ADS)
Tazzari, Marco; Beaujean, Frederik; Testi, Leonardo
2017-10-01
The galario library exploits the computing power of modern graphic cards (GPUs) to accelerate the comparison of model predictions to radio interferometer observations. It speeds up the computation of the synthetic visibilities given a model image (or an axisymmetric brightness profile) and their comparison to the observations.
Model Acts and Regulations. Metrodocs Monograph Two.
ERIC Educational Resources Information Center
Petty, Johnese, Comp.
Metrodocs, an informal library cooperative in the Oklahoma City Metropolitan area consisting of eight academic and public depository libraries, produced this compilation of model acts, legislation, statutes, and ordinances--i.e., those that have been promulgated in order to satisfy a demand for legislation covering a particular subject in a…
Virtual Libraries: Interactive Support Software and an Application in Chaotic Models.
ERIC Educational Resources Information Center
Katsirikou, Anthi; Skiadas, Christos; Apostolou, Apostolos; Rompogiannakis, Giannis
This paper begins with a discussion of the characteristics and the singularity of chaotic systems, including dynamic systems theory, chaotic orbit, fractals, chaotic attractors, and characteristics of chaotic systems. The second section addresses the digital libraries (DL) concept and the appropriateness of chaotic models, including definition and…
Virtual Reference, Real Money: Modeling Costs in Virtual Reference Services
ERIC Educational Resources Information Center
Eakin, Lori; Pomerantz, Jeffrey
2009-01-01
Libraries nationwide are in yet another phase of belt tightening. Without an understanding of the economic factors that influence library operations, however, controlling costs and performing cost-benefit analyses on services is difficult. This paper describes a project to develop a cost model for collaborative virtual reference services. This…
NASA Astrophysics Data System (ADS)
Chen, Xuelong; Su, Bob
2017-04-01
Remote sensing has provided us an opportunity to observe Earth land surface with a much higher resolution than any of GCM simulation. Due to scarcity of information for land surface physical parameters, up-to-date GCMs still have large uncertainties in the coupled land surface process modeling. One critical issue is a large amount of parameters used in their land surface models. Thus remote sensing of land surface spectral information can be used to provide information on these parameters or assimilated to decrease the model uncertainties. Satellite imager could observe the Earth land surface with optical, thermal and microwave bands. Some basic Earth land surface status (land surface temperature, canopy height, canopy leaf area index, soil moisture etc.) has been produced with remote sensing technique, which already help scientists understanding Earth land and atmosphere interaction more precisely. However, there are some challenges when applying remote sensing variables to calculate global land-air heat and water exchange fluxes. Firstly, a global turbulent exchange parameterization scheme needs to be developed and verified, especially for global momentum and heat roughness length calculation with remote sensing information. Secondly, a compromise needs to be innovated to overcome the spatial-temporal gaps in remote sensing variables to make the remote sensing based land surface fluxes applicable for GCM model verification or comparison. A flux network data library (more 200 flux towers) was collected to verify the designed method. Important progress in remote sensing of global land flux and evaporation will be presented and its benefits for GCM models will also be discussed. Some in-situ studies on the Tibetan Plateau and problems of land surface process simulation will also be discussed.
Study of XAFS of some Fe compounds and determination of first shell radial distance
NASA Astrophysics Data System (ADS)
Parsai, Neetu; Mishra, Ashutosh
2017-05-01
X-ray absorption fine structure (XAFS) of some Fe compounds have been studied using the latest XAFS analysis software Demeter with Strawberry Perl. The processed XAFS data of the Fe compounds have been taken from available model compound library. The XAFS data have been processed to plot the µ(E) verses E spectra. These spectra have been converted into K-space, R-space and q-space. R-space spectra have been used to obtain first shell radial distance in Fe compounds. Structural parameters like first shell radial distance is useful in determination of bond length in Fe compounds. Hence the study play important role in biological applications.
Tools for Modeling & Simulation of Molecular and Nanoelectronics Devices
2012-06-14
implemented a prototype DFT simulation software using two different open source Finite Element (FE) libraries: DEALII and FENICS . These two libraries have been...ATK. In the first part of this Phase I project we investigated two different candidate finite element libraries, DEAL II and FENICS . Although both...element libraries, Deal.II and FEniCS /dolfin, for use as back-ends to a finite element DFT in ATK, Quantum Insight and QuantumWise A/S, October 2011.
ERIC Educational Resources Information Center
Bollen, Johan; Vemulapalli, Soma Sekara; Xu, Weining; Luce, Rick; Marcum, Deanna; Friedlander, Amy; Tenopir, Carol; Grayson, Matt; Zhang, Yan; Ebuen, Mercy; King, Donald W.; Boyce, Peter; Rogers, Clare; Kirriemuir, John; Tanner, Simon; Deegan, Marilyn; Marcum, James W.
2003-01-01
Includes six articles that discuss use analysis and research trends in digital libraries; library history and digital preservation; journal use by scientists; a content management system-based Web site for higher education in the United Kingdom; cost studies for transitioning to digitized collections in European cultural institutions; and the…
Simple Activities for Powerful Impact
NASA Astrophysics Data System (ADS)
LaConte, K.; Shupla, C. B.; Dusenbery, P.; Harold, J. B.; Holland, A.
2016-12-01
STEM education is having a transformational impact on libraries across the country. The STAR Library Education Network (STAR_Net) provides free Science-Technology Activities & Resources that are helping libraries to engage their communities in STEM learning experiences. Hear the results of a national 2015 survey of library and STEM professionals and learn what STEM programming is currently in place in public libraries and how libraries approach and implement STEM programs. Experience hands-on space science activities that are being used in library programs with multiple age groups. Through these hands-on activities, learners explore the nature of science and employ science and engineering practices, including developing and using models, planning and carrying out investigations, and engaging in argument from evidence (NGSS Lead States, 2013). Learn how STAR_Net can help you print (free!) mini-exhibits and educator guides. Join STAR_Net's online community and access STEM resources and webinars to work with libraries in your local community.
An innovative use of instant messaging technology to support a library's single-service point.
Horne, Andrea S; Ragon, Bart; Wilson, Daniel T
2012-01-01
A library service model that provides reference and instructional services by summoning reference librarians from a single service point is described. The system utilizes Libraryh3lp, an open-source, multioperator instant messaging system. The selection and refinement of this solution and technical challenges encountered are explored, as is the design of public services around this technology, usage of the system, and best practices. This service model, while a major cultural and procedural change at first, is now a routine aspect of customer service for this library.
Sustaining librarian vitality: embedded librarianship model for health sciences libraries.
Wu, Lin; Mi, Misa
2013-01-01
With biomedical information widely accessible from anywhere at any time, health sciences libraries have become less centralized, and they are challenged to stay relevant and vital to the mission and strategic goals of their home institution. One solution is to embed librarians at strategic points in health professions' education, research, and patient care. This article discusses a proposed five-level model of embedded librarianship within the context of health sciences libraries and describes different roles, knowledge, and skills desirable for health sciences librarians working as embedded librarians.
2007-12-01
model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality. 15. NUMBER OF...and a Behavioral model. Finally, we build a small agent-based model using the component architecture to demonstrate the library’s functionality...prototypes an architectural design which is generalizable, reusable, and extensible. We have created an initial set of model elements that demonstrate
NASA Astrophysics Data System (ADS)
Verbeke, Jérôme M.; Petit, Odile; Chebboubi, Abdelhazize; Litaize, Olivier
2018-01-01
Fission modeling in general-purpose Monte Carlo transport codes often relies on average nuclear data provided by international evaluation libraries. As such, only average fission multiplicities are available and correlations between fission neutrons and photons are missing. Whereas uncorrelated fission physics is usually sufficient for standard reactor core and radiation shielding calculations, correlated fission secondaries are required for specialized nuclear instrumentation and detector modeling. For coincidence counting detector optimization for instance, precise simulation of fission neutrons and photons that remain correlated in time from birth to detection is essential. New developments were recently integrated into the Monte Carlo transport code TRIPOLI-4 to model fission physics more precisely, the purpose being to access event-by-event fission events from two different fission models: FREYA and FIFRELIN. TRIPOLI-4 simulations can now be performed, either by connecting via an API to the LLNL fission library including FREYA, or by reading external fission event data files produced by FIFRELIN beforehand. These new capabilities enable us to easily compare results from Monte Carlo transport calculations using the two fission models in a nuclear instrumentation application. In the first part of this paper, broad underlying principles of the two fission models are recalled. We then present experimental measurements of neutron angular correlations for 252Cf(sf) and 240Pu(sf). The correlations were measured for several neutron kinetic energy thresholds. In the latter part of the paper, simulation results are compared to experimental data. Spontaneous fissions in 252Cf and 240Pu are modeled by FREYA or FIFRELIN. Emitted neutrons and photons are subsequently transported to an array of scintillators by TRIPOLI-4 in analog mode to preserve their correlations. Angular correlations between fission neutrons obtained independently from these TRIPOLI-4 simulations, using either FREYA or FIFRELIN, are compared to experimental results. For 240Pu(sf), the measured correlations were used to tune the model parameters.
A Python Interface for the Dakota Iterative Systems Analysis Toolkit
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E.; Syvitski, J. P.
2016-12-01
Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.
NASA Astrophysics Data System (ADS)
Wankhede, Mamta
Functional vasculature is vital for tumor growth, proliferation, and metastasis. Many tumor-specific vascular targeting agents (VTAs) aim to destroy this essential tumor vasculature to induce indirect tumor cell death via oxygen and nutrition deprivation. The tumor angiogenesis-inhibiting anti-angiogenics (AIs) and the established tumor vessel targeting vascular disrupting agents (VDAs) are the two major players in the vascular targeting field. Combination of VTAs with conventional therapies or with each other, have been shown to have additive or supra-additive effects on tumor control and treatment. Pathophysiological changes post-VTA treatment in terms of structural and vessel function changes are important parameters to characterize the treatment efficacy. Despite the abundance of information regarding these parameters acquired using various techniques, there remains a need for a quantitative, real-time, and direct observation of these phenomenon in live animals. Through this research we aspired to develop a spectral imaging based mouse tumor system for real-time in vivo microvessel structure and functional measurements for VTA characterization. A model tumor system for window chamber studies was identified, and then combinatorial effects of VDA and AI were characterized in model tumor system. (Full text of this dissertation may be available via the University of Florida Libraries web site. Please check http://www.uflib.ufl.edu/etd.html)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Da Cruz, D. F.; Rochman, D.; Koning, A. J.
2012-07-01
This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {supmore » 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)« less
Statistical molecular design of balanced compound libraries for QSAR modeling.
Linusson, A; Elofsson, M; Andersson, I E; Dahlgren, M K
2010-01-01
A fundamental step in preclinical drug development is the computation of quantitative structure-activity relationship (QSAR) models, i.e. models that link chemical features of compounds with activities towards a target macromolecule associated with the initiation or progression of a disease. QSAR models are computed by combining information on the physicochemical and structural features of a library of congeneric compounds, typically assembled from two or more building blocks, and biological data from one or more in vitro assays. Since the models provide information on features affecting the compounds' biological activity they can be used as guides for further optimization. However, in order for a QSAR model to be relevant to the targeted disease, and drug development in general, the compound library used must contain molecules with balanced variation of the features spanning the chemical space believed to be important for interaction with the biological target. In addition, the assays used must be robust and deliver high quality data that are directly related to the function of the biological target and the associated disease state. In this review, we discuss and exemplify the concept of statistical molecular design (SMD) in the selection of building blocks and final synthetic targets (i.e. compounds to synthesize) to generate information-rich, balanced libraries for biological testing and computation of QSAR models.