Sample records for density parameter library

  1. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    NASA Astrophysics Data System (ADS)

    Capote, R.; Herman, M.; Obložinský, P.; Young, P. G.; Goriely, S.; Belgya, T.; Ignatyuk, A. V.; Koning, A. J.; Hilaire, S.; Plujko, V. A.; Avrigeanu, M.; Bersillon, O.; Chadwick, M. B.; Fukahori, T.; Ge, Zhigang; Han, Yinlu; Kailas, S.; Kopecky, J.; Maslov, V. M.; Reffo, G.; Sin, M.; Soukhovitskii, E. Sh.; Talou, P.

    2009-12-01

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released in January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and γ-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from 51V to 239Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.

  2. RIPL - Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Oblozinsky, P.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through (http://www-nds.iaea.org/RIPL-3/). This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  3. RIPL-Reference Input Parameter Library for Calculation of Nuclear Reactions and Nuclear Data Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Capote, R.; Herman, M.; Capote,R.

    We describe the physics and data included in the Reference Input Parameter Library, which is devoted to input parameters needed in calculations of nuclear reactions and nuclear data evaluations. Advanced modelling codes require substantial numerical input, therefore the International Atomic Energy Agency (IAEA) has worked extensively since 1993 on a library of validated nuclear-model input parameters, referred to as the Reference Input Parameter Library (RIPL). A final RIPL coordinated research project (RIPL-3) was brought to a successful conclusion in December 2008, after 15 years of challenging work carried out through three consecutive IAEA projects. The RIPL-3 library was released inmore » January 2009, and is available on the Web through http://www-nds.iaea.org/RIPL-3/. This work and the resulting database are extremely important to theoreticians involved in the development and use of nuclear reaction modelling (ALICE, EMPIRE, GNASH, UNF, TALYS) both for theoretical research and nuclear data evaluations. The numerical data and computer codes included in RIPL-3 are arranged in seven segments: MASSES contains ground-state properties of nuclei for about 9000 nuclei, including three theoretical predictions of masses and the evaluated experimental masses of Audi et al. (2003). DISCRETE LEVELS contains 117 datasets (one for each element) with all known level schemes, electromagnetic and {gamma}-ray decay probabilities available from ENSDF in October 2007. NEUTRON RESONANCES contains average resonance parameters prepared on the basis of the evaluations performed by Ignatyuk and Mughabghab. OPTICAL MODEL contains 495 sets of phenomenological optical model parameters defined in a wide energy range. When there are insufficient experimental data, the evaluator has to resort to either global parameterizations or microscopic approaches. Radial density distributions to be used as input for microscopic calculations are stored in the MASSES segment. LEVEL DENSITIES contains phenomenological parameterizations based on the modified Fermi gas and superfluid models and microscopic calculations which are based on a realistic microscopic single-particle level scheme. Partial level densities formulae are also recommended. All tabulated total level densities are consistent with both the recommended average neutron resonance parameters and discrete levels. GAMMA contains parameters that quantify giant resonances, experimental gamma-ray strength functions and methods for calculating gamma emission in statistical model codes. The experimental GDR parameters are represented by Lorentzian fits to the photo-absorption cross sections for 102 nuclides ranging from {sup 51}V to {sup 239}Pu. FISSION includes global prescriptions for fission barriers and nuclear level densities at fission saddle points based on microscopic HFB calculations constrained by experimental fission cross sections.« less

  4. DiSCaMB: a software library for aspherical atom model X-ray scattering factor calculations with CPUs and GPUs.

    PubMed

    Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria

    2018-02-01

    It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.

  5. Impact of New Nuclear Data Libraries on Small Sized Long Life CANDLE HTGR Design Parameters

    NASA Astrophysics Data System (ADS)

    Liem, Peng Hong; Hartanto, Donny; Tran, Hoai Nam

    2017-01-01

    The impact of new evaluated nuclear data libraries (JENDL-4.0, ENDF/B-VII.0 and JEFF-3.1) on the core characteristics of small-sized long-life CANDLE High Temperature Gas-Cooled Reactors (HTGRs) with uranium and thorium fuel cycles was investigated. The most important parameters of the CANDLE core characteristics investigated here covered (1) infinite multiplication factor of the fresh fuel containing burnable poison, (2) the effective multiplication factor of the equilibrium core, (3) the moving velocity of the burning region, (4) the attained discharge burnup, and (5) the maximum power density. The reference case was taken from the current JENDL-3.3 results. For the uranium fuel cycle, the impact of the new libraries was small, while significant impact was found for thorium fuel cycle. The findings indicated the needs of more accurate nuclear data libraries for nuclides involved in thorium fuel cycle in the future.

  6. An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence

    NASA Astrophysics Data System (ADS)

    Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras

    2014-05-01

    We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.

  7. Impact of nuclear data on sodium-cooled fast reactor calculations

    NASA Astrophysics Data System (ADS)

    Aures, Alexander; Bostelmann, Friederike; Zwermann, Winfried; Velkov, Kiril

    2016-03-01

    Neutron transport and depletion calculations are performed in combination with various nuclear data libraries in order to assess the impact of nuclear data on safety-relevant parameters of sodium-cooled fast reactors. These calculations are supplemented by systematic uncertainty analyses with respect to nuclear data. Analysed quantities are the multiplication factor and nuclide densities as a function of burn-up and the Doppler and Na-void reactivity coefficients at begin of cycle. While ENDF/B-VII.0 / -VII.1 yield rather consistent results, larger discrepancies are observed between the JEFF libraries. While the newest evaluation, JEFF-3.2, agrees with the ENDF/B-VII libraries, the JEFF-3.1.2 library yields significant larger multiplication factors.

  8. Biomolecular Force Field Parameterization via Atoms-in-Molecule Electron Density Partitioning.

    PubMed

    Cole, Daniel J; Vilseck, Jonah Z; Tirado-Rives, Julian; Payne, Mike C; Jorgensen, William L

    2016-05-10

    Molecular mechanics force fields, which are commonly used in biomolecular modeling and computer-aided drug design, typically treat nonbonded interactions using a limited library of empirical parameters that are developed for small molecules. This approach does not account for polarization in larger molecules or proteins, and the parametrization process is labor-intensive. Using linear-scaling density functional theory and atoms-in-molecule electron density partitioning, environment-specific charges and Lennard-Jones parameters are derived directly from quantum mechanical calculations for use in biomolecular modeling of organic and biomolecular systems. The proposed methods significantly reduce the number of empirical parameters needed to construct molecular mechanics force fields, naturally include polarization effects in charge and Lennard-Jones parameters, and scale well to systems comprised of thousands of atoms, including proteins. The feasibility and benefits of this approach are demonstrated by computing free energies of hydration, properties of pure liquids, and the relative binding free energies of indole and benzofuran to the L99A mutant of T4 lysozyme.

  9. A modern Monte Carlo investigation of the TG-43 dosimetry parameters for an {sup 125}I seed already having AAPM consensus data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aryal, Prakash; Molloy, Janelle A.; Rivard, Mark J., E-mail: mark.j.rivard@gmail.com

    2014-02-15

    Purpose: To investigate potential causes for differences in TG-43 brachytherapy dosimetry parameters in the existent literature for the model IAI-125A{sup 125}I seed and to propose new standard dosimetry parameters. Methods: The MCNP5 code was used for Monte Carlo (MC) simulations. Sensitivity of dose distributions, and subsequently TG-43 dosimetry parameters, was explored to reproduce historical methods upon which American Association of Physicists in Medicine (AAPM) consensus data are based. Twelve simulation conditions varying{sup 125}I coating thickness, coating mass density, photon interaction cross-section library, and photon emission spectrum were examined. Results: Varying{sup 125}I coating thickness, coating mass density, photon cross-section library, andmore » photon emission spectrum for the model IAI-125A seed changed the dose-rate constant by up to 0.9%, about 1%, about 3%, and 3%, respectively, in comparison to the proposed standard value of 0.922 cGy h{sup −1} U{sup −1}. The dose-rate constant values by Solberg et al. [“Dosimetric parameters of three new solid core {sup 125}I brachytherapy sources,” J. Appl. Clin. Med. Phys. 3, 119–134 (2002)], Meigooni et al. [“Experimental and theoretical determination of dosimetric characteristics of IsoAid ADVANTAGE™ {sup 125}I brachytherapy source,” Med. Phys. 29, 2152–2158 (2002)], and Taylor and Rogers [“An EGSnrc Monte Carlo-calculated database of TG-43 parameters,” Med. Phys. 35, 4228–4241 (2008)] for the model IAI-125A seed and Kennedy et al. [“Experimental and Monte Carlo determination of the TG-43 dosimetric parameters for the model 9011 THINSeed™ brachytherapy source,” Med. Phys. 37, 1681–1688 (2010)] for the model 6711 seed were +4.3% (0.962 cGy h{sup −1} U{sup −1}), +6.2% (0.98 cGy h{sup −1} U{sup −1}), +0.3% (0.925 cGy h{sup −1} U{sup −1}), and −0.2% (0.921 cGy h{sup −1} U{sup −1}), respectively, in comparison to the proposed standard value. Differences in the radial dose functions between the current study and both Solberg et al. and Meigooni et al. were <10% for r ≤ 5 cm, and increased for r > 5 cm with a maximum difference of 29% at r = 9 cm. In comparison to Taylor and Rogers, these differences were lower (maximum of 2% at r = 9 cm). For the similarly designed model 6711 {sup 125}I seed, differences did not exceed 0.5% for 0.5 ≤ r ≤ 10 cm. Radial dose function values varied by 1% as coating thickness and coating density were changed. Varying the cross-section library and source spectrum altered the radial dose function by 25% and 12%, respectively, but these differences occurred at r = 10 cm where the dose rates were very low. The 2D anisotropy function results were most similar to those of Solberg et al. and most different to those of Meigooni et al. The observed order of simulation condition variables from most to least important for influencing the 2D anisotropy function was spectrum, coating thickness, coating density, and cross-section library. Conclusions: Several MC radiation transport codes are available for calculation of the TG-43 dosimetry parameters for brachytherapy seeds. The physics models in these codes and their related cross-section libraries have been updated and improved since publication of the 2007 AAPM TG-43U1S1 report. Results using modern data indicated statistically significant differences in these dosimetry parameters in comparison to data recommended in the TG-43U1S1 report. Therefore, it seems that professional societies such as the AAPM should consider reevaluating the consensus data for this and others seeds and establishing a process of regular evaluations in which consensus data are based upon methods that remain state-of-the-art.« less

  10. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  11. Charge-density analysis of a protein structure at subatomic resolution: the human aldose reductase case.

    PubMed

    Guillot, Benoît; Jelsch, Christian; Podjarny, Alberto; Lecomte, Claude

    2008-05-01

    The valence electron density of the protein human aldose reductase was analyzed at 0.66 angstroms resolution. The methodological developments in the software MoPro to adapt standard charge-density techniques from small molecules to macromolecular structures are described. The deformation electron density visible in initial residual Fourier difference maps was significantly enhanced after high-order refinement. The protein structure was refined after transfer of the experimental library multipolar atom model (ELMAM). The effects on the crystallographic statistics, on the atomic thermal displacement parameters and on the structure stereochemistry are analyzed. Constrained refinements of the transferred valence populations Pval and multipoles Plm were performed against the X-ray diffraction data on a selected substructure of the protein with low thermal motion. The resulting charge densities are of good quality, especially for chemical groups with many copies present in the polypeptide chain. To check the effect of the starting point on the result of the constrained multipolar refinement, the same charge-density refinement strategy was applied but using an initial neutral spherical atom model, i.e. without transfer from the ELMAM library. The best starting point for a protein multipolar refinement is the structure with the electron density transferred from the database. This can be assessed by the crystallographic statistical indices, including Rfree, and the quality of the static deformation electron-density maps, notably on the oxygen electron lone pairs. The analysis of the main-chain bond lengths suggests that stereochemical dictionaries would benefit from a revision based on recently determined unrestrained atomic resolution protein structures.

  12. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    NASA Astrophysics Data System (ADS)

    Hartini, Entin; Andiwijayakusuma, Dinan

    2014-09-01

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  13. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartini, Entin, E-mail: entin@batan.go.id; Andiwijayakusuma, Dinan, E-mail: entin@batan.go.id

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuelmore » type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.« less

  14. High Density Data Storage, the SONY Data DiscMan Electronic Book, and the Unfolding Multi-Media Revolution.

    ERIC Educational Resources Information Center

    Kountz, John

    1991-01-01

    Description of high density data storage (HDDS) devices focuses on CD-ROMs and explores their impact on libraries, publishing, education, and library communications. Highlights include costs; technical standards; reading devices; authoring systems; robotics; the influence of new technology on the role of libraries; and royalty and copyright issues…

  15. Wyoming: Open Range for Library Technology.

    ERIC Educational Resources Information Center

    Maul, Helen Meadors

    1996-01-01

    Describes the development of library technology and the need for telecommunications in a state with a lack of population density. Topics include the state library's role; shared library resources and library networks; government information; the Wyoming State Home Page on the World Wide Web; Ariel software; network coordinating; and central…

  16. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    NASA Astrophysics Data System (ADS)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  17. PAR -- Interface to the ADAM Parameter System

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.; Chipperfield, Alan J.

    PAR is a library of Fortran subroutines that provides convenient mechanisms for applications to exchange information with the outside world, through input-output channels called parameters. Parameters enable a user to control an application's behaviour. PAR supports numeric, character, and logical parameters, and is currently implemented only on top of the ADAM parameter system. The PAR library permits parameter values to be obtained, without or with a variety of constraints. Results may be put into parameters to be passed onto other applications. Other facilities include setting a prompt string, and suggested defaults. This document also introduces a preliminary C interface for the PAR library -- this may be subject to change in the light of experience.

  18. Facile synthesis of semi-library of low charge density cationic polyesters from poly(alkylene maleate)s for efficient local gene delivery.

    PubMed

    Yan, Huijie; Zhu, Dingcheng; Zhou, Zhuxian; Liu, Xin; Piao, Ying; Zhang, Zhen; Liu, Xiangrui; Tang, Jianbin; Shen, Youqing

    2018-03-30

    Cationic polymers are one of the main non-viral vectors for gene therapy, but their applications are hindered by the toxicity and inefficient transfection, particularly in the presence of serum or other biological fluids. While rational design based on the current understanding of gene delivery process has produced various cationic polymers with improved overall transfection, high-throughput parallel synthesis of libraries of cationic polymers seems a more effective strategy to screen out efficacious polymers. Herein, we demonstrate a novel platform for parallel synthesis of low cationic charge-density polyesters for efficient gene delivery. Unsaturated polyester poly(alkylene maleate) (PAM) readily underwent Michael-addition reactions with various mercaptamines to produce polyester backbones with pendant amine groups, poly(alkylene maleate mercaptamine)s (PAMAs). Variations of the alkylenes in the backbone and the mercaptamines on the side chain produced PAMAs with tunable hydrophobicity and DNA-condensation ability, the key parameters dominating transfection efficiency of the resulting polymer/DNA complexes (polyplexes). A semi-library of such PAMAs was exampled from 7 alkylenes and 18 mercaptamines, from which a lead PAMA, G-1, synthesized from poly(1,4-phenylene bis(methylene) maleate) and N,N-dimethylcysteamine, showed remarkable transfection efficiency even in the presence of serum, owing to its efficient lysosome-circumventing cellular uptake. Furthermore, G-1 polyplexes efficiently delivered the suicide gene pTRAIL to intraperitoneal tumors and elicited effective anticancer activity. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Library Off-Site Shelving: Guide for High-Density Facilities.

    ERIC Educational Resources Information Center

    Nitecki, Danuta A., Ed.; Kendrick, Curtis L., Ed.

    This collection of essays addresses the planning, construction, and operating issues relating to high-density library shelving facilities. The volume covers essential topics that address issues relating to the building, its operations, and serving the collections. It begins with an introduction by the volume's editors, "The Paradox and…

  20. Status of MAPA (Modular Accelerator Physics Analysis) and the Tech-X Object-Oriented Accelerator Library

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Shasharina, S.; Bruhwiler, D. L.

    1998-04-01

    The MAPA code is a fully interactive accelerator modeling and design tool consisting of a GUI and two object-oriented C++ libraries: a general library suitable for treatment of any dynamical system, and an accelerator library including many element types plus an accelerator class. The accelerator library inherits directly from the system library, which uses hash tables to store any relevant parameters or strings. The GUI can access these hash tables in a general way, allowing the user to invoke a window displaying all relevant parameters for a particular element type or for the accelerator class, with the option to change those parameters. The system library can advance an arbitrary number of dynamical variables through an arbitrary mapping. The accelerator class inherits this capability and overloads the relevant functions to advance the phase space variables of a charged particle through a string of elements. Among other things, the GUI makes phase space plots and finds fixed points of the map. We discuss the object hierarchy of the two libraries and use of the code.

  1. Image-analysis library

    NASA Technical Reports Server (NTRS)

    1980-01-01

    MATHPAC image-analysis library is collection of general-purpose mathematical and statistical routines and special-purpose data-analysis and pattern-recognition routines for image analysis. MATHPAC library consists of Linear Algebra, Optimization, Statistical-Summary, Densities and Distribution, Regression, and Statistical-Test packages.

  2. HSE12 implementation in libxc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moussa, Jonathan E.

    2013-05-13

    This piece of software is a new feature implemented inside an existing open-source library. Specifically, it is a new implementation of a density functional (HSE, short for Heyd-Scuseria-Ernzerhof) for a repository of density functionals, the libxc library. It fixes some numerical problems with existing implementations, as outlined in a scientific paper recently submitted for publication. Density functionals are components of electronic structure simulations, which model properties of electrons inside molecules and crystals.

  3. 40 CFR 75.6 - Incorporation by reference.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., phone: 610-832-9585, http://www.astm.org/DIGITAL_LIBRARY/index.shtml. (1) ASTM D129-00, Standard Test... Information Reference Unit of the U.S. EPA, 401 M St., SW., Washington, DC and at the Library (MD-35), U.S... D4052-96 (Reapproved 2002), Standard Test Method for Density and Relative Density of Liquids by Digital...

  4. 40 CFR 75.6 - Incorporation by reference.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., phone: 610-832-9585, http://www.astm.org/DIGITAL_LIBRARY/index.shtml. (1) ASTM D129-00, Standard Test... Information Reference Unit of the U.S. EPA, 401 M St., SW., Washington, DC and at the Library (MD-35), U.S... D4052-96 (Reapproved 2002), Standard Test Method for Density and Relative Density of Liquids by Digital...

  5. 40 CFR 75.6 - Incorporation by reference.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., phone: 610-832-9585, http://www.astm.org/DIGITAL_LIBRARY/index.shtml. (1) ASTM D129-00, Standard Test... Information Reference Unit of the U.S. EPA, 401 M St., SW., Washington, DC and at the Library (MD-35), U.S... D4052-96 (Reapproved 2002), Standard Test Method for Density and Relative Density of Liquids by Digital...

  6. 40 CFR 75.6 - Incorporation by reference.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., phone: 610-832-9585, http://www.astm.org/DIGITAL_LIBRARY/index.shtml. (1) ASTM D129-00, Standard Test... Information Reference Unit of the U.S. EPA, 401 M St., SW., Washington, DC and at the Library (MD-35), U.S... D4052-96 (Reapproved 2002), Standard Test Method for Density and Relative Density of Liquids by Digital...

  7. Public Library Service to Children in Oklahoma.

    ERIC Educational Resources Information Center

    Wentroth, Mary Ann

    Because of the low density of its population and subsequent low property tax support, library service in Oklahoma is based on the multicounty library operating as a single unit. With the help of federal funds, such units now cover one-third of the state and 60 percent of its population utilizing branch libraries and bookmobile service. Service to…

  8. Economical analysis of saturation mutagenesis experiments

    PubMed Central

    Acevedo-Rocha, Carlos G.; Reetz, Manfred T.; Nov, Yuval

    2015-01-01

    Saturation mutagenesis is a powerful technique for engineering proteins, metabolic pathways and genomes. In spite of its numerous applications, creating high-quality saturation mutagenesis libraries remains a challenge, as various experimental parameters influence in a complex manner the resulting diversity. We explore from the economical perspective various aspects of saturation mutagenesis library preparation: We introduce a cheaper and faster control for assessing library quality based on liquid media; analyze the role of primer purity and supplier in libraries with and without redundancy; compare library quality, yield, randomization efficiency, and annealing bias using traditional and emergent randomization schemes based on mixtures of mutagenic primers; and establish a methodology for choosing the most cost-effective randomization scheme given the screening costs and other experimental parameters. We show that by carefully considering these parameters, laboratory expenses can be significantly reduced. PMID:26190439

  9. EMPIRE: Nuclear Reaction Model Code System for Data Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herman, M.; Capote, R.; Carlson, B.V.

    EMPIRE is a modular system of nuclear reaction codes, comprising various nuclear models, and designed for calculations over a broad range of energies and incident particles. A projectile can be a neutron, proton, any ion (including heavy-ions) or a photon. The energy range extends from the beginning of the unresolved resonance region for neutron-induced reactions ({approx} keV) and goes up to several hundred MeV for heavy-ion induced reactions. The code accounts for the major nuclear reaction mechanisms, including direct, pre-equilibrium and compound nucleus ones. Direct reactions are described by a generalized optical model (ECIS03) or by the simplified coupled-channels approachmore » (CCFUS). The pre-equilibrium mechanism can be treated by a deformation dependent multi-step direct (ORION + TRISTAN) model, by a NVWY multi-step compound one or by either a pre-equilibrium exciton model with cluster emission (PCROSS) or by another with full angular momentum coupling (DEGAS). Finally, the compound nucleus decay is described by the full featured Hauser-Feshbach model with {gamma}-cascade and width-fluctuations. Advanced treatment of the fission channel takes into account transmission through a multiple-humped fission barrier with absorption in the wells. The fission probability is derived in the WKB approximation within the optical model of fission. Several options for nuclear level densities include the EMPIRE-specific approach, which accounts for the effects of the dynamic deformation of a fast rotating nucleus, the classical Gilbert-Cameron approach and pre-calculated tables obtained with a microscopic model based on HFB single-particle level schemes with collective enhancement. A comprehensive library of input parameters covers nuclear masses, optical model parameters, ground state deformations, discrete levels and decay schemes, level densities, fission barriers, moments of inertia and {gamma}-ray strength functions. The results can be converted into ENDF-6 formatted files using the accompanying code EMPEND and completed with neutron resonances extracted from the existing evaluations. The package contains the full EXFOR (CSISRS) library of experimental reaction data that are automatically retrieved during the calculations. Publication quality graphs can be obtained using the powerful and flexible plotting package ZVView. The graphic user interface, written in Tcl/Tk, provides for easy operation of the system. This paper describes the capabilities of the code, outlines physical models and indicates parameter libraries used by EMPIRE to predict reaction cross sections and spectra, mainly for nucleon-induced reactions. Selected applications of EMPIRE are discussed, the most important being an extensive use of the code in evaluations of neutron reactions for the new US library ENDF/B-VII.0. Future extensions of the system are outlined, including neutron resonance module as well as capabilities of generating covariances, using both KALMAN and Monte-Carlo methods, that are still being advanced and refined.« less

  10. Monte Carlo Calculation of Thermal Neutron Inelastic Scattering Cross Section Uncertainties by Sampling Perturbed Phonon Spectra

    NASA Astrophysics Data System (ADS)

    Holmes, Jesse Curtis

    Nuclear data libraries provide fundamental reaction information required by nuclear system simulation codes. The inclusion of data covariances in these libraries allows the user to assess uncertainties in system response parameters as a function of uncertainties in the nuclear data. Formats and procedures are currently established for representing covariances for various types of reaction data in ENDF libraries. This covariance data is typically generated utilizing experimental measurements and empirical models, consistent with the method of parent data production. However, ENDF File 7 thermal neutron scattering library data is, by convention, produced theoretically through fundamental scattering physics model calculations. Currently, there is no published covariance data for ENDF File 7 thermal libraries. Furthermore, no accepted methodology exists for quantifying or representing uncertainty information associated with this thermal library data. The quality of thermal neutron inelastic scattering cross section data can be of high importance in reactor analysis and criticality safety applications. These cross sections depend on the material's structure and dynamics. The double-differential scattering law, S(alpha, beta), tabulated in ENDF File 7 libraries contains this information. For crystalline solids, S(alpha, beta) is primarily a function of the material's phonon density of states (DOS). Published ENDF File 7 libraries are commonly produced by calculation and processing codes, such as the LEAPR module of NJOY, which utilize the phonon DOS as the fundamental input for inelastic scattering calculations to directly output an S(alpha, beta) matrix. To determine covariances for the S(alpha, beta) data generated by this process, information about uncertainties in the DOS is required. The phonon DOS may be viewed as a probability density function of atomic vibrational energy states that exist in a material. Probable variation in the shape of this spectrum may be established that depends on uncertainties in the physics models and methodology employed to produce the DOS. Through Monte Carlo sampling of perturbations from the reference phonon spectrum, an S(alpha, beta) covariance matrix may be generated. In this work, density functional theory and lattice dynamics in the harmonic approximation are used to calculate the phonon DOS for hexagonal crystalline graphite. This form of graphite is used as an example material for the purpose of demonstrating procedures for analyzing, calculating and processing thermal neutron inelastic scattering uncertainty information. Several sources of uncertainty in thermal neutron inelastic scattering calculations are examined, including sources which cannot be directly characterized through a description of the phonon DOS uncertainty, and their impacts are evaluated. Covariances for hexagonal crystalline graphite S(alpha, beta) data are quantified by coupling the standard methodology of LEAPR with a Monte Carlo sampling process. The mechanics of efficiently representing and processing this covariance information is also examined. Finally, with appropriate sensitivity information, it is shown that an S(alpha, beta) covariance matrix can be propagated to generate covariance data for integrated cross sections, secondary energy distributions, and coupled energy-angle distributions. This approach enables a complete description of thermal neutron inelastic scattering cross section uncertainties which may be employed to improve the simulation of nuclear systems.

  11. Recent developments in LIBXC - A comprehensive library of functionals for density functional theory

    NASA Astrophysics Data System (ADS)

    Lehtola, Susi; Steigemann, Conrad; Oliveira, Micael J. T.; Marques, Miguel A. L.

    2018-01-01

    LIBXC is a library of exchange-correlation functionals for density-functional theory. We are concerned with semi-local functionals (or the semi-local part of hybrid functionals), namely local-density approximations, generalized-gradient approximations, and meta-generalized-gradient approximations. Currently we include around 400 functionals for the exchange, correlation, and the kinetic energy, spanning more than 50 years of research. Moreover, LIBXC is by now used by more than 20 codes, not only from the atomic, molecular, and solid-state physics, but also from the quantum chemistry communities.

  12. C library for topological study of the electronic charge density.

    PubMed

    Vega, David; Aray, Yosslen; Rodríguez, Jesús

    2012-12-05

    The topological study of the electronic charge density is useful to obtain information about the kinds of bonds (ionic or covalent) and the atom charges on a molecule or crystal. For this study, it is necessary to calculate, at every space point, the electronic density and its electronic density derivatives values up to second order. In this work, a grid-based method for these calculations is described. The library, implemented for three dimensions, is based on a multidimensional Lagrange interpolation in a regular grid; by differentiating the resulting polynomial, the gradient vector, the Hessian matrix and the Laplacian formulas were obtained for every space point. More complex functions such as the Newton-Raphson method (to find the critical points, where the gradient is null) and the Cash-Karp Runge-Kutta method (used to make the gradient paths) were programmed. As in some crystals, the unit cell has angles different from 90°, the described library includes linear transformations to correct the gradient and Hessian when the grid is distorted (inclined). Functions were also developed to handle grid containing files (grd from DMol® program, CUBE from Gaussian® program and CHGCAR from VASP® program). Each one of these files contains the data for a molecular or crystal electronic property (such as charge density, spin density, electrostatic potential, and others) in a three-dimensional (3D) grid. The library can be adapted to make the topological study in any regular 3D grid by modifying the code of these functions. Copyright © 2012 Wiley Periodicals, Inc.

  13. libSRES: a C library for stochastic ranking evolution strategy for parameter estimation.

    PubMed

    Ji, Xinglai; Xu, Ying

    2006-01-01

    Estimation of kinetic parameters in a biochemical pathway or network represents a common problem in systems studies of biological processes. We have implemented a C library, named libSRES, to facilitate a fast implementation of computer software for study of non-linear biochemical pathways. This library implements a (mu, lambda)-ES evolutionary optimization algorithm that uses stochastic ranking as the constraint handling technique. Considering the amount of computing time it might require to solve a parameter-estimation problem, an MPI version of libSRES is provided for parallel implementation, as well as a simple user interface. libSRES is freely available and could be used directly in any C program as a library function. We have extensively tested the performance of libSRES on various pathway parameter-estimation problems and found its performance to be satisfactory. The source code (in C) is free for academic users at http://csbl.bmb.uga.edu/~jix/science/libSRES/

  14. Towards a library of synthetic galaxy spectra and preliminary results of classification and parametrization of unresolved galaxies for Gaia. II

    NASA Astrophysics Data System (ADS)

    Tsalmantza, P.; Kontizas, M.; Rocca-Volmerange, B.; Bailer-Jones, C. A. L.; Kontizas, E.; Bellas-Velidis, I.; Livanou, E.; Korakitis, R.; Dapergolas, A.; Vallenari, A.; Fioc, M.

    2009-09-01

    Aims: This paper is the second in a series, implementing a classification system for Gaia observations of unresolved galaxies. Our goals are to determine spectral classes and estimate intrinsic astrophysical parameters via synthetic templates. Here we describe (1) a new extended library of synthetic galaxy spectra; (2) its comparison with various observations; and (3) first results of classification and parametrization experiments using simulated Gaia spectrophotometry of this library. Methods: Using the PÉGASE.2 code, based on galaxy evolution models that take account of metallicity evolution, extinction correction, and emission lines (with stellar spectra based on the BaSeL library), we improved our first library and extended it to cover the domain of most of the SDSS catalogue. Our classification and regression models were support vector machines (SVMs). Results: We produce an extended library of 28 885 synthetic galaxy spectra at zero redshift covering four general Hubble types of galaxies, over the wavelength range between 250 and 1050 nm at a sampling of 1 nm or less. The library is also produced for 4 random values of redshift in the range of 0-0.2. It is computed on a random grid of four key astrophysical parameters (infall timescale and 3 parameters defining the SFR) and, depending on the galaxy type, on two values of the age of the galaxy. The synthetic library was compared and found to be in good agreement with various observations. The first results from the SVM classifiers and parametrizers are promising, indicating that Hubble types can be reliably predicted and several parameters estimated with low bias and variance.

  15. General Economic and Demographic Background and Projections for Indiana Library Services.

    ERIC Educational Resources Information Center

    Foust, James D.; Tower, Carl B.

    Before future library needs can be estimated, economic and demographic variables that influence the demand for library services must be projected and estimating equations relating library needs to economic and demographic parameters developed. This study considers the size, location and age-sex characteristics of Indiana's current population and…

  16. Editorial Library: User Survey.

    ERIC Educational Resources Information Center

    Surace, Cecily J.

    This report presents the findings of a survey conducted by the editorial library of the Los Angeles Times to measure usage and satisfaction with library service, provide background information on library user characteristics, collect information on patterns of use of the Times' clipping files, relate data on usage and satisfaction parameters to…

  17. Combinatorial Optimization of Heterogeneous Catalysts Used in the Growth of Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Cassell, Alan M.; Verma, Sunita; Delzeit, Lance; Meyyappan, M.; Han, Jie

    2000-01-01

    Libraries of liquid-phase catalyst precursor solutions were printed onto iridium-coated silicon substrates and evaluated for their effectiveness in catalyzing the growth of multi-walled carbon nanotubes (MWNTs) by chemical vapor deposition (CVD). The catalyst precursor solutions were composed of inorganic salts and a removable tri-block copolymer (EO)20(PO)70(EO)20 (EO = ethylene oxide, PO = propylene oxide) structure-directing agent (SDA), dissolved in ethanol/methanol mixtures. Sample libraries were quickly assayed using scanning electron microscopy after CVD growth to identify active catalysts and CVD conditions. Composition libraries and focus libraries were then constructed around the active spots identified in the discovery libraries to understand how catalyst precursor composition affects the yield, density, and quality of the nanotubes. Successful implementation of combinatorial optimization methods in the development of highly active, carbon nanotube catalysts is demonstrated, as well as the identification of catalyst formulations that lead to varying densities and shapes of aligned nanotube towers.

  18. Overview of refinement procedures within REFMAC5: utilizing data from different sources.

    PubMed

    Kovalevskiy, Oleg; Nicholls, Robert A; Long, Fei; Carlon, Azzurra; Murshudov, Garib N

    2018-03-01

    Refinement is a process that involves bringing into agreement the structural model, available prior knowledge and experimental data. To achieve this, the refinement procedure optimizes a posterior conditional probability distribution of model parameters, including atomic coordinates, atomic displacement parameters (B factors), scale factors, parameters of the solvent model and twin fractions in the case of twinned crystals, given observed data such as observed amplitudes or intensities of structure factors. A library of chemical restraints is typically used to ensure consistency between the model and the prior knowledge of stereochemistry. If the observation-to-parameter ratio is small, for example when diffraction data only extend to low resolution, the Bayesian framework implemented in REFMAC5 uses external restraints to inject additional information extracted from structures of homologous proteins, prior knowledge about secondary-structure formation and even data obtained using different experimental methods, for example NMR. The refinement procedure also generates the `best' weighted electron-density maps, which are useful for further model (re)building. Here, the refinement of macromolecular structures using REFMAC5 and related tools distributed as part of the CCP4 suite is discussed.

  19. Bone Density: MedinePlus Health Topic

    MedlinePlus

    ... Articles References and abstracts from MEDLINE/PubMed (National Library of Medicine) Article: Associations between bone-alkaline phosphatase ... MedlinePlus Connect for EHRs For Developers U.S. National Library of Medicine 8600 Rockville Pike, Bethesda, MD 20894 ...

  20. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    NASA Astrophysics Data System (ADS)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  1. Alignment-Independent Comparisons of Human Gastrointestinal Tract Microbial Communities in a Multidimensional 16S rRNA Gene Evolutionary Space▿

    PubMed Central

    Rudi, Knut; Zimonja, Monika; Kvenshagen, Bente; Rugtveit, Jarle; Midtvedt, Tore; Eggesbø, Merete

    2007-01-01

    We present a novel approach for comparing 16S rRNA gene clone libraries that is independent of both DNA sequence alignment and definition of bacterial phylogroups. These steps are the major bottlenecks in current microbial comparative analyses. We used direct comparisons of taxon density distributions in an absolute evolutionary coordinate space. The coordinate space was generated by using alignment-independent bilinear multivariate modeling. Statistical analyses for clone library comparisons were based on multivariate analysis of variance, partial least-squares regression, and permutations. Clone libraries from both adult and infant gastrointestinal tract microbial communities were used as biological models. We reanalyzed a library consisting of 11,831 clones covering complete colons from three healthy adults in addition to a smaller 390-clone library from infant feces. We show that it is possible to extract detailed information about microbial community structures using our alignment-independent method. Our density distribution analysis is also very efficient with respect to computer operation time, meeting the future requirements of large-scale screenings to understand the diversity and dynamics of microbial communities. PMID:17337554

  2. X-ray Reflected Spectra from Accretion Disk Models. III. A Complete Grid of Ionized Reflection Calculations

    NASA Technical Reports Server (NTRS)

    Garcia, J.; Dauser, T.; Reynolds, C. S.; Kallman, T. R.; McClintock, J. E.; Wilms, J.; Ekmann, W.

    2013-01-01

    We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code xillver that incorporates new routines and a richer atomic data base. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index Gamma of the illuminating radiation, the ionization parameter zeta at the surface of the disk (i.e., the ratio of the X-ray flux to the gas density), and the iron abundance A(sub Fe) relative to the solar value. The ranges of the parameters covered are: 1.2 <= Gamma <= 3.4, 1 <= zeta <= 104, and 0.5 <= A(sub Fe) <= 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file suitable for the analysis of X-ray observations via the atable model in xspec. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of xillver.

  3. An open library of relativistic core electron density function for the QTAIM analysis with pseudopotentials.

    PubMed

    Zou, Wenli; Cai, Ziyu; Wang, Jiankang; Xin, Kunyu

    2018-04-29

    Based on two-component relativistic atomic calculations, a free electron density function (EDF) library has been developed for nearly all the known ECPs of the elements Li (Z = 3) up to Ubn (Z = 120), which can be interfaced into modern quantum chemistry programs to save the .wfx wavefunction file. The applicability of this EDF library is demonstrated by the analyses of the quantum theory of atoms in molecules (QTAIM) and other real space functions on HeCuF, PtO42+, OgF 4 , and TlCl 3 (DMSO) 2 . When a large-core ECP is used, it shows that the corrections by EDF may significantly improve the properties of some density-derived real space functions, but they are invalid for the wavefunction-depending real space functions. To classify different chemical bonds and especially some nonclassical interactions, a list of universal criteria has also been proposed. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  4. Principal component analysis as a tool for library design: a case study investigating natural products, brand-name drugs, natural product-like libraries, and drug-like libraries.

    PubMed

    Wenderski, Todd A; Stratton, Christopher F; Bauer, Renato A; Kopp, Felix; Tan, Derek S

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design.

  5. Principal Component Analysis as a Tool for Library Design: A Case Study Investigating Natural Products, Brand-Name Drugs, Natural Product-Like Libraries, and Drug-Like Libraries

    PubMed Central

    Wenderski, Todd A.; Stratton, Christopher F.; Bauer, Renato A.; Kopp, Felix; Tan, Derek S.

    2015-01-01

    Principal component analysis (PCA) is a useful tool in the design and planning of chemical libraries. PCA can be used to reveal differences in structural and physicochemical parameters between various classes of compounds by displaying them in a convenient graphical format. Herein, we demonstrate the use of PCA to gain insight into structural features that differentiate natural products, synthetic drugs, natural product-like libraries, and drug-like libraries, and show how the results can be used to guide library design. PMID:25618349

  6. A Multi-User Microcomputer System for Small Libraries.

    ERIC Educational Resources Information Center

    Leggate, Peter

    1988-01-01

    Describes the development of Bookshelf, a multi-user microcomputer system for small libraries that uses an integrated software package. The discussion covers the design parameters of the package, which were based on a survey of seven small libraries, and some characteristics of the software. (three notes with references) (CLB)

  7. IDENTIFYING COMPOUNDS USING SOURCE CID ON AN ORTHOGONAL ACCELERATION TIME-OF-FLIGHT MASS SPECTROMETER

    EPA Science Inventory

    Exact mass libraries of ESI and APCI mass spectra are not commercially available In-house libraries are dependent on CID parameters and are instrument specific. The ability to identify compounds without reliance on mass spectral libraries is therefore more crucial for liquid sam...

  8. Macro and Microenvironments at the British Library.

    ERIC Educational Resources Information Center

    Shenton, Helen

    This paper describes the storage of the 12 million items that have just been moved into the new British Library building. The specifications for the storage and environmental conditions for different types of library and archive material are explained. The varying environmental parameters for storage areas and public areas, including reading rooms…

  9. Microbiological quality of indoor air in university libraries.

    PubMed

    Hayleeyesus, Samuel Fekadu; Manaye, Abayneh Melaku

    2014-05-01

    To evaluate the concentration of bacteria and fungi in the indoor environment of Jimma University libraries, so as to estimate the health hazard and to create standards for indoor air quality control. The microbial quality of indoor air of eight libraries of Jimma University was determined. The settle plate method using open Petri-dishes containing different culture media was employed to collect sample twice daily. Isolates were identified according to standard methods. The concentrations of bacteria and fungi aerosols in the indoor environment of the university libraries ranged between 367-2595 CFU/m(3). According to the sanitary standards classification of European Commission, almost all the libraries indoor air of Jimma University was heavily contaminated with bacteria and fungi. In spite of their major source difference, the average fungi density found in the indoor air of libraries did appear to follow the same trend with bacterial density (P=0.001). The bacteria isolates included Micrococcus sp., Staphylococcus aureus, Streptococcus pyogenes, Bacillus sp. and Neisseria sp. while Cladosporium sp., Alternaria sp., Penicillium sp. and Aspergillus sp. were the most isolated fungi. The indoor air of all libraries were in the range above highly contaminated according to European Commission classification and the most isolates are considered as potential candidates involved in the establishment of sick building syndromes and often associated with clinical manifestations like allergy, rhinitis, asthma and conjunctivitis. Thus, attention must be given to control those environmental factors which favor the growth and multiplication of microbes in indoor environment of libraries to safeguard the health of users and workers.

  10. A Computerized Library and Evaluation System for Integral Neutron Experiments.

    ERIC Educational Resources Information Center

    Hampel, Viktor E.; And Others

    A computerized library of references to integral neutron experiments has been developed at the Lawrence Radiation Laboratory at Livermore. This library serves as a data base for the systematic retrieval of documents describing diverse critical and bulk nuclear experiments. The evaluation and reduction of the physical parameters of the experiments…

  11. X-RAY REFLECTED SPECTRA FROM ACCRETION DISK MODELS. III. A COMPLETE GRID OF IONIZED REFLECTION CALCULATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, J.; McClintock, J. E.; Dauser, T.

    2013-05-10

    We present a new and complete library of synthetic spectra for modeling the component of emission that is reflected from an illuminated accretion disk. The spectra were computed using an updated version of our code XILLVER that incorporates new routines and a richer atomic database. We offer in the form of a table model an extensive grid of reflection models that cover a wide range of parameters. Each individual model is characterized by the photon index {Gamma} of the illuminating radiation, the ionization parameter {xi} at the surface of the disk (i.e., the ratio of the X-ray flux to themore » gas density), and the iron abundance A{sub Fe} relative to the solar value. The ranges of the parameters covered are 1.2 {<=} {Gamma} {<=} 3.4, 1 {<=} {xi} {<=} 10{sup 4}, and 0.5 {<=} A{sub Fe} {<=} 10. These ranges capture the physical conditions typically inferred from observations of active galactic nuclei, and also stellar-mass black holes in the hard state. This library is intended for use when the thermal disk flux is faint compared to the incident power-law flux. The models are expected to provide an accurate description of the Fe K emission line, which is the crucial spectral feature used to measure black hole spin. A total of 720 reflection spectra are provided in a single FITS file (http://hea-www.cfa.harvard.edu/{approx}javier/xillver/) suitable for the analysis of X-ray observations via the atable model in XSPEC. Detailed comparisons with previous reflection models illustrate the improvements incorporated in this version of XILLVER.« less

  12. Library fingerprints: a novel approach to the screening of virtual libraries.

    PubMed

    Klon, Anthony E; Diller, David J

    2007-01-01

    We propose a novel method to prioritize libraries for combinatorial synthesis and high-throughput screening that assesses the viability of a particular library on the basis of the aggregate physical-chemical properties of the compounds using a naïve Bayesian classifier. This approach prioritizes collections of related compounds according to the aggregate values of their physical-chemical parameters in contrast to single-compound screening. The method is also shown to be useful in screening existing noncombinatorial libraries when the compounds in these libraries have been previously clustered according to their molecular graphs. We show that the method used here is comparable or superior to the single-compound virtual screening of combinatorial libraries and noncombinatorial libraries and is superior to the pairwise Tanimoto similarity searching of a collection of combinatorial libraries.

  13. Comparison Of A Neutron Kinetics Parameter For A Polyethylene Moderated Highly Enriched Uranium System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKenzie, IV, George Espy; Goda, Joetta Marie; Grove, Travis Justin

    This paper examines the comparison of MCNP® code’s capability to calculate kinetics parameters effectively for a thermal system containing highly enriched uranium (HEU). The Rossi-α parameter was chosen for this examination because it is relatively easy to measure as well as easy to calculate using MCNP®’s kopts card. The Rossi-α also incorporates many other parameters of interest in nuclear kinetics most of which are more difficult to precisely measure. The comparison looks at two different nuclear data libraries for comparison to the experimental data. These libraries are ENDF/BVI (.66c) and ENDF/BVII (.80c).

  14. Mapping forest canopy fuels in Yellowstone National Park using lidar and hyperspectral data

    NASA Astrophysics Data System (ADS)

    Halligan, Kerry Quinn

    The severity and size of wildland fires in the forested western U.S have increased in recent years despite improvements in fire suppression efficiency. This, along with increased density of homes in the wildland-urban interface, has resulted in high costs for fire management and increased risks to human health, safety and property. Crown fires, in comparison to surface fires, pose an especially high risk due to their intensity and high rate of spread. Crown fire models require a range of quantitative fuel parameters which can be difficult and costly to obtain, but advances in lidar and hyperspectral sensor technologies hold promise for delivering these inputs. Further research is needed, however, to assess the strengths and limitations of these technologies and the most appropriate analysis methodologies for estimating crown fuel parameters from these data. This dissertation focuses on retrieving critical crown fuel parameters, including canopy height, canopy bulk density and proportion of dead canopy fuel, from airborne lidar and hyperspectral data. Remote sensing data were used in conjunction with detailed field data on forest parameters and surface reflectance measurements. A new method was developed for retrieving Digital Surface Model (DSM) and Digital Canopy Models (DCM) from first return lidar data. Validation data on individual tree heights demonstrated the high accuracy (r2 0.95) of the DCMs developed via this new algorithm. Lidar-derived DCMs were used to estimate critical crown fire parameters including available canopy fuel, canopy height and canopy bulk density with linear regression model r2 values ranging from 0.75 to 0.85. Hyperspectral data were used in conjunction with Spectral Mixture Analysis (SMA) to assess fuel quality in the form of live versus dead canopy proportions. Severity and stage of insect-caused forest mortality were estimated using the fractional abundance of green vegetation, non-photosynthetic vegetation and shade obtained from SMA. Proportion of insect attack was estimated with a linear model producing an r2 of 0.6 using SMA and bark endmembers from image and reference libraries. Fraction of red attack, with a possible link to increased crown fire risk, was estimated with an r2 of 0.45.

  15. Real time method and computer system for identifying radioactive materials from HPGe gamma-ray spectroscopy

    DOEpatents

    Rowland, Mark S.; Howard, Douglas E.; Wong, James L.; Jessup, James L.; Bianchini, Greg M.; Miller, Wayne O.

    2007-10-23

    A real-time method and computer system for identifying radioactive materials which collects gamma count rates from a HPGe gamma-radiation detector to produce a high-resolution gamma-ray energy spectrum. A library of nuclear material definitions ("library definitions") is provided, with each uniquely associated with a nuclide or isotope material and each comprising at least one logic condition associated with a spectral parameter of a gamma-ray energy spectrum. The method determines whether the spectral parameters of said high-resolution gamma-ray energy spectrum satisfy all the logic conditions of any one of the library definitions, and subsequently uniquely identifies the material type as that nuclide or isotope material associated with the satisfied library definition. The method is iteratively repeated to update the spectrum and identification in real time.

  16. ELSI: A unified software interface for Kohn–Sham electronic structure solvers

    DOE PAGES

    Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto; ...

    2017-09-15

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less

  17. ELSI: A unified software interface for Kohn-Sham electronic structure solvers

    NASA Astrophysics Data System (ADS)

    Yu, Victor Wen-zhe; Corsetti, Fabiano; García, Alberto; Huhn, William P.; Jacquelin, Mathias; Jia, Weile; Lange, Björn; Lin, Lin; Lu, Jianfeng; Mi, Wenhui; Seifitokaldani, Ali; Vázquez-Mayagoitia, Álvaro; Yang, Chao; Yang, Haizhao; Blum, Volker

    2018-01-01

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aims to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. Comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.

  18. StarSmasher: Smoothed Particle Hydrodynamics code for smashing stars and planets

    NASA Astrophysics Data System (ADS)

    Gaburov, Evghenii; Lombardi, James C., Jr.; Portegies Zwart, Simon; Rasio, F. A.

    2018-05-01

    Smoothed Particle Hydrodynamics (SPH) is a Lagrangian particle method that approximates a continuous fluid as discrete nodes, each carrying various parameters such as mass, position, velocity, pressure, and temperature. In an SPH simulation the resolution scales with the particle density; StarSmasher is able to handle both equal-mass and equal number-density particle models. StarSmasher solves for hydro forces by calculating the pressure for each particle as a function of the particle's properties - density, internal energy, and internal properties (e.g. temperature and mean molecular weight). The code implements variational equations of motion and libraries to calculate the gravitational forces between particles using direct summation on NVIDIA graphics cards. Using a direct summation instead of a tree-based algorithm for gravity increases the accuracy of the gravity calculations at the cost of speed. The code uses a cubic spline for the smoothing kernel and an artificial viscosity prescription coupled with a Balsara Switch to prevent unphysical interparticle penetration. The code also implements an artificial relaxation force to the equations of motion to add a drag term to the calculated accelerations during relaxation integrations. Initially called StarCrash, StarSmasher was developed originally by Rasio.

  19. ELSI: A unified software interface for Kohn–Sham electronic structure solvers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Victor Wen-zhe; Corsetti, Fabiano; Garcia, Alberto

    Solving the electronic structure from a generalized or standard eigenproblem is often the bottleneck in large scale calculations based on Kohn-Sham density-functional theory. This problem must be addressed by essentially all current electronic structure codes, based on similar matrix expressions, and by high-performance computation. We here present a unified software interface, ELSI, to access different strategies that address the Kohn-Sham eigenvalue problem. Currently supported algorithms include the dense generalized eigensolver library ELPA, the orbital minimization method implemented in libOMM, and the pole expansion and selected inversion (PEXSI) approach with lower computational complexity for semilocal density functionals. The ELSI interface aimsmore » to simplify the implementation and optimal use of the different strategies, by offering (a) a unified software framework designed for the electronic structure solvers in Kohn-Sham density-functional theory; (b) reasonable default parameters for a chosen solver; (c) automatic conversion between input and internal working matrix formats, and in the future (d) recommendation of the optimal solver depending on the specific problem. As a result, comparative benchmarks are shown for system sizes up to 11,520 atoms (172,800 basis functions) on distributed memory supercomputing architectures.« less

  20. Material Identification and Quantification in Spectral X-ray Micro-CT

    NASA Astrophysics Data System (ADS)

    Holmes, Thomas Wesley

    The identification and quantification of all the voxels within a reconstructed microCT image was possible through making comparisons of the attenuation profile from an unknown voxel with precalculated signatures of known materials. This was accomplished through simulations with the MCNP6 general-purpose radiation-transport package that modeled a CdTe detector array consisting of 200 elements which were able to differentiate between 100 separate energy bins over the entire range of the emitted 110 kVp tungsten x-ray spectra. The information from each of the separate energy bins was then used to create a single reconstructed image that was then grouped back together to produce a final image where each voxel had a corresponding attenuation pro le. A library of known attenuation profiles was created for each of the materials expected to be within an object with otherwise unknown parameters. A least squares analysis was performed, and comparisons were then made for each voxel's attenuation profile in the unknown object and combinations of each possible library combination of attenuation profiles. Based on predetermined thresholds that the results must meet, some of the combinations were then removed. Of the remaining combinations, a voting system based on statistical evaluations of the fits was designed to select the most appropriate material combination to the input unknown voxel. This was performed over all of the voxels in the reconstructed image and a final resulting material map was produced. These material locations were then quantified by creating an equation of the response from several different densities of the same material and recording the response of the base library. This entire process was called the All Combinations Library Least Squares (ACLLS)analysis and was used to test several Different models. These models investigated a range of densities for the x-ray contrast agents of gold and gadolinium that can be used in many medical applications, as well as a range of densities of bone to test the ACLLS ability to be used with bone density estimation. A final test used a model with five different materials present within the object and consisted of two separate features with mixtures of three materials as gold, iodine and water, and another feature with gadolinium, iodine and water. The remaining four features were all mixtures of water with bone, gold, gadolinium, and iodine. All of the various material mixtures were successfully identified and quantified using the ACLLS analysis package within an acceptable statistical range. The ACLLS method has proven itself as a viable analysis tool for determining both the physical locations and the amount of all the materials present within a given object. This tool could be implemented in the future so as to further assist a team of medical practitioners in diagnosing a subject through reducing ambiguities in an image and providing a quantifiable solution to all of the voxels.

  1. Parameterizable Library Components for SAW Devices

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2006-01-01

    To facilitate quick fabrication of Surface Acoustic Wave (SAW) sensors we have found it necessary to develop a library of parameterizable components. This library is the first module in our strategy towards a design tool that is integrated into existing Electronic Design Automation (EDA) tools. This library is similar to the standard cell libraries found in digital design packages. The library cells allow the user to input the design parameters which automatically generate a detailed layout of the SAW component. This paper presents the results of our development of parameterizable cells for an InterDigitated Transducer (IDT), reflector, SAW delay line, and both one and two port resonators.

  2. Online plasma calculator

    NASA Astrophysics Data System (ADS)

    Wisniewski, H.; Gourdain, P.-A.

    2017-10-01

    APOLLO is an online, Linux based plasma calculator. Users can input variables that correspond to their specific plasma, such as ion and electron densities, temperatures, and external magnetic fields. The system is based on a webserver where a FastCGI protocol computes key plasma parameters including frequencies, lengths, velocities, and dimensionless numbers. FastCGI was chosen to overcome security problems caused by JAVA-based plugins. The FastCGI also speeds up calculations over PHP based systems. APOLLO is built upon the WT library, which turns any web browser into a versatile, fast graphic user interface. All values with units are expressed in SI units except temperature, which is in electron-volts. SI units were chosen over cgs units because of the gradual shift to using SI units within the plasma community. APOLLO is intended to be a fast calculator that also provides the user with the proper equations used to calculate the plasma parameters. This system is intended to be used by undergraduates taking plasma courses as well as graduate students and researchers who need a quick reference calculation.

  3. An implementation of the NiftyRec medical imaging library for PIXE-tomography reconstruction

    NASA Astrophysics Data System (ADS)

    Michelet, C.; Barberet, P.; Desbarats, P.; Giovannelli, J.-F.; Schou, C.; Chebil, I.; Delville, M.-H.; Gordillo, N.; Beasley, D. G.; Devès, G.; Moretto, P.; Seznec, H.

    2017-08-01

    A new development of the TomoRebuild software package is presented, including ;thick sample; correction for non linear X-ray production (NLXP) and X-ray absorption (XA). As in the previous versions, C++ programming with standard libraries was used for easier portability. Data reduction requires different steps which may be run either from a command line instruction or via a user friendly interface, developed as a portable Java plugin in ImageJ. All experimental and reconstruction parameters can be easily modified, either directly in the ASCII parameter files or via the ImageJ interface. A detailed user guide in English is provided. Sinograms and final reconstructed images are generated in usual binary formats that can be read by most public domain graphic softwares. New MLEM and OSEM methods are proposed, using optimized methods from the NiftyRec medical imaging library. An overview of the different medical imaging methods that have been used for ion beam microtomography applications is presented. In TomoRebuild, PIXET data reduction is performed for each chemical element independently and separately from STIMT, except for two steps where the fusion of STIMT and PIXET data is required: the calculation of the correction matrix and the normalization of PIXET data to obtain mass fraction distributions. Correction matrices for NLXP and XA are calculated using procedures extracted from the DISRA code, taking into account a large X-ray detection solid angle. For this, the 3D STIMT mass density distribution is used, considering a homogeneous global composition. A first example of PIXET experiment using two detectors is presented. Reconstruction results are compared and found in good agreement between different codes: FBP, NiftyRec MLEM and OSEM of the TomoRebuild software package, the original DISRA, its accelerated version provided in JPIXET and the accelerated MLEM version of JPIXET, with or without correction.

  4. Chemical Space of DNA-Encoded Libraries.

    PubMed

    Franzini, Raphael M; Randolph, Cassie

    2016-07-28

    In recent years, DNA-encoded chemical libraries (DECLs) have attracted considerable attention as a potential discovery tool in drug development. Screening encoded libraries may offer advantages over conventional hit discovery approaches and has the potential to complement such methods in pharmaceutical research. As a result of the increased application of encoded libraries in drug discovery, a growing number of hit compounds are emerging in scientific literature. In this review we evaluate reported encoded library-derived structures and identify general trends of these compounds in relation to library design parameters. We in particular emphasize the combinatorial nature of these libraries. Generally, the reported molecules demonstrate the ability of this technology to afford hits suitable for further lead development, and on the basis of them, we derive guidelines for DECL design.

  5. Increasing leaf vein density by mutagenesis: laying the foundations for C4 rice.

    PubMed

    Feldman, Aryo B; Murchie, Erik H; Leung, Hei; Baraoidan, Marietta; Coe, Robert; Yu, Su-May; Lo, Shuen-Fang; Quick, William P

    2014-01-01

    A high leaf vein density is both an essential feature of C4 photosynthesis and a foundation trait to C4 evolution, ensuring the optimal proportion and proximity of mesophyll and bundle sheath cells for permitting the rapid exchange of photosynthates. Two rice mutant populations, a deletion mutant library with a cv. IR64 background (12,470 lines) and a T-DNA insertion mutant library with a cv. Tainung 67 background (10,830 lines), were screened for increases in vein density. A high throughput method with handheld microscopes was developed and its accuracy was supported by more rigorous microscopy analysis. Eight lines with significantly increased leaf vein densities were identified to be used as genetic stock for the global C4 Rice Consortium. The candidate population was shown to include both shared and independent mutations and so more than one gene controlled the high vein density phenotype. The high vein density trait was found to be linked to a narrow leaf width trait but the linkage was incomplete. The more genetically robust narrow leaf width trait was proposed to be used as a reliable phenotypic marker for finding high vein density variants in rice in future screens.

  6. Indoor air pollution and preventions in college libraries

    NASA Astrophysics Data System (ADS)

    Yang, Zengzhang

    2017-05-01

    The college library is a place where it gets the comparatively high density of students often staying long time with it. Therefore, the indoor air quality will affect directly reading effect and physical health of teachers and students in colleges and universities. The paper analyzes the influenced factors in indoor air pollution of the library from the selection of green-environmental decorating materials and furniture, good ventilation maintaining, electromagnetic radiation reducing, regular disinfection, indoor green building and awareness of health and environmental protection strengthening etc. six aspects to put forward the ideas for preventions of indoor air pollution and construction of the green low-carbon library.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sin, M.; Capote, R.; Herman, M. W.

    Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less

  8. Development and validation of the European Cluster Assimilation Techniques run libraries

    NASA Astrophysics Data System (ADS)

    Facskó, G.; Gordeev, E.; Palmroth, M.; Honkonen, I.; Janhunen, P.; Sergeev, V.; Kauristie, K.; Milan, S.

    2012-04-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: Synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The result of the 192 simulations named so called "synthetic run library" were visualized and uploaded to the homepage of the FMI after validation. Here we present details of these runs.

  9. Characterization of Three Maize Bacterial Artificial Chromosome Libraries toward Anchoring of the Physical Map to the Genetic Map Using High-Density Bacterial Artificial Chromosome Filter Hybridization1

    PubMed Central

    Yim, Young-Sun; Davis, Georgia L.; Duru, Ngozi A.; Musket, Theresa A.; Linton, Eric W.; Messing, Joachim W.; McMullen, Michael D.; Soderlund, Carol A.; Polacco, Mary L.; Gardiner, Jack M.; Coe, Edward H.

    2002-01-01

    Three maize (Zea mays) bacterial artificial chromosome (BAC) libraries were constructed from inbred line B73. High-density filter sets from all three libraries, made using different restriction enzymes (HindIII, EcoRI, and MboI, respectively), were evaluated with a set of complex probes including the185-bp knob repeat, ribosomal DNA, two telomere-associated repeat sequences, four centromere repeats, the mitochondrial genome, a multifragment chloroplast DNA probe, and bacteriophage λ. The results indicate that the libraries are of high quality with low contamination by organellar and λ-sequences. The use of libraries from multiple enzymes increased the chance of recovering each region of the genome. Ninety maize restriction fragment-length polymorphism core markers were hybridized to filters of the HindIII library, representing 6× coverage of the genome, to initiate development of a framework for anchoring BAC contigs to the intermated B73 × Mo17 genetic map and to mark the bin boundaries on the physical map. All of the clones used as hybridization probes detected at least three BACs. Twenty-two single-copy number core markers identified an average of 7.4 ± 3.3 positive clones, consistent with the expectation of six clones. This information is integrated into fingerprinting data generated by the Arizona Genomics Institute to assemble the BAC contigs using fingerprint contig and contributed to the process of physical map construction. PMID:12481051

  10. Alzheimer's Disease Diagnosis in Individual Subjects using Structural MR Images: Validation Studies

    PubMed Central

    Vemuri, Prashanthi; Gunter, Jeffrey L.; Senjem, Matthew L.; Whitwell, Jennifer L.; Kantarci, Kejal; Knopman, David S.; Boeve, Bradley F.; Petersen, Ronald C.; Jack, Clifford R.

    2008-01-01

    OBJECTIVE To develop and validate a tool for Alzheimer's disease (AD) diagnosis in individual subjects using support vector machine (SVM) based classification of structural MR (sMR) images. BACKGROUND Libraries of sMR scans of clinically well characterized subjects can be harnessed for the purpose of diagnosing new incoming subjects. METHODS 190 patients with probable AD were age- and gender-matched with 190 cognitively normal (CN) subjects. Three different classification models were implemented: Model I uses tissue densities obtained from sMR scans to give STructural Abnormality iNDex (STAND)-score; and Models II and III use tissue densities as well as covariates (demographics and Apolipoprotein E genotype) to give adjusted-STAND (aSTAND)-score. Data from 140 AD and 140 CN were used for training. The SVM parameter optimization and training was done by four-fold cross validation. The remaining independent sample of 50 AD and 50 CN were used to obtain a minimally biased estimate of the generalization error of the algorithm. RESULTS The CV accuracy of Model II and Model III aSTAND-scores was 88.5% and 89.3% respectively and the developed models generalized well on the independent test datasets. Anatomic patterns best differentiating the groups were consistent with the known distribution of neurofibrillary AD pathology. CONCLUSIONS This paper presents preliminary evidence that application of SVM-based classification of an individual sMR scan relative to a library of scans can provide useful information in individual subjects for diagnosis of AD. Including demographic and genetic information in the classification algorithm slightly improves diagnostic accuracy. PMID:18054253

  11. Effects of the EVCAM chemical validation library on differentiation using marker gene expression in lmouse embryonic stem cells

    EPA Science Inventory

    The adherent cell differentiation and cytotoxicity (ACDC) assay was used to profile the effects of the ECVAM EST validation chemical library (19 compounds) on J1 mouse embryonic stem cells (mESC). PCR-based TaqMan Low Density Arrays (TLDA) provided a high-content assessment of al...

  12. Visualization Based Data Mining for Comparison Between Two Solar Cell Libraries.

    PubMed

    Yosipof, Abraham; Kaspi, Omer; Majhi, Koushik; Senderowitz, Hanoch

    2016-12-01

    Material informatics may provide meaningful insights and powerful predictions for the development of new and efficient Metal Oxide (MO) based solar cells. The main objective of this paper is to establish the usefulness of data reduction and visualization methods for analyzing data sets emerging from multiple all-MOs solar cell libraries. For this purpose, two libraries, TiO 2 |Co 3 O 4 and TiO 2 |Co 3 O 4 |MoO 3 , differing only by the presence of a MoO 3 layer in the latter were analyzed with Principal Component Analysis and Self-Organizing Maps. Both analyses suggest that the addition of the MoO 3 layer to the TiO 2 |Co 3 O 4 library has affected the overall photovoltaic (PV) activity profile of the solar cells making the two libraries clearly distinguishable from one another. Furthermore, while MoO 3 had an overall favorable effect on PV parameters, a sub-population of cells was identified which were either indifferent to its presence or even demonstrated a reduction in several parameters. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. SAR target recognition using behaviour library of different shapes in different incidence angles and polarisations

    NASA Astrophysics Data System (ADS)

    Fallahpour, Mojtaba Behzad; Dehghani, Hamid; Jabbar Rashidi, Ali; Sheikhi, Abbas

    2018-05-01

    Target recognition is one of the most important issues in the interpretation of the synthetic aperture radar (SAR) images. Modelling, analysis, and recognition of the effects of influential parameters in the SAR can provide a better understanding of the SAR imaging systems, and therefore facilitates the interpretation of the produced images. Influential parameters in SAR images can be divided into five general categories of radar, radar platform, channel, imaging region, and processing section, each of which has different physical, structural, hardware, and software sub-parameters with clear roles in the finally formed images. In this paper, for the first time, a behaviour library that includes the effects of polarisation, incidence angle, and shape of targets, as radar and imaging region sub-parameters, in the SAR images are extracted. This library shows that the created pattern for each of cylindrical, conical, and cubic shapes is unique, and due to their unique properties these types of shapes can be recognised in the SAR images. This capability is applied to data acquired with the Canadian RADARSAT1 satellite.

  14. A comparison of different functions for predicted protein model quality assessment.

    PubMed

    Li, Juan; Fang, Huisheng

    2016-07-01

    In protein structure prediction, a considerable number of models are usually produced by either the Template-Based Method (TBM) or the ab initio prediction. The purpose of this study is to find the critical parameter in assessing the quality of the predicted models. A non-redundant template library was developed and 138 target sequences were modeled. The target sequences were all distant from the proteins in the template library and were aligned with template library proteins on the basis of the transformation matrix. The quality of each model was first assessed with QMEAN and its six parameters, which are C_β interaction energy (C_beta), all-atom pairwise energy (PE), solvation energy (SE), torsion angle energy (TAE), secondary structure agreement (SSA), and solvent accessibility agreement (SAE). Finally, the alignment score (score) was also used to assess the quality of model. Hence, a total of eight parameters (i.e., QMEAN, C_beta, PE, SE, TAE, SSA, SAE, score) were independently used to assess the quality of each model. The results indicate that SSA is the best parameter to estimate the quality of the model.

  15. Random sampling and validation of covariance matrices of resonance parameters

    NASA Astrophysics Data System (ADS)

    Plevnik, Lucijan; Zerovnik, Gašper

    2017-09-01

    Analytically exact methods for random sampling of arbitrary correlated parameters are presented. Emphasis is given on one hand on the possible inconsistencies in the covariance data, concentrating on the positive semi-definiteness and consistent sampling of correlated inherently positive parameters, and on the other hand on optimization of the implementation of the methods itself. The methods have been applied in the program ENDSAM, written in the Fortran language, which from a file from a nuclear data library of a chosen isotope in ENDF-6 format produces an arbitrary number of new files in ENDF-6 format which contain values of random samples of resonance parameters (in accordance with corresponding covariance matrices) in places of original values. The source code for the program ENDSAM is available from the OECD/NEA Data Bank. The program works in the following steps: reads resonance parameters and their covariance data from nuclear data library, checks whether the covariance data is consistent, and produces random samples of resonance parameters. The code has been validated with both realistic and artificial data to show that the produced samples are statistically consistent. Additionally, the code was used to validate covariance data in existing nuclear data libraries. A list of inconsistencies, observed in covariance data of resonance parameters in ENDF-VII.1, JEFF-3.2 and JENDL-4.0 is presented. For now, the work has been limited to resonance parameters, however the methods presented are general and can in principle be extended to sampling and validation of any nuclear data.

  16. Hypercluster parallel processing library user's manual

    NASA Technical Reports Server (NTRS)

    Quealy, Angela

    1990-01-01

    This User's Manual describes the Hypercluster Parallel Processing Library, composed of FORTRAN-callable subroutines which enable a FORTRAN programmer to manipulate and transfer information throughout the Hypercluster at NASA Lewis Research Center. Each subroutine and its parameters are described in detail. A simple heat flow application using Laplace's equation is included to demonstrate the use of some of the library's subroutines. The manual can be used initially as an introduction to the parallel features provided by the library. Thereafter it can be used as a reference when programming an application.

  17. Archimedes' principle for characterisation of recombinant whole cell biocatalysts.

    PubMed

    Schmitt, Steven; Walser, Marcel; Rehmann, Michael; Oesterle, Sabine; Panke, Sven; Held, Martin

    2018-02-14

    The ability of whole cells to catalyse multistep reactions, often yielding synthetically demanding compounds later used by industrial biotech or pharma, makes them an indispensable tool of synthetic chemistry. The complex reaction network employed by cellular catalysts and the still only moderate predictive power of modelling approaches leaves this tool challenging to engineer. Frequently, large libraries of semi-rationally generated variants are sampled in high-throughput mode in order to then identify improved catalysts. We present a method for space- and time-efficient processing of very large libraries (10 7 ) of recombinant cellular catalysts, in which the phenotypic characterisation and the isolation of positive variants for the entire library is done within one minute in a single, highly parallelized operation. Specifically, product formation in nanolitre-sized cultivation vessels is sensed and translated into the formation of catalase as a reporter protein. Exposure to hydrogen peroxide leads to oxygen gas formation and thus to a density shift of the cultivation vessel. Exploiting Archimedes' principle, this density shift and the resulting upward buoyancy force can be used for batch-wise library sampling. We demonstrate the potential of the method for both, screening and selection protocols, and envision a wide applicability of the system for biosensor-based assays.

  18. Computer Simulation of the Circulation Subsystem of a Library

    ERIC Educational Resources Information Center

    Shaw, W. M., Jr.

    1975-01-01

    When circulation data are used as input parameters for a computer simulation of a library's circulation subsystem, the results of the simulation provide information on book availability and delays. The model may be used to simulate alternative loan policies. (Author/LS)

  19. Chemoinformatic Analysis of Combinatorial Libraries, Drugs, Natural Products and Molecular Libraries Small Molecule Repository

    PubMed Central

    Singh, Narender; Guha, Rajarshi; Giulianotti, Marc; Pinilla, Clemencia; Houghten, Richard; Medina-Franco, Jose L.

    2009-01-01

    A multiple criteria approach is presented, that is used to perform a comparative analysis of four recently developed combinatorial libraries to drugs, Molecular Libraries Small Molecule Repository (MLSMR) and natural products. The compound databases were assessed in terms of physicochemical properties, scaffolds and fingerprints. The approach enables the analysis of property space coverage, degree of overlap between collections, scaffold and structural diversity and overall structural novelty. The degree of overlap between combinatorial libraries and drugs was assessed using the R-NN curve methodology, which measures the density of chemical space around a query molecule embedded in the chemical space of a target collection. The combinatorial libraries studied in this work exhibit scaffolds that were not observed in the drug, MLSMR and natural products collections. The fingerprint-based comparisons indicate that these combinatorial libraries are structurally different to current drugs. The R-NN curve methodology revealed that a proportion of molecules in the combinatorial libraries are located within the property space of the drugs. However, the R-NN analysis also showed that there are a significant number of molecules in several combinatorial libraries that are located in sparse regions of the drug space. PMID:19301827

  20. ZASPE: A Code to Measure Stellar Atmospheric Parameters and their Covariance from Spectra

    NASA Astrophysics Data System (ADS)

    Brahm, Rafael; Jordán, Andrés; Hartman, Joel; Bakos, Gáspár

    2017-05-01

    We describe the Zonal Atmospheric Stellar Parameters Estimator (zaspe), a new algorithm, and its associated code, for determining precise stellar atmospheric parameters and their uncertainties from high-resolution echelle spectra of FGK-type stars. zaspe estimates stellar atmospheric parameters by comparing the observed spectrum against a grid of synthetic spectra only in the most sensitive spectral zones to changes in the atmospheric parameters. Realistic uncertainties in the parameters are computed from the data itself, by taking into account the systematic mismatches between the observed spectrum and the best-fitting synthetic one. The covariances between the parameters are also estimated in the process. zaspe can in principle use any pre-calculated grid of synthetic spectra, but unbiased grids are required to obtain accurate parameters. We tested the performance of two existing libraries, and we concluded that neither is suitable for computing precise atmospheric parameters. We describe a process to synthesize a new library of synthetic spectra that was found to generate consistent results when compared with parameters obtained with different methods (interferometry, asteroseismology, equivalent widths).

  1. Development of the FITS tools package for multiple software environments

    NASA Technical Reports Server (NTRS)

    Pence, W. D.; Blackburn, J. K.

    1992-01-01

    The HEASARC is developing a package of general purpose software for analyzing data files in FITS format. This paper describes the design philosophy which makes the software both machine-independent (it runs on VAXs, Suns, and DEC-stations) and software environment-independent. Currently the software can be compiled and linked to produce IRAF tasks, or alternatively, the same source code can be used to generate stand-alone tasks using one of two implementations of a user-parameter interface library. The machine independence of the software is achieved by writing the source code in ANSI standard Fortran or C, using the machine-independent FITSIO subroutine interface for all data file I/O, and using a standard user-parameter subroutine interface for all user I/O. The latter interface is based on the Fortran IRAF Parameter File interface developed at STScI. The IRAF tasks are built by linking to the IRAF implementation of this parameter interface library. Two other implementations of this parameter interface library, which have no IRAF dependencies, are now available which can be used to generate stand-alone executable tasks. These stand-alone tasks can simply be executed from the machine operating system prompt either by supplying all the task parameters on the command line or by entering the task name after which the user will be prompted for any required parameters. A first release of this FTOOLS package is now publicly available. The currently available tasks are described, along with instructions on how to obtain a copy of the software.

  2. AQUATOX Data Sources Documents

    EPA Pesticide Factsheets

    Contains the data sources for parameter values of the AQUATOX model including: a bibliography for the AQUATOX data libraries and the compendia of parameter values for US Army Corps of Engineers models.

  3. Relationships between bacterial diversity and environmental variables in a tropical marine environment, Rio de Janeiro.

    PubMed

    Vieira, Ricardo P; Gonzalez, Alessandra M; Cardoso, Alexander M; Oliveira, Denise N; Albano, Rodolpho M; Clementino, Maysa M; Martins, Orlando B; Paranhos, Rodolfo

    2008-01-01

    This study is the first to apply a comparative analysis of environmental chemistry, microbiological parameters and bacterioplankton 16S rRNA clone libraries from different areas of a 50 km transect along a trophic gradient in the tropical Guanabara Bay ecosystem. Higher bacterial diversity was found in the coastal area, whereas lower richness was observed in the more polluted inner bay water. The significance of differences between clone libraries was examined with LIBSHUFF statistics. Paired reciprocal comparisons indicated that each of the libraries differs significantly from the others, and this is in agreement with direct interpretation of the phylogenetic tree. Furthermore, correspondence analyses showed that some taxa are related to specific abiotic, trophic and microbiological parameters in Guanabara Bay estuarine system.

  4. Bishydrazinium and Diammonium Salts of 4,4’,5,5’-Tetranitro-2,2’-biimidazolate (TNBI): Synthesis and Properties

    DTIC Science & Technology

    2015-01-01

    detonation were all calculated with CHEETAH 6.0 using the exp 6.3 library. The per- formance values as reported for NTO and RDX are provided for comparison...purposes and were generated using the CHEETAH 6.0 reactant library values for heat of formation and density for these materials. 3.2 Synthetic

  5. VizieR Online Data Catalog: Brussels nuclear reaction rate library (Aikawa+, 2005)

    NASA Astrophysics Data System (ADS)

    Aikawa, M.; Arnould, M.; Goriely, S.; Jorissen, A.; Takahashi, K.

    2005-07-01

    The present data is part of the Brussels nuclear reaction rate library (BRUSLIB) for astrophysics applications and concerns nuclear reaction rate predictions calculated within the statistical Hauser-Feshbach approximation and making use of global and coherent microscopic nuclear models for the quantities (nuclear masses, nuclear structure properties, nuclear level densities, gamma-ray strength functions, optical potentials) entering the rate calculations. (4 data files).

  6. Improving hot region prediction by parameter optimization of density clustering in PPI.

    PubMed

    Hu, Jing; Zhang, Xiaolong

    2016-11-01

    This paper proposed an optimized algorithm which combines density clustering of parameter selection with feature-based classification for hot region prediction. First, all the residues are classified by SVM to remove non-hot spot residues, then density clustering of parameter selection is used to find hot regions. In the density clustering, this paper studies how to select input parameters. There are two parameters radius and density in density-based incremental clustering. We firstly fix density and enumerate radius to find a pair of parameters which leads to maximum number of clusters, and then we fix radius and enumerate density to find another pair of parameters which leads to maximum number of clusters. Experiment results show that the proposed method using both two pairs of parameters provides better prediction performance than the other method, and compare these two predictive results, the result by fixing radius and enumerating density have slightly higher prediction accuracy than that by fixing density and enumerating radius. Copyright © 2016. Published by Elsevier Inc.

  7. The Library of the Institute of Theoretical Astronomy of the R.A.S. (1924-1994). History, Present State, Perspectives for Future

    NASA Astrophysics Data System (ADS)

    Lapteva, M. V.

    Building up a specialized library collection of the Library of the Institute of Theoretical Astronomy of the Russian Academy of Sciences beginning with foundation of the Library (1924) up to the present time have been considered in their historical perspective. The main acquisition sources, stock figures, various parameters of the collection composi- tion, including information on rare foreign editions are also dealt with. The data on the existing retrieval systems and the perspectives of developing computerized problem directed reference bibliographic complexes are also considered.

  8. MatProps: Material Properties Database and Associated Access Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Durrenberger, J K; Becker, R C; Goto, D M

    2007-08-13

    Coefficients for analytic constitutive and equation of state models (EOS), which are used by many hydro codes at LLNL, are currently stored in a legacy material database (Steinberg, UCRL-MA-106349). Parameters for numerous materials are available through this database, and include Steinberg-Guinan and Steinberg-Lund constitutive models for metals, JWL equations of state for high explosives, and Mie-Gruniesen equations of state for metals. These constitutive models are used in most of the simulations done by ASC codes today at Livermore. Analytic EOSs are also still used, but have been superseded in many cases by tabular representations in LEOS (http://leos.llnl.gov). Numerous advanced constitutivemore » models have been developed and implemented into ASC codes over the past 20 years. These newer models have more physics and better representations of material strength properties than their predecessors, and therefore more model coefficients. However, a material database of these coefficients is not readily available. Therefore incorporating these coefficients with those of the legacy models into a portable database that could be shared amongst codes would be most welcome. The goal of this paper is to describe the MatProp effort at LLNL to create such a database and associated access library that could be used by codes throughout the DOE complex and beyond. We have written an initial version of the MatProp database and access library and our DOE/ASC code ALE3D (Nichols et. al., UCRL-MA-152204) is able to import information from the database. The database, a link to which exists on the Sourceforge server at LLNL, contains coefficients for many materials and models (see Appendix), and includes material parameters in the following categories--flow stress, shear modulus, strength, damage, and equation of state. Future versions of the Matprop database and access library will include the ability to read and write material descriptions that can be exchanged between codes. It will also include an ability to do unit changes, i.e. have the library return parameters in user-specified unit systems. In addition to these, additional material categories can be added (e.g., phase change kinetics, etc.). The Matprop database and access library is part of a larger set of tools used at LLNL for assessing material model behavior. One of these is MSlib, a shared constitutive material model library. Another is the Material Strength Database (MSD), which allows users to compare parameter fits for specific constitutive models to available experimental data. Together with Matprop, these tools create a suite of capabilities that provide state-of-the-art models and parameters for those models to integrated simulation codes. This document is broken into several appendices. Appendix A contains a code example to retrieve several material coefficients. Appendix B contains the API for the Matprop data access library. Appendix C contains a list of the material names and model types currently available in the Matprop database. Appendix D contains a list of the parameter names for the currently recognized model types. Appendix E contains a full xml description of the material Tantalum.« less

  9. Ab Initio Calculation of XAFS Debye-Waller Factors for Crystalline Materials

    NASA Astrophysics Data System (ADS)

    Dimakis, Nicholas

    2007-02-01

    A direct an accurate technique for calculating the thermal X-ray absorption fine structure (XAFS) Debye-Waller factors (DWF) for materials of crystalline structure is presented. Using the Density Functional Theory (DFT) under the hybrid X3LYP functional, a library of MnO spin—optimized clusters are built and their phonon spectrum properties are calculated; these properties in the form of normal mode eigenfrequencies and eigenvectors are in turn used for calculation of the single and multiple scattering XAFS DWF. DWF obtained via this technique are temperature dependent expressions and can be used to substantially reduce the number of fitting parameters when experimental spectra are fitted with a hypothetical structure without any ad hoc assumptions. Due to the high computational demand a hybrid approach of mixing the DFT calculated DWF with the correlated Debye model for inner and outer shells respectively is presented. DFT obtained DWFs are compared with corresponding values from experimental XAFS spectra on manganosite. The cluster size effect and the spin parameter on the DFT calculated DWFs are discussed.

  10. Genetics of Bone Density

    MedlinePlus

    ... Record Research & Training Medical Research Initiatives Science Highlights Science Education Research in NIH Labs & Clinics Training Opportunities Library Resources Research Resources Clinical Research Resources Safety, Regulation ...

  11. Simulation of rarefied low pressure RF plasma flow around the sample

    NASA Astrophysics Data System (ADS)

    Zheltukhin, V. S.; Shemakhin, A. Yu

    2017-01-01

    The paper describes a mathematical model of the flow of radio frequency plasma at low pressure. The hybrid mathematical model includes the Boltzmann equation for the neutral component of the RF plasma, the continuity and the thermal equations for the charged component. Initial and boundary conditions for the corresponding equations are described. The electron temperature in the calculations is 1-4 eV, atoms temperature in the plasma clot is (3-4) • 103 K, in the plasma jet is (3.2-10) • 102 K, the degree of ionization is 10-7-10-5, electron density is 1015-1019 m-3. For calculations plasma parameters is developed soft package on C++ program language, that uses the OpenFOAM library package. Simulations for the vacuum chamber in the presence of a sample and the free jet flow were carried out.

  12. Kinetic Parameter Measurements in the MINERVE Reactor

    NASA Astrophysics Data System (ADS)

    Perret, Grégory; Geslot, Benoit; Gruel, Adrien; Blaise, Patrick; Di-Salvo, Jacques; De Izarra, Grégoire; Jammes, Christian; Hursin, Mathieu; Pautz, Andréas

    2017-01-01

    In the framework of an international collaboration, teams of the PSI and CEA research institutes measure the critical decay constant (α0 = β/A), delayed neutron fraction (β) and generation time (A) of the Minerve reactor using the Feynman-α, Power Spectral Density and Rossi-α neutron noise measurement techniques. These measurements contribute to the experimental database of kinetic parameters used to improve nuclear data files and validate modern methods in Monte Carlo codes. Minerve is a zero-power pool reactor composed of a central experimental test lattice surrounded by a large aluminum buffer and four high-enriched driver regions. Measurements are performed in three slightly subcritical configurations (-2 cents to -30 cents) using two high-efficiency 235U fission chambers in the driver regions. Measurement of α0 and β obtained by the two institutes and with the different techniques are consistent for the configurations envisaged. Slight increases of the β values are observed with the subcriticality level. Best estimate values are obtained with the Cross-Power Spectral Density technique at -2 cents, and are worth: β = 716.9±9.0 pcm, α0 = 79.0±0.6 s-1 and A = 90.7±1.4 μs. The kinetic parameters are predicted with MCNP5-v1.6 and TRIPOLI4.9 and the JEFF-3.1/3.1.1 and ENDF/B-VII.1 nuclear data libraries. The predictions for β and α0 overestimate the experimental results by 3-5% and 10-12%, respectively; that for A underestimate the experimental result by 6-7%. The discrepancies are suspected to come from the driven system nature of Minerve and the location of the detectors in the driver regions, which prevent accounting for the full reactor.

  13. SP_Ace: a new code to derive stellar parameters and elemental abundances

    NASA Astrophysics Data System (ADS)

    Boeche, C.; Grebel, E. K.

    2016-03-01

    Context. Ongoing and future massive spectroscopic surveys will collect large numbers (106-107) of stellar spectra that need to be analyzed. Highly automated software is needed to derive stellar parameters and chemical abundances from these spectra. Aims: We developed a new method of estimating the stellar parameters Teff, log g, [M/H], and elemental abundances. This method was implemented in a new code, SP_Ace (Stellar Parameters And Chemical abundances Estimator). This is a highly automated code suitable for analyzing the spectra of large spectroscopic surveys with low or medium spectral resolution (R = 2000-20 000). Methods: After the astrophysical calibration of the oscillator strengths of 4643 absorption lines covering the wavelength ranges 5212-6860 Å and 8400-8924 Å, we constructed a library that contains the equivalent widths (EW) of these lines for a grid of stellar parameters. The EWs of each line are fit by a polynomial function that describes the EW of the line as a function of the stellar parameters. The coefficients of these polynomial functions are stored in a library called the "GCOG library". SP_Ace, a code written in FORTRAN95, uses the GCOG library to compute the EWs of the lines, constructs models of spectra as a function of the stellar parameters and abundances, and searches for the model that minimizes the χ2 deviation when compared to the observed spectrum. The code has been tested on synthetic and real spectra for a wide range of signal-to-noise and spectral resolutions. Results: SP_Ace derives stellar parameters such as Teff, log g, [M/H], and chemical abundances of up to ten elements for low to medium resolution spectra of FGK-type stars with precision comparable to the one usually obtained with spectra of higher resolution. Systematic errors in stellar parameters and chemical abundances are presented and identified with tests on synthetic and real spectra. Stochastic errors are automatically estimated by the code for all the parameters. A simple Web front end of SP_Ace can be found at http://dc.g-vo.org/SP_ACE while the source code will be published soon. Full Tables D.1-D.3 are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/587/A2

  14. Architectural Optimization of Digital Libraries

    NASA Technical Reports Server (NTRS)

    Biser, Aileen O.

    1998-01-01

    This work investigates performance and scaling issues relevant to large scale distributed digital libraries. Presently, performance and scaling studies focus on specific implementations of production or prototype digital libraries. Although useful information is gained to aid these designers and other researchers with insights to performance and scaling issues, the broader issues relevant to very large scale distributed libraries are not addressed. Specifically, no current studies look at the extreme or worst case possibilities in digital library implementations. A survey of digital library research issues is presented. Scaling and performance issues are mentioned frequently in the digital library literature but are generally not the focus of much of the current research. In this thesis a model for a Generic Distributed Digital Library (GDDL) and nine cases of typical user activities are defined. This model is used to facilitate some basic analysis of scaling issues. Specifically, the calculation of Internet traffic generated for different configurations of the study parameters and an estimate of the future bandwidth needed for a large scale distributed digital library implementation. This analysis demonstrates the potential impact a future distributed digital library implementation would have on the Internet traffic load and raises questions concerning the architecture decisions being made for future distributed digital library designs.

  15. Changes of the Bacterial Abundance and Communities in Shallow Ice Cores from Dunde and Muztagata Glaciers, Western China

    PubMed Central

    Chen, Yong; Li, Xiang-Kai; Si, Jing; Wu, Guang-Jian; Tian, Li-De; Xiang, Shu-Rong

    2016-01-01

    In this study, six bacterial community structures were analyzed from the Dunde ice core (9.5-m-long) using 16S rRNA gene cloning library technology. Compared to the Muztagata mountain ice core (37-m-long), the Dunde ice core has different dominant community structures, with five genus-related groups Blastococcus sp./Propionibacterium, Cryobacterium-related., Flavobacterium sp., Pedobacter sp., and Polaromas sp. that are frequently found in the six tested ice layers from 1990 to 2000. Live and total microbial density patterns were examined and related to the dynamics of physical-chemical parameters, mineral particle concentrations, and stable isotopic ratios in the precipitations collected from both Muztagata and Dunde ice cores. The Muztagata ice core revealed seasonal response patterns for both live and total cell density, with high cell density occurring in the warming spring and summer months indicated by the proxy value of the stable isotopic ratios. Seasonal analysis of live cell density for the Dunde ice core was not successful due to the limitations of sampling resolution. Both ice cores showed that the cell density peaks were frequently associated with high concentrations of particles. A comparison of microbial communities in the Dunde and Muztagata glaciers showed that similar taxonomic members exist in the related ice cores, but the composition of the prevalent genus-related groups is largely different between the two geographically different glaciers. This indicates that the micro-biogeography associated with geographic differences was mainly influenced by a few dominant taxonomic groups. PMID:27847503

  16. Modelling Neutron-induced Reactions on 232–237U from 10 keV up to 30 MeV

    DOE PAGES

    Sin, M.; Capote, R.; Herman, M. W.; ...

    2017-01-17

    Comprehensive calculations of cross sections for neutron-induced reactions on 232–237U targets are performed in this paper in the 10 keV–30 MeV incident energy range with the code EMPIRE–3.2 Malta. The advanced modelling and consistent calculation scheme are aimed at improving our knowledge of the neutron scattering and emission cross sections, and to assess the consistency of available evaluated libraries for light uranium isotopes. The reaction model considers a dispersive optical potential (RIPL 2408) that couples from five (even targets) to nine (odd targets) levels of the ground-state rotational band, and a triple-humped fission barrier with absorption in the wells describedmore » within the optical model for fission. A modified Lorentzian model (MLO) of the radiative strength function and Enhanced Generalized Superfluid Model nuclear level densities are used in Hauser-Feschbach calculations of the compound-nuclear decay that include width fluctuation corrections. The starting values for the model parameters are retrieved from RIPL. Excellent agreement with available experimental data for neutron emission and fission is achieved, giving confidence that the quantities for which there is no experimental information are also accurately predicted. Finally, deficiencies in existing evaluated libraries are highlighted.« less

  17. The Pain in Storage: Work Safety in a High-Density Shelving Facility

    ERIC Educational Resources Information Center

    Atkins, Stephanie A.

    2005-01-01

    An increasing number of academic and research libraries have built high-density shelving facilities to address overcrowding conditions in their regular stacks. However, the work performed in these facilities is physically strenuous and highly repetitive in nature and may require the use of potentially dangerous equipment. This article will examine…

  18. Multi-registration of software library resources

    DOEpatents

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-04-05

    Data communications, including issuing, by an application program to a high level data communications library, a request for initialization of a data communications service; issuing to a low level data communications library a request for registration of data communications functions; registering the data communications functions, including instantiating a factory object for each of the one or more data communications functions; issuing by the application program an instruction to execute a designated data communications function; issuing, to the low level data communications library, an instruction to execute the designated data communications function, including passing to the low level data communications library a call parameter that identifies a factory object; creating with the identified factory object the data communications object that implements the data communications function according to the protocol; and executing by the low level data communications library the designated data communications function.

  19. TRIQS: A toolbox for research on interacting quantum systems

    NASA Astrophysics Data System (ADS)

    Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka

    2015-11-01

    We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.

  20. Multigroup cross section library for GFR2400

    NASA Astrophysics Data System (ADS)

    Čerba, Štefan; Vrban, Branislav; Lüley, Jakub; Haščík, Ján; Nečas, Vladimír

    2017-09-01

    In this paper the development and optimization of the SBJ_E71 multigroup cross section library for GFR2400 applications is discussed. A cross section processing scheme, merging Monte Carlo and deterministic codes, was developed. Several fine and coarse group structures and two weighting flux options were analysed through 18 benchmark experiments selected from the handbook of ICSBEP and based on performed similarity assessments. The performance of the collapsed version of the SBJ_E71 library was compared with MCNP5 CE ENDF/B VII.1 and the Korean KAFAX-E70 library. The comparison was made based on integral parameters of calculations performed on full core homogenous models.

  1. Collection Development Policy: Academic Library, St. Mary's University. Revised.

    ERIC Educational Resources Information Center

    Sylvia, Margaret

    This guide spells out the collection development policy of the library of St. Mary's University in San Antonio, Texas. The guide is divided into the following five topic areas: (1) introduction to the community served, parameters of the collection, cooperation in collection development, and priorities of the collection; (2) considerations in…

  2. Smoothing Forecasting Methods for Academic Library Circulations: An Evaluation and Recommendation.

    ERIC Educational Resources Information Center

    Brooks, Terrence A.; Forys, John W., Jr.

    1986-01-01

    Circulation time-series data from 50 midwest academic libraries were used to test 110 variants of 8 smoothing forecasting methods. Data and methodologies and illustrations of two recommended methods--the single exponential smoothing method and Brown's one-parameter linear exponential smoothing method--are given. Eight references are cited. (EJS)

  3. INTRIGOSS: A new Library of High Resolution Synthetic Spectra

    NASA Astrophysics Data System (ADS)

    Franchini, Mariagrazia; Morossi, Carlo; Di Marcancantonio, Paolo; Chavez, Miguel; GES-Builders

    2018-01-01

    INTRIGOSS (INaf Trieste Grid Of Synthetic Spectra) is a new High Resolution (HiRes) synthetic spectral library designed for studying F, G, and K stars. The library is based on atmosphere models computed with specified individual element abundances via ATLAS12 code. Normalized SPectra (NSP) and surface Flux SPectra (FSP), in the 4800-5400 Å wavelength range, were computed by means of the SPECTRUM code. The synthetic spectra are computed with an atomic and bi-atomic molecular line list including "bona fide" Predicted Lines (PLs) built by tuning loggf to reproduce very high SNR Solar spectrum and the UVES-U580 spectra of five cool giants extracted from the Gaia-ESO survey (GES). The astrophysical gf-values were then assessed by using more than 2000 stars with homogenous and accurate atmosphere parameters and detailed chemical composition from GES. The validity and greater accuracy of INTRIGOSS NSPs and FSPs with respect to other available spectral libraries is discussed. INTRIGOSS will be available on the web and will be a valuable tool for both stellar atmospheric parameters and stellar population studies.

  4. Chlorination pattern effect on thermodynamic parameters and environmental degradability for C₁₀-SCCPs: Quantum chemical calculation based on virtual combinational library.

    PubMed

    Sun, Yuzhen; Pan, Wenxiao; Lin, Yuan; Fu, Jianjie; Zhang, Aiqian

    2016-01-01

    Short-chain chlorinated paraffins (SCCPs) are still controversial candidates for inclusion in the Stockholm Convention. The inherent mixture nature of SCCPs makes it rather difficult to explore their environmental behaviors. A virtual molecule library of 42,720 C10-SCCP congeners covering the full structure spectrum was constructed. We explored the structural effects on the thermodynamic parameters and environmental degradability of C10-SCCPs through semi-empirical quantum chemical calculations. The thermodynamic properties were acquired using the AM1 method, and frontier molecular orbital analysis was carried out to obtain the E(HOMO), E(LUMO) and E(LUMO)-E(HOMO) for degradability exploration at the same level. The influence of the chlorination degree (N(Cl)) on the relative stability and environmental degradation was elucidated. A novel structural descriptor, μ, was proposed to measure the dispersion of the chlorine atoms within a molecule. There were significant correlations between thermodynamic values and N(Cl), while the reported N(Cl)-dependent pollution profile of C10-SCCPs in environmental samples was basically consistent with the predicted order of formation stability of C10-SCCP congeners. In addition, isomers with large μ showed higher relative stability than those with small μ. This could be further verified by the relationship between μ and the reactivity of nucleophilic substitution and OH attack respectively. The C10-SCCP congeners with less Cl substitution and lower dispersion degree are susceptible to environmental degradation via nucleophilic substitution and hydroxyl radical attack, while direct photolysis of C10-SCCP congeners cannot readily occur due to the large E(LUMO)-E(HOMO) values. The chlorination effect and the conclusions were further checked with appropriate density functional theory (DFT) calculations. Copyright © 2015. Published by Elsevier B.V.

  5. Reevaluation of the AAPM TG-43 brachytherapy dosimetry parameters for an 125I seed, and the influence of eye plaque design on dose distributions and dose-volume histograms

    NASA Astrophysics Data System (ADS)

    Aryal, Prakash

    The TG-43 dosimetry parameters of the Advantage(TM) 125I model IAI-125A brachytherapy seed were studied. An investigation using modern MCNP radiation transport code with updated cross-section libraries was performed. Twelve different simulation conditions were studied for a single seed by varying the coating thickness, mass density, photon energy spectrum and cross-section library. The dose rate was found to be 6.3% lower at 1 cm in comparison to published results. New TG-43 dosimetry parameters are proposed. The dose distribution for a brachytherapy eye plaque, model EP917, was investigated, including the effects of collimation from high-Z slots. Dose distributions for 26 slot designs were determined using Monte Carlo methods and compared between the published literature, a clinical treatment planning system, and physical measurements. The dosimetric effect of the composition and mass density of the gold backing was shown to be less than 3%. Slot depth, width, and length changed the central axis (CAX) dose distributions by < 1% per 0.1 mm in design variation. Seed shifts in the slot towards the eye and shifts of the 125I-laden silver rod within the seed had the greatest impact on the CAX dose distribution, changing it by 14%, 9%, 4.3%, and 2.7% at 1, 2, 5, and 10 mm, respectively, from the inner scleral surface. The measured, full plaque slot geometry delivered 2.4% +/- 1.1% higher dose along the plaque's CAX than the geometry provided by the manufacturer and 2.2%+/-2.3% higher than Plaque Simulator(TM) (PS) treatment planning software (version 5.7.6). The D10 for the simulated tumor, inner sclera, and outer sclera for the measured slot plaque to manufacturer provided slot design was 9%, 10%, and 19% higher, respectively. In comparison to the measured plaque design, a theoretical plaque having narrow and deep slots delivered 30%, 37%, and 62% lower D 10 doses to the tumor, inner sclera, and outer sclera, respectively. CAX doses at --1, 0, 1, and 2 mm were also lower by a factor of 2.6, 1.72, 1.50, and 1.39, respectively. The study identified substantial sensitivity of the EP917 plaque dose distributions to slot design. KEYWORDS: Monte Carlo methods, dosimetry, 125I, TG-43, eye plaque brachytherapy.

  6. Restoring Bone Density in Women with Ovarian Disorder

    MedlinePlus

    ... Record Research & Training Medical Research Initiatives Science Highlights Science Education Research in NIH Labs & Clinics Training Opportunities Library Resources Research Resources Clinical Research Resources Safety, Regulation ...

  7. Collection Metadata Solutions for Digital Library Applications

    NASA Technical Reports Server (NTRS)

    Hill, Linda L.; Janee, Greg; Dolin, Ron; Frew, James; Larsgaard, Mary

    1999-01-01

    Within a digital library, collections may range from an ad hoc set of objects that serve a temporary purpose to established library collections intended to persist through time. The objects in these collections vary widely, from library and data center holdings to pointers to real-world objects, such as geographic places, and the various metadata schemas that describe them. The key to integrated use of such a variety of collections in a digital library is collection metadata that represents the inherent and contextual characteristics of a collection. The Alexandria Digital Library (ADL) Project has designed and implemented collection metadata for several purposes: in XML form, the collection metadata "registers" the collection with the user interface client; in HTML form, it is used for user documentation; eventually, it will be used to describe the collection to network search agents; and it is used for internal collection management, including mapping the object metadata attributes to the common search parameters of the system.

  8. Glaucoma Diagnostic Ability of the Optical Coherence Tomography Angiography Vessel Density Parameters.

    PubMed

    Chung, Jae Keun; Hwang, Young Hoon; Wi, Jae Min; Kim, Mijin; Jung, Jong Jin

    2017-11-01

    To investigate the glaucoma diagnostic abilities of vessel density parameters as determined by optical coherence tomography (OCT) angiography in different stages of glaucoma. A total of 113 healthy eyes and 140 glaucomatous eyes were enrolled. Diagnostic abilities of the OCT vessel density parameters in the optic nerve head (ONH), peripapillary, and macular regions were evaluated by calculating the area under the receiver operation characteristic curves (AUCs). AUCs of the peripapillary vessel density parameters and circumpapillary retinal nerve fiber layer (RNFL) thickness were compared. OCT angiography vessel densities in the ONH, peripapillary, and macular regions in the glaucomatous eyes were significantly lower than those in the healthy eyes (P < 0.05). Among the vessel density parameters, the average peripapillary vessel density showed higher AUC than the ONH and macular region (AUCs: 0.807, 0.566, and 0.651, respectively) for glaucoma detection. The peripapillary vessel density parameters showed similar AUCs with the corresponding sectoral RNFL thickness (P > 0.05). However, in the early stage of glaucoma, the AUCs of the inferotemporal and temporal peripapillary vessel densities were significantly lower than that of the RNFL thickness (P < 0.05). The glaucomatous eyes showed decreased vessel density as determined by OCT angiography. Although the peripapillary vessel density parameters showed similar glaucoma diagnostic ability with circumpapillary RNFL thickness, in the early stage, the vessel density parameters showed limited clinical value.

  9. TEMPy: a Python library for assessment of three-dimensional electron microscopy density fits.

    PubMed

    Farabella, Irene; Vasishtan, Daven; Joseph, Agnel Praveen; Pandurangan, Arun Prasad; Sahota, Harpal; Topf, Maya

    2015-08-01

    Three-dimensional electron microscopy is currently one of the most promising techniques used to study macromolecular assemblies. Rigid and flexible fitting of atomic models into density maps is often essential to gain further insights into the assemblies they represent. Currently, tools that facilitate the assessment of fitted atomic models and maps are needed. TEMPy (template and electron microscopy comparison using Python) is a toolkit designed for this purpose. The library includes a set of methods to assess density fits in intermediate-to-low resolution maps, both globally and locally. It also provides procedures for single-fit assessment, ensemble generation of fits, clustering, and multiple and consensus scoring, as well as plots and output files for visualization purposes to help the user in analysing rigid and flexible fits. The modular nature of TEMPy helps the integration of scoring and assessment of fits into large pipelines, making it a tool suitable for both novice and expert structural biologists.

  10. Statins: MedlinePlus Health Topic

    MedlinePlus

    ... for Medical Education and Research) Also in Spanish ... References and abstracts from MEDLINE/PubMed (National Library of Medicine) Article: Novel method versus the Friedewald method for estimating low-density ...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perret, Gregory

    The critical decay constant (B/A), delayed neutron fraction (B) and generation time (A) of the Minerve reactor were measured by the Paul Scherrer Institut (PSI) and the Commissariat a l'Energie Atomique (CEA) in September 2014 using the Feynman-alpha and Power Spectral Density neutron noise measurement techniques. Three slightly subcritical configuration were measured using two 1-g {sup 235}U fission chambers. This paper reports on the results obtained by PSI in the near critical configuration (-2g). The most reliable and precise results were obtained with the Cross-Power Spectral Density technique: B = 708.4±9.2 pcm, B/A = 79.0±0.6 s{sup -1} and A 89.7±1.4more » micros. Predictions of the same kinetic parameters were obtained with MCNP5-v1.6 and the JEFF-3.1 and ENDF/B-VII.1 nuclear data libraries. On average the predictions for B and B/A overestimate the experimental results by 5% and 11%, respectively. The discrepancy is suspected to come from either a corruption of the data or from the inadequacy of the point kinetic equations to interpret the measurements in the Minerve driven system. (authors)« less

  12. Psi4 1.1: An Open-Source Electronic Structure Program Emphasizing Automation, Advanced Libraries, and Interoperability.

    PubMed

    Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David

    2017-07-11

    Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.

  13. A Method of Predicting Queuing at Library Online PCs

    ERIC Educational Resources Information Center

    Beranek, Lea G.

    2006-01-01

    On-campus networked personal computer (PC) usage at La Trobe University Library was surveyed during September 2005. The survey's objectives were to confirm peak usage times, to measure some of the relevant parameters of online PC usage, and to determine the effect that 24 new networked PCs had on service quality. The survey found that clients…

  14. A new stellar spectrum interpolation algorithm and its application to Yunnan-III evolutionary population synthesis models

    NASA Astrophysics Data System (ADS)

    Cheng, Liantao; Zhang, Fenghui; Kang, Xiaoyu; Wang, Lang

    2018-05-01

    In evolutionary population synthesis (EPS) models, we need to convert stellar evolutionary parameters into spectra via interpolation in a stellar spectral library. For theoretical stellar spectral libraries, the spectrum grid is homogeneous on the effective-temperature and gravity plane for a given metallicity. It is relatively easy to derive stellar spectra. For empirical stellar spectral libraries, stellar parameters are irregularly distributed and the interpolation algorithm is relatively complicated. In those EPS models that use empirical stellar spectral libraries, different algorithms are used and the codes are often not released. Moreover, these algorithms are often complicated. In this work, based on a radial basis function (RBF) network, we present a new spectrum interpolation algorithm and its code. Compared with the other interpolation algorithms that are used in EPS models, it can be easily understood and is highly efficient in terms of computation. The code is written in MATLAB scripts and can be used on any computer system. Using it, we can obtain the interpolated spectra from a library or a combination of libraries. We apply this algorithm to several stellar spectral libraries (such as MILES, ELODIE-3.1 and STELIB-3.2) and give the integrated spectral energy distributions (ISEDs) of stellar populations (with ages from 1 Myr to 14 Gyr) by combining them with Yunnan-III isochrones. Our results show that the differences caused by the adoption of different EPS model components are less than 0.2 dex. All data about the stellar population ISEDs in this work and the RBF spectrum interpolation code can be obtained by request from the first author or downloaded from http://www1.ynao.ac.cn/˜zhangfh.

  15. Measurements of neutron capture cross sections on 70Zn at 0.96 and 1.69 MeV

    NASA Astrophysics Data System (ADS)

    Punte, L. R. M.; Lalremruata, B.; Otuka, N.; Suryanarayana, S. V.; Iwamoto, Y.; Pachuau, Rebecca; Satheesh, B.; Thanga, H. H.; Danu, L. S.; Desai, V. V.; Hlondo, L. R.; Kailas, S.; Ganesan, S.; Nayak, B. K.; Saxena, A.

    2017-02-01

    The cross sections of the 70Zn(n ,γ )Zn71m (T1 /2=3.96 ±0.05 -h ) reaction have been measured relative to the 197Au(n ,γ )198Au cross sections at 0.96 and 1.69 MeV using a 7Li(p ,n )7Be neutron source and activation technique. The cross section of this reaction has been measured for the first time in the MeV region. The new experimental cross sections have been compared with the theoretical prediction by talys-1.6 with various level-density models and γ -ray strength functions as well as the tendl-2015 library. The talys-1.6 calculation with the generalized superfluid level-density model and Kopecky-Uhl generalized Lorentzian γ -ray strength function predicted the new experimental cross sections at both incident energies. The 70Zn(n ,γ ) g+m 71Zn total capture cross sections have also been derived by applying the evaluated isomeric ratios in the tendl-2015 library to the measured partial capture cross sections. The spectrum averaged total capture cross sections derived in the present paper agree well with the jendl-4.0 library at 0.96 MeV, whereas it lies between the tendl-2015 and the jendl-4.0 libraries at 1.69 MeV.

  16. Multifractal Characterization of Geologic Noise for Improved UXO Detection and Discrimination

    DTIC Science & Technology

    2008-03-01

    12 Recovery of the Universal Multifractal Parameters ...dipole-model to each magnetic anomaly and compares the extracted model parameters with a library of UXO items. They found that remnant magnetization...the survey parameters , and the geologic environment. In this pilot study we have focused on the multifractal representation of natural variations

  17. The relative pose estimation of aircraft based on contour model

    NASA Astrophysics Data System (ADS)

    Fu, Tai; Sun, Xiangyi

    2017-02-01

    This paper proposes a relative pose estimation approach based on object contour model. The first step is to obtain a two-dimensional (2D) projection of three-dimensional (3D)-model-based target, which will be divided into 40 forms by clustering and LDA analysis. Then we proceed by extracting the target contour in each image and computing their Pseudo-Zernike Moments (PZM), thus a model library is constructed in an offline mode. Next, we spot a projection contour that resembles the target silhouette most in the present image from the model library with reference of PZM; then similarity transformation parameters are generated as the shape context is applied to match the silhouette sampling location, from which the identification parameters of target can be further derived. Identification parameters are converted to relative pose parameters, in the premise that these values are the initial result calculated via iterative refinement algorithm, as the relative pose parameter is in the neighborhood of actual ones. At last, Distance Image Iterative Least Squares (DI-ILS) is employed to acquire the ultimate relative pose parameters.

  18. Validation of Hansen-Roach library for highly enriched uranium metal systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenz, T.R.; Busch, R.D.

    The Hansen-Roach 16-group cross-section library has been validated for use in pure uranium metal systems by modeling the Godiva critical assembly using the neutronics transport theory code ONEDANT to perform effective multiplication factor (k{sub eff}) calculations. The cross-section library used contains data for 118 isotopes (34 unique elements), including the revised cross sections for {sup 235}U and {sup 238}U. The Godiva critical assembly is a 17.4-cm sphere composed of 93.7 wt% {sup 235}U, 1.0 wt% {sup 234}U, and 5.3 wt% {sup 238}U with an effective homogeneous density of 18.7 g/cm{sup 3}.

  19. The Essential Genome of Escherichia coli K-12.

    PubMed

    Goodall, Emily C A; Robinson, Ashley; Johnston, Iain G; Jabbari, Sara; Turner, Keith A; Cunningham, Adam F; Lund, Peter A; Cole, Jeffrey A; Henderson, Ian R

    2018-02-20

    Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. IMPORTANCE Incentives to define lists of genes that are essential for bacterial survival include the identification of potential targets for antibacterial drug development, genes required for rapid growth for exploitation in biotechnology, and discovery of new biochemical pathways. To identify essential genes in Escherichia coli , we constructed a transposon mutant library of unprecedented density. Initial automated analysis of the resulting data revealed many discrepancies compared to the literature. We now report more extensive statistical analysis supported by both literature searches and detailed inspection of high-density TraDIS sequencing data for each putative essential gene for the E. coli model laboratory organism. This paper is important because it provides a better understanding of the essential genes of E. coli , reveals the limitations of relying on automated analysis alone, and provides a new standard for the analysis of TraDIS data. Copyright © 2018 Goodall et al.

  20. Planetary Image Geometry Library

    NASA Technical Reports Server (NTRS)

    Deen, Robert C.; Pariser, Oleg

    2010-01-01

    The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.

  1. X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin

    NASA Astrophysics Data System (ADS)

    Laycock, Silas; Yang, Jun; Christodoulou, Dimitris; Coe, Malcolm; Cappallo, Rigel; Zezas, Andreas; Ho, Wynn C. G.; Hong, JaeSub; Fingerman, Samuel; Drake, Jeremy J.; Kretschmar, Peter; Antoniou, Vallia

    2017-08-01

    We present our multi-satellite library of X-ray Pulsar observations to the community, and highlight recent science results. Available at www.xraypulsars.space the library provides a range of high-level data products, including: activity histories, pulse-profiles, phased event files, and a unique pulse-profile modeling interface. The initial release (v1.0) contains some 15 years of RXTE-PCA, Chandra ACIS-I, and XMM-PN observations of the Small Magellanic Cloud, creating a valuable record of pulsar behavior. Our library is intended to enable new progress on fundamental NS parameters and accretion physics. The major motivations are (1) Assemble a large homogeneous sample to enable population statistics. This has so far been used to map the propeller transition, and explore the role of retrograde and pro-grade accretion disks. (2) Obtain pulse-profiles for the same pulsars on many different occasions, at different luminosities and states in order to break model degeneracies. This effort has led to preliminary measurements of the offsets between magnetic and spin axes. With the addition of other satellites, and Galactic pulsars, the library will cover the entire available range of luminosity, variability timescales and accretion regimes.

  2. Optimization and Simulation of SLM Process for High Density H13 Tool Steel Parts

    NASA Astrophysics Data System (ADS)

    Laakso, Petri; Riipinen, Tuomas; Laukkanen, Anssi; Andersson, Tom; Jokinen, Antero; Revuelta, Alejandro; Ruusuvuori, Kimmo

    This paper demonstrates the successful printing and optimization of processing parameters of high-strength H13 tool steel by Selective Laser Melting (SLM). D-Optimal Design of Experiments (DOE) approach is used for parameter optimization of laser power, scanning speed and hatch width. With 50 test samples (1×1×1cm) we establish parameter windows for these three parameters in relation to part density. The calculated numerical model is found to be in good agreement with the density data obtained from the samples using image analysis. A thermomechanical finite element simulation model is constructed of the SLM process and validated by comparing the calculated densities retrieved from the model with the experimentally determined densities. With the simulation tool one can explore the effect of different parameters on density before making any printed samples. Establishing a parameter window provides the user with freedom for parameter selection such as choosing parameters that result in fastest print speed.

  3. Polycrystalline CVD diamond device level modeling for particle detection applications

    NASA Astrophysics Data System (ADS)

    Morozzi, A.; Passeri, D.; Kanxheri, K.; Servoli, L.; Lagomarsino, S.; Sciortino, S.

    2016-12-01

    Diamond is a promising material whose excellent physical properties foster its use for radiation detection applications, in particular in those hostile operating environments where the silicon-based detectors behavior is limited due to the high radiation fluence. Within this framework, the application of Technology Computer Aided Design (TCAD) simulation tools is highly envisaged for the study, the optimization and the predictive analysis of sensing devices. Since the novelty of using diamond in electronics, this material is not included in the library of commercial, state-of-the-art TCAD software tools. In this work, we propose the development, the application and the validation of numerical models to simulate the electrical behavior of polycrystalline (pc)CVD diamond conceived for diamond sensors for particle detection. The model focuses on the characterization of a physically-based pcCVD diamond bandgap taking into account deep-level defects acting as recombination centers and/or trap states. While a definite picture of the polycrystalline diamond band-gap is still debated, the effect of the main parameters (e.g. trap densities, capture cross-sections, etc.) can be deeply investigated thanks to the simulated approach. The charge collection efficiency due to β -particle irradiation of diamond materials provided by different vendors and with different electrode configurations has been selected as figure of merit for the model validation. The good agreement between measurements and simulation findings, keeping the traps density as the only one fitting parameter, assesses the suitability of the TCAD modeling approach as a predictive tool for the design and the optimization of diamond-based radiation detectors.

  4. Bottled SAFT: A Web App Providing SAFT-γ Mie Force Field Parameters for Thousands of Molecular Fluids.

    PubMed

    Ervik, Åsmund; Mejía, Andrés; Müller, Erich A

    2016-09-26

    Coarse-grained molecular simulation has become a popular tool for modeling simple and complex fluids alike. The defining aspects of a coarse grained model are the force field parameters, which must be determined for each particular fluid. Because the number of molecular fluids of interest in nature and in engineering processes is immense, constructing force field parameter tables by individually fitting to experimental data is a futile task. A step toward solving this challenge was taken recently by Mejía et al., who proposed a correlation that provides SAFT-γ Mie force field parameters for a fluid provided one knows the critical temperature, the acentric factor and a liquid density, all relatively accessible properties. Building on this, we have applied the correlation to more than 6000 fluids, and constructed a web application, called "Bottled SAFT", which makes this data set easily searchable by CAS number, name or chemical formula. Alternatively, the application allows the user to calculate parameters for components not present in the database. Once the intermolecular potential has been found through Bottled SAFT, code snippets are provided for simulating the desired substance using the "raaSAFT" framework, which leverages established molecular dynamics codes to run the simulations. The code underlying the web application is written in Python using the Flask microframework; this allows us to provide a modern high-performance web app while also making use of the scientific libraries available in Python. Bottled SAFT aims at taking the complexity out of obtaining force field parameters for a wide range of molecular fluids, and facilitates setting up and running coarse-grained molecular simulations. The web application is freely available at http://www.bottledsaft.org . The underlying source code is available on Bitbucket under a permissive license.

  5. Drafting Recommendations for a Shared Statewide High-Density Storage Facility: Experiences with the State University Libraries of Florida Proposal

    ERIC Educational Resources Information Center

    Walker, Ben

    2008-01-01

    In August 2007, an $11.2 million proposal for a shared statewide high-density storage facility was submitted to the Board of Governors, the governing body of the State University System in Florida. The project was subsequently approved at a slightly lower level and funding was delayed until 2010/2011. The experiences of coordinating data…

  6. Technical Considerations for Reduced Representation Bisulfite Sequencing with Multiplexed Libraries

    PubMed Central

    Chatterjee, Aniruddha; Rodger, Euan J.; Stockwell, Peter A.; Weeks, Robert J.; Morison, Ian M.

    2012-01-01

    Reduced representation bisulfite sequencing (RRBS), which couples bisulfite conversion and next generation sequencing, is an innovative method that specifically enriches genomic regions with a high density of potential methylation sites and enables investigation of DNA methylation at single-nucleotide resolution. Recent advances in the Illumina DNA sample preparation protocol and sequencing technology have vastly improved sequencing throughput capacity. Although the new Illumina technology is now widely used, the unique challenges associated with multiplexed RRBS libraries on this platform have not been previously described. We have made modifications to the RRBS library preparation protocol to sequence multiplexed libraries on a single flow cell lane of the Illumina HiSeq 2000. Furthermore, our analysis incorporates a bioinformatics pipeline specifically designed to process bisulfite-converted sequencing reads and evaluate the output and quality of the sequencing data generated from the multiplexed libraries. We obtained an average of 42 million paired-end reads per sample for each flow-cell lane, with a high unique mapping efficiency to the reference human genome. Here we provide a roadmap of modifications, strategies, and trouble shooting approaches we implemented to optimize sequencing of multiplexed libraries on an a RRBS background. PMID:23193365

  7. FreeSASA: An open source C library for solvent accessible surface area calculations.

    PubMed

    Mitternacht, Simon

    2016-01-01

    Calculating solvent accessible surface areas (SASA) is a run-of-the-mill calculation in structural biology. Although there are many programs available for this calculation, there are no free-standing, open-source tools designed for easy tool-chain integration. FreeSASA is an open source C library for SASA calculations that provides both command-line and Python interfaces in addition to its C API. The library implements both Lee and Richards' and Shrake and Rupley's approximations, and is highly configurable to allow the user to control molecular parameters, accuracy and output granularity. It only depends on standard C libraries and should therefore be easy to compile and install on any platform. The library is well-documented, stable and efficient. The command-line interface can easily replace closed source legacy programs, with comparable or better accuracy and speed, and with some added functionality.

  8. CHEMICAL EVOLUTION LIBRARY FOR GALAXY FORMATION SIMULATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saitoh, Takayuki R., E-mail: saitoh@elsi.jp

    We have developed a software library for chemical evolution simulations of galaxy formation under the simple stellar population (SSP) approximation. In this library, all of the necessary components concerning chemical evolution, such as initial mass functions, stellar lifetimes, yields from Type II and Type Ia supernovae, asymptotic giant branch stars, and neutron star mergers, are compiled from the literature. Various models are pre-implemented in this library so that users can choose their favorite combination of models. Subroutines of this library return released energy and masses of individual elements depending on a given event type. Since the redistribution manner of thesemore » quantities depends on the implementation of users’ simulation codes, this library leaves it up to the simulation code. As demonstrations, we carry out both one-zone, closed-box simulations and 3D simulations of a collapsing gas and dark matter system using this library. In these simulations, we can easily compare the impact of individual models on the chemical evolution of galaxies, just by changing the control flags and parameters of the library. Since this library only deals with the part of chemical evolution under the SSP approximation, any simulation codes that use the SSP approximation—namely, particle-base and mesh codes, as well as semianalytical models—can use it. This library is named “CELib” after the term “Chemical Evolution Library” and is made available to the community.« less

  9. DNAism: exploring genomic datasets on the web with Horizon Charts.

    PubMed

    Rio Deiros, David; Gibbs, Richard A; Rogers, Jeffrey

    2016-01-27

    Computational biologists daily face the need to explore massive amounts of genomic data. New visualization techniques can help researchers navigate and understand these big data. Horizon Charts are a relatively new visualization method that, under the right circumstances, maximizes data density without losing graphical perception. Horizon Charts have been successfully applied to understand multi-metric time series data. We have adapted an existing JavaScript library (Cubism) that implements Horizon Charts for the time series domain so that it works effectively with genomic datasets. We call this new library DNAism. Horizon Charts can be an effective visual tool to explore complex and large genomic datasets. Researchers can use our library to leverage these techniques to extract additional insights from their own datasets.

  10. Library of Giant Planet Reflection Spectra for WFirst and Future Space Telescopes

    NASA Astrophysics Data System (ADS)

    Smith, Adam J. R. W.; Fortney, Jonathan; Morley, Caroline; Batalha, Natasha E.; Lewis, Nikole K.

    2018-01-01

    Future large space space telescopes will be able to directly image exoplanets in optical light. The optical light of a resolved planet is due to stellar flux reflected by Rayleigh scattering or cloud scattering, with absorption features imprinted due to molecular bands in the planetary atmosphere. To aid in the design of such missions, and to better understand a wide range of giant planet atmospheres, we have built a library of model giant planet reflection spectra, for the purpose of determining effective methods of spectral analysis as well as for comparison with actual imaged objects. This library covers a wide range of parameters: objects are modeled at ten orbital distances between 0.5 AU and 5.0 AU, which ranges from planets too warm for water clouds, out to those that are true Jupiter analogs. These calculations include six metalicities between solar and 100x solar, with a variety of different cloud thickness parameters, and across all possible phase angles.

  11. Elucidation of mechanisms of actions of thymoquinone-enriched methanolic and volatile oil extracts from Nigella sativa against cardiovascular risk parameters in experimental hyperlipidemia.

    PubMed

    Ahmad, Shafeeque; Beg, Zafarul H

    2013-06-13

    Nigella sativa belonging to the Ranunculaceae family has been reported to use for thousands of years as protective and curative traditional medicine against a number of diseases. GC-MS analysis of methanolic extract (ME) and volatile oil (VO) extracted from Nigella sativa seed oil was performed by two different mass spectrometry libraries, WIlEY8 and NIST05s. The cholesterol lowering and antioxidant actions of VO and ME fractions were investigated in atherogenic suspension fed rats. In this study, four groups of male Wistar rats were used: normolipidemic control (NLP-C), hyperlipidemic control (HLP-C), methanolic extract (HLP-ME) and volatile oil treated (HLP-VO) groups for 30 days of duration. P value < 0.05 was assumed as significant data in groups. Administration of atherogenic suspension to male Wistar rats for 30 days resulted in a marked increase of plasma triglycerides and total cholesterol, and significant change in plasma lipoprotein levels along with a decrease in antioxidant arylesterase activity in hyperlipidemic control (HLP-C) group. The oral feeding of 100 mg ME or 20 mg VO per rat/day effectively reduced the plasma triglycerides to near normal level, while high density lipoprotein cholesterol and its subfraction along with arylesterase activity levels were significantly increased. The test fractions elicited a significant decrease in hepatic HMG-CoA reductase activity. The fractions significantly blocked the ex vivo basal and in vitro maximal formation of conjugated diene and malondialdehyde, and lengthened the lag times of low density lipoprotein, small dense low density lipoprotein and large buoyant low density lipoprotein. ME possessing ω-6 linoleic acid along with palmitic acid active compounds was more effective than VO extract containing thymol and isothymol phenolic antioxidant compounds, thymoquinone phenolic compound common to the both extracts, via reduction in hepatic HMG-CoA reductase activity as well as antioxidant mechanisms. The both extracts especially, ME significantly improve cardiovascular risk parameters in treated rats, and can be used in reactive oxygen species disorders such as cardiovascular diseases.

  12. Revived STIS. II. Properties of Stars in the Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Heap, Sara R.; Lindler, D.

    2010-01-01

    Spectroscopic surveys of galaxies at high redshift will bring the rest-frame ultraviolet into view of large, ground-based telescopes. The UV-blue spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). The NGSL contains UV-optical spectra (0.2 - 1.0 microns) of 374 stars having a wide range in temperature, luminosity, and metallicity. We will describe our work to derive basic stellar parameters from NGSL spectra using modern model spectra and to use these stellar parameters to develop UV-blue spectral diagnostics.

  13. The UF/NCI family of hybrid computational phantoms representing the current US population of male and female children, adolescents, and adults—application to CT dosimetry

    NASA Astrophysics Data System (ADS)

    Geyer, Amy M.; O'Reilly, Shannon; Lee, Choonsik; Long, Daniel J.; Bolch, Wesley E.

    2014-09-01

    Substantial increases in pediatric and adult obesity in the US have prompted a major revision to the current UF/NCI (University of Florida/National Cancer Institute) family of hybrid computational phantoms to more accurately reflect current trends in larger body morphometry. A decision was made to construct the new library in a gridded fashion by height/weight without further reference to age-dependent weight/height percentiles as these become quickly outdated. At each height/weight combination, circumferential parameters were defined and used for phantom construction. All morphometric data for the new library were taken from the CDC NHANES survey data over the time period 1999-2006, the most recent reported survey period. A subset of the phantom library was then used in a CT organ dose sensitivity study to examine the degree to which body morphometry influences the magnitude of organ doses for patients that are underweight to morbidly obese in body size. Using primary and secondary morphometric parameters, grids containing 100 adult male height/weight bins, 93 adult female height/weight bins, 85 pediatric male height/weight bins and 73 pediatric female height/weight bins were constructed. These grids served as the blueprints for construction of a comprehensive library of patient-dependent phantoms containing 351 computational phantoms. At a given phantom standing height, normalized CT organ doses were shown to linearly decrease with increasing phantom BMI for pediatric males, while curvilinear decreases in organ dose were shown with increasing phantom BMI for adult females. These results suggest that one very useful application of the phantom library would be the construction of a pre-computed dose library for CT imaging as needed for patient dose-tracking.

  14. A novel patterning control strategy based on real-time fingerprint recognition and adaptive wafer level scanner optimization

    NASA Astrophysics Data System (ADS)

    Cekli, Hakki Ergun; Nije, Jelle; Ypma, Alexander; Bastani, Vahid; Sonntag, Dag; Niesing, Henk; Zhang, Linmiao; Ullah, Zakir; Subramony, Venky; Somasundaram, Ravin; Susanto, William; Matsunobu, Masazumi; Johnson, Jeff; Tabery, Cyrus; Lin, Chenxi; Zou, Yi

    2018-03-01

    In addition to lithography process and equipment induced variations, processes like etching, annealing, film deposition and planarization exhibit variations, each having their own intrinsic characteristics and leaving an effect, a `fingerprint', on the wafers. With ever tighter requirements for CD and overlay, controlling these process induced variations is both increasingly important and increasingly challenging in advanced integrated circuit (IC) manufacturing. For example, the on-product overlay (OPO) requirement for future nodes is approaching <3nm, requiring the allowable budget for process induced variance to become extremely small. Process variance control is seen as an bottleneck to further shrink which drives the need for more sophisticated process control strategies. In this context we developed a novel `computational process control strategy' which provides the capability of proactive control of each individual wafer with aim to maximize the yield, without introducing a significant impact on metrology requirements, cycle time or productivity. The complexity of the wafer process is approached by characterizing the full wafer stack building a fingerprint library containing key patterning performance parameters like Overlay, Focus, etc. Historical wafer metrology is decomposed into dominant fingerprints using Principal Component Analysis. By associating observed fingerprints with their origin e.g. process steps, tools and variables, we can give an inline assessment of the strength and origin of the fingerprints on every wafer. Once the fingerprint library is established, a wafer specific fingerprint correction recipes can be determined based on its processing history. Data science techniques are used in real-time to ensure that the library is adaptive. To realize this concept, ASML TWINSCAN scanners play a vital role with their on-board full wafer detection and exposure correction capabilities. High density metrology data is created by the scanner for each wafer and on every layer during the lithography steps. This metrology data will be used to obtain the process fingerprints. Also, the per exposure and per wafer correction potential of the scanners will be utilized for improved patterning control. Additionally, the fingerprint library will provide early detection of excursions for inline root cause analysis and process optimization guidance.

  15. Atmospheric and Fundamental Parameters of Stars in Hubble's Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Heap, Sally

    2010-01-01

    Hubble's Next Generation Spectral Library (NGSL) consists of R approximately 1000 spectra of 374 stars of assorted temperature, gravity, and metallicity. We are presently working to determine the atmospheric and fundamental parameters of the stars from the NGSL spectra themselves via full-spectrum fitting of model spectra to the observed (extinction-corrected) spectrum over the full wavelength range, 0.2-1.0 micron. We use two grids of model spectra for this purpose: the very low-resolution spectral grid from Castelli-Kurucz (2004), and the grid from MARCS (2008). Both the observed spectrum and the MARCS spectra are first degraded in resolution to match the very low resolution of the Castelli-Kurucz models, so that our fitting technique is the same for both model grids. We will present our preliminary results with a comparison with those from the Sloan/Segue Stellar Parameter Pipeline, ELODIE, and MILES, etc.

  16. CHEMKIN2. General Gas-Phase Chemical Kinetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rupley, F.M.

    1992-01-24

    CHEMKIN is a high-level tool for chemists to use to describe arbitrary gas-phase chemical reaction mechanisms and systems of governing equations. It remains, however, for the user to select and implement a solution method; this is not provided. It consists of two major components: the Interpreter and the Gas-phase Subroutine Library. The Interpreter reads a symbolic description of an arbitrary, user-specified chemical reaction mechanism. A data file is generated which forms a link to the Gas-phase Subroutine Library, a collection of about 200 modular subroutines which may be called to return thermodynamic properties, chemical production rates, derivatives of thermodynamic properties,more » derivatives of chemical production rates, or sensitivity parameters. Both single and double precision versions of CHEMKIN are included. Also provided is a set of FORTRAN subroutines for evaluating gas-phase transport properties such as thermal conductivities, viscosities, and diffusion coefficients. These properties are an important part of any computational simulation of a chemically reacting flow. The transport properties subroutines are designed to be used in conjunction with the CHEMKIN Subroutine Library. The transport properties depend on the state of the gas and on certain molecular parameters. The parameters considered are the Lennard-Jones potential well depth and collision diameter, the dipole moment, the polarizability, and the rotational relaxation collision number.« less

  17. Scanning electron microscope measurement of width and shape of 10nm patterned lines using a JMONSEL-modeled library.

    PubMed

    Villarrubia, J S; Vladár, A E; Ming, B; Kline, R J; Sunday, D F; Chawla, J S; List, S

    2015-07-01

    The width and shape of 10nm to 12 nm wide lithographically patterned SiO2 lines were measured in the scanning electron microscope by fitting the measured intensity vs. position to a physics-based model in which the lines' widths and shapes are parameters. The approximately 32 nm pitch sample was patterned at Intel using a state-of-the-art pitch quartering process. Their narrow widths and asymmetrical shapes are representative of near-future generation transistor gates. These pose a challenge: the narrowness because electrons landing near one edge may scatter out of the other, so that the intensity profile at each edge becomes width-dependent, and the asymmetry because the shape requires more parameters to describe and measure. Modeling was performed by JMONSEL (Java Monte Carlo Simulation of Secondary Electrons), which produces a predicted yield vs. position for a given sample shape and composition. The simulator produces a library of predicted profiles for varying sample geometry. Shape parameter values are adjusted until interpolation of the library with those values best matches the measured image. Profiles thereby determined agreed with those determined by transmission electron microscopy and critical dimension small-angle x-ray scattering to better than 1 nm. Published by Elsevier B.V.

  18. SELECTIVE DISSEMINATION OF INFORMATION--REVIEW OF SELECTED SYSTEMS AND A DESIGN FOR ARMY TECHNICAL LIBRARIES. FINAL REPORT. ARMY TECHNICAL LIBRARY IMPROVEMENT STUDIES (ATLIS), REPORT NO. 8.

    ERIC Educational Resources Information Center

    BIVONA, WILLIAM A.

    THIS REPORT PRESENTS AN ANALYSIS OF OVER EIGHTEEN SMALL, INTERMEDIATE, AND LARGE SCALE SYSTEMS FOR THE SELECTIVE DISSEMINATION OF INFORMATION (SDI). SYSTEMS ARE COMPARED AND ANALYZED WITH RESPECT TO DESIGN CRITERIA AND THE FOLLOWING NINE SYSTEM PARAMETERS--(1) INFORMATION INPUT, (2) METHODS OF INDEXING AND ABSTRACTING, (3) USER INTEREST PROFILE…

  19. AQUATOX Frequently Asked Questions

    EPA Pesticide Factsheets

    Capabilities, Installation, Source Code, Example Study Files, Biotic State Variables, Initial Conditions, Loadings, Volume, Sediments, Parameters, Libraries, Ecotoxicology, Waterbodies, Link to Watershed Models, Output, Metals, Troubleshooting

  20. Computer Center CDC Libraries/NSRD (Subprograms).

    DTIC Science & Technology

    1984-06-01

    VALUES Y - ARRAY OR CORRESPONDING Y-VALUES N - NUMBER OF VALUES CM REQUIRED: IOOB ERROR MESSAGE ’ L=XXXXX, X=X.XXXXXXX E+YY, X NOT MONOTONE STOP SELF ...PARAMETERS (SUBSEQUENT REPORTS MAY BE UNSOLICITED) . PCRTP1 - REQUEST TERMINAL PARAMETERS (SUBSEQUENT REPORTS ONLY IN RESPOSE TO HOST REQUEST) DA - REQUEST

  1. Balancing novelty with confined chemical space in modern drug discovery.

    PubMed

    Medina-Franco, José L; Martinez-Mayorga, Karina; Meurice, Nathalie

    2014-02-01

    The concept of chemical space has broad applications in drug discovery. In response to the needs of drug discovery campaigns, different approaches are followed to efficiently populate, mine and select relevant chemical spaces that overlap with biologically relevant chemical spaces. This paper reviews major trends in current drug discovery and their impact on the mining and population of chemical space. We also survey different approaches to develop screening libraries with confined chemical spaces balancing physicochemical properties. In this context, the confinement is guided by criteria that can be divided in two broad categories: i) library design focused on a relevant therapeutic target or disease and ii) library design focused on the chemistry or a desired molecular function. The design and development of chemical libraries should be associated with the specific purpose of the library and the project goals. The high complexity of drug discovery and the inherent imperfection of individual experimental and computational technologies prompt the integration of complementary library design and screening approaches to expedite the identification of new and better drugs. Library design approaches including diversity-oriented synthesis, biological-oriented synthesis or combinatorial library design, to name a few, and the design of focused libraries driven by target/disease, chemical structure or molecular function are more efficient if they are guided by multi-parameter optimization. In this context, consideration of pharmaceutically relevant properties is essential for balancing novelty with chemical space in drug discovery.

  2. Kernel-density estimation and approximate Bayesian computation for flexible epidemiological model fitting in Python.

    PubMed

    Irvine, Michael A; Hollingsworth, T Déirdre

    2018-05-26

    Fitting complex models to epidemiological data is a challenging problem: methodologies can be inaccessible to all but specialists, there may be challenges in adequately describing uncertainty in model fitting, the complex models may take a long time to run, and it can be difficult to fully capture the heterogeneity in the data. We develop an adaptive approximate Bayesian computation scheme to fit a variety of epidemiologically relevant data with minimal hyper-parameter tuning by using an adaptive tolerance scheme. We implement a novel kernel density estimation scheme to capture both dispersed and multi-dimensional data, and directly compare this technique to standard Bayesian approaches. We then apply the procedure to a complex individual-based simulation of lymphatic filariasis, a human parasitic disease. The procedure and examples are released alongside this article as an open access library, with examples to aid researchers to rapidly fit models to data. This demonstrates that an adaptive ABC scheme with a general summary and distance metric is capable of performing model fitting for a variety of epidemiological data. It also does not require significant theoretical background to use and can be made accessible to the diverse epidemiological research community. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Hubble's Next Generation Spectral Library

    NASA Astrophysics Data System (ADS)

    Heap, Sara R.; Lindler, D.

    2008-03-01

    Spectroscopic surveys of galaxies at z 1 or more bring the rest-frame ultraviolet into view of large, ground-based telescopes. This spectral region is rich in diagnostics, but these diagnostics have not yet been calibrated in terms of the properties of the responsible stellar population(s). Such calibrations are now possible with Hubble's Next Generation Spectral Library (NGSL). This library contains UV-optical spectra (0.2-1.0 microns) of 378 stars having a wide range in temperature, luminosity, and metallicity. We have derived the basic stellar parameters from the optical spectral region (0.35 - 1.0 microns) and are using them to calibrate UV spectral diagnostic indices and colors.

  4. Modeling of frequency agile devices: development of PKI neuromodeling library based on hierarchical network structure

    NASA Astrophysics Data System (ADS)

    Sanchez, P.; Hinojosa, J.; Ruiz, R.

    2005-06-01

    Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.

  5. Quantum mechanical energy-based screening of combinatorially generated library of tautomers. TauTGen: a tautomer generator program.

    PubMed

    Harańczyk, Maciej; Gutowski, Maciej

    2007-01-01

    We describe a procedure of finding low-energy tautomers of a molecule. The procedure consists of (i) combinatorial generation of a library of tautomers, (ii) screening based on the results of geometry optimization of initial structures performed at the density functional level of theory, and (iii) final refinement of geometry for the top hits at the second-order Möller-Plesset level of theory followed by single-point energy calculations at the coupled cluster level of theory with single, double, and perturbative triple excitations. The library of initial structures of various tautomers is generated with TauTGen, a tautomer generator program. The procedure proved to be successful for these molecular systems for which common chemical knowledge had not been sufficient to predict the most stable structures.

  6. Assessment of antibody library diversity through next generation sequencing and technical error compensation

    PubMed Central

    Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error. PMID:28505201

  7. Assessment of antibody library diversity through next generation sequencing and technical error compensation.

    PubMed

    Fantini, Marco; Pandolfini, Luca; Lisi, Simonetta; Chirichella, Michele; Arisi, Ivan; Terrigno, Marco; Goracci, Martina; Cremisi, Federico; Cattaneo, Antonino

    2017-01-01

    Antibody libraries are important resources to derive antibodies to be used for a wide range of applications, from structural and functional studies to intracellular protein interference studies to developing new diagnostics and therapeutics. Whatever the goal, the key parameter for an antibody library is its complexity (also known as diversity), i.e. the number of distinct elements in the collection, which directly reflects the probability of finding in the library an antibody against a given antigen, of sufficiently high affinity. Quantitative evaluation of antibody library complexity and quality has been for a long time inadequately addressed, due to the high similarity and length of the sequences of the library. Complexity was usually inferred by the transformation efficiency and tested either by fingerprinting and/or sequencing of a few hundred random library elements. Inferring complexity from such a small sampling is, however, very rudimental and gives limited information about the real diversity, because complexity does not scale linearly with sample size. Next-generation sequencing (NGS) has opened new ways to tackle the antibody library complexity quality assessment. However, much remains to be done to fully exploit the potential of NGS for the quantitative analysis of antibody repertoires and to overcome current limitations. To obtain a more reliable antibody library complexity estimate here we show a new, PCR-free, NGS approach to sequence antibody libraries on Illumina platform, coupled to a new bioinformatic analysis and software (Diversity Estimator of Antibody Library, DEAL) that allows to reliably estimate the complexity, taking in consideration the sequencing error.

  8. Hierarchy and extremes in selections from pools of randomized proteins

    PubMed Central

    Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier

    2016-01-01

    Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different “frameworks” typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution). PMID:26969726

  9. Hierarchy and extremes in selections from pools of randomized proteins.

    PubMed

    Boyer, Sébastien; Biswas, Dipanwita; Kumar Soshee, Ananda; Scaramozzino, Natale; Nizak, Clément; Rivoire, Olivier

    2016-03-29

    Variation and selection are the core principles of Darwinian evolution, but quantitatively relating the diversity of a population to its capacity to respond to selection is challenging. Here, we examine this problem at a molecular level in the context of populations of partially randomized proteins selected for binding to well-defined targets. We built several minimal protein libraries, screened them in vitro by phage display, and analyzed their response to selection by high-throughput sequencing. A statistical analysis of the results reveals two main findings. First, libraries with the same sequence diversity but built around different "frameworks" typically have vastly different responses; second, the distribution of responses of the best binders in a library follows a simple scaling law. We show how an elementary probabilistic model based on extreme value theory rationalizes the latter finding. Our results have implications for designing synthetic protein libraries, estimating the density of functional biomolecules in sequence space, characterizing diversity in natural populations, and experimentally investigating evolvability (i.e., the potential for future evolution).

  10. Screening and identification of genetic loci involved in producing more/denser inclusion bodies in Escherichia coli

    PubMed Central

    2013-01-01

    Background Many proteins and peptides have been used in therapeutic or industrial applications. They are often produced in microbial production hosts by fermentation. Robust protein production in the hosts and efficient downstream purification are two critical factors that could significantly reduce cost for microbial protein production by fermentation. Producing proteins/peptides as inclusion bodies in the hosts has the potential to achieve both high titers in fermentation and cost-effective downstream purification. Manipulation of the host cells such as overexpression/deletion of certain genes could lead to producing more and/or denser inclusion bodies. However, there are limited screening methods to help to identify beneficial genetic changes rendering more protein production and/or denser inclusion bodies. Results We report development and optimization of a simple density gradient method that can be used for distinguishing and sorting E. coli cells with different buoyant densities. We demonstrate utilization of the method to screen genetic libraries to identify a) expression of glyQS loci on plasmid that increased expression of a peptide of interest as well as the buoyant density of inclusion body producing E. coli cells; and b) deletion of a host gltA gene that increased the buoyant density of the inclusion body produced in the E. coli cells. Conclusion A novel density gradient sorting method was developed to screen genetic libraries. Beneficial host genetic changes could be exploited to improve recombinant protein expression as well as downstream protein purification. PMID:23638724

  11. Screening and identification of genetic loci involved in producing more/denser inclusion bodies in Escherichia coli.

    PubMed

    Pandey, Neeraj; Sachan, Annapurna; Chen, Qi; Ruebling-Jass, Kristin; Bhalla, Ritu; Panguluri, Kiran Kumar; Rouviere, Pierre E; Cheng, Qiong

    2013-05-02

    Many proteins and peptides have been used in therapeutic or industrial applications. They are often produced in microbial production hosts by fermentation. Robust protein production in the hosts and efficient downstream purification are two critical factors that could significantly reduce cost for microbial protein production by fermentation. Producing proteins/peptides as inclusion bodies in the hosts has the potential to achieve both high titers in fermentation and cost-effective downstream purification. Manipulation of the host cells such as overexpression/deletion of certain genes could lead to producing more and/or denser inclusion bodies. However, there are limited screening methods to help to identify beneficial genetic changes rendering more protein production and/or denser inclusion bodies. We report development and optimization of a simple density gradient method that can be used for distinguishing and sorting E. coli cells with different buoyant densities. We demonstrate utilization of the method to screen genetic libraries to identify a) expression of glyQS loci on plasmid that increased expression of a peptide of interest as well as the buoyant density of inclusion body producing E. coli cells; and b) deletion of a host gltA gene that increased the buoyant density of the inclusion body produced in the E. coli cells. A novel density gradient sorting method was developed to screen genetic libraries. Beneficial host genetic changes could be exploited to improve recombinant protein expression as well as downstream protein purification.

  12. Flight Software Math Library

    NASA Technical Reports Server (NTRS)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  13. \\Space: A new code to estimate \\temp, \\logg, and elemental abundances

    NASA Astrophysics Data System (ADS)

    Boeche, C.

    2016-09-01

    \\Space is a FORTRAN95 code that derives stellar parameters and elemental abundances from stellar spectra. To derive these parameters, \\Space does not measure equivalent widths of lines nor it uses templates of synthetic spectra, but it employs a new method based on a library of General Curve-Of-Growths. To date \\Space works on the wavelength range 5212-6860 Å and 8400-8921 Å, and at the spectral resolution R=2000-20000. Extensions of these limits are possible. \\Space is a highly automated code suitable for application to large spectroscopic surveys. A web front end to this service is publicly available at http://dc.g-vo.org/SP_ACE together with the library and the binary code.

  14. The HST/STIS Next Generation Spectral Library

    NASA Technical Reports Server (NTRS)

    Gregg, M. D.; Silva, D.; Rayner, J.; Worthey, G.; Valdes, F.; Pickles, A.; Rose, J.; Carney, B.; Vacca, W.

    2006-01-01

    During Cycles 10, 12, and 13, we obtained STIS G230LB, G430L, and G750L spectra of 378 bright stars covering a wide range in abundance, effective temperature, and luminosity. This HST/STIS Next Generation Spectral Library was scheduled to reach its goal of 600 targets by the end of Cycle 13 when STIS came to an untimely end. Even at 2/3 complete, the library significantly improves the sampling of stellar atmosphere parameter space compared to most other spectral libraries by including the near-UV and significant numbers of metal poor and super-solar abundance stars. Numerous calibration challenges have been encountered, some expected, some not; these arise from the use of the E1 aperture location, non-standard wavelength calibration, and, most significantly, the serious contamination of the near-UV spectra by red light. Maximizing the utility of the library depends directly on overcoming or at least minimizing these problems, especially correcting the UV spectra.

  15. PrecisePrimer: an easy-to-use web server for designing PCR primers for DNA library cloning and DNA shuffling.

    PubMed

    Pauthenier, Cyrille; Faulon, Jean-Loup

    2014-07-01

    PrecisePrimer is a web-based primer design software made to assist experimentalists in any repetitive primer design task such as preparing, cloning and shuffling DNA libraries. Unlike other popular primer design tools, it is conceived to generate primer libraries with popular PCR polymerase buffers proposed as pre-set options. PrecisePrimer is also meant to design primers in batches, such as for DNA libraries creation of DNA shuffling experiments and to have the simplest interface possible. It integrates the most up-to-date melting temperature algorithms validated with experimental data, and cross validated with other computational tools. We generated a library of primers for the extraction and cloning of 61 genes from yeast DNA genomic extract using default parameters. All primer pairs efficiently amplified their target without any optimization of the PCR conditions. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. Raster graphics display library

    NASA Technical Reports Server (NTRS)

    Grimsrud, Anders; Stephenson, Michael B.

    1987-01-01

    The Raster Graphics Display Library (RGDL) is a high level subroutine package that give the advanced raster graphics display capabilities needed. The RGDL uses FORTRAN source code routines to build subroutines modular enough to use as stand-alone routines in a black box type of environment. Six examples are presented which will teach the use of RGDL in the fastest, most complete way possible. Routines within the display library that are used to produce raster graphics are presented in alphabetical order, each on a separate page. Each user-callable routine is described by function and calling parameters. All common blocks that are used in the display library are listed and the use of each variable within each common block is discussed. A reference on the include files that are necessary to compile the display library is contained. Each include file and its purpose are listed. The link map for MOVIE.BYU version 6, a general purpose computer graphics display system that uses RGDL software, is also contained.

  17. micrOMEGAs 2.0.7: a program to calculate the relic density of dark matter in a generic model

    NASA Astrophysics Data System (ADS)

    Bélanger, G.; Boudjema, F.; Pukhov, A.; Semenov, A.

    2007-12-01

    micrOMEGAs2.0.7 is a code which calculates the relic density of a stable massive particle in an arbitrary model. The underlying assumption is that there is a conservation law like R-parity in supersymmetry which guarantees the stability of the lightest odd particle. The new physics model must be incorporated in the notation of CalcHEP, a package for the automatic generation of squared matrix elements. Once this is done, all annihilation and coannihilation channels are included automatically in any model. Cross-sections at v=0, relevant for indirect detection of dark matter, are also computed automatically. The package includes three sample models: the minimal supersymmetric standard model (MSSM), the MSSM with complex phases and the NMSSM. Extension to other models, including non supersymmetric models, is described. Program summaryTitle of program:micrOMEGAs2.0.7 Catalogue identifier:ADQR_v2_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQR_v2_1.html Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions:Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.:216 529 No. of bytes in distributed program, including test data, etc.:1 848 816 Distribution format:tar.gz Programming language used:C and Fortran Computer:PC, Alpha, Mac, Sun Operating system:UNIX (Linux, OSF1, SunOS, Darwin, Cygwin) RAM:17 MB depending on the number of processes required Classification:1.9, 11.6 Catalogue identifier of previous version:ADQR_v2_0 Journal version of previous version:Comput. Phys. Comm. 176 (2007) 367 Does the new version supersede the previous version?:Yes Nature of problem:Calculation of the relic density of the lightest stable particle in a generic new model of particle physics. Solution method:In numerically solving the evolution equation for the density of dark matter, relativistic formulae for the thermal average are used. All tree-level processes for annihilation and coannihilation of new particles in the model are included. The cross-sections for all processes are calculated exactly with CalcHEP after definition of a model file. Higher-order QCD corrections to Higgs couplings to quark pairs are included. Reasons for new version:The main changes in this new version consist, on the one hand, in improvements of the user interface and treatment of error codes when using spectrum calculators in the MSSM and, on the other hand, on a completely revised code for the calculation of the relic density in the NMSSM based on the code NMSSMTools1.0.2 for the computation of the spectrum. Summary of revisions:The version of CalcHEP was updated to CalcHEP 2.4. The procedure for shared library generation has been improved. Now the libraries are recalculated each time the model is modified. The default value for the top quark mass has been set to 171.4 GeV. Changes specific to the MSSM model. The deltaMb correction is now included in the B,t,H-vertex and is always included for other Higgs vertices. In case of a fatal error in an RGE program, micrOMEGAs now continues operation while issuing a warning that the given point is not valid. This is important when running scans over parameter space. However this means that the standard ˆC command that could be used to cancel a job now only cancels the RGE program. To cancel a job, use "kill -9 -N" where N is the micrOMEGAs process id, all child processes launched by micrOMEGAs will be killed at once. Following the last SLHA2 release, we use key=26 item of EXTPAR block for the pole mass of the CP-odd Higgs so that micrOMEGAs can now use SoftSUSY for spectrum calculation with EWSB input. The Isajet interface was corrected too, so the user has to recompile the isajet_slha executable. For SuSpect we still support an old "wrong" interface where key=24 is used for the mass of the CP-odd Higgs. In the non-universal SUGRA model, we set the value of M ( M,A) to the value of the largest subset of equal parameters among scalar masses (gaugino masses, trilinear couplings). In the previous version these parameters were set arbitrarily to be equal to MH2, MG2 and At respectively. The spectrum calculators need an input value for M,M and A for initialisation purposes. We have removed bugs in micrOMEGAs-Isajet interface in case of non-universal SUGRA. $(FFLAGS) is added to compilation instruction of suspect.exe. It was omitted in version 2.0. The treatment of errors in reading of the LesHouches accord file is improved. Now, if the SPINFO block is absent in the SLHA output it is considered as a fatal error. Instructions for calculation of Δ, (, Br(b→sγ) and Br(B→μμ) constraints are included in EWSB sample main programs omg.c/omg.cpp/omg.F. We have corrected the name of the library for neutralino-neutralino annihilation in our sample files MSSM/cs br.*. Changes specific to the NMSSM model. The NMSSM has been completely revised. Now it is based on NMSSMTools_1.0.2. The deltaMb corrections in the NMSSM are included in the Higgs potential. CP violation model. We have included in our package the MSSM with CP violation. Our implementation was described in Phys. Rev. D 73 (2006) 115007. It is based on the CPSUPERH package published in Comput. Phys. Comm. 156 (2004) 283. Unusual features:Depending on the parameters of the model, the program generates additional new code, compiles it and loads it dynamically. Running time:0.2 seconds

  18. The structure of a thermophilic kinase shapes fitness upon random circular permutation

    PubMed Central

    Jones, Alicia M.; Mehta, Manan M.; Thomas, Emily E.; Atkinson, Joshua T.; Segall-Shapiro, Thomas H.; Liu, Shirley; Silberg, Jonathan J.

    2016-01-01

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement where native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein’s functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AK with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and they reveal a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection. PMID:26976658

  19. The Structure of a Thermophilic Kinase Shapes Fitness upon Random Circular Permutation.

    PubMed

    Jones, Alicia M; Mehta, Manan M; Thomas, Emily E; Atkinson, Joshua T; Segall-Shapiro, Thomas H; Liu, Shirley; Silberg, Jonathan J

    2016-05-20

    Proteins can be engineered for synthetic biology through circular permutation, a sequence rearrangement in which native protein termini become linked and new termini are created elsewhere through backbone fission. However, it remains challenging to anticipate a protein's functional tolerance to circular permutation. Here, we describe new transposons for creating libraries of randomly circularly permuted proteins that minimize peptide additions at their termini, and we use transposase mutagenesis to study the tolerance of a thermophilic adenylate kinase (AK) to circular permutation. We find that libraries expressing permuted AKs with either short or long peptides amended to their N-terminus yield distinct sets of active variants and present evidence that this trend arises because permuted protein expression varies across libraries. Mapping all sites that tolerate backbone cleavage onto AK structure reveals that the largest contiguous regions of sequence that lack cleavage sites are proximal to the phosphotransfer site. A comparison of our results with a range of structure-derived parameters further showed that retention of function correlates to the strongest extent with the distance to the phosphotransfer site, amino acid variability in an AK family sequence alignment, and residue-level deviations in superimposed AK structures. Our work illustrates how permuted protein libraries can be created with minimal peptide additions using transposase mutagenesis, and it reveals a challenge of maintaining consistent expression across permuted variants in a library that minimizes peptide additions. Furthermore, these findings provide a basis for interpreting responses of thermophilic phosphotransferases to circular permutation by calibrating how different structure-derived parameters relate to retention of function in a cellular selection.

  20. Microcomputing.

    ERIC Educational Resources Information Center

    Beiser, Karl

    1986-01-01

    Describes a product--BiblioFile, Library Corporation's catalog production system--and a service--reproduction of public domain software on CD-ROM for sale to those interested--which revolve around the ultra-high-density storage capacity of CD-ROM discs. Criteria for selecting microcomputers are briefly reviewed. (MBR)

  1. Determination of elastomeric foam parameters for simulations of complex loading.

    PubMed

    Petre, M T; Erdemir, A; Cavanagh, P R

    2006-08-01

    Finite element (FE) analysis has shown promise for the evaluation of elastomeric foam personal protection devices. Although appropriate representation of foam materials is necessary in order to obtain realistic simulation results, material definitions used in the literature vary widely and often fail to account for the multi-mode loading experienced by these devices. This study aims to provide a library of elastomeric foam material parameters that can be used in FE simulations of complex loading scenarios. Twelve foam materials used in footwear were tested in uni-axial compression, simple shear and volumetric compression. For each material, parameters for a common compressible hyperelastic material model used in FE analysis were determined using: (a) compression; (b) compression and shear data; and (c) data from all three tests. Material parameters and Drucker stability limits for the best fits are provided with their associated errors. The material model was able to reproduce deformation modes for which data was provided during parameter determination but was unable to predict behavior in other deformation modes. Simulation results were found to be highly dependent on the extent of the test data used to determine the parameters in the material definition. This finding calls into question the many published results of simulations of complex loading that use foam material parameters obtained from a single mode of testing. The library of foam parameters developed here presents associated errors in three deformation modes that should provide for a more informed selection of material parameters.

  2. The U. S. Geological Survey, Digital Spectral Library: Version 1 (0.2 to 3.0um)

    USGS Publications Warehouse

    Clark, Roger N.; Swayze, Gregg A.; Gallagher, Andrea J.; King, Trude V.V.; Calvin, Wendy M.

    1993-01-01

    We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 498 spectra of 444 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 um . The spectral resolution (Full Width Half Maximum) of the reflectance data is <= 4 nm in the visible (0.2-0.8 um) and <= 10 nm in the NIR (0.8-2.35 um). All spectra were corrected to absolute reflectance using an NIST Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from borate, carbonate, chloride, element, halide, hydroxide, nitrate, oxide, phosphate, sulfate, sulfide, sulfosalt, and the silicate (cyclosilicate, inosilicate, nesosilicate, phyllosilicate, sorosilicate, and tectosilicate) classes are represented. X-Ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses. The library and software are available as a series of U.S.G.S. Open File reports. PC user software is available to convert the binary data to ascii files (a separate U.S.G.S. open file report). Additionally, a binary data files are on line at the U.S.G.S. in Denver for anonymous ftp to users on the Internet. The library search software enables a user to search on documentation parameters as well as spectral features. The analysis system includes general spectral analysis routines, plotting packages, radiative transfer software for computing intimate mixtures, routines to derive optical constants from reflectance spectra, tools to analyze spectral features, and the capability to access imaging spectrometer data cubes for spectral analysis. Users may build customized libraries (at specific wavelengths and spectral resolution) for their own instruments using the library software. We are currently extending spectral coverage to 150 um. The libraries (original and convolved) will be made available in the future on a CD-ROM.

  3. Absolute determination of power density in the VVER-1000 mock-up on the LR-0 research reactor.

    PubMed

    Košt'ál, Michal; Švadlenková, Marie; Milčák, Ján

    2013-08-01

    The work presents a detailed comparison of calculated and experimentally determined net peak areas of selected fission products gamma lines. The fission products were induced during a 2.5 h irradiation on the power level of 9.5 W in selected fuel pins of the VVER-1000 Mock-Up. The calculations were done with deterministic and stochastic (Monte Carlo) methods. The effects of different nuclear data libraries used for calculations are discussed as well. The Net Peak Area (NPA) may be used for the determination of fission density across the mock-up. This fission density is practically identical to power density. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Quantitative structure activity relationships of some pyridine derivatives as corrosion inhibitors of steel in acidic medium.

    PubMed

    El Ashry, El Sayed H; El Nemr, Ahmed; Ragab, Safaa

    2012-03-01

    Quantum chemical calculations using the density functional theory (B3LYP/6-31G DFT) and semi-empirical AM1 methods were performed on ten pyridine derivatives used as corrosion inhibitors for mild steel in acidic medium to determine the relationship between molecular structure and their inhibition efficiencies. Quantum chemical parameters such as total negative charge (TNC) on the molecule, energy of highest occupied molecular orbital (E (HOMO)), energy of lowest unoccupied molecular orbital (E (LUMO)) and dipole moment (μ) as well as linear solvation energy terms, molecular volume (Vi) and dipolar-polarization (π) were correlated to corrosion inhibition efficiency of ten pyridine derivatives. A possible correlation between corrosion inhibition efficiencies and structural properties was searched to reduce the number of compounds to be selected for testing from a library of compounds. It was found that theoretical data support the experimental results. The results were used to predict the corrosion inhibition of 24 related pyridine derivatives.

  5. CCFpams: Atmospheric stellar parameters from cross-correlation functions

    NASA Astrophysics Data System (ADS)

    Malavolta, Luca; Lovis, Christophe; Pepe, Francesco; Sneden, Christopher; Udry, Stephane

    2017-07-01

    CCFpams allows the measurement of stellar temperature, metallicity and gravity within a few seconds and in a completely automated fashion. Rather than performing comparisons with spectral libraries, the technique is based on the determination of several cross-correlation functions (CCFs) obtained by including spectral features with different sensitivity to the photospheric parameters. Literature stellar parameters of high signal-to-noise (SNR) and high-resolution HARPS spectra of FGK Main Sequence stars are used to calibrate the stellar parameters as a function of CCF areas.

  6. The insertion torque-depth curve integral as a measure of implant primary stability: An in vitro study on polyurethane foam blocks.

    PubMed

    Di Stefano, Danilo Alessio; Arosio, Paolo; Gastaldi, Giorgio; Gherlone, Enrico

    2017-07-08

    Recent research has shown that dynamic parameters correlate with insertion energy-that is, the total work needed to place an implant into its site-might convey more reliable information concerning immediate implant primary stability at insertion than the commonly used insertion torque (IT), the reverse torque (RT), or the implant stability quotient (ISQ). Yet knowledge on these dynamic parameters is still limited. The purpose of this in vitro study was to evaluate whether an energy-related parameter, the torque-depth curve integral (I), could be a reliable measure of primary stability. This was done by assessing if (I) measurement was operator-independent, by investigating its correlation with other known primary stability parameters (IT, RT, or ISQ) by quantifying the (I) average error and correlating (I), IT, RT, and ISQ variations with bone density. Five operators placed 200 implants in polyurethane foam blocks of different densities using a micromotor that calculated the (I) during implant placement. Primary implant stability was assessed by measuring the ISQ, IT, and RT. ANOVA tests were used to evaluate whether measurements were operator independent (P>.05 in all cases). A correlation analysis was performed between (I) and IT, ISQ, and RT. The (I) average error was calculated and compared with that of the other parameters by ANOVA. (I)-density, IT-density, ISQ-density, and RT-density plots were drawn, and their slopes were compared by ANCOVA. The (I) measurements were operator independent and correlated with IT, ISQ, and RT. The average error of these parameters was not significantly different (P>.05 in all cases). The (I)-density, IT-density, ISQ-density, and RT-density curves were linear in the 0.16 to 0.49 g/cm³ range, with the (I)-density curves having a significantly greater slope than those regarding the other parameters (P≤.001 in all cases). The torque-depth curve integral (I) provides a reliable assessment of primary stability and shows a greater sensitivity to density variations than other known primary stability parameters. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  7. Experimental Research on Selective Laser Melting AlSi10Mg Alloys: Process, Densification and Performance

    NASA Astrophysics Data System (ADS)

    Chen, Zhen; Wei, Zhengying; Wei, Pei; Chen, Shenggui; Lu, Bingheng; Du, Jun; Li, Junfeng; Zhang, Shuzhe

    2017-12-01

    In this work, a set of experiments was designed to investigate the effect of process parameters on the relative density of the AlSi10Mg parts manufactured by SLM. The influence of laser scan speed v, laser power P and hatch space H, which were considered as the dominant parameters, on the powder melting and densification behavior was also studied experimentally. In addition, the laser energy density was introduced to evaluate the combined effect of the above dominant parameters, so as to control the SLM process integrally. As a result, a high relative density (> 97%) was obtained by SLM at an optimized laser energy density of 3.5-5.5 J/mm2. Moreover, a parameter-densification map was established to visually select the optimum process parameters for the SLM-processed AlSi10Mg parts with elevated density and required mechanical properties. The results provide an important experimental guidance for obtaining AlSi10Mg components with full density and gradient functional porosity by SLM.

  8. The Medical Library Association Benchmarking Network: development and implementation.

    PubMed

    Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C; Smith, Bernie Todd

    2006-04-01

    This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program.

  9. The Medical Library Association Benchmarking Network: development and implementation*

    PubMed Central

    Dudden, Rosalind Farnam; Corcoran, Kate; Kaplan, Janice; Magouirk, Jeff; Rand, Debra C.; Smith, Bernie Todd

    2006-01-01

    Objective: This article explores the development and implementation of the Medical Library Association (MLA) Benchmarking Network from the initial idea and test survey, to the implementation of a national survey in 2002, to the establishment of a continuing program in 2004. Started as a program for hospital libraries, it has expanded to include other nonacademic health sciences libraries. Methods: The activities and timelines of MLA's Benchmarking Network task forces and editorial board from 1998 to 2004 are described. Results: The Benchmarking Network task forces successfully developed an extensive questionnaire with parameters of size and measures of library activity and published a report of the data collected by September 2002. The data were available to all MLA members in the form of aggregate tables. Utilization of Web-based technologies proved feasible for data intake and interactive display. A companion article analyzes and presents some of the data. MLA has continued to develop the Benchmarking Network with the completion of a second survey in 2004. Conclusions: The Benchmarking Network has provided many small libraries with comparative data to present to their administrators. It is a challenge for the future to convince all MLA members to participate in this valuable program. PMID:16636702

  10. Design of a genetic algorithm for the simulated evolution of a library of asymmetric transfer hydrogenation catalysts.

    PubMed

    Vriamont, Nicolas; Govaerts, Bernadette; Grenouillet, Pierre; de Bellefon, Claude; Riant, Olivier

    2009-06-15

    A library of catalysts was designed for asymmetric-hydrogen transfer to acetophenone. At first, the whole library was submitted to evaluation using high-throughput experiments (HTE). The catalysts were listed in ascending order, with respect to their performance, and best catalysts were identified. In the second step, various simulated evolution experiments, based on a genetic algorithm, were applied to this library. A small part of the library, called the mother generation (G0), thus evolved from generation to generation. The goal was to use our collection of HTE data to adjust the parameters of the genetic algorithm, in order to obtain a maximum of the best catalysts within a minimal number of generations. It was namely found that simulated evolution's results depended on the selection of G0 and that a random G0 should be preferred. We also demonstrated that it was possible to get 5 to 6 of the ten best catalysts while investigating only 10 % of the library. Moreover, we developed a double algorithm making this result still achievable if the evolution started with one of the worst G0.

  11. Differentiating aquatic plant communities in a eutrophic river using hyperspectral and multispectral remote sensing

    USGS Publications Warehouse

    Tian, Y.Q.; Yu, Q.; Zimmerman, M.J.; Flint, S.; Waldron, M.C.

    2010-01-01

    This study evaluates the efficacy of remote sensing technology to monitor species composition, areal extent and density of aquatic plants (macrophytes and filamentous algae) in impoundments where their presence may violate water-quality standards. Multispectral satellite (IKONOS) images and more than 500 in situ hyperspectral samples were acquired to map aquatic plant distributions. By analyzing field measurements, we created a library of hyperspectral signatures for a variety of aquatic plant species, associations and densities. We also used three vegetation indices. Normalized Difference Vegetation Index (NDVI), near-infrared (NIR)-Green Angle Index (NGAI) and normalized water absorption depth (DH), at wavelengths 554, 680, 820 and 977 nm to differentiate among aquatic plant species composition, areal density and thickness in cases where hyperspectral analysis yielded potentially ambiguous interpretations. We compared the NDVI derived from IKONOS imagery with the in situ, hyperspectral-derived NDVI. The IKONOS-based images were also compared to data obtained through routine visual observations. Our results confirmed that aquatic species composition alters spectral signatures and affects the accuracy of remote sensing of aquatic plant density. The results also demonstrated that the NGAI has apparent advantages in estimating density over the NDVI and the DH. In the feature space of the three indices, 3D scatter plot analysis revealed that hyperspectral data can differentiate several aquatic plant associations. High-resolution multispectral imagery provided useful information to distinguish among biophysical aquatic plant characteristics. Classification analysis indicated that using satellite imagery to assess Lemna coverage yielded an overall agreement of 79% with visual observations and >90% agreement for the densest aquatic plant coverages. Interpretation of biophysical parameters derived from high-resolution satellite or airborne imagery should prove to be a valuable approach for assessing the effectiveness of management practices for controlling aquatic plant growth in inland waters, as well as for routine monitoring of aquatic plants in lakes and suitable lentic environments. ?? 2010 Blackwell Publishing Ltd.

  12. Hardrock Elastic Physical Properties: Birch's Seismic Parameter Revisited

    NASA Astrophysics Data System (ADS)

    Wu, M.; Milkereit, B.

    2014-12-01

    Identifying rock composition and properties is imperative in a variety of fields including geotechnical engineering, mining, and petroleum exploration, in order to accurately make any petrophysical calculations. Density is, in particular, an important parameter that allows us to differentiate between lithologies and estimate or calculate other petrophysical properties. It is well established that compressional and shear wave velocities of common crystalline rocks increase with increasing densities (i.e. the Birch and Nafe-Drake relationships). Conventional empirical relations do not take into account S-wave velocity. Physical properties of Fe-oxides and massive sulfides, however, differ significantly from the empirical velocity-density relationships. Currently, acquiring in-situ density data is challenging and problematic, and therefore, developing an approximation for density based on seismic wave velocity and elastic moduli would be beneficial. With the goal of finding other possible or better relationships between density and the elastic moduli, a database of density, P-wave velocity, S-wave velocity, bulk modulus, shear modulus, Young's modulus, and Poisson's ratio was compiled based on a multitude of lab samples. The database is comprised of isotropic, non-porous metamorphic rock. Multi-parameter cross plots of the various elastic parameters have been analyzed in order to find a suitable parameter combination that reduces high density outliers. As expected, the P-wave velocity to S-wave velocity ratios show no correlation with density. However, Birch's seismic parameter, along with the bulk modulus, shows promise in providing a link between observed compressional and shear wave velocities and rock densities, including massive sulfides and Fe-oxides.

  13. Atmospheric particulate analysis using angular light scattering

    NASA Technical Reports Server (NTRS)

    Hansen, M. Z.

    1980-01-01

    Using the light scattering matrix elements measured by a polar nephelometer, a procedure for estimating the characteristics of atmospheric particulates was developed. A theoretical library data set of scattering matrices derived from Mie theory was tabulated for a range of values of the size parameter and refractive index typical of atmospheric particles. Integration over the size parameter yielded the scattering matrix elements for a variety of hypothesized particulate size distributions. A least squares curve fitting technique was used to find a best fit from the library data for the experimental measurements. This was used as a first guess for a nonlinear iterative inversion of the size distributions. A real index of 1.50 and an imaginary index of -0.005 are representative of the smoothed inversion results for the near ground level atmospheric aerosol in Tucson.

  14. Prototype of Multifunctional Full-text Library in the Architecture Web-browser / Web-server / SQL-server

    NASA Astrophysics Data System (ADS)

    Lyapin, Sergey; Kukovyakin, Alexey

    Within the framework of the research program "Textaurus" an operational prototype of multifunctional library T-Libra v.4.1. has been created which makes it possible to carry out flexible parametrizable search within a full-text database. The information system is realized in the architecture Web-browser / Web-server / SQL-server. This allows to achieve an optimal combination of universality and efficiency of text processing, on the one hand, and convenience and minimization of expenses for an end user (due to applying of a standard Web-browser as a client application), on the other one. The following principles underlie the information system: a) multifunctionality, b) intelligence, c) multilingual primary texts and full-text searching, d) development of digital library (DL) by a user ("administrative client"), e) multi-platform working. A "library of concepts", i.e. a block of functional models of semantic (concept-oriented) searching, as well as a subsystem of parametrizable queries to a full-text database, which is closely connected with the "library", serve as a conceptual basis of multifunctionality and "intelligence" of the DL T-Libra v.4.1. An author's paragraph is a unit of full-text searching in the suggested technology. At that, the "logic" of an educational / scientific topic or a problem can be built in a multilevel flexible structure of a query and the "library of concepts", replenishable by the developers and experts. About 10 queries of various level of complexity and conceptuality are realized in the suggested version of the information system: from simple terminological searching (taking into account lexical and grammatical paradigms of Russian) to several kinds of explication of terminological fields and adjustable two-parameter thematic searching (a [set of terms] and a [distance between terms] within the limits of an author's paragraph are such parameters correspondingly).

  15. High-resolution neutron and X-ray diffraction room-temperature studies of an H-FABP-oleic acid complex: study of the internal water cluster and ligand binding by a transferred multipolar electron-density distribution.

    PubMed

    Howard, E I; Guillot, B; Blakeley, M P; Haertlein, M; Moulin, M; Mitschler, A; Cousido-Siah, A; Fadel, F; Valsecchi, W M; Tomizaki, Takashi; Petrova, T; Claudot, J; Podjarny, A

    2016-03-01

    Crystal diffraction data of heart fatty acid binding protein (H-FABP) in complex with oleic acid were measured at room temperature with high-resolution X-ray and neutron protein crystallography (0.98 and 1.90 Å resolution, respectively). These data provided very detailed information about the cluster of water molecules and the bound oleic acid in the H-FABP large internal cavity. The jointly refined X-ray/neutron structure of H-FABP was complemented by a transferred multipolar electron-density distribution using the parameters of the ELMAMII library. The resulting electron density allowed a precise determination of the electrostatic potential in the fatty acid (FA) binding pocket. Bader's quantum theory of atoms in molecules was then used to study interactions involving the internal water molecules, the FA and the protein. This approach showed H⋯H contacts of the FA with highly conserved hydrophobic residues known to play a role in the stabilization of long-chain FAs in the binding cavity. The determination of water hydrogen (deuterium) positions allowed the analysis of the orientation and electrostatic properties of the water molecules in the very ordered cluster. As a result, a significant alignment of the permanent dipoles of the water molecules with the protein electrostatic field was observed. This can be related to the dielectric properties of hydration layers around proteins, where the shielding of electrostatic interactions depends directly on the rotational degrees of freedom of the water molecules in the interface.

  16. Community Composition and Density of Methanogens in the Foregut of the Tammar Wallaby (Macropus eugenii)▿

    PubMed Central

    Evans, Paul N.; Hinds, Lyn A.; Sly, Lindsay I.; McSweeney, Christopher S.; Morrison, Mark; Wright, André-Denis G.

    2009-01-01

    The composition of the methanogenic archaeal community in the foregut contents of Tammar wallabies (Macropus eugenii) was studied using 16S rRNA and methyl coenzyme reductase subunit A (mcrA) gene clone libraries. Methanogens belonging to the Methanobacteriales and a well-supported cluster of uncultivated archaeon sequences previously observed in the ovine and bovine rumens were found. Methanogen densities ranged from 7.0 × 105 and 3.9 × 106 cells per gram of wet weight. PMID:19218421

  17. A Hierarchical Bayesian Model for Calibrating Estimates of Species Divergence Times

    PubMed Central

    Heath, Tracy A.

    2012-01-01

    In Bayesian divergence time estimation methods, incorporating calibrating information from the fossil record is commonly done by assigning prior densities to ancestral nodes in the tree. Calibration prior densities are typically parametric distributions offset by minimum age estimates provided by the fossil record. Specification of the parameters of calibration densities requires the user to quantify his or her prior knowledge of the age of the ancestral node relative to the age of its calibrating fossil. The values of these parameters can, potentially, result in biased estimates of node ages if they lead to overly informative prior distributions. Accordingly, determining parameter values that lead to adequate prior densities is not straightforward. In this study, I present a hierarchical Bayesian model for calibrating divergence time analyses with multiple fossil age constraints. This approach applies a Dirichlet process prior as a hyperprior on the parameters of calibration prior densities. Specifically, this model assumes that the rate parameters of exponential prior distributions on calibrated nodes are distributed according to a Dirichlet process, whereby the rate parameters are clustered into distinct parameter categories. Both simulated and biological data are analyzed to evaluate the performance of the Dirichlet process hyperprior. Compared with fixed exponential prior densities, the hierarchical Bayesian approach results in more accurate and precise estimates of internal node ages. When this hyperprior is applied using Markov chain Monte Carlo methods, the ages of calibrated nodes are sampled from mixtures of exponential distributions and uncertainty in the values of calibration density parameters is taken into account. PMID:22334343

  18. Construction of a directed hammerhead ribozyme library: towards the identification of optimal target sites for antisense-mediated gene inhibition.

    PubMed Central

    Pierce, M L; Ruffner, D E

    1998-01-01

    Antisense-mediated gene inhibition uses short complementary DNA or RNA oligonucleotides to block expression of any mRNA of interest. A key parameter in the success or failure of an antisense therapy is the identification of a suitable target site on the chosen mRNA. Ultimately, the accessibility of the target to the antisense agent determines target suitability. Since accessibility is a function of many complex factors, it is currently beyond our ability to predict. Consequently, identification of the most effective target(s) requires examination of every site. Towards this goal, we describe a method to construct directed ribozyme libraries against any chosen mRNA. The library contains nearly equal amounts of ribozymes targeting every site on the chosen transcript and the library only contains ribozymes capable of binding to that transcript. Expression of the ribozyme library in cultured cells should allow identification of optimal target sites under natural conditions, subject to the complexities of a fully functional cell. Optimal target sites identified in this manner should be the most effective sites for therapeutic intervention. PMID:9801305

  19. Hole filling and library optimization: application to commercially available fragment libraries.

    PubMed

    An, Yuling; Sherman, Woody; Dixon, Steven L

    2012-09-15

    Compound libraries comprise an integral component of drug discovery in the pharmaceutical and biotechnology industries. While in-house libraries often contain millions of molecules, this number pales in comparison to the accessible space of drug-like molecules. Therefore, care must be taken when adding new compounds to an existing library in order to ensure that unexplored regions in the chemical space are filled efficiently while not needlessly increasing the library size. In this work, we present an automated method to fill holes in an existing library using compounds from an external source and apply it to commercially available fragment libraries. The method, called Canvas HF, uses distances computed from 2D chemical fingerprints and selects compounds that fill vacuous regions while not suffering from the problem of selecting only compounds at the edge of the chemical space. We show that the method is robust with respect to different databases and the number of requested compounds to retrieve. We also present an extension of the method where chemical properties can be considered simultaneously with the selection process to bias the compounds toward a desired property space without imposing hard property cutoffs. We compare the results of Canvas HF to those obtained with a standard sphere exclusion method and with random compound selection and find that Canvas HF performs favorably. Overall, the method presented here offers an efficient and effective hole-filling strategy to augment compound libraries with compounds from external sources. The method does not have any fit parameters and therefore it should be applicable in most hole-filling applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singleton, Jr., Robert; Israel, Daniel M.; Doebling, Scott William

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returnedmore » at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.« less

  1. Double level selection in a constitutional dynamic library of coordination driven supramolecular polygons.

    PubMed

    Rancan, Marzio; Tessarolo, Jacopo; Casarin, Maurizio; Zanonato, Pier Luigi; Quici, Silvio; Armelao, Lidia

    2014-07-21

    A constitutional dynamic library (CDL) of Cu(II) metallo-supramolecular polygons has been studied as a bench test to examine an interesting selection case based on molecular recognition. Sorting of the CDL polygons is achieved through a proper guest that is hosted into the triangular metallo-macrocycle constituent. Two selection mechanisms are observed, a guest induced path and a guest templated self-assembly (virtual library approach). Remarkably, the triangular host can accommodate several guests with a degree of selectivity ranging from ∼1 to ∼10(4) for all possible guest pairs. A double level selection operates: guests drive the CDL toward the triangular polygon, and, at the same time, this is able to pick a specific guest from a set of competitive molecules, according to a selectivity-affinity correlation. Association constants of the host-guest systems have been determined. Guest competition and exchange studies have been analyzed through variable temperature UV-Vis absorption spectroscopy and single crystal X-ray diffraction studies. Molecular structures and electronic properties of the triangular polygon and of the host-guest systems also have been studied by means of all electrons density functional theory (DFT) and time-dependent density functional theory (TDDFT) calculations including dispersive contributions. DFT outcomes ultimately indicate the dispersive nature of the host-guest interactions, while TDDFT results allow a thorough assignment of the host and host-guests spectral features.

  2. Electron lithography STAR design guidelines. Part 2: The design of a STAR for space applications

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Newman, W.

    1982-01-01

    The STAR design system developed by NASA enables any user with a logic diagram to design a semicustom digital MOS integrated circuit. The system is comprised of a library of standard logic cells and computr programs to place, route, and display designs implemented with cells from the library. Also described is the development of a radiation-hard array designed for the STAR system. The design is based on the CMOS silicon gate technology developed by SANDIA National Laboratories. The design rules used are given as well as the model parameters developed for the basic array element. Library cells of the CMOS metal gate and CMOS silicon gate technologies were simulated using SPICE, and the results are shown and compared.

  3. An Old Story in the Parallel Synthesis World: An Approach to Hydantoin Libraries.

    PubMed

    Bogolubsky, Andrey V; Moroz, Yurii S; Savych, Olena; Pipko, Sergey; Konovets, Angelika; Platonov, Maxim O; Vasylchenko, Oleksandr V; Hurmach, Vasyl V; Grygorenko, Oleksandr O

    2018-01-08

    An approach to the parallel synthesis of hydantoin libraries by reaction of in situ generated 2,2,2-trifluoroethylcarbamates and α-amino esters was developed. To demonstrate utility of the method, a library of 1158 hydantoins designed according to the lead-likeness criteria (MW 200-350, cLogP 1-3) was prepared. The success rate of the method was analyzed as a function of physicochemical parameters of the products, and it was found that the method can be considered as a tool for lead-oriented synthesis. A hydantoin-bearing submicromolar primary hit acting as an Aurora kinase A inhibitor was discovered with a combination of rational design, parallel synthesis using the procedures developed, in silico and in vitro screenings.

  4. Crystallographic Fragment Based Drug Discovery: Use of a Brominated Fragment Library Targeting HIV Protease

    PubMed Central

    Tiefenbrunn, Theresa; Forli, Stefano; Happer, Meaghan; Gonzalez, Ana; Tsai, Yingssu; Soltis, Michael; Elder, John H.; Olson, Arthur J.; Stout, C. David

    2013-01-01

    A library of 68 brominated fragments was screened against a new crystal form of inhibited HIV-1 protease in order to probe surface sites in soaking experiments. Often fragments are weak binders with partial occupancy, resulting in weak, difficult-to-fit electron density. The use of a brominated fragment library addresses this challenge, as bromine can be located unequivocally via anomalous scattering. Data collection was carried out in an automated fashion using AutoDrug at SSRL. Novel hits were identified in the known surface sites: 3-bromo-2,6-dimethoxybenzoic acid (Br6) in the flap site, and 1-bromo-2-naphthoic acid (Br27) in the exosite, expanding the chemistry of known fragments for development of higher affinity potential allosteric inhibitors. At the same time, mapping the binding sites of a number of weaker binding Br-fragments provides further insight into the nature of these surface pockets. PMID:23998903

  5. Adiabatic quantum-flux-parametron cell library designed using a 10 kA cm-2 niobium fabrication process

    NASA Astrophysics Data System (ADS)

    Takeuchi, Naoki; Nagasawa, Shuichi; China, Fumihiro; Ando, Takumi; Hidaka, Mutsuo; Yamanashi, Yuki; Yoshikawa, Nobuyuki

    2017-03-01

    Adiabatic quantum-flux-parametron (AQFP) logic is an energy-efficient superconductor logic with zero static power consumption and very small switching energy. In this paper, we report a new AQFP cell library designed using the AIST 10 kA cm-2 Nb high-speed standard process (HSTP), which is a high-critical-current-density version of the AIST 2.5 kA cm-2 Nb standard process (STP2). Since the intrinsic damping of the Josephson junction (JJ) of HSTP is relatively strong, shunt resistors for JJs were removed and the energy efficiency improved significantly. Also, excitation transformers in the new cells were redesigned so that the cells can operate in a four-phase excitation mode. We described the detail of HSTP and the AQFP cell library designed using HSTP, and showed experimental results of cell test circuits.

  6. Extending a Tandem Mass Spectral Library to Include MS2 Spectra of Fragment Ions Produced In-Source and MSn Spectra.

    PubMed

    Yang, Xiaoyu; Neta, Pedatsur; Stein, Stephen E

    2017-11-01

    Tandem mass spectral library searching is finding increased use as an effective means of determining chemical identity in mass spectrometry-based omics studies. We previously reported on constructing a tandem mass spectral library that includes spectra for multiple precursor ions for each analyte. Here we report our method for expanding this library to include MS 2 spectra of fragment ions generated during the ionization process (in-source fragment ions) as well as MS 3 and MS 4 spectra. These can assist the chemical identification process. A simple density-based clustering algorithm was used to cluster all significant precursor ions from MS 1 scans for an analyte acquired during an infusion experiment. The MS 2 spectra associated with these precursor ions were grouped into the same precursor clusters. Subsequently, a new top-down hierarchical divisive clustering algorithm was developed for clustering the spectra from fragmentation of ions in each precursor cluster, including the MS 2 spectra of the original precursors and of the in-source fragments as well as the MS n spectra. This algorithm starts with all the spectra of one precursor in one cluster and then separates them into sub-clusters of similar spectra based on the fragment patterns. Herein, we describe the algorithms and spectral evaluation methods for extending the library. The new library features were demonstrated by searching the high resolution spectra of E. coli extracts against the extended library, allowing identification of compounds and their in-source fragment ions in a manner that was not possible before. Graphical Abstract ᅟ.

  7. Quantitative screening of yeast surface-displayed polypeptide libraries by magnetic bead capture.

    PubMed

    Yeung, Yik A; Wittrup, K Dane

    2002-01-01

    Magnetic bead capture is demonstrated here to be a feasible alternative for quantitative screening of favorable mutants from a cell-displayed polypeptide library. Flow cytometric sorting with fluorescent probes has been employed previously for high throughput screening for either novel binders or improved mutants. However, many laboratories do not have ready access to this technology as a result of the limited availability and high cost of cytometers, restricting the use of cell-displayed libraries. Using streptavidin-coated magnetic beads and biotinylated ligands, an alternative approach to cell-based library screening for improved mutants was developed. Magnetic bead capture probability of labeled cells is shown to be closely correlated with the surface ligand density. A single-pass enrichment ratio of 9400 +/- 1800-fold, at the expense of 85 +/- 6% binder losses, is achieved from screening a library that contains one antibody-displaying cell (binder) in 1.1 x 10(5) nondisplaying cells. Additionally, kinetic screening for an initial high affinity to low affinity (7.7-fold lower) mutant ratio of 1:95,000, the magnetic bead capture method attains a single-pass enrichment ratio of 600 +/- 200-fold with a 75 +/- 24% probability of loss for the higher affinity mutant. The observed high loss probabilities can be straightforwardly compensated for by library oversampling, given the inherently parallel nature of the screen. Overall, these results demonstrate that magnetic beads are capable of quantitatively screening for novel binders and improved mutants. The described methods are directly analogous to procedures in common use for phage display and should lower the barriers to entry for use of cell surface display libraries.

  8. Description of a novel allelic “thick leafed” mutant of sorghum

    USDA-ARS?s Scientific Manuscript database

    An allelic sorghum [Sorghum bicolor (L.) Moench] mutant with thick and narrow erect leaves (thl) and reduced adaxial stomatal density was isolated from the Annotated Individually pedigreed Mutagenized Sorghum (AIMS) mutant library developed at the Plant Stress and Germplasm Development Unit at Lubbo...

  9. Extending Your Reach.

    ERIC Educational Resources Information Center

    Batterman, Christopher T.

    2002-01-01

    High-density mobile storage (storage units mounted on carriages and rails which move and compact to utilize wasted space) can double the capacity of an existing school library facility. This article describes the benefits of going mobile and looks at the advantages of powered, programmable mobile storage systems. A sidebar describes Michigan…

  10. Developing an academic medical library core journal collection in the (almost) post-print era: the Florida State University College of Medicine Medical Library experience

    PubMed Central

    Shearer, Barbara S.; Nagy, Suzanne P.

    2003-01-01

    The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters. PMID:12883565

  11. Performance evaluation of phage-displayed synthetic human single-domain antibody libraries: A retrospective analysis.

    PubMed

    Henry, Kevin A; Tanha, Jamshid

    2018-05-01

    Fully human synthetic single-domain antibodies (sdAbs) are desirable therapeutic molecules but their development is a considerable challenge. Here, using a retrospective analysis of in-house historical data, we examined the parameters that impact the outcome of screening phage-displayed synthetic human sdAb libraries to discover antigen-specific binders. We found no evidence for a differential effect of domain type (V H or V L ), library randomization strategy, incorporation of a stabilizing disulfide linkage or sdAb display format (monovalent vs. multivalent) on the probability of obtaining any antigen-binding human sdAbs, instead finding that the success of library screens was primarily related to properties of target antigens, especially molecular mass. The solubility and binding affinity of sdAbs isolated from successful screens depended both on properties of the sdAb libraries (primarily domain type) and the target antigens. Taking attrition of sdAbs with major manufacturability concerns (aggregation; low expression) and sdAbs that do not recognize native cell-surface antigens as independent probabilities, we calculate the overall likelihood of obtaining ≥1 antigen-binding human sdAb from a single library-target screen as ~24%. Successful library-target screens should be expected to yield ~1.3 human sdAbs on average, each with average binding affinity of ~2 μM. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Developing an academic medical library core journal collection in the (almost) post-print era: the Florida State University College of Medicine Medical Library experience.

    PubMed

    Shearer, Barbara S; Nagy, Suzanne P

    2003-07-01

    The Florida State University (FSU) College of Medicine Medical Library is the first academic medical library to be established since the Web's dramatic appearance during the 1990s. A large customer base for electronic medical information resources is both comfortable with and eager to migrate to the electronic format completely, and vendors are designing radical pricing models that make print journal cancellations economically advantageous. In this (almost) post-print environment, the new FSU Medical Library is being created and will continue to evolve. By analyzing print journal subscription lists of eighteen academic medical libraries with similar missions to the community-based FSU College of Medicine and by entering these and selected quality indicators into a Microsoft Access database, a core list was created. This list serves as a selection guide, as a point for discussion with faculty and curriculum leaders when creating budgets, and for financial negotiations in a broader university environment. After journal titles specific to allied health sciences, veterinary medicine, dentistry, pharmacy, library science, and nursing were eliminated from the list, 4,225 unique journal titles emerged. Based on a ten-point scale including SERHOLD holdings and DOCLINE borrowing activity, a list of 449 core titles is identified. The core list has been saved in spreadsheet format for easy sorting by a number of parameters.

  13. Measuring Protoplanetary Disk Gas Surface Density Profiles with ALMA

    NASA Astrophysics Data System (ADS)

    Williams, Jonathan P.; McPartland, Conor

    2016-10-01

    The gas and dust are spatially segregated in protoplanetary disks due to the vertical settling and radial drift of large grains. A fuller accounting of the mass content and distribution in disks therefore requires spectral line observations. We extend the modeling approach presented in Williams & Best to show that gas surface density profiles can be measured from high fidelity 13CO integrated intensity images. We demonstrate the methodology by fitting ALMA observations of the HD 163296 disk to determine a gas mass, M gas = 0.048 M ⊙, and accretion disk characteristic size R c = 213 au and gradient γ = 0.39. The same parameters match the C18O 2-1 image and indicate an abundance ratio [12CO]/[C18O] of 700 independent of radius. To test how well this methodology can be applied to future line surveys of smaller, lower mass T Tauri disks, we create a large 13CO 2-1 image library and fit simulated data. For disks with gas masses 3-10 M Jup at 150 pc, ALMA observations with a resolution of 0.″2-0.″3 and integration times of ˜20 minutes allow reliable estimates of R c to within about 10 au and γ to within about 0.2. Economic gas imaging surveys are therefore feasible and offer the opportunity to open up a new dimension for studying disk structure and its evolution toward planet formation.

  14. Comparative Study on Various Geometrical Core Design of 300 MWth Gas Cooled Fast Reactor with UN-PuN Fuel Longlife without Refuelling

    NASA Astrophysics Data System (ADS)

    Dewi Syarifah, Ratna; Su'ud, Zaki; Basar, Khairul; Irwanto, Dwi

    2017-07-01

    Nuclear power has progressive improvement in the operating performance of exiting reactors and ensuring economic competitiveness of nuclear electricity around the world. The GFR use gas coolant and fast neutron spectrum. This research use helium coolant which has low neutron moderation, chemical inert and single phase. Comparative study on various geometrical core design for modular GFR with UN-PuN fuel long life without refuelling has been done. The calculation use SRAC2006 code both PIJ calculation and CITATION calculation. The data libraries use JENDL 4.0. The variation of fuel fraction is 40% until 65%. In this research, we varied the geometry of core reactor to find the optimum geometry design. The variation of the geometry design is balance cylinder; it means that the diameter active core (D) same with height active core (H). Second, pancake cylinder (D>H) and third, tall cylinder (D

  15. Measuring Two Key Parameters of H3 Color Centers in Diamond

    NASA Technical Reports Server (NTRS)

    Roberts, W. Thomas

    2005-01-01

    A method of measuring two key parameters of H3 color centers in diamond has been created as part of a continuing effort to develop tunable, continuous-wave, visible lasers that would utilize diamond as the lasing medium. (An H3 color center in a diamond crystal lattice comprises two nitrogen atoms substituted for two carbon atoms bonded to a third carbon atom. H3 color centers can be induced artificially; they also occur naturally. If present in sufficient density, they impart a yellow hue.) The method may also be applicable to the corresponding parameters of other candidate lasing media. One of the parameters is the number density of color centers, which is needed for designing an efficient laser. The other parameter is an optical-absorption cross section, which, as explained below, is needed for determining the number density. The present method represents an improvement over prior methods in which optical-absorption measurements have been used to determine absorption cross sections or number densities. Heretofore, in order to determine a number density from such measurements, it has been necessary to know the applicable absorption cross section; alternatively, to determine the absorption cross section from such measurements, it has been necessary to know the number density. If, as in this case, both the number density and the absorption cross section are initially unknown, then it is impossible to determine either parameter in the absence of additional information.

  16. A reliable computational workflow for the selection of optimal screening libraries.

    PubMed

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic components, it can be easily adapted and reproduced by computational groups interested in rational selection of screening libraries. Furthermore, the workflow could be readily modified to include additional components. This workflow has been routinely used in our laboratory for the selection of libraries in multiple projects and consistently selects libraries which are well balanced across multiple parameters.Graphical abstract.

  17. Environment Modeling Using Runtime Values for JPF-Android

    NASA Technical Reports Server (NTRS)

    van der Merwe, Heila; Tkachuk, Oksana; Nel, Seal; van der Merwe, Brink; Visser, Willem

    2015-01-01

    Software applications are developed to be executed in a specific environment. This environment includes external native libraries to add functionality to the application and drivers to fire the application execution. For testing and verification, the environment of an application is simplified abstracted using models or stubs. Empty stubs, returning default values, are simple to generate automatically, but they do not perform well when the application expects specific return values. Symbolic execution is used to find input parameters for drivers and return values for library stubs, but it struggles to detect the values of complex objects. In this work-in-progress paper, we explore an approach to generate drivers and stubs based on values collected during runtime instead of using default values. Entry-points and methods that need to be modeled are instrumented to log their parameters and return values. The instrumented applications are then executed using a driver and instrumented libraries. The values collected during runtime are used to generate driver and stub values on- the-fly that improve coverage during verification by enabling the execution of code that previously crashed or was missed. We are implementing this approach to improve the environment model of JPF-Android, our model checking and analysis tool for Android applications.

  18. Effects of LiDAR point density, sampling size and height threshold on estimation accuracy of crop biophysical parameters.

    PubMed

    Luo, Shezhou; Chen, Jing M; Wang, Cheng; Xi, Xiaohuan; Zeng, Hongcheng; Peng, Dailiang; Li, Dong

    2016-05-30

    Vegetation leaf area index (LAI), height, and aboveground biomass are key biophysical parameters. Corn is an important and globally distributed crop, and reliable estimations of these parameters are essential for corn yield forecasting, health monitoring and ecosystem modeling. Light Detection and Ranging (LiDAR) is considered an effective technology for estimating vegetation biophysical parameters. However, the estimation accuracies of these parameters are affected by multiple factors. In this study, we first estimated corn LAI, height and biomass (R2 = 0.80, 0.874 and 0.838, respectively) using the original LiDAR data (7.32 points/m2), and the results showed that LiDAR data could accurately estimate these biophysical parameters. Second, comprehensive research was conducted on the effects of LiDAR point density, sampling size and height threshold on the estimation accuracy of LAI, height and biomass. Our findings indicated that LiDAR point density had an important effect on the estimation accuracy for vegetation biophysical parameters, however, high point density did not always produce highly accurate estimates, and reduced point density could deliver reasonable estimation results. Furthermore, the results showed that sampling size and height threshold were additional key factors that affect the estimation accuracy of biophysical parameters. Therefore, the optimal sampling size and the height threshold should be determined to improve the estimation accuracy of biophysical parameters. Our results also implied that a higher LiDAR point density, larger sampling size and height threshold were required to obtain accurate corn LAI estimation when compared with height and biomass estimations. In general, our results provide valuable guidance for LiDAR data acquisition and estimation of vegetation biophysical parameters using LiDAR data.

  19. A technical assessment of the porcine ejaculated spermatozoa for a sperm-specific RNA-seq analysis.

    PubMed

    Gòdia, Marta; Mayer, Fabiana Quoos; Nafissi, Julieta; Castelló, Anna; Rodríguez-Gil, Joan Enric; Sánchez, Armand; Clop, Alex

    2018-04-26

    The study of the boar sperm transcriptome by RNA-seq can provide relevant information on sperm quality and fertility and might contribute to animal breeding strategies. However, the analysis of the spermatozoa RNA is challenging as these cells harbor very low amounts of highly fragmented RNA, and the ejaculates also contain other cell types with larger amounts of non-fragmented RNA. Here, we describe a strategy for a successful boar sperm purification, RNA extraction and RNA-seq library preparation. Using these approaches our objectives were: (i) to evaluate the sperm recovery rate (SRR) after boar spermatozoa purification by density centrifugation using the non-porcine-specific commercial reagent BoviPure TM ; (ii) to assess the correlation between SRR and sperm quality characteristics; (iii) to evaluate the relationship between sperm cell RNA load and sperm quality traits and (iv) to compare different library preparation kits for both total RNA-seq (SMARTer Universal Low Input RNA and TruSeq RNA Library Prep kit) and small RNA-seq (NEBNext Small RNA and TailorMix miRNA Sample Prep v2) for high-throughput sequencing. Our results show that pig SRR (~22%) is lower than in other mammalian species and that it is not significantly dependent of the sperm quality parameters analyzed in our study. Moreover, no relationship between the RNA yield per sperm cell and sperm phenotypes was found. We compared a RNA-seq library preparation kit optimized for low amounts of fragmented RNA with a standard kit designed for high amount and quality of input RNA and found that for sperm, a protocol designed to work on low-quality RNA is essential. We also compared two small RNA-seq kits and did not find substantial differences in their performance. We propose the methodological workflow described for the RNA-seq screening of the boar spermatozoa transcriptome. FPKM: fragments per kilobase of transcript per million mapped reads; KRT1: keratin 1; miRNA: micro-RNA; miscRNA: miscellaneous RNA; Mt rRNA: mitochondrial ribosomal RNA; Mt tRNA: mitochondrial transference RNA; OAZ3: ornithine decarboxylase antizyme 3; ORT: osmotic resistance test; piRNA: Piwi-interacting RNA; PRM1: protamine 1; PTPRC: protein tyrosine phosphatase receptor type C; rRNA: ribosomal RNA; snoRNA: small nucleolar RNA; snRNA: small nuclear RNA; SRR: sperm recovery rate; tRNA: transfer RNA.

  20. Detection of Volatile Organic Compounds by Self-assembled Monolayer Coated Sensor Array with Concentration-independent Fingerprints

    PubMed Central

    Chang, Ye; Tang, Ning; Qu, Hemi; Liu, Jing; Zhang, Daihua; Zhang, Hao; Pang, Wei; Duan, Xuexin

    2016-01-01

    In this paper, we have modeled and analyzed affinities and kinetics of volatile organic compounds (VOCs) adsorption (and desorption) on various surface chemical groups using multiple self-assembled monolayers (SAMs) functionalized film bulk acoustic resonator (FBAR) array. The high-frequency and micro-scale resonator provides improved sensitivity in the detections of VOCs at trace levels. With the study of affinities and kinetics, three concentration-independent intrinsic parameters (monolayer adsorption capacity, adsorption energy constant and desorption rate) of gas-surface interactions are obtained to contribute to a multi-parameter fingerprint library of VOC analytes. Effects of functional group’s properties on gas-surface interactions are also discussed. The proposed sensor array with concentration-independent fingerprint library shows potential as a portable electronic nose (e-nose) system for VOCs discrimination and gas-sensitive materials selections. PMID:27045012

  1. One Size Does Not Fit All: The Effect of Chain Length and Charge Density of Poly(ethylene imine) Based Copolymers on Delivery of pDNA, mRNA, and RepRNA Polyplexes.

    PubMed

    Blakney, Anna K; Yilmaz, Gokhan; McKay, Paul F; Becer, C Remzi; Shattock, Robin J

    2018-05-03

    Nucleic acid delivery systems are commonly translated between different modalities, such as DNA and RNA of varying length and structure, despite physical differences in these molecules that yield disparate delivery efficiency with the same system. Here, we synthesized a library of poly(2-ethyl-2-oxazoline)/poly(ethylene imine) copolymers with varying molar mass and charge densities in order to probe how pDNA, mRNA, and RepRNA polyplex characteristics affect transfection efficiency. The library was utilized in a full factorial design of experiment (DoE) screening, with outputs of luciferase expression, particle size, surface charge, and particle concentration. The optimal copolymer molar mass and charge density was found as 83 kDa/100%, 72 kDa/100%, and 45 kDa/80% for pDNA, RepRNA, and mRNA, respectively. While 10 of the synthesized copolymers enhanced the transfection efficiency of pDNA and mRNA, only 2 copolymers enhanced RepRNA transfection efficiency, indicating a narrow and more stringent design space for RepRNA. These findings suggest that there is not a "one size fits all" polymer for different nucleic acid species.

  2. Double density dynamics: realizing a joint distribution of a physical system and a parameter system

    NASA Astrophysics Data System (ADS)

    Fukuda, Ikuo; Moritsugu, Kei

    2015-11-01

    To perform a variety of types of molecular dynamics simulations, we created a deterministic method termed ‘double density dynamics’ (DDD), which realizes an arbitrary distribution for both physical variables and their associated parameters simultaneously. Specifically, we constructed an ordinary differential equation that has an invariant density relating to a joint distribution of the physical system and the parameter system. A generalized density function leads to a physical system that develops under nonequilibrium environment-describing superstatistics. The joint distribution density of the physical system and the parameter system appears as the Radon-Nikodym derivative of a distribution that is created by a scaled long-time average, generated from the flow of the differential equation under an ergodic assumption. The general mathematical framework is fully discussed to address the theoretical possibility of our method, and a numerical example representing a 1D harmonic oscillator is provided to validate the method being applied to the temperature parameters.

  3. Anisotropy of the angular distribution of fission fragments in heavy-ion fusion-fission reactions: The influence of the level-density parameter and the neck thickness

    NASA Astrophysics Data System (ADS)

    Naderi, D.; Pahlavani, M. R.; Alavi, S. A.

    2013-05-01

    Using the Langevin dynamical approach, the neutron multiplicity and the anisotropy of angular distribution of fission fragments in heavy ion fusion-fission reactions were calculated. We applied one- and two-dimensional Langevin equations to study the decay of a hot excited compound nucleus. The influence of the level-density parameter on neutron multiplicity and anisotropy of angular distribution of fission fragments was investigated. We used the level-density parameter based on the liquid drop model with two different values of the Bartel approach and Pomorska approach. Our calculations show that the anisotropy and neutron multiplicity are affected by level-density parameter and neck thickness. The calculations were performed on the 16O+208Pb and 20Ne+209Bi reactions. Obtained results in the case of the two-dimensional Langevin with a level-density parameter based on Bartel and co-workers approach are in better agreement with experimental data.

  4. Systematic Biological Filter Design with a Desired I/O Filtering Response Based on Promoter-RBS Libraries.

    PubMed

    Hsu, Chih-Yuan; Pan, Zhen-Ming; Hu, Rei-Hsing; Chang, Chih-Chun; Cheng, Hsiao-Chun; Lin, Che; Chen, Bor-Sen

    2015-01-01

    In this study, robust biological filters with an external control to match a desired input/output (I/O) filtering response are engineered based on the well-characterized promoter-RBS libraries and a cascade gene circuit topology. In the field of synthetic biology, the biological filter system serves as a powerful detector or sensor to sense different molecular signals and produces a specific output response only if the concentration of the input molecular signal is higher or lower than a specified threshold. The proposed systematic design method of robust biological filters is summarized into three steps. Firstly, several well-characterized promoter-RBS libraries are established for biological filter design by identifying and collecting the quantitative and qualitative characteristics of their promoter-RBS components via nonlinear parameter estimation method. Then, the topology of synthetic biological filter is decomposed into three cascade gene regulatory modules, and an appropriate promoter-RBS library is selected for each module to achieve the desired I/O specification of a biological filter. Finally, based on the proposed systematic method, a robust externally tunable biological filter is engineered by searching the promoter-RBS component libraries and a control inducer concentration library to achieve the optimal reference match for the specified I/O filtering response.

  5. Sequenced sorghum mutant library- an efficient platform for discovery of causal gene mutations

    USDA-ARS?s Scientific Manuscript database

    Ethyl methanesulfonate (EMS) efficiently generates high-density mutations in genomes. We applied whole-genome sequencing to 256 phenotyped mutant lines of sorghum (Sorghum bicolor L. Moench) to 16x coverage. Comparisons with the reference sequence revealed >1.8 million canonical EMS-induced G/C to A...

  6. Genome-wide annotation of mutations in a phenotyped mutant library provides an efficient platform for discovery of casual gene mutations

    USDA-ARS?s Scientific Manuscript database

    Ethyl methanesulfonate (EMS) efficiently generates high-density mutations in genomes. Conventionally, these mutations are identified by techniques that can detect single-nucleotide mismatches in heteroduplexes of individual PCR amplicons. We applied whole-genome sequencing to 256-phenotyped mutant l...

  7. 77 FR 5564 - Tehachapi Uplands Multiple Species Habitat Conservation Plan; Kern County, CA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... Library, Frazier Park Branch, 3732 Park Drive, Frazier Park, CA 93225. FOR FURTHER INFORMATION CONTACT... and future low density residential and commercial development activities on a portion of the Tejon...,533 acres of mountain resort and other development within and adjacent to the Interstate 5 corridor...

  8. Use of Massive Parallel Computing Libraries in the Context of Global Gravity Field Determination from Satellite Data

    NASA Astrophysics Data System (ADS)

    Brockmann, J. M.; Schuh, W.-D.

    2011-07-01

    The estimation of the global Earth's gravity field parametrized as a finite spherical harmonic series is computationally demanding. The computational effort depends on the one hand on the maximal resolution of the spherical harmonic expansion (i.e. the number of parameters to be estimated) and on the other hand on the number of observations (which are several millions for e.g. observations from the GOCE satellite missions). To circumvent these restrictions, a massive parallel software based on high-performance computing (HPC) libraries as ScaLAPACK, PBLAS and BLACS was designed in the context of GOCE HPF WP6000 and the GOCO consortium. A prerequisite for the use of these libraries is that all matrices are block-cyclic distributed on a processor grid comprised by a large number of (distributed memory) computers. Using this set of standard HPC libraries has the benefit that once the matrices are distributed across the computer cluster, a huge set of efficient and highly scalable linear algebra operations can be used.

  9. Empirical calibration of the near-infrared Ca ii triplet - I. The stellar library and index definition

    NASA Astrophysics Data System (ADS)

    Cenarro, A. J.; Cardiel, N.; Gorgas, J.; Peletier, R. F.; Vazdekis, A.; Prada, F.

    2001-09-01

    A new stellar library at the near-IR spectral region developed for the empirical calibration of the Caii triplet and stellar population synthesis modelling is presented. The library covers the range λλ8348-9020 at 1.5-Å (FWHM) spectral resolution, and consists of 706 stars spanning a wide range in atmospheric parameters. We have defined a new set of near-IR indices, CaT*, CaT and PaT, which mostly overcome the limitations of previous definitions, the former being specially suited for the measurement of the Caii triplet strength corrected for the contamination from Paschen lines. We also present a comparative study of the new and the previous Ca indices, as well as the corresponding transformations between the different systems. A thorough analysis of the sources of index errors and the procedure to calculate them is given. Finally, index and error measurements for the whole stellar library are provided together with the final spectra.

  10. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  11. Air Pollution and Quality of Sperm: A Meta-Analysis

    PubMed Central

    Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin

    2015-01-01

    Context: Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. Evidence Acquisition: The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. Results: The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. Conclusions: The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram. PMID:26023349

  12. Air pollution and quality of sperm: a meta-analysis.

    PubMed

    Fathi Najafi, Tahereh; Latifnejad Roudsari, Robab; Namvar, Farideh; Ghavami Ghanbarabadi, Vahid; Hadizadeh Talasaz, Zahra; Esmaeli, Mahin

    2015-04-01

    Air pollution is common in all countries and affects reproductive functions in men and women. It particularly impacts sperm parameters in men. This meta-analysis aimed to examine the impact of air pollution on the quality of sperm. The scientific databases of Medline, PubMed, Scopus, Google scholar, Cochrane Library, and Elsevier were searched to identify relevant articles published between 1978 to 2013. In the first step, 76 articles were selected. These studies were ecological correlation, cohort, retrospective, cross-sectional, and case control ones that were found through electronic and hand search of references about air pollution and male infertility. The outcome measurement was the change in sperm parameters. A total of 11 articles were ultimately included in a meta-analysis to examine the impact of air pollution on sperm parameters. The authors applied meta-analysis sheets from Cochrane library, then data extraction, including mean and standard deviation of sperm parameters were calculated and finally their confidence interval (CI) were compared to CI of standard parameters. The CI for pooled means were as follows: 2.68 ± 0.32 for ejaculation volume (mL), 62.1 ± 15.88 for sperm concentration (million per milliliter), 39.4 ± 5.52 for sperm motility (%), 23.91 ± 13.43 for sperm morphology (%) and 49.53 ± 11.08 for sperm count. The results of this meta-analysis showed that air pollution reduces sperm motility, but has no impact on the other sperm parameters of spermogram.

  13. Influence of additive laser manufacturing parameters on surface using density of partially melted particles

    NASA Astrophysics Data System (ADS)

    Rosa, Benoit; Brient, Antoine; Samper, Serge; Hascoët, Jean-Yves

    2016-12-01

    Mastering the additive laser manufacturing surface is a real challenge and would allow functional surfaces to be obtained without finishing. Direct Metal Deposition (DMD) surfaces are composed by directional and chaotic textures that are directly linked to the process principles. The aim of this work is to obtain surface topographies by mastering the operating process parameters. Based on experimental investigation, the influence of operating parameters on the surface finish has been modeled. Topography parameters and multi-scale analysis have been used in order to characterize the DMD obtained surfaces. This study also proposes a methodology to characterize DMD chaotic texture through topography filtering and 3D image treatment. In parallel, a new parameter is proposed: density of particles (D p). Finally, this study proposes a regression modeling between process parameters and density of particles parameter.

  14. n+235U resonance parameters and neutron multiplicities in the energy region below 100 eV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Capote, Roberto; Trkov, Andrej; Pronyaev, Vladimir G.

    2017-09-01

    In August 2016, following the recent effort within the Collaborative International Evaluated Library Organization (CIELO) pilot project to improve the neutron cross sections of 235U, Oak Ridge National Laboratory (ORNL) collaborated with the International Atomic Energy Agency (IAEA) to release a resonance parameter evaluation. This evaluation restores the performance of the evaluated cross sections for the thermal- and above-thermal-solution benchmarks on the basis of newly evaluated thermal neutron constants (TNCs) and thermal prompt fission neutron spectra (PFNS). Performed with support from the US Nuclear Criticality Safety Program (NCSP) in an effort to provide the highest fidelity general purpose nuclear database for nuclear criticality applications, the resonance parameter evaluation was submitted as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The resonance parameter evaluation methodology used the Reich-Moore approximation of the R-matrix formalism implemented in the code SAMMY to fit the available time-of-flight (TOF) measured data for the thermal induced cross section of n+235U up to 100 eV. While maintaining reasonably good agreement with the experimental data, the validation analysis focused on restoring the benchmark performance for 235U solutions by combining changes to the resonance parameters and to the prompt resonance v̅

  15. micrOMEGAs 2.0: A program to calculate the relic density of dark matter in a generic model

    NASA Astrophysics Data System (ADS)

    Bélanger, G.; Boudjema, F.; Pukhov, A.; Semenov, A.

    2007-03-01

    micrOMEGAs 2.0 is a code which calculates the relic density of a stable massive particle in an arbitrary model. The underlying assumption is that there is a conservation law like R-parity in supersymmetry which guarantees the stability of the lightest odd particle. The new physics model must be incorporated in the notation of CalcHEP, a package for the automatic generation of squared matrix elements. Once this is done, all annihilation and coannihilation channels are included automatically in any model. Cross-sections at v=0, relevant for indirect detection of dark matter, are also computed automatically. The package includes three sample models: the minimal supersymmetric standard model (MSSM), the MSSM with complex phases and the NMSSM. Extension to other models, including non-supersymmetric models, is described. Program summaryTitle of program:micrOMEGAs2.0 Catalogue identifier:ADQR_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADQR_v2_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers for which the program is designed and others on which it has been tested:PC, Alpha, Mac, Sun Operating systems under which the program has been tested:UNIX (Linux, OSF1, SunOS, Darwin, Cygwin) Programming language used:C and Fortran Memory required to execute with typical data:17 MB depending on the number of processes required No. of processors used:1 Has the code been vectorized or parallelized:no No. of lines in distributed program, including test data, etc.:91 778 No. of bytes in distributed program, including test data, etc.:1 306 726 Distribution format:tar.gz External routines/libraries used:no Catalogue identifier of previous version:ADQR_v1_3 Journal reference of previous version:Comput. Phys. Comm. 174 (2006) 577 Does the new version supersede the previous version:yes Nature of physical problem:Calculation of the relic density of the lightest stable particle in a generic new model of particle physics. Method of solution: In numerically solving the evolution equation for the density of dark matter, relativistic formulae for the thermal average are used. All tree-level processes for annihilation and coannihilation of new particles in the model are included. The cross-sections for all processes are calculated exactly with CalcHEP after definition of a model file. Higher-order QCD corrections to Higgs couplings to quark pairs are included. Reasons for the new version:There are many models of new physics that propose a candidate for dark matter besides the much studied minimal supersymmetric standard model. This new version not only incorporates extensions of the MSSM, such as the MSSM with complex phases, or the NMSSM which contains an extra singlet superfield but also gives the possibility for the user to incorporate easily a new model. For this the user only needs to redefine appropriately a new model file. Summary of revisions:Possibility to include in the package any particle physics model with a discrete symmetry that guarantees the stability of the cold dark matter candidate (LOP) and to compute the relic density of CDM. Compute automatically the cross-sections for annihilation of the LOP at small velocities into SM final states and provide the energy spectra for γ,e,p¯,ν final states. For the MSSM with input parameters defined at the GUT scale, the interface with any of the spectrum calculator codes reads an input file in the SUSY Les Houches Accord format (SLHA). Implementation of the MSSM with complex parameters (CPV-MSSM) with an interface to CPsuperH to calculate the spectrum. Routine to calculate the electric dipole moment of the electron in the CPV-MSSM. In the NMSSM, new interface compatible with NMHDECAY2.1. Typical running time:0.2 sec Unusual features of the program:Depending on the parameters of the model, the program generates additional new code, compiles it and loads it dynamically.

  16. Development of Probabilistic Socio-Economic Emissions Scenarios (2012)

    EPA Pesticide Factsheets

    The purpose of this analysis is to help overcome these limitations through the development of a publically available library of socio-economic-emissions projections derived from a systematic examination of uncertainty in key underlying model parameters, w

  17. Effects of the Application of the New Nuclear Data Library ENDF/B to the Criticality Analysis of AP1000

    NASA Astrophysics Data System (ADS)

    Kuntoro, Iman; Sembiring, T. M.; Susilo, Jati; Deswandri; Sunaryo, G. R.

    2018-02-01

    Calculations of criticality of the AP1000 core due to the use of new edition of nuclear data library namely ENDF/B-VII and ENDF/B-VII.1 have been done. This work is aimed to know the accuracy of ENDF/B-VII.1 compared to ENDF/B-VII and ENDF/B-VI.8. in determining the criticality parameter of AP1000. Analysis ws imposed to core at cold zero power (CZP) conditions. The calculations have been carried out by means of MCNP computer code for 3 dimension geometry. The results show that criticality parameter namely effective multiplication factor of the AP1000 core are higher than that ones resulted from ENDF/B-VI.8 with relative differences of 0.39% for application of ENDF/B-VII and of 0.34% for application of ENDF/B-VII.1.

  18. SU-F-T-366: Dosimetric Parameters Enhancement of 120-Leaf Millennium MLC Using EGSnrc and IAEA Phase-Space Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haddad, K; Alopoor, H

    Purpose: Recently, the multileaf collimators (MLC) have become an important part of any LINAC collimation systems because they reduce the treatment planning time and improves the conformity. Important factors that affects the MLCs collimation performance are leaves material composition and their thickness. In this study, we investigate the main dosimetric parameters of 120-leaf Millennium MLC including dose in the buildup point, physical penumbra as well as average and end leaf leakages. Effects of the leaves geometry and density on these parameters are evaluated Methods: From EGSnrc Monte Carlo code, BEAMnrc and DOSXYZnrc modules are used to evaluate the dosimetric parametersmore » of a water phantom exposed to a Varian xi for 100cm SSD. Using IAEA phasespace data just above MLC (Z=46cm) and BEAMnrc, for the modified 120-leaf Millennium MLC a new phase space data at Z=52cm is produces. The MLC is modified both in leaf thickness and material composition. EGSgui code generates 521ICRU library for tungsten alloys. DOSXYZnrc with the new phase space evaluates the dose distribution in a water phantom of 60×60×20 cm3 with voxel size of 4×4×2 mm3. Using DOSXYZnrc dose distributions for open beam and closed beam as well as the leakages definition, end leakage, average leakage and physical penumbra are evaluated. Results: A new MLC with improved dosimetric parameters is proposed. The physical penumbra for proposed MLC is 4.7mm compared to 5.16 mm for Millennium. Average leakage in our design is reduced to 1.16% compared to 1.73% for Millennium, the end leaf leakage suggested design is also reduced to 4.86% compared to 7.26% of Millennium. Conclusion: The results show that the proposed MLC with enhanced dosimetric parameters could improve the conformity of treatment planning.« less

  19. The Essential Genome of Escherichia coli K-12

    PubMed Central

    2018-01-01

    ABSTRACT Transposon-directed insertion site sequencing (TraDIS) is a high-throughput method coupling transposon mutagenesis with short-fragment DNA sequencing. It is commonly used to identify essential genes. Single gene deletion libraries are considered the gold standard for identifying essential genes. Currently, the TraDIS method has not been benchmarked against such libraries, and therefore, it remains unclear whether the two methodologies are comparable. To address this, a high-density transposon library was constructed in Escherichia coli K-12. Essential genes predicted from sequencing of this library were compared to existing essential gene databases. To decrease false-positive identification of essential genes, statistical data analysis included corrections for both gene length and genome length. Through this analysis, new essential genes and genes previously incorrectly designated essential were identified. We show that manual analysis of TraDIS data reveals novel features that would not have been detected by statistical analysis alone. Examples include short essential regions within genes, orientation-dependent effects, and fine-resolution identification of genome and protein features. Recognition of these insertion profiles in transposon mutagenesis data sets will assist genome annotation of less well characterized genomes and provides new insights into bacterial physiology and biochemistry. PMID:29463657

  20. Recent advances in PC-Linux systems for electronic structure computations by optimized compilers and numerical libraries.

    PubMed

    Yu, Jen-Shiang K; Yu, Chin-Hui

    2002-01-01

    One of the most frequently used packages for electronic structure research, GAUSSIAN 98, is compiled on Linux systems with various hardware configurations, including AMD Athlon (with the "Thunderbird" core), AthlonMP, and AthlonXP (with the "Palomino" core) systems as well as the Intel Pentium 4 (with the "Willamette" core) machines. The default PGI FORTRAN compiler (pgf77) and the Intel FORTRAN compiler (ifc) are respectively employed with different architectural optimization options to compile GAUSSIAN 98 and test the performance improvement. In addition to the BLAS library included in revision A.11 of this package, the Automatically Tuned Linear Algebra Software (ATLAS) library is linked against the binary executables to improve the performance. Various Hartree-Fock, density-functional theories, and the MP2 calculations are done for benchmarking purposes. It is found that the combination of ifc with ATLAS library gives the best performance for GAUSSIAN 98 on all of these PC-Linux computers, including AMD and Intel CPUs. Even on AMD systems, the Intel FORTRAN compiler invariably produces binaries with better performance than pgf77. The enhancement provided by the ATLAS library is more significant for post-Hartree-Fock calculations. The performance on one single CPU is potentially as good as that on an Alpha 21264A workstation or an SGI supercomputer. The floating-point marks by SpecFP2000 have similar trends to the results of GAUSSIAN 98 package.

  1. The effect of call libraries and acoustic filters on the identification of bat echolocation.

    PubMed

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-09-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  2. The effect of call libraries and acoustic filters on the identification of bat echolocation

    PubMed Central

    Clement, Matthew J; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys. PMID:25535563

  3. The effect of call libraries and acoustic filters on the identification of bat echolocation

    USGS Publications Warehouse

    Clement, Matthew; Murray, Kevin L; Solick, Donald I; Gruver, Jeffrey C

    2014-01-01

    Quantitative methods for species identification are commonly used in acoustic surveys for animals. While various identification models have been studied extensively, there has been little study of methods for selecting calls prior to modeling or methods for validating results after modeling. We obtained two call libraries with a combined 1556 pulse sequences from 11 North American bat species. We used four acoustic filters to automatically select and quantify bat calls from the combined library. For each filter, we trained a species identification model (a quadratic discriminant function analysis) and compared the classification ability of the models. In a separate analysis, we trained a classification model using just one call library. We then compared a conventional model assessment that used the training library against an alternative approach that used the second library. We found that filters differed in the share of known pulse sequences that were selected (68 to 96%), the share of non-bat noises that were excluded (37 to 100%), their measurement of various pulse parameters, and their overall correct classification rate (41% to 85%). Although the top two filters did not differ significantly in overall correct classification rate (85% and 83%), rates differed significantly for some bat species. In our assessment of call libraries, overall correct classification rates were significantly lower (15% to 23% lower) when tested on the second call library instead of the training library. Well-designed filters obviated the need for subjective and time-consuming manual selection of pulses. Accordingly, researchers should carefully design and test filters and include adequate descriptions in publications. Our results also indicate that it may not be possible to extend inferences about model accuracy beyond the training library. If so, the accuracy of acoustic-only surveys may be lower than commonly reported, which could affect ecological understanding or management decisions based on acoustic surveys.

  4. Effects of varying densities on serum reproductive parameters in pen-reared juvenile female rainbow trout Oncorhynchus mykiss farms

    NASA Astrophysics Data System (ADS)

    Hou, Zhishuai; Wen, Haishen; Li, Jifang; He, Feng; Liu, Qun; Wang, Jinhuan; Guan, Biao; Wang, Qinglong

    2017-01-01

    The primary goal of this study was to assess the effect of varying densities on serum reproductive parameters of immature rainbow trout Oncorhynchus mykiss. Experimental trout were maintained in intensive, pen-reared farms for 300 days in fresh water reservoirs. Initial densities were 4.6, 6.6, and 8.6 kg/m3 (40, 60, 80 ind./m3), indicated as SD1, SD2, SD3, and final densities were 31.1, 40.6, 49.3 kg/m3, respectively. A summary of the ovarian stages were observed by histological examination. Serum E2 (estradiol), T (testosterone) were evaluated by radioimmunoassay and FSH (follicle-stimulating-hormone), LH (luteinizing-hormone), vitellogenin, 17α,20β-P (17α,20βdihydroxy4-pregnen-3-one) were measured by enzyme-linked immunosorbent assay. Our findings demonstrated that ovarian development were retarded (from stage III to stage IV) at highest rearing density (SD3) after 180 days of intensive culture (over 40.6 kg/m3). In addition, we observed an inverse relationship between serum reproductive parameters and rearing density. Furthermore, compared to serum reproductive parameters of SD1, E2, T, FSH, vitellogenin, 17α,20β-P, GSI and LH of two higher density groups decreased firstly and significantly at 60 (over 15.9 kg/m 3 ), 180 (over 31.7 kg/m 3 ), 180 (over 40.6 kg/m3), 240 (over 36 kg/m3), 240 (over 36 kg/m3), 240 (over 45 kg/m3) and 300 (over 49.3 kg/m3) days, respectively. Comparing serum reproductive parameters within the same ovarian development stage of rainbow trout from varying densities revealed that higher population density also led to significantly lower overall serum reproductive parameters. Overall, this study presents the reproductive, endocrinological parameters of juvenile female rainbow trout at high rearing densities and indicates the need for rainbow trout (114.44±5.21 g, 19.69±0.31 cm) that are initially stocked at 6.6 or 8.6 kg/m3 should be classified and subdivided into lower density after 180 days of farming (not over 31.7 kg/m3).

  5. A two-population sporadic meteoroid bulk density distribution and its implications for environment models

    NASA Astrophysics Data System (ADS)

    Moorhead, Althea V.; Blaauw, Rhiannon C.; Moser, Danielle E.; Campbell-Brown, Margaret D.; Brown, Peter G.; Cooke, William J.

    2017-12-01

    The bulk density of a meteoroid affects its dynamics in space, its ablation in the atmosphere, and the damage it does to spacecraft and lunar or planetary surfaces. Meteoroid bulk densities are also notoriously difficult to measure, and we are typically forced to assume a density or attempt to measure it via a proxy. In this paper, we construct a density distribution for sporadic meteoroids based on existing density measurements. We considered two possible proxies for density: the KB parameter introduced by Ceplecha and Tisserand parameter, TJ. Although KB is frequently cited as a proxy for meteoroid material properties, we find that it is poorly correlated with ablation-model-derived densities. We therefore follow the example of Kikwaya et al. in associating density with the Tisserand parameter. We fit two density distributions to meteoroids originating from Halley-type comets (TJ < 2) and those originating from all other parent bodies (TJ > 2); the resulting two-population density distribution is the most detailed sporadic meteoroid density distribution justified by the available data. Finally, we discuss the implications for meteoroid environment models and spacecraft risk assessments. We find that correcting for density increases the fraction of meteoroid-induced spacecraft damage produced by the helion/antihelion source.

  6. Global asymptotic stability of density dependent integral population projection models.

    PubMed

    Rebarber, Richard; Tenhumberg, Brigitte; Townley, Stuart

    2012-02-01

    Many stage-structured density dependent populations with a continuum of stages can be naturally modeled using nonlinear integral projection models. In this paper, we study a trichotomy of global stability result for a class of density dependent systems which include a Platte thistle model. Specifically, we identify those systems parameters for which zero is globally asymptotically stable, parameters for which there is a positive asymptotically stable equilibrium, and parameters for which there is no asymptotically stable equilibrium. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Swift Foxes and Ideal Free Distribution: Relative Influence of Vegetation and Rodent Prey Base on Swift Fox Survival, Density, and Home Range Size

    DTIC Science & Technology

    2012-01-01

    of exploiting a wide range of habitats, reported population parameters such as density and survival vary widely indicating variation in habitat quality...more strongly influenced by the “riskiness” of the habitat than by resource availability [8]. Swift fox population parameters in different landscapes...we explored the effects of landscape heterogeneity on population parameters likely to reflect habitat quality, such as population density, home range

  8. Uncertainty analysis on reactivity and discharged inventory for a pressurized water reactor fuel assembly due to {sup 235,238}U nuclear data uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Da Cruz, D. F.; Rochman, D.; Koning, A. J.

    2012-07-01

    This paper discusses the uncertainty analysis on reactivity and inventory for a typical PWR fuel element as a result of uncertainties in {sup 235,238}U nuclear data. A typical Westinghouse 3-loop fuel assembly fuelled with UO{sub 2} fuel with 4.8% enrichment has been selected. The Total Monte-Carlo method has been applied using the deterministic transport code DRAGON. This code allows the generation of the few-groups nuclear data libraries by directly using data contained in the nuclear data evaluation files. The nuclear data used in this study is from the JEFF3.1 evaluation, and the nuclear data files for {sup 238}U and {supmore » 235}U (randomized for the generation of the various DRAGON libraries) are taken from the nuclear data library TENDL. The total uncertainty (obtained by randomizing all {sup 238}U and {sup 235}U nuclear data in the ENDF files) on the reactor parameters has been split into different components (different nuclear reaction channels). Results show that the TMC method in combination with a deterministic transport code constitutes a powerful tool for performing uncertainty and sensitivity analysis of reactor physics parameters. (authors)« less

  9. Equation of state for dense nucleonic matter from metamodeling. I. Foundational aspects

    NASA Astrophysics Data System (ADS)

    Margueron, Jérôme; Hoffmann Casali, Rudiney; Gulminelli, Francesca

    2018-02-01

    Metamodeling for the nucleonic equation of state (EOS), inspired from a Taylor expansion around the saturation density of symmetric nuclear matter, is proposed and parameterized in terms of the empirical parameters. The present knowledge of nuclear empirical parameters is first reviewed in order to estimate their average values and associated uncertainties, and thus defining the parameter space of the metamodeling. They are divided into isoscalar and isovector types, and ordered according to their power in the density expansion. The goodness of the metamodeling is analyzed against the predictions of the original models. In addition, since no correlation among the empirical parameters is assumed a priori, all arbitrary density dependences can be explored, which might not be accessible in existing functionals. Spurious correlations due to the assumed functional form are also removed. This meta-EOS allows direct relations between the uncertainties on the empirical parameters and the density dependence of the nuclear equation of state and its derivatives, and the mapping between the two can be done with standard Bayesian techniques. A sensitivity analysis shows that the more influential empirical parameters are the isovector parameters Lsym and Ksym, and that laboratory constraints at supersaturation densities are essential to reduce the present uncertainties. The present metamodeling for the EOS for nuclear matter is proposed for further applications in neutron stars and supernova matter.

  10. Convergent development of anodic bacterial communities in microbial fuel cells.

    PubMed

    Yates, Matthew D; Kiely, Patrick D; Call, Douglas F; Rismani-Yazdi, Hamid; Bibby, Kyle; Peccia, Jordan; Regan, John M; Logan, Bruce E

    2012-11-01

    Microbial fuel cells (MFCs) are often inoculated from a single wastewater source. The extent that the inoculum affects community development or power production is unknown. The stable anodic microbial communities in MFCs were examined using three inocula: a wastewater treatment plant sample known to produce consistent power densities, a second wastewater treatment plant sample, and an anaerobic bog sediment. The bog-inoculated MFCs initially produced higher power densities than the wastewater-inoculated MFCs, but after 20 cycles all MFCs on average converged to similar voltages (470±20 mV) and maximum power densities (590±170 mW m(-2)). The power output from replicate bog-inoculated MFCs was not significantly different, but one wastewater-inoculated MFC (UAJA3 (UAJA, University Area Joint Authority Wastewater Treatment Plant)) produced substantially less power. Denaturing gradient gel electrophoresis profiling showed a stable exoelectrogenic biofilm community in all samples after 11 cycles. After 16 cycles the predominance of Geobacter spp. in anode communities was identified using 16S rRNA gene clone libraries (58±10%), fluorescent in-situ hybridization (FISH) (63±6%) and pyrosequencing (81±4%). While the clone library analysis for the underperforming UAJA3 had a significantly lower percentage of Geobacter spp. sequences (36%), suggesting that a predominance of this microbe was needed for convergent power densities, the lower percentage of this species was not verified by FISH or pyrosequencing analyses. These results show that the predominance of Geobacter spp. in acetate-fed systems was consistent with good MFC performance and independent of the inoculum source.

  11. mr: A C++ library for the matching and running of the Standard Model parameters

    NASA Astrophysics Data System (ADS)

    Kniehl, Bernd A.; Pikelner, Andrey F.; Veretin, Oleg L.

    2016-09-01

    We present the C++ program library mr that allows us to reliably calculate the values of the running parameters in the Standard Model at high energy scales. The initial conditions are obtained by relating the running parameters in the MS bar renormalization scheme to observables at lower energies with full two-loop precision. The evolution is then performed in accordance with the renormalization group equations with full three-loop precision. Pure QCD corrections to the matching and running are included through four loops. We also provide a Mathematica interface for this program library. Catalogue identifier: AFAI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFAI_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 517613 No. of bytes in distributed program, including test data, etc.: 2358729 Distribution format: tar.gz Programming language: C++. Computer: IBM PC. Operating system: Linux, Mac OS X. RAM: 1 GB Classification: 11.1. External routines: TSIL [1], OdeInt [2], boost [3] Nature of problem: The running parameters of the Standard Model renormalized in the MS bar scheme at some high renormalization scale, which is chosen by the user, are evaluated in perturbation theory as precisely as possible in two steps. First, the initial conditions at the electroweak energy scale are evaluated from the Fermi constant GF and the pole masses of the W, Z, and Higgs bosons and the bottom and top quarks including the full two-loop threshold corrections. Second, the evolution to the high energy scale is performed by numerically solving the renormalization group evolution equations through three loops. Pure QCD corrections to the matching and running are included through four loops. Solution method: Numerical integration of analytic expressions Additional comments: Available for download from URL: http://apik.github.io/mr/. The MathLink interface is tested to work with Mathematica 7-9 and, with an additional flag, also with Mathematica 10 under Linux and with Mathematica 10 under Mac OS X. Running time: less than 1 second References: [1] S. P. Martin and D. G. Robertson, Comput. Phys. Commun. 174 (2006) 133-151 [hep-ph/0501132]. [2] K. Ahnert and M. Mulansky, AIP Conf. Proc. 1389 (2011) 1586-1589 [arxiv:1110.3397 [cs.MS

  12. X-ray Pulsars Across the Parameter Space of Luminosity, Accretion Mode, and Spin

    NASA Astrophysics Data System (ADS)

    Laycock, Silas

    We propose to expand the scope of our successful project providing a multi-satellite library of X-ray Pulsar observations to the community. The library provides high-level products, activity monitoring, pulse-profiles, phased event files, spectra, and a unique pulse-profile modeling interface. The library's scientific footprint will expand in 4 key directions: (1) Update, by processing all new XMM-Newton and Chandra observations (2015-2017) of X-ray Binary Pulsars in the Magellanic Clouds. (2) Expand, by including all archival Suzaku, Swift and NuStar observations, and including Galactic pulsars. (3) Improve, by offering innovative data products that provide deeper insight. (4) Advance, by implementing a new generation of physically motivated emission and pulse-profile models. The library currently includes some 2000 individual RXTE-PCA, 200 Chandra ACIS-I, and 120 XMM-PN observations of the SMC spanning 15 years, creating an unrivaled record of pulsar temporal behavior. In Phase-2, additional observations of SMC pulsars will be added: 221 Chandra (ACIS-S and ACIS-I), 22 XMM-PN, 142 XMM-MOS, 92 Suzaku, 25 NuSTAR, and >10,000 Swift; leveraging our pipeline and analysis techniques already developed. With the addition of 7 Galactic pulsars each having many hundred multisatellite observations, these datasets cover the entire range of variability timescales and accretion regimes. We will model the pulse-profiles using state of the art techniques to parameterize their morphology and obtain the distribution of offsets between magnetic and spin axes, and create samples of profiles under specific accretion modes (whether pencil-beam or fan-beam dominated). These products are needed for the next generation of advances in neutron star theory and modeling. The long-duration of the dataset and “whole-galaxy" nature of the SMC sample make possible a new statistical approach to uncover the duty-cycle distribution and hence population demographics of transient High Mass X-ray Binary (HMXB) populations. Our unique library is already fueling progress on fundamental NS parameters and accretion physics.

  13. Shape Memory Micro- and Nanowire Libraries for the High-Throughput Investigation of Scaling Effects.

    PubMed

    Oellers, Tobias; König, Dennis; Kostka, Aleksander; Xie, Shenqie; Brugger, Jürgen; Ludwig, Alfred

    2017-09-11

    The scaling behavior of Ti-Ni-Cu shape memory thin-film micro- and nanowires of different geometry is investigated with respect to its influence on the martensitic transformation properties. Two processes for the high-throughput fabrication of Ti-Ni-Cu micro- to nanoscale thin film wire libraries and the subsequent investigation of the transformation properties are reported. The libraries are fabricated with compositional and geometrical (wire width) variations to investigate the influence of these parameters on the transformation properties. Interesting behaviors were observed: Phase transformation temperatures change in the range from 1 to 72 °C (austenite finish, (A f ), 13 to 66 °C (martensite start, M s ) and the thermal hysteresis from -3.5 to 20 K. It is shown that a vanishing hysteresis can be achieved for special combinations of sample geometry and composition.

  14. CARS Spectral Fitting with Multiple Resonant Species using Sparse Libraries

    NASA Technical Reports Server (NTRS)

    Cutler, Andrew D.; Magnotti, Gaetano

    2010-01-01

    The dual pump CARS technique is often used in the study of turbulent flames. Fast and accurate algorithms are needed for fitting dual-pump CARS spectra for temperature and multiple chemical species. This paper describes the development of such an algorithm. The algorithm employs sparse libraries, whose size grows much more slowly with number of species than a conventional library. The method was demonstrated by fitting synthetic "experimental" spectra containing 4 resonant species (N2, O2, H2 and CO2), both with noise and without it, and by fitting experimental spectra from a H2-air flame produced by a Hencken burner. In both studies, weighted least squares fitting of signal, as opposed to least squares fitting signal or square-root signal, was shown to produce the least random error and minimize bias error in the fitted parameters.

  15. Engineering emergent multicellular behavior through synthetic adhesion

    NASA Astrophysics Data System (ADS)

    Glass, David; Riedel-Kruse, Ingmar

    In over a decade, synthetic biology has developed increasingly robust gene networks within single cells, but constructed very few systems that demonstrate multicellular spatio-temporal dynamics. We are filling this gap in synthetic biology's toolbox by developing an E. coli self-assembly platform based on modular cell-cell adhesion. We developed a system in which adhesive selectivity is provided by a library of outer membrane-displayed peptides with intra-library specificities, while affinity is provided by consistent expression across the entire library. We further provide a biophysical model to help understand the parameter regimes in which this tool can be used to self-assemble into cellular clusters, filaments, or meshes. The combined platform will enable future development of synthetic multicellular systems for use in consortia-based metabolic engineering, in living materials, and in controlled study of minimal multicellular systems. Stanford Bio-X Bowes Fellowship.

  16. Crystallographic fragment-based drug discovery: use of a brominated fragment library targeting HIV protease.

    PubMed

    Tiefenbrunn, Theresa; Forli, Stefano; Happer, Meaghan; Gonzalez, Ana; Tsai, Yingssu; Soltis, Michael; Elder, John H; Olson, Arthur J; Stout, Charles D

    2014-02-01

    A library of 68 brominated fragments was screened against a new crystal form of inhibited HIV-1 protease in order to probe surface sites in soaking experiments. Often, fragments are weak binders with partial occupancy, resulting in weak, difficult-to-fit electron density. The use of a brominated fragment library addresses this challenge, as bromine can be located unequivocally via anomalous scattering. Data collection was carried out in an automated fashion using AutoDrug at SSRL. Novel hits were identified in the known surface sites: 3-bromo-2,6-dimethoxybenzoic acid (Br6) in the flap site and 1-bromo-2-naphthoic acid (Br27) in the exosite, expanding the chemistry of known fragments for development of higher affinity potential allosteric inhibitors. At the same time, mapping the binding sites of a number of weaker binding Br-fragments provides further insight into the nature of these surface pockets. © 2013 John Wiley & Sons A/S.

  17. Proportionality between Doppler noise and integrated signal path electron density validated by differenced S-X range

    NASA Technical Reports Server (NTRS)

    Berman, A. L.

    1977-01-01

    Observations of Viking differenced S-band/X-band (S-X) range are shown to correlate strongly with Viking Doppler noise. A ratio of proportionality between downlink S-band plasma-induced range error and two-way Doppler noise is calculated. A new parameter (similar to the parameter epsilon which defines the ratio of local electron density fluctuations to mean electron density) is defined as a function of observed data sample interval (Tau) where the time-scale of the observations is 15 Tau. This parameter is interpreted to yield the ratio of net observed phase (or electron density) fluctuations to integrated electron density (in RMS meters/meter). Using this parameter and the thin phase-changing screen approximation, a value for the scale size L is calculated. To be consistent with Doppler noise observations, it is seen necessary for L to be proportional to closest approach distance a, and a strong function of the observed data sample interval, and hence the time-scale of the observations.

  18. Macromolecular refinement by model morphing using non-atomic parameterizations.

    PubMed

    Cowtan, Kevin; Agirre, Jon

    2018-02-01

    Refinement is a critical step in the determination of a model which explains the crystallographic observations and thus best accounts for the missing phase components. The scattering density is usually described in terms of atomic parameters; however, in macromolecular crystallography the resolution of the data is generally insufficient to determine the values of these parameters for individual atoms. Stereochemical and geometric restraints are used to provide additional information, but produce interrelationships between parameters which slow convergence, resulting in longer refinement times. An alternative approach is proposed in which parameters are not attached to atoms, but to regions of the electron-density map. These parameters can move the density or change the local temperature factor to better explain the structure factors. Varying the size of the region which determines the parameters at a particular position in the map allows the method to be applied at different resolutions without the use of restraints. Potential applications include initial refinement of molecular-replacement models with domain motions, and potentially the use of electron density from other sources such as electron cryo-microscopy (cryo-EM) as the refinement model.

  19. Macroscopically constrained Wang-Landau method for systems with multiple order parameters and its application to drawing complex phase diagrams

    NASA Astrophysics Data System (ADS)

    Chan, C. H.; Brown, G.; Rikvold, P. A.

    2017-05-01

    A generalized approach to Wang-Landau simulations, macroscopically constrained Wang-Landau, is proposed to simulate the density of states of a system with multiple macroscopic order parameters. The method breaks a multidimensional random-walk process in phase space into many separate, one-dimensional random-walk processes in well-defined subspaces. Each of these random walks is constrained to a different set of values of the macroscopic order parameters. When the multivariable density of states is obtained for one set of values of fieldlike model parameters, the density of states for any other values of these parameters can be obtained by a simple transformation of the total system energy. All thermodynamic quantities of the system can then be rapidly calculated at any point in the phase diagram. We demonstrate how to use the multivariable density of states to draw the phase diagram, as well as order-parameter probability distributions at specific phase points, for a model spin-crossover material: an antiferromagnetic Ising model with ferromagnetic long-range interactions. The fieldlike parameters in this model are an effective magnetic field and the strength of the long-range interaction.

  20. Total-energy Assisted Tight-binding Method Based on Local Density Approximation of Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Fujiwara, Takeo; Nishino, Shinya; Yamamoto, Susumu; Suzuki, Takashi; Ikeda, Minoru; Ohtani, Yasuaki

    2018-06-01

    A novel tight-binding method is developed, based on the extended Hückel approximation and charge self-consistency, with referring the band structure and the total energy of the local density approximation of the density functional theory. The parameters are so adjusted by computer that the result reproduces the band structure and the total energy, and the algorithm for determining parameters is established. The set of determined parameters is applicable to a variety of crystalline compounds and change of lattice constants, and, in other words, it is transferable. Examples are demonstrated for Si crystals of several crystalline structures varying lattice constants. Since the set of parameters is transferable, the present tight-binding method may be applicable also to molecular dynamics simulations of large-scale systems and long-time dynamical processes.

  1. AGAMA: Action-based galaxy modeling framework

    NASA Astrophysics Data System (ADS)

    Vasiliev, Eugene

    2018-05-01

    The AGAMA library models galaxies. It computes gravitational potential and forces, performs orbit integration and analysis, and can convert between position/velocity and action/angle coordinates. It offers a framework for finding best-fit parameters of a model from data and self-consistent multi-component galaxy models, and contains useful auxiliary utilities such as various mathematical routines. The core of the library is written in C++, and there are Python and Fortran interfaces. AGAMA may be used as a plugin for the stellar-dynamical software packages galpy (ascl:1411.008), AMUSE (ascl:1107.007), and NEMO (ascl:1010.051).

  2. Z-Scan Analysis: a New Method to Determine the Oxidative State of Low-Density Lipoprotein and Its Association with Multiple Cardiometabolic Biomarkers

    NASA Astrophysics Data System (ADS)

    de Freitas, Maria Camila Pruper; Figueiredo Neto, Antonio Martins; Giampaoli, Viviane; da Conceição Quintaneiro Aubin, Elisete; de Araújo Lima Barbosa, Milena Maria; Damasceno, Nágila Raquel Teixeira

    2016-04-01

    The great atherogenic potential of oxidized low-density lipoprotein has been widely described in the literature. The objective of this study was to investigate whether the state of oxidized low-density lipoprotein in human plasma measured by the Z-scan technique has an association with different cardiometabolic biomarkers. Total cholesterol, high-density lipoprotein cholesterol, triacylglycerols, apolipoprotein A-I and apolipoprotein B, paraoxonase-1, and glucose were analyzed using standard commercial kits, and low-density lipoprotein cholesterol was estimated using the Friedewald equation. A sandwich enzyme-linked immunosorbent assay was used to detect electronegative low-density lipoprotein. Low-density lipoprotein and high-density lipoprotein sizes were determined by Lipoprint® system. The Z-scan technique was used to measure the non-linear optical response of low-density lipoprotein solution. Principal component analysis and correlations were used respectively to resize the data from the sample and test association between the θ parameter, measured with the Z-scan technique, and the principal component. A total of 63 individuals, from both sexes, with mean age 52 years (±11), being overweight and having high levels of total cholesterol and low levels of high-density lipoprotein cholesterol, were enrolled in this study. A positive correlation between the θ parameter and more anti-atherogenic pattern for cardiometabolic biomarkers together with a negative correlation for an atherogenic pattern was found. Regarding the parameters related with an atherogenic low-density lipoprotein profile, the θ parameter was negatively correlated with a more atherogenic pattern. By using Z-scan measurements, we were able to find an association between oxidized low-density lipoprotein state and multiple cardiometabolic biomarkers in samples from individuals with different cardiovascular risk factors.

  3. Optimization of the Hot Forging Processing Parameters for Powder Metallurgy Fe-Cu-C Connecting Rods Based on Finite Element Simulation

    NASA Astrophysics Data System (ADS)

    Li, Fengxian; Yi, Jianhong; Eckert, Jürgen

    2017-12-01

    Powder forged connecting rods have the problem of non-uniform density distributions because of their complex geometric shape. The densification behaviors of powder metallurgy (PM) connecting rod preforms during hot forging processes play a significant role in optimizing the connecting rod quality. The deformation behaviors of a connecting rod preform, a Fe-3Cu-0.5C (wt pct) alloy compacted and sintered by the powder metallurgy route (PM Fe-Cu-C), were investigated using the finite element method, while damage and friction behaviors of the material were considered in the complicated forging process. The calculated results agree well with the experimental results. The relationship between the processing parameters of hot forging and the relative density of the connecting rod was revealed. The results showed that the relative density of the hot forged connecting rod at the central shank changed significantly compared with the relative density at the big end and at the small end. Moreover, the relative density of the connecting rod was sensitive to the processing parameters such as the forging velocity and the initial density of the preform. The optimum forging processing parameters were determined and presented by using an orthogonal design method. This work suggests that the processing parameters can be optimized to prepare a connecting rod with uniform density distribution and can help to better meet the requirements of the connecting rod industry.

  4. A Computational Study on Porosity Evolution in Parts Produced by Selective Laser Melting

    NASA Astrophysics Data System (ADS)

    Tan, J. L.; Tang, C.; Wong, C. H.

    2018-06-01

    Selective laser melting (SLM) is a powder-bed additive manufacturing process that uses laser to melt powders, layer by layer to generate a functional 3D part. There are many different parameters, such as laser power, scanning speed, and layer thickness, which play a role in determining the quality of the printed part. These parameters contribute to the energy density applied on the powder bed. Defects arise when insufficient or excess energy density is applied. A common defect in these cases is the presence of porosity. This paper studies the formation of porosities when inappropriate energy densities are used. A computational model was developed to simulate the melting and solidification process of SS316L powders in the SLM process. Three different sets of process parameters were used to produce 800-µm-long melt tracks, and the characteristics of the porosities were analyzed. It was found that when low energy density parameters were used, the pores were found to be irregular in shapes and were located near the top surface of the powder bed. However, when high energy density parameters were used, the pores were either elliptical or spherical in shapes and were usually located near the bottom of the keyholes.

  5. A framework linkage map of perennial ryegrass based on SSR markers

    Treesearch

    G.P. Gill; P.L. Wilcox; D.J. Whittaker; R.A. Winz; P. Bickerstaff; Craig E. Echt; J. Kent; M.O. Humphreys; K.M. Elborough; R.C. Gardner

    2006-01-01

    A moderate-density linkage map for Lolium perenne L. has been constructed based on 376 simple sequence repeat (SSR) markers. Approximately one third ( 124) of the SSR markers were developed from GeneThresher libraries that preferentially select genomic DNA clones from the gene-rich unmethylated portion of the genome. The remaining SSR marker loci...

  6. NanoTopoChip: High-throughput nanotopographical cell instruction.

    PubMed

    Hulshof, Frits F B; Zhao, Yiping; Vasilevich, Aliaksei; Beijer, Nick R M; de Boer, Meint; Papenburg, Bernke J; van Blitterswijk, Clemens; Stamatialis, Dimitrios; de Boer, Jan

    2017-10-15

    Surface topography is able to influence cell phenotype in numerous ways and offers opportunities to manipulate cells and tissues. In this work, we develop the Nano-TopoChip and study the cell instructive effects of nanoscale topographies. A combination of deep UV projection lithography and conventional lithography was used to fabricate a library of more than 1200 different defined nanotopographies. To illustrate the cell instructive effects of nanotopography, actin-RFP labeled U2OS osteosarcoma cells were cultured and imaged on the Nano-TopoChip. Automated image analysis shows that of many cell morphological parameters, cell spreading, cell orientation and actin morphology are mostly affected by the nanotopographies. Additionally, by using modeling, the changes of cell morphological parameters could by predicted by several feature shape parameters such as lateral size and spacing. This work overcomes the technological challenges of fabricating high quality defined nanoscale features on unprecedented large surface areas of a material relevant for tissue culture such as PS and the screening system is able to infer nanotopography - cell morphological parameter relationships. Our screening platform provides opportunities to identify and study the effect of nanotopography with beneficial properties for the culture of various cell types. The nanotopography of biomaterial surfaces can be modified to influence adhering cells with the aim to improve the performance of medical implants and tissue culture substrates. However, the necessary knowledge of the underlying mechanisms remains incomplete. One reason for this is the limited availability of high-resolution nanotopographies on relevant biomaterials, suitable to conduct systematic biological studies. The present study shows the fabrication of a library of nano-sized surface topographies with high fidelity. The potential of this library, called the 'NanoTopoChip' is shown in a proof of principle HTS study which demonstrates how cells are affected by nanotopographies. The large dataset, acquired by quantitative high-content imaging, allowed us to use predictive modeling to describe how feature dimensions affect cell morphology. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  7. Empirical Performance Model-Driven Data Layout Optimization and Library Call Selection for Tensor Contraction Expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Qingda; Gao, Xiaoyang; Krishnamoorthy, Sriram

    Empirical optimizers like ATLAS have been very effective in optimizing computational kernels in libraries. The best choice of parameters such as tile size and degree of loop unrolling is determined by executing different versions of the computation. In contrast, optimizing compilers use a model-driven approach to program transformation. While the model-driven approach of optimizing compilers is generally orders of magnitude faster than ATLAS-like library generators, its effectiveness can be limited by the accuracy of the performance models used. In this paper, we describe an approach where a class of computations is modeled in terms of constituent operations that are empiricallymore » measured, thereby allowing modeling of the overall execution time. The performance model with empirically determined cost components is used to perform data layout optimization together with the selection of library calls and layout transformations in the context of the Tensor Contraction Engine, a compiler for a high-level domain-specific language for expressing computational models in quantum chemistry. The effectiveness of the approach is demonstrated through experimental measurements on representative computations from quantum chemistry.« less

  8. The WAGGS project - I. The WiFeS Atlas of Galactic Globular cluster Spectra

    NASA Astrophysics Data System (ADS)

    Usher, Christopher; Pastorello, Nicola; Bellstedt, Sabine; Alabi, Adebusola; Cerulo, Pierluigi; Chevalier, Leonie; Fraser-McKelvie, Amelia; Penny, Samantha; Foster, Caroline; McDermid, Richard M.; Schiavon, Ricardo P.; Villaume, Alexa

    2017-07-01

    We present the WiFeS Atlas of Galactic Globular cluster Spectra, a library of integrated spectra of Milky Way and Local Group globular clusters. We used the WiFeS integral field spectrograph on the Australian National University 2.3 m telescope to observe the central regions of 64 Milky Way globular clusters and 22 globular clusters hosted by the Milky Way's low-mass satellite galaxies. The spectra have wider wavelength coverage (3300-9050 Å) and higher spectral resolution (R = 6800) than existing spectral libraries of Milky Way globular clusters. By including Large and Small Magellanic Cloud star clusters, we extend the coverage of parameter space of existing libraries towards young and intermediate ages. While testing stellar population synthesis models and analysis techniques is the main aim of this library, the observations may also further our understanding of the stellar populations of Local Group globular clusters and make possible the direct comparison of extragalactic globular cluster integrated light observations with well-understood globular clusters in the Milky Way. The integrated spectra are publicly available via the project website.

  9. Influence of the volume and density functions within geometric models for estimating trunk inertial parameters.

    PubMed

    Wicke, Jason; Dumas, Genevieve A

    2010-02-01

    The geometric method combines a volume and a density function to estimate body segment parameters and has the best opportunity for developing the most accurate models. In the trunk, there are many different tissues that greatly differ in density (e.g., bone versus lung). Thus, the density function for the trunk must be particularly sensitive to capture this diversity, such that accurate inertial estimates are possible. Three different models were used to test this hypothesis by estimating trunk inertial parameters of 25 female and 24 male college-aged participants. The outcome of this study indicates that the inertial estimates for the upper and lower trunk are most sensitive to the volume function and not very sensitive to the density function. Although it appears that the uniform density function has a greater influence on inertial estimates in the lower trunk region than in the upper trunk region, this is likely due to the (overestimated) density value used. When geometric models are used to estimate body segment parameters, care must be taken in choosing a model that can accurately estimate segment volumes. Researchers wanting to develop accurate geometric models should focus on the volume function, especially in unique populations (e.g., pregnant or obese individuals).

  10. Diffraction of cosine-Gaussian-correlated Schell-model beams.

    PubMed

    Pan, Liuzhan; Ding, Chaoliang; Wang, Haixia

    2014-05-19

    The expression of spectral density of cosine-Gaussian-correlated Schell-model (CGSM) beams diffracted by an aperture is derived, and used to study the changes in the spectral density distribution of CGSM beams upon propagation, where the effect of aperture diffraction is emphasized. It is shown that, comparing with that of GSM beams, the spectral density distribution of CGSM beams diffracted by an aperture has dip and shows dark hollow intensity distribution when the order-parameter n is big enough. The central intensity increases with increasing truncation parameter of aperture. The comparative study of spectral density distributions of CGSM beams with aperture and that of without aperture is performed. Furthermore, the effect of order-parameter n and spatial coherence of CGSM beams on the spectral density distribution is discussed in detail. The results obtained may be useful in optical particulate manipulation.

  11. The Herschel Virgo Cluster Survey. XIX. Physical properties of low luminosity FIR sources at z < 0.5

    NASA Astrophysics Data System (ADS)

    Pappalardo, Ciro; Bizzocchi, Luca; Fritz, Jacopo; Boselli, Alessandro; Boquien, Mederic; Boissier, Samuel; Baes, Maarten; Ciesla, Laure; Bianchi, Simone; Clemens, Marcel; Viaene, Sebastien; Bendo, George J.; De Looze, Ilse; Smith, Matthew W. L.; Davies, Jonathan

    2016-05-01

    Context. The star formation rate is a crucial parameter for the investigation galaxy evolution. At low redshift the cosmic star formation rate density declines smoothly, and massive active galaxies become passive, reducing their star formation activity. This implies that the bulk of the star formation rate density at low redshift is mainly driven by low mass objects. Aims: We investigate the properties of a sample of low luminosity far-infrared sources selected at 250 μm. We have collected data from ultraviolet to far-infrared in order to perform a multiwavelengths analysis. The main goal is to investigate the correlation between star formation rate, stellar mass, and dust mass for a galaxy population with a wide range in dust content and stellar mass, including the low mass regime that most probably dominates the star formation rate density at low redshift. Methods: We define a main sample of ~800 sources with full spectral energy distribution coverage between 0.15 <λ< 500 μm and an extended sample with ~5000 sources in which we remove the constraints on the ultraviolet and near-infrared bands. We analyze both samples with two different spectral energy distribution fitting methods: MAGPHYS and CIGALE, which interpret a galaxy spectral energy distribution as a combination of different simple stellar population libraries and dust emission templates. Results: In the star formation rate versus stellar mass plane our samples occupy a region included between local spirals and higher redshift star forming galaxies. These galaxies represent the population that at z< 0.5 quenches their star formation activity and reduces their contribution to the cosmic star formation rate density. The subsample of galaxies with the higher masses (M∗> 3 × 1010 M⊙) do not lie on the main sequence, but show a small offset as a consequence of the decreased star formation. Low mass galaxies (M∗< 1 × 1010 M⊙) settle in the main sequence with star formation rate and stellar mass consistent with local spirals. Conclusions: Deep Herschel data allow the identification of a mixed galaxy population with galaxies still in an assembly phase or galaxies at the beginning of their passive evolution. We find that the dust luminosity is the parameter that allow us to discriminate between these two galaxy populations. The median spectral energy distribution shows that even at low star formation rate our galaxy sample has a higher mid-infrared emission than previously predicted. Herschel is an ESA space observatory with science instruments provided by a European-led principal investigator consortia and with an important participation from NASA.

  12. A technique for routinely updating the ITU-R database using radio occultation electron density profiles

    NASA Astrophysics Data System (ADS)

    Brunini, Claudio; Azpilicueta, Francisco; Nava, Bruno

    2013-09-01

    Well credited and widely used ionospheric models, such as the International Reference Ionosphere or NeQuick, describe the variation of the electron density with height by means of a piecewise profile tied to the F2-peak parameters: the electron density,, and the height, . Accurate values of these parameters are crucial for retrieving reliable electron density estimations from those models. When direct measurements of these parameters are not available, the models compute the parameters using the so-called ITU-R database, which was established in the early 1960s. This paper presents a technique aimed at routinely updating the ITU-R database using radio occultation electron density profiles derived from GPS measurements gathered from low Earth orbit satellites. Before being used, these radio occultation profiles are validated by fitting to them an electron density model. A re-weighted Least Squares algorithm is used for down-weighting unreliable measurements (occasionally, entire profiles) and to retrieve and values—together with their error estimates—from the profiles. These values are used to monthly update the database, which consists of two sets of ITU-R-like coefficients that could easily be implemented in the IRI or NeQuick models. The technique was tested with radio occultation electron density profiles that are delivered to the community by the COSMIC/FORMOSAT-3 mission team. Tests were performed for solstices and equinoxes seasons in high and low-solar activity conditions. The global mean error of the resulting maps—estimated by the Least Squares technique—is between and elec/m for the F2-peak electron density (which is equivalent to 7 % of the value of the estimated parameter) and from 2.0 to 5.6 km for the height (2 %).

  13. Exciton scattering approach for optical spectra calculations in branched conjugated macromolecules

    NASA Astrophysics Data System (ADS)

    Li, Hao; Wu, Chao; Malinin, Sergey V.; Tretiak, Sergei; Chernyak, Vladimir Y.

    2016-12-01

    The exciton scattering (ES) technique is a multiscale approach based on the concept of a particle in a box and developed for efficient calculations of excited-state electronic structure and optical spectra in low-dimensional conjugated macromolecules. Within the ES method, electronic excitations in molecular structure are attributed to standing waves representing quantum quasi-particles (excitons), which reside on the graph whose edges and nodes stand for the molecular linear segments and vertices, respectively. Exciton propagation on the linear segments is characterized by the exciton dispersion, whereas exciton scattering at the branching centers is determined by the energy-dependent scattering matrices. Using these ES energetic parameters, the excitation energies are then found by solving a set of generalized "particle in a box" problems on the graph that represents the molecule. Similarly, unique energy-dependent ES dipolar parameters permit calculations of the corresponding oscillator strengths, thus, completing optical spectra modeling. Both the energetic and dipolar parameters can be extracted from quantum-chemical computations in small molecular fragments and tabulated in the ES library for further applications. Subsequently, spectroscopic modeling for any macrostructure within a considered molecular family could be performed with negligible numerical effort. We demonstrate the ES method application to molecular families of branched conjugated phenylacetylenes and ladder poly-para-phenylenes, as well as structures with electron donor and acceptor chemical substituents. Time-dependent density functional theory (TD-DFT) is used as a reference model for electronic structure. The ES calculations accurately reproduce the optical spectra compared to the reference quantum chemistry results, and make possible to predict spectra of complex macromolecules, where conventional electronic structure calculations are unfeasible.

  14. Integrated platform for genome-wide screening and construction of high-density genetic interaction maps in mammalian cells

    PubMed Central

    Kampmann, Martin; Bassik, Michael C.; Weissman, Jonathan S.

    2013-01-01

    A major challenge of the postgenomic era is to understand how human genes function together in normal and disease states. In microorganisms, high-density genetic interaction (GI) maps are a powerful tool to elucidate gene functions and pathways. We have developed an integrated methodology based on pooled shRNA screening in mammalian cells for genome-wide identification of genes with relevant phenotypes and systematic mapping of all GIs among them. We recently demonstrated the potential of this approach in an application to pathways controlling the susceptibility of human cells to the toxin ricin. Here we present the complete quantitative framework underlying our strategy, including experimental design, derivation of quantitative phenotypes from pooled screens, robust identification of hit genes using ultra-complex shRNA libraries, parallel measurement of tens of thousands of GIs from a single double-shRNA experiment, and construction of GI maps. We describe the general applicability of our strategy. Our pooled approach enables rapid screening of the same shRNA library in different cell lines and under different conditions to determine a range of different phenotypes. We illustrate this strategy here for single- and double-shRNA libraries. We compare the roles of genes for susceptibility to ricin and Shiga toxin in different human cell lines and reveal both toxin-specific and cell line-specific pathways. We also present GI maps based on growth and ricin-resistance phenotypes, and we demonstrate how such a comparative GI mapping strategy enables functional dissection of physical complexes and context-dependent pathways. PMID:23739767

  15. A method to describe inelastic gamma field distribution in neutron gamma density logging.

    PubMed

    Zhang, Feng; Zhang, Quanying; Liu, Juntao; Wang, Xinguang; Wu, He; Jia, Wenbao; Ti, Yongzhou; Qiu, Fei; Zhang, Xiaoyang

    2017-11-01

    Pulsed neutron gamma density logging (NGD) is of great significance for radioprotection and density measurement in LWD, however, the current methods have difficulty in quantitative calculation and single factor analysis for the inelastic gamma field distribution. In order to clarify the NGD mechanism, a new method is developed to describe the inelastic gamma field distribution. Based on the fast-neutron scattering and gamma attenuation, the inelastic gamma field distribution is characterized by the inelastic scattering cross section, fast-neutron scattering free path, formation density and other parameters. And the contribution of formation parameters on the field distribution is quantitatively analyzed. The results shows the contribution of density attenuation is opposite to that of inelastic scattering cross section and fast-neutron scattering free path. And as the detector-spacing increases, the density attenuation gradually plays a dominant role in the gamma field distribution, which means large detector-spacing is more favorable for the density measurement. Besides, the relationship of density sensitivity and detector spacing was studied according to this gamma field distribution, therefore, the spacing of near and far gamma ray detector is determined. The research provides theoretical guidance for the tool parameter design and density determination of pulsed neutron gamma density logging technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Metrology of deep trench etched memory structures using 3D scatterometry

    NASA Astrophysics Data System (ADS)

    Reinig, Peter; Dost, Rene; Moert, Manfred; Hingst, Thomas; Mantz, Ulrich; Moffitt, Jasen; Shakya, Sushil; Raymond, Christopher J.; Littau, Mike

    2005-05-01

    Scatterometry is receiving considerable attention as an emerging optical metrology in the silicon industry. One area of progress in deploying these powerful measurements in process control is performing measurements on real device structures, as opposed to limiting scatterometry measurements to periodic structures, such as line-space gratings, placed in the wafer scribe. In this work we will discuss applications of 3D scatterometry to the measurement of advanced trench memory devices. This is a challenging and complex scatterometry application that requires exceptionally high-performance computational abilities. In order to represent the physical device, the relatively tall structures require a high number of slices in the rigorous coupled wave analysis (RCWA) theoretical model. This is complicated further by the presence of an amorphous silicon hard mask on the surface, which is highly sensitive to reflectance scattering and therefore needs to be modeled in detail. The overall structure is comprised of several layers, with the trenches presenting a complex bow-shape sidewall that must be measured. Finally, the double periodicity in the structures demands significantly greater computational capabilities. Our results demonstrate that angular scatterometry is sensitive to the key parameters of interest. The influence of further model parameters and parameter cross correlations have to be carefully taken into account. Profile results obtained by non-library optimization methods compare favorably with cross-section SEM images. Generating a model library suitable for process control, which is preferred for precision, presents numerical throughput challenges. Details will be discussed regarding library generation approaches and strategies for reducing the numerical overhead. Scatterometry and SEM results will be compared, leading to conclusions about the feasibility of this advanced application.

  17. Effect of q-nonextensive parameter and saturation time on electron density steepening in electron-positron-ion plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hashemzadeh, M., E-mail: hashemzade@gmail.com

    2015-11-15

    The effect of q-nonextensive parameter and saturation time on the electron density steepening in electron-positron-ion plasmas is studied by particle in cell method. Phase space diagrams show that the size of the holes, and consequently, the number of trapped particles strongly depends on the q-parameter and saturation time. Furthermore, the mechanism of the instability and exchange of energy between electron-positron and electric field is explained by the profiles of the energy density. Moreover, it is found that the q-parameter, saturation time, and electron and positron velocities affect the nonlinear evolution of the electron density which leads to the steepening ofmore » its structure. The q-nonextensive parameter or degree of nonextensivity is the relation between temperature gradient and potential energy of the system. Therefore, the deviation of q-parameter from unity indicates the degree of inhomogeneity of temperature or deviation from equilibrium. Finally, using the kinetic theory, a generalized q-dispersion relation is presented for electron-positron-ion plasma systems. It is found that the simulation results in the linear regime are in good agreement with the growth rate results obtained by the kinetic theory.« less

  18. Theoretical Calculation of the Electron Transport Parameters and Energy Distribution Function for CF3I with noble gases mixtures using Monte Carlo simulation program

    NASA Astrophysics Data System (ADS)

    Jawad, Enas A.

    2018-05-01

    In this paper, The Monte Carlo simulation program has been used to calculation the electron energy distribution function (EEDF) and electric transport parameters for the gas mixtures of The trif leoroiodo methane (CF3I) ‘environment friendly’ with a noble gases (Argon, Helium, kryptos, Neon and Xenon). The electron transport parameters are assessed in the range of E/N (E is the electric field and N is the gas number density of background gas molecules) between 100 to 2000Td (1 Townsend = 10-17 V cm2) at room temperature. These parameters, namely are electron mean energy (ε), the density –normalized longitudinal diffusion coefficient (NDL) and the density –normalized mobility (μN). In contrast, the impact of CF3I in the noble gases mixture is strongly apparent in the values for the electron mean energy, the density –normalized longitudinal diffusion coefficient and the density –normalized mobility. Note in the results of the calculation agreed well with the experimental results.

  19. Surface density: a new parameter in the fundamental metallicity relation of star-forming galaxies

    NASA Astrophysics Data System (ADS)

    Hashimoto, Tetsuya; Goto, Tomotsugu; Momose, Rieko

    2018-04-01

    Star-forming galaxies display a close relation among stellar mass, metallicity, and star formation rate (or molecular-gas mass). This is known as the fundamental metallicity relation (FMR) (or molecular-gas FMR), and it has a profound implication on models of galaxy evolution. However, there still remains a significant residual scatter around the FMR. We show here that a fourth parameter, the surface density of stellar mass, reduces the dispersion around the molecular-gas FMR. In a principal component analysis of 29 physical parameters of 41 338 star-forming galaxies, the surface density of stellar mass is found to be the fourth most important parameter. The new 4D fundamental relation forms a tighter hypersurface that reduces the metallicity dispersion to 50 per cent of that of the molecular-gas FMR. We suggest that future analyses and models of galaxy evolution should consider the FMR in a 4D space that includes surface density. The dilution time-scale of gas inflow and the star-formation efficiency could explain the observational dependence on surface density of stellar mass.

  20. Joint constraints on galaxy bias and σ{sub 8} through the N-pdf of the galaxy number density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnalte-Mur, Pablo; Martínez, Vicent J.; Vielva, Patricio

    We present a full description of the N-probability density function of the galaxy number density fluctuations. This N-pdf is given in terms, on the one hand, of the cold dark matter correlations and, on the other hand, of the galaxy bias parameter. The method relies on the assumption commonly adopted that the dark matter density fluctuations follow a local non-linear transformation of the initial energy density perturbations. The N-pdf of the galaxy number density fluctuations allows for an optimal estimation of the bias parameter (e.g., via maximum-likelihood estimation, or Bayesian inference if there exists any a priori information on themore » bias parameter), and of those parameters defining the dark matter correlations, in particular its amplitude (σ{sub 8}). It also provides the proper framework to perform model selection between two competitive hypotheses. The parameters estimation capabilities of the N-pdf are proved by SDSS-like simulations (both, ideal log-normal simulations and mocks obtained from Las Damas simulations), showing that our estimator is unbiased. We apply our formalism to the 7th release of the SDSS main sample (for a volume-limited subset with absolute magnitudes M{sub r} ≤ −20). We obtain b-circumflex  = 1.193 ± 0.074 and σ-bar{sub 8} = 0.862 ± 0.080, for galaxy number density fluctuations in cells of the size of 30h{sup −1}Mpc. Different model selection criteria show that galaxy biasing is clearly favoured.« less

  1. Exact and Approximate Statistical Inference for Nonlinear Regression and the Estimating Equation Approach.

    PubMed

    Demidenko, Eugene

    2017-09-01

    The exact density distribution of the nonlinear least squares estimator in the one-parameter regression model is derived in closed form and expressed through the cumulative distribution function of the standard normal variable. Several proposals to generalize this result are discussed. The exact density is extended to the estimating equation (EE) approach and the nonlinear regression with an arbitrary number of linear parameters and one intrinsically nonlinear parameter. For a very special nonlinear regression model, the derived density coincides with the distribution of the ratio of two normally distributed random variables previously obtained by Fieller (1932), unlike other approximations previously suggested by other authors. Approximations to the density of the EE estimators are discussed in the multivariate case. Numerical complications associated with the nonlinear least squares are illustrated, such as nonexistence and/or multiple solutions, as major factors contributing to poor density approximation. The nonlinear Markov-Gauss theorem is formulated based on the near exact EE density approximation.

  2. Mathematical modeling of a thermovoltaic cell

    NASA Technical Reports Server (NTRS)

    White, Ralph E.; Kawanami, Makoto

    1992-01-01

    A new type of battery named 'Vaporvolt' cell is in the early stage of its development. A mathematical model of a CuO/Cu 'Vaporvolt' cell is presented that can be used to predict the potential and the transport behavior of the cell during discharge. A sensitivity analysis of the various transport and electrokinetic parameters indicates which parameters have the most influence on the predicted energy and power density of the 'Vaporvolt' cell. This information can be used to decide which parameters should be optimized or determined more accurately through further modeling or experimental studies. The optimal thicknesses of electrodes and separator, the concentration of the electrolyte, and the current density are determined by maximizing the power density. These parameter sensitivities and optimal design parameter values will help in the development of a better CuO/Cu 'Vaporvolt' cell.

  3. Effect of the medium's density on the hydrocyclonic separation of waste plastics with different densities.

    PubMed

    Fu, Shuangcheng; Fang, Yong; Yuan, Huixin; Tan, Wanjiang; Dong, Yiwen

    2017-09-01

    Hydrocyclones can be applied to recycle waste plastics with different densities through separating plastics based on their differences in densities. In the process, the medium density is one of key parameters and the value of the medium's density is not just the average of the density of two kinds of plastics separated. Based on the force analysis and establishing the equation of motion of particles in the hydrocyclone, a formula to calculate the optimum separation medium density has been deduced. This value of the medium's density is a function of various parameters including the diameter, density, radial position and tangential velocity of particles, and viscosity of the medium. Tests on the separation performance of the hydrocyclone has been conducted with PET and PVC particles. The theoretical result appeared to be in good agreement with experimental results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Quantitative photoacoustic imaging in the acoustic regime using SPIM

    NASA Astrophysics Data System (ADS)

    Beigl, Alexander; Elbau, Peter; Sadiq, Kamran; Scherzer, Otmar

    2018-05-01

    While in standard photoacoustic imaging the propagation of sound waves is modeled by the standard wave equation, our approach is based on a generalized wave equation with variable sound speed and material density, respectively. In this paper we present an approach for photoacoustic imaging, which in addition to the recovery of the absorption density parameter, the imaging parameter of standard photoacoustics, also allows us to reconstruct the spatially varying sound speed and density, respectively, of the medium. We provide analytical reconstruction formulas for all three parameters based in a linearized model based on single plane illumination microscopy (SPIM) techniques.

  5. Optimizing Power Density and Efficiency of a Double-Halbach Array Permanent-Magnet Ironless Axial-Flux Motor

    NASA Technical Reports Server (NTRS)

    Duffy, Kirsten P.

    2016-01-01

    NASA Glenn Research Center is investigating hybrid electric and turboelectric propulsion concepts for future aircraft to reduce fuel burn, emissions, and noise. Systems studies show that the weight and efficiency of the electric system components need to be improved for this concept to be feasible. This effort aims to identify design parameters that affect power density and efficiency for a double-Halbach array permanent-magnet ironless axial flux motor configuration. These parameters include both geometrical and higher-order parameters, including pole count, rotor speed, current density, and geometries of the magnets, windings, and air gap.

  6. The Structure of Dark Matter Halos in Dwarf Galaxies

    NASA Astrophysics Data System (ADS)

    Burkert, A.

    1995-07-01

    Recent observations indicate that dark matter halos have flat central density profiles. Cosmological simulations with nonbaryonic dark matter, however, predict self-similar halos with central density cusps. This contradiction has lead to the conclusion that dark matter must be baryonic. Here it is shown that the dark matter halos of dwarf spiral galaxies represent a one-parameter family with self-similar density profiles. The observed global halo parameters are coupled with each other through simple scaling relations which can be explained by the standard cold dark matter model if one assumes that all the halos formed from density fluctuations with the same primordial amplitude. We find that the finite central halo densities correlate with the other global parameters. This result rules out scenarios where the flat halo cores formed subsequently through violent dynamical processes in the baryonic component. These cores instead provide important information on the origin and nature of dark matter in dwarf galaxies.

  7. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  8. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  9. Gestational dyslipidaemia and adverse birthweight outcomes: a systematic review and meta-analysis.

    PubMed

    Wang, J; Moore, D; Subramanian, A; Cheng, K K; Toulis, K A; Qiu, X; Saravanan, P; Price, M J; Nirantharakumar, K

    2018-05-21

    Low and high birthweight is known to increase the risk of acute and longer-term adverse outcomes, such as stillbirth, infant mortality, obesity, type 2 diabetes and cardiovascular diseases. Gestational dyslipidaemia is associated with a numbers of adverse birth outcomes, but evidence regarding birthweight is still inconsistent to reliably inform clinical practice and treatment recommendations. The aim of this study was to explore the relationship between maternal gestational dyslipidaemia and neonatal health outcomes, namely, birthweight, metabolic factors and inflammatory parameters. We searched systematically Embase, MEDLINE, PubMed, CINAHL Plus and Cochrane Library up to 1 August 2016 (with an updated search in MEDLINE at the end of July 2017) for longitudinal studies that assessed the association of maternal lipid levels during pregnancy with neonatal birthweight, or metabolic and inflammatory parameters up to 3 years old. Data from 46 publications including 31,402 pregnancies suggest that maternal high triglycerides and low high-density-lipoprotein cholesterol levels throughout pregnancy are associated with increased birthweight, higher risk of large for gestational age and macrosomia and lower risk of small-for-gestational age. The findings were consistent across the studied populations, but stronger associations were observed in women who were overweight or obese prior to pregnancy. This meta-analysis suggested that the potential under-recognized adverse effects of intrauterine exposure to maternal dyslipidaemia may warrant further investigation into the relationship between maternal dyslipidaemia and birthweight in large prospective cohorts or in randomized trials. © 2018 World Obesity Federation.

  10. A theoretical-electron-density databank using a model of real and virtual spherical atoms.

    PubMed

    Nassour, Ayoub; Domagala, Slawomir; Guillot, Benoit; Leduc, Theo; Lecomte, Claude; Jelsch, Christian

    2017-08-01

    A database describing the electron density of common chemical groups using combinations of real and virtual spherical atoms is proposed, as an alternative to the multipolar atom modelling of the molecular charge density. Theoretical structure factors were computed from periodic density functional theory calculations on 38 crystal structures of small molecules and the charge density was subsequently refined using a density model based on real spherical atoms and additional dummy charges on the covalent bonds and on electron lone-pair sites. The electron-density parameters of real and dummy atoms present in a similar chemical environment were averaged on all the molecules studied to build a database of transferable spherical atoms. Compared with the now-popular databases of transferable multipolar parameters, the spherical charge modelling needs fewer parameters to describe the molecular electron density and can be more easily incorporated in molecular modelling software for the computation of electrostatic properties. The construction method of the database is described. In order to analyse to what extent this modelling method can be used to derive meaningful molecular properties, it has been applied to the urea molecule and to biotin/streptavidin, a protein/ligand complex.

  11. Inference of reaction rate parameters based on summary statistics from experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin

    Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less

  12. Inference of reaction rate parameters based on summary statistics from experiments

    DOE PAGES

    Khalil, Mohammad; Chowdhary, Kamaljit Singh; Safta, Cosmin; ...

    2016-10-15

    Here, we present the results of an application of Bayesian inference and maximum entropy methods for the estimation of the joint probability density for the Arrhenius rate para meters of the rate coefficient of the H 2/O 2-mechanism chain branching reaction H + O 2 → OH + O. Available published data is in the form of summary statistics in terms of nominal values and error bars of the rate coefficient of this reaction at a number of temperature values obtained from shock-tube experiments. Our approach relies on generating data, in this case OH concentration profiles, consistent with the givenmore » summary statistics, using Approximate Bayesian Computation methods and a Markov Chain Monte Carlo procedure. The approach permits the forward propagation of parametric uncertainty through the computational model in a manner that is consistent with the published statistics. A consensus joint posterior on the parameters is obtained by pooling the posterior parameter densities given each consistent data set. To expedite this process, we construct efficient surrogates for the OH concentration using a combination of Pad'e and polynomial approximants. These surrogate models adequately represent forward model observables and their dependence on input parameters and are computationally efficient to allow their use in the Bayesian inference procedure. We also utilize Gauss-Hermite quadrature with Gaussian proposal probability density functions for moment computation resulting in orders of magnitude speedup in data likelihood evaluation. Despite the strong non-linearity in the model, the consistent data sets all res ult in nearly Gaussian conditional parameter probability density functions. The technique also accounts for nuisance parameters in the form of Arrhenius parameters of other rate coefficients with prescribed uncertainty. The resulting pooled parameter probability density function is propagated through stoichiometric hydrogen-air auto-ignition computations to illustrate the need to account for correlation among the Arrhenius rate parameters of one reaction and across rate parameters of different reactions.« less

  13. All about Properties of Matter. Physical Science for Children[TM]. Schlessinger Science Library. [Videotape].

    ERIC Educational Resources Information Center

    2000

    Most children know how to describe an object--by color, size, and shape. Here they'll learn that all objects are made of matter and that all matter can be described with basic scientific properties--mass, weight, volume and density. Each of these properties is described using fun, real-life examples. With clear illustrations and hands-on…

  14. Properties of Matter. Physical Science in Action[TM]. Schlessinger Science Library. [Videotape].

    ERIC Educational Resources Information Center

    2000

    Everything in the universe consists of matter. So how can the differences between various types of matter be distinguished? Besides color, shape and size, there are more detailed properties that are used to define matter. Mass, weight, volume and density are all related to tell a great deal about an object or substance. Students will learn about…

  15. Changes in Properties of Matter. Physical Science in Action[TM]. Schlessinger Science Library. [Videotape].

    ERIC Educational Resources Information Center

    2000

    All matter possesses certain properties--mass, weight, volume and density. But what happens to these properties when the matter changes form? How does wood become ash when it burns? And how does ice cream change when it melts? Students will learn the difference between chemical and physical changes in this excellent introduction to the changes of…

  16. The U.S. national nuclear forensics library, nuclear materials information program, and data dictionary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamont, Stephen Philip; Brisson, Marcia; Curry, Michael

    2011-02-17

    Nuclear forensics assessments to determine material process history requires careful comparison of sample data to both measured and modeled nuclear material characteristics. Developing centralized databases, or nuclear forensics libraries, to house this information is an important step to ensure all relevant data will be available for comparison during a nuclear forensics analysis and help expedite the assessment of material history. The approach most widely accepted by the international community at this time is the implementation of National Nuclear Forensics libraries, which would be developed and maintained by individual nations. This is an attractive alternative toan international database since it providesmore » an understanding that each country has data on materials produced and stored within their borders, but eliminates the need to reveal any proprietary or sensitive information to other nations. To support the concept of National Nuclear Forensics libraries, the United States Department of Energy has developed a model library, based on a data dictionary, or set of parameters designed to capture all nuclear forensic relevant information about a nuclear material. Specifically, information includes material identification, collection background and current location, analytical laboratories where measurements were made, material packaging and container descriptions, physical characteristics including mass and dimensions, chemical and isotopic characteristics, particle morphology or metallurgical properties, process history including facilities, and measurement quality assurance information. While not necessarily required, it may also be valuable to store modeled data sets including reactor burn-up or enrichment cascade data for comparison. It is fully expected that only a subset of this information is available or relevant to many materials, and much of the data populating a National Nuclear Forensics library would be process analytical or material accountability measurement data as opposed to a complete forensic analysis of each material in the library.« less

  17. PySpike-A Python library for analyzing spike train synchrony

    NASA Astrophysics Data System (ADS)

    Mulansky, Mario; Kreuz, Thomas

    Understanding how the brain functions is one of the biggest challenges of our time. The analysis of experimentally recorded neural firing patterns (spike trains) plays a crucial role in addressing this problem. Here, the PySpike library is introduced, a Python package for spike train analysis providing parameter-free and time-scale independent measures of spike train synchrony. It allows to compute similarity and dissimilarity profiles, averaged values and distance matrices. Although mainly focusing on neuroscience, PySpike can also be applied in other contexts like climate research or social sciences. The package is available as Open Source on Github and PyPI.

  18. Scaling behavior of immersed granular flows

    NASA Astrophysics Data System (ADS)

    Amarsid, L.; Delenne, J.-Y.; Mutabaruka, P.; Monerie, Y.; Perales, F.; Radjai, F.

    2017-06-01

    The shear behavior of granular materials immersed in a viscous fluid depends on fluid properties (viscosity, density), particle properties (size, density) and boundary conditions (shear rate, confining pressure). Using computational fluid dynamics simulations coupled with molecular dynamics for granular flow, and exploring a broad range of the values of parameters, we show that the parameter space can be reduced to a single parameter that controls the packing fraction and effective friction coefficient. This control parameter is a modified inertial number that incorporates viscous effects.

  19. A local leaky-box model for the local stellar surface density-gas surface density-gas phase metallicity relation

    NASA Astrophysics Data System (ADS)

    Zhu, Guangtun Ben; Barrera-Ballesteros, Jorge K.; Heckman, Timothy M.; Zakamska, Nadia L.; Sánchez, Sebastian F.; Yan, Renbin; Brinkmann, Jonathan

    2017-07-01

    We revisit the relation between the stellar surface density, the gas surface density and the gas-phase metallicity of typical disc galaxies in the local Universe with the SDSS-IV/MaNGA survey, using the star formation rate surface density as an indicator for the gas surface density. We show that these three local parameters form a tight relationship, confirming previous works (e.g. by the PINGS and CALIFA surveys), but with a larger sample. We present a new local leaky-box model, assuming star-formation history and chemical evolution is localized except for outflowing materials. We derive closed-form solutions for the evolution of stellar surface density, gas surface density and gas-phase metallicity, and show that these parameters form a tight relation independent of initial gas density and time. We show that, with canonical values of model parameters, this predicted relation match the observed one well. In addition, we briefly describe a pathway to improving the current semi-analytic models of galaxy formation by incorporating the local leaky-box model in the cosmological context, which can potentially explain simultaneously multiple properties of Milky Way-type disc galaxies, such as the size growth and the global stellar mass-gas metallicity relation.

  20. Topology of Surface Ligands on Liposomes: Characterization Based on the Terms, Incorporation Ratio, Surface Anchor Density, and Reaction Yield.

    PubMed

    Lee, Shang-Hsuan; Sato, Yusuke; Hyodo, Mamoru; Harashima, Hideyoshi

    2016-01-01

    The surface topology of ligands on liposomes is an important factor in active targeting in drug delivery systems. Accurately evaluating the density of anchors and bioactive functional ligands on a liposomal surface is critical for ensuring the efficient delivery of liposomes. For evaluating surface ligand density, it is necessary to clarify that on the ligand-modified liposomal surfaces, some anchors are attached to ligands but some are not. To distinguish between these situations, a key parameter, surface anchor density, was introduced to specify amount of total anchors on the liposomal surface. Second, the parameter reaction yield was introduced to identify the amount of ligand-attached anchors among total anchors, since the conjugation efficiency is not always the same nor 100%. Combining these independent parameters, we derived: incorporation ratio=surface anchor density×reaction yield. The term incorporation ratio defines the surface ligand density. Since the surface anchor density represents the density of polyethylene glycol (PEG) on the surfaces in most cases, it also determines liposomal function. It is possible to accurately characterize various PEG and ligand densities and to define the surface topologies. In conclusion, this quantitative methodology can standardize the liposome preparation process and qualify the modified liposomal surfaces.

  1. Orbitally limited pair-density-wave phase of multilayer superconductors

    NASA Astrophysics Data System (ADS)

    Möckli, David; Yanase, Youichi; Sigrist, Manfred

    2018-04-01

    We investigate the magnetic field dependence of an ideal superconducting vortex lattice in the parity-mixed pair-density-wave phase of multilayer superconductors within a circular cell Ginzburg-Landau approach. In multilayer systems, due to local inversion symmetry breaking, a Rashba spin-orbit coupling is induced at the outer layers. This combined with a perpendicular paramagnetic (Pauli) limiting magnetic field stabilizes a staggered layer dependent pair-density-wave phase in the superconducting singlet channel. The high-field pair-density-wave phase is separated from the low-field BCS phase by a first-order phase transition. The motivating guiding question in this paper is: What is the minimal necessary Maki parameter αM for the appearance of the pair-density-wave phase of a superconducting trilayer system? To address this problem we generalize the circular cell method for the regular flux-line lattice of a type-II superconductor to include paramagnetic depairing effects. Then, we apply the model to the trilayer system, where each of the layers are characterized by Ginzburg-Landau parameter κ0 and a Maki parameter αM. We find that when the spin-orbit Rashba interaction compares to the superconducting condensation energy, the orbitally limited pair-density-wave phase stabilizes for Maki parameters αM>10 .

  2. Double-hybrid density-functional theory with meta-generalized-gradient approximations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Souvi, Sidi M. O., E-mail: sidi.souvi@irsn.fr; Sharkas, Kamal; Toulouse, Julien, E-mail: julien.toulouse@upmc.fr

    2014-02-28

    We extend the previously proposed one-parameter double-hybrid density-functional theory [K. Sharkas, J. Toulouse, and A. Savin, J. Chem. Phys. 134, 064113 (2011)] to meta-generalized-gradient-approximation (meta-GGA) exchange-correlation density functionals. We construct several variants of one-parameter double-hybrid approximations using the Tao-Perdew-Staroverov-Scuseria (TPSS) meta-GGA functional and test them on test sets of atomization energies and reaction barrier heights. The most accurate variant uses the uniform coordinate scaling of the density and of the kinetic energy density in the correlation functional, and improves over both standard Kohn-Sham TPSS and second-order Møller-Plesset calculations.

  3. Concrete density estimation by rebound hammer method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ismail, Mohamad Pauzi bin, E-mail: pauzi@nm.gov.my; Masenwat, Noor Azreen bin; Sani, Suhairy bin

    Concrete is the most common and cheap material for radiation shielding. Compressive strength is the main parameter checked for determining concrete quality. However, for shielding purposes density is the parameter that needs to be considered. X- and -gamma radiations are effectively absorbed by a material with high atomic number and high density such as concrete. The high strength normally implies to higher density in concrete but this is not always true. This paper explains and discusses the correlation between rebound hammer testing and density for concrete containing hematite aggregates. A comparison is also made with normal concrete i.e. concrete containingmore » crushed granite.« less

  4. Effects of Density Fluctuations on Weakly Nonlinear Alfven Waves: An IST Perspective

    NASA Astrophysics Data System (ADS)

    Hamilton, R.; Hadley, N.

    2012-12-01

    The effects of random density fluctuations on oblique, 1D, weakly nonlinear Alfven waves is examined through a numerical study of an analytical model developed by Ruderman [M.S. Ruderman, Phys. Plasmas, 9 (7), pp. 2940-2945, (2002).]. Consistent with Ruderman's application to the one-parameter dark soliton, the effects on both one-parameter bright and dark solitons, the two-parameter soliton as well as pairs of one-parameter solitons were similar to that of Ohmic dissipation found by Hamilton et al. [R. Hamilton, D. Peterson, and S. Libby, J. Geophys. Res 114, A03104,doi:10.1029/2008JA013582 (2009).] It was found in all cases where bright or two-parameter solitons are present initially, that the effects of density fluctuations results in the eventual damping of such compressive wave forms and the formation of a train of dark solitons, or magnetic depressions.

  5. Mass-number and excitation-energy dependence of the spin cutoff parameter

    DOE PAGES

    Grimes, S. M.; Voinov, A. V.; Massey, T. N.

    2016-07-12

    Here, the spin cutoff parameter determining the nuclear level density spin distribution ρ(J) is defined through the spin projection as < J 2 z > 1/2 or equivalently for spherical nuclei, (< J(J+1) >/3) 1/2. It is needed to divide the total level density into levels as a function of J. To obtain the total level density at the neutron binding energy from the s-wave resonance count, the spin cutoff parameter is also needed. The spin cutoff parameter has been calculated as a function of excitation energy and mass with a super-conducting Hamiltonian. Calculations have been compared with two commonlymore » used semiempirical formulas. A need for further measurements is also observed. Some complications for deformed nuclei are discussed. The quality of spin cut off parameter data derived from isomeric ratio measurement is examined.« less

  6. Open-Source Python Tools for Deploying Interactive GIS Dashboards for a Billion Datapoints on a Laptop

    NASA Astrophysics Data System (ADS)

    Steinberg, P. D.; Bednar, J. A.; Rudiger, P.; Stevens, J. L. R.; Ball, C. E.; Christensen, S. D.; Pothina, D.

    2017-12-01

    The rich variety of software libraries available in the Python scientific ecosystem provides a flexible and powerful alternative to traditional integrated GIS (geographic information system) programs. Each such library focuses on doing a certain set of general-purpose tasks well, and Python makes it relatively simple to glue the libraries together to solve a wide range of complex, open-ended problems in Earth science. However, choosing an appropriate set of libraries can be challenging, and it is difficult to predict how much "glue code" will be needed for any particular combination of libraries and tasks. Here we present a set of libraries that have been designed to work well together to build interactive analyses and visualizations of large geographic datasets, in standard web browsers. The resulting workflows run on ordinary laptops even for billions of data points, and easily scale up to larger compute clusters when available. The declarative top-level interface used in these libraries means that even complex, fully interactive applications can be built and deployed as web services using only a few dozen lines of code, making it simple to create and share custom interactive applications even for datasets too large for most traditional GIS systems. The libraries we will cover include GeoViews (HoloViews extended for geographic applications) for declaring visualizable/plottable objects, Bokeh for building visual web applications from GeoViews objects, Datashader for rendering arbitrarily large datasets faithfully as fixed-size images, Param for specifying user-modifiable parameters that model your domain, Xarray for computing with n-dimensional array data, Dask for flexibly dispatching computational tasks across processors, and Numba for compiling array-based Python code down to fast machine code. We will show how to use the resulting workflow with static datasets and with simulators such as GSSHA or AdH, allowing you to deploy flexible, high-performance web-based dashboards for your GIS data or simulations without needing major investments in code development or maintenance.

  7. Efficient Computation of Sparse Matrix Functions for Large-Scale Electronic Structure Calculations: The CheSS Library.

    PubMed

    Mohr, Stephan; Dawson, William; Wagner, Michael; Caliste, Damien; Nakajima, Takahito; Genovese, Luigi

    2017-10-10

    We present CheSS, the "Chebyshev Sparse Solvers" library, which has been designed to solve typical problems arising in large-scale electronic structure calculations using localized basis sets. The library is based on a flexible and efficient expansion in terms of Chebyshev polynomials and presently features the calculation of the density matrix, the calculation of matrix powers for arbitrary powers, and the extraction of eigenvalues in a selected interval. CheSS is able to exploit the sparsity of the matrices and scales linearly with respect to the number of nonzero entries, making it well-suited for large-scale calculations. The approach is particularly adapted for setups leading to small spectral widths of the involved matrices and outperforms alternative methods in this regime. By coupling CheSS to the DFT code BigDFT, we show that such a favorable setup is indeed possible in practice. In addition, the approach based on Chebyshev polynomials can be massively parallelized, and CheSS exhibits excellent scaling up to thousands of cores even for relatively small matrix sizes.

  8. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N., E-mail: zizin@adis.vver.kiae.ru

    2010-12-15

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit ofmore » the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.« less

  9. Reactivity effects in VVER-1000 of the third unit of the kalinin nuclear power plant at physical start-up. Computations in ShIPR intellectual code system with library of two-group cross sections generated by UNK code

    NASA Astrophysics Data System (ADS)

    Zizin, M. N.; Zimin, V. G.; Zizina, S. N.; Kryakvin, L. V.; Pitilimov, V. A.; Tereshonok, V. A.

    2010-12-01

    The ShIPR intellectual code system for mathematical simulation of nuclear reactors includes a set of computing modules implementing the preparation of macro cross sections on the basis of the two-group library of neutron-physics cross sections obtained for the SKETCH-N nodal code. This library is created by using the UNK code for 3D diffusion computation of first VVER-1000 fuel loadings. Computation of neutron fields in the ShIPR system is performed using the DP3 code in the two-group diffusion approximation in 3D triangular geometry. The efficiency of all groups of control rods for the first fuel loading of the third unit of the Kalinin Nuclear Power Plant is computed. The temperature, barometric, and density effects of reactivity as well as the reactivity coefficient due to the concentration of boric acid in the reactor were computed additionally. Results of computations are compared with the experiment.

  10. Analysis of Brown camera distortion model

    NASA Astrophysics Data System (ADS)

    Nowakowski, Artur; Skarbek, Władysław

    2013-10-01

    Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.

  11. The US Geological Survey, digital spectral reflectance library: version 1: 0.2 to 3.0 microns

    NASA Technical Reports Server (NTRS)

    Clark, Roger N.; Swayze, Gregg A.; King, Trude V. V.; Gallagher, Andrea J.; Calvin, Wendy M.

    1993-01-01

    We have developed a digital reflectance spectral library, with management and spectral analysis software. The library includes 500 spectra of 447 samples (some samples include a series of grain sizes) measured from approximately 0.2 to 3.0 microns. The spectral resolution (Full Width Half Maximum) of the reflectance data is less than or equal to 4 nm in the visible (0.2-0.8 microns) and less than or equal 10 nm in the NIR (0.8-2.35 microns). All spectra were corrected to absolute reflectance using an NBS Halon standard. Library management software lets users search on parameters (e.g. chemical formulae, chemical analyses, purity of samples, mineral groups, etc.) as well as spectral features. Minerals from sulfide, oxide, hydroxide, halide, carbonate, nitrate, borate, phosphate, and silicate groups are represented. X-ray and chemical analyses are tabulated for many of the entries, and all samples have been evaluated for spectral purity. The library also contains end and intermediate members for the olivine, garnet, scapolite, montmorillonite, muscovite, jarosite, and alunite solid-solution series. We have included representative spectra of H2O ice, kerogen, ammonium-bearing minerals, rare-earth oxides, desert varnish coatings, kaolinite crystallinity series, kaolinite-smectite series, zeolite series, and an extensive evaporite series. Because of the importance of vegetation to climate-change studies we have include 17 spectra of tree leaves, bushes, and grasses.

  12. Assessment of various parameters to improve MALDI-TOF MS reference spectra libraries constructed for the routine identification of filamentous fungi.

    PubMed

    Normand, Anne-Cécile; Cassagne, Carole; Ranque, Stéphane; L'ollivier, Coralie; Fourquet, Patrick; Roesems, Sam; Hendrickx, Marijke; Piarroux, Renaud

    2013-04-08

    The poor reproducibility of matrix-assisted desorption/ionization time-of-flight (MALDI-TOF) spectra limits the effectiveness of the MALDI-TOF MS-based identification of filamentous fungi with highly heterogeneous phenotypes in routine clinical laboratories. This study aimed to enhance the MALDI-TOF MS-based identification of filamentous fungi by assessing several architectures of reference spectrum libraries. We established reference spectrum libraries that included 30 filamentous fungus species with various architectures characterized by distinct combinations of the following: i) technical replicates, i.e., the number of analyzed deposits for each culture used to build a reference meta-spectrum (RMS); ii) biological replicates, i.e., the number of RMS derived from the distinct subculture of each strain; and iii) the number of distinct strains of a given species. We then compared the effectiveness of each library in the identification of 200 prospectively collected clinical isolates, including 38 species in 28 genera.Identification effectiveness was improved by increasing the number of both RMS per strain (p<10-4) and strains for a given species (p<10-4) in a multivariate analysis. Addressing the heterogeneity of MALDI-TOF spectra derived from filamentous fungi by increasing the number of RMS obtained from distinct subcultures of strains included in the reference spectra library markedly improved the effectiveness of the MALDI-TOF MS-based identification of clinical filamentous fungi.

  13. The study on the effect of pattern density distribution on the STI CMP process

    NASA Astrophysics Data System (ADS)

    Sub, Yoon Myung; Hian, Bernard Yap Tzen; Fong, Lee It; Anak, Philip Menit; Minhar, Ariffin Bin; Wui, Tan Kim; Kim, Melvin Phua Twang; Jin, Looi Hui; Min, Foo Thai

    2017-08-01

    The effects of pattern density on CMP characteristics were investigated using specially designed wafer for the characterization of pattern-dependencies in STI CMP [1]. The purpose of this study is to investigate the planarization behavior based on a direct STI CMP used in cerium (CeO2) based slurry system in terms of pattern density variation. The minimal design rule (DR) of 180nm generation technology node was adopted for the mask layout. The mask was successfully applied for evaluation of a cerium (CeO2) abrasive based direct STI CMP process. In this study, we described a planarization behavior of the loading-effects of pattern density variation which were characterized with layout pattern density and pitch variations using masks mentioned above. Furthermore, the characterizing pattern dependent on the variations of the dimensions and spacing features, in thickness remaining after CMP, were analyzed and evaluated. The goal was to establish a concept of library method which will be used to generate design rules reducing the probability of CMP-related failures. Details of the characterization were measured in various layouts showing different pattern density ranges and the effects of pattern density on STI CMP has been discussed in this paper.

  14. Size-density scaling in protists and the links between consumer-resource interaction parameters.

    PubMed

    DeLong, John P; Vasseur, David A

    2012-11-01

    Recent work indicates that the interaction between body-size-dependent demographic processes can generate macroecological patterns such as the scaling of population density with body size. In this study, we evaluate this possibility for grazing protists and also test whether demographic parameters in these models are correlated after controlling for body size. We compiled data on the body-size dependence of consumer-resource interactions and population density for heterotrophic protists grazing algae in laboratory studies. We then used nested dynamic models to predict both the height and slope of the scaling relationship between population density and body size for these protists. We also controlled for consumer size and assessed links between model parameters. Finally, we used the models and the parameter estimates to assess the individual- and population-level dependence of resource use on body-size and prey-size selection. The predicted size-density scaling for all models matched closely to the observed scaling, and the simplest model was sufficient to predict the pattern. Variation around the mean size-density scaling relationship may be generated by variation in prey productivity and area of capture, but residuals are relatively insensitive to variation in prey size selection. After controlling for body size, many consumer-resource interaction parameters were correlated, and a positive correlation between residual prey size selection and conversion efficiency neutralizes the apparent fitness advantage of taking large prey. Our results indicate that widespread community-level patterns can be explained with simple population models that apply consistently across a range of sizes. They also indicate that the parameter space governing the dynamics and the steady states in these systems is structured such that some parts of the parameter space are unlikely to represent real systems. Finally, predator-prey size ratios represent a kind of conundrum, because they are widely observed but apparently have little influence on population size and fitness, at least at this level of organization. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.

  15. Uncertainty quantification of voice signal production mechanical model and experimental updating

    NASA Astrophysics Data System (ADS)

    Cataldo, E.; Soize, C.; Sampaio, R.

    2013-11-01

    The aim of this paper is to analyze the uncertainty quantification in a voice production mechanical model and update the probability density function corresponding to the tension parameter using the Bayes method and experimental data. Three parameters are considered uncertain in the voice production mechanical model used: the tension parameter, the neutral glottal area and the subglottal pressure. The tension parameter of the vocal folds is mainly responsible for the changing of the fundamental frequency of a voice signal, generated by a mechanical/mathematical model for producing voiced sounds. The three uncertain parameters are modeled by random variables. The probability density function related to the tension parameter is considered uniform and the probability density functions related to the neutral glottal area and the subglottal pressure are constructed using the Maximum Entropy Principle. The output of the stochastic computational model is the random voice signal and the Monte Carlo method is used to solve the stochastic equations allowing realizations of the random voice signals to be generated. For each realization of the random voice signal, the corresponding realization of the random fundamental frequency is calculated and the prior pdf of this random fundamental frequency is then estimated. Experimental data are available for the fundamental frequency and the posterior probability density function of the random tension parameter is then estimated using the Bayes method. In addition, an application is performed considering a case with a pathology in the vocal folds. The strategy developed here is important mainly due to two things. The first one is related to the possibility of updating the probability density function of a parameter, the tension parameter of the vocal folds, which cannot be measured direct and the second one is related to the construction of the likelihood function. In general, it is predefined using the known pdf. Here, it is constructed in a new and different manner, using the own system considered.

  16. Automatically Generated Vegetation Density Maps with LiDAR Survey for Orienteering Purpose

    NASA Astrophysics Data System (ADS)

    Petrovič, Dušan

    2018-05-01

    The focus of our research was to automatically generate the most adequate vegetation density maps for orienteering purpose. Application Karttapullatuin was used for automated generation of vegetation density maps, which requires LiDAR data to process an automatically generated map. A part of the orienteering map in the area of Kazlje-Tomaj was used to compare the graphical display of vegetation density. With different settings of parameters in the Karttapullautin application we changed the way how vegetation density of automatically generated map was presented, and tried to match it as much as possible with the orienteering map of Kazlje-Tomaj. Comparing more created maps of vegetation density the most suitable parameter settings to automatically generate maps on other areas were proposed, too.

  17. A predictive model for the tokamak density limit

    DOE PAGES

    Teng, Q.; Brennan, D. P.; Delgado-Aparicio, L.; ...

    2016-07-28

    We reproduce the Greenwald density limit, in all tokamak experiments by using a phenomenologically correct model with parameters in the range of experiments. A simple model of equilibrium evolution and local power balance inside the island has been implemented to calculate the radiation-driven thermo-resistive tearing mode growth and explain the density limit. Strong destabilization of the tearing mode due to an imbalance of local Ohmic heating and radiative cooling in the island predicts the density limit within a few percent. Furthermore, we found the density limit and it is a local edge limit and weakly dependent on impurity densities. Ourmore » results are robust to a substantial variation in model parameters within the range of experiments.« less

  18. Implementation of Open-Source Web Mapping Technologies to Support Monitoring of Governmental Schemes

    NASA Astrophysics Data System (ADS)

    Pulsani, B. R.

    2015-10-01

    Several schemes are undertaken by the government to uplift social and economic condition of people. The monitoring of these schemes is done through information technology where involvement of Geographic Information System (GIS) is lacking. To demonstrate the benefits of thematic mapping as a tool for assisting the officials in making decisions, a web mapping application for three government programs such as Mother and Child Tracking system (MCTS), Telangana State Housing Corporation Limited (TSHCL) and Ground Water Quality Mapping (GWQM) has been built. Indeed the three applications depicted the distribution of various parameters thematically and helped in identifying the areas with higher and weaker distributions. Based on the three applications, the study tends to find similarities of many government schemes reflecting the nature of thematic mapping and hence deduces to implement this kind of approach for other schemes as well. These applications have been developed using SharpMap Csharp library which is a free and open source mapping library for developing geospatial applications. The study highlights upon the cost benefits of SharpMap and brings out the advantage of this library over proprietary vendors and further discusses its advantages over other open source libraries as well.

  19. Automation and hypermedia technology applications

    NASA Technical Reports Server (NTRS)

    Jupin, Joseph H.; Ng, Edward W.; James, Mark L.

    1993-01-01

    This paper represents a progress report on HyLite (Hypermedia Library technology): a research and development activity to produce a versatile system as part of NASA's technology thrusts in automation, information sciences, and communications. HyLite can be used as a system or tool to facilitate the creation and maintenance of large distributed electronic libraries. The contents of such a library may be software components, hardware parts or designs, scientific data sets or databases, configuration management information, etc. Proliferation of computer use has made the diversity and quantity of information too large for any single user to sort, process, and utilize effectively. In response to this information deluge, we have created HyLite to enable the user to process relevant information into a more efficient organization for presentation, retrieval, and readability. To accomplish this end, we have incorporated various AI techniques into the HyLite hypermedia engine to facilitate parameters and properties of the system. The proposed techniques include intelligent searching tools for the libraries, intelligent retrievals, and navigational assistance based on user histories. HyLite itself is based on an earlier project, the Encyclopedia of Software Components (ESC) which used hypermedia to facilitate and encourage software reuse.

  20. Optimization of the genotyping-by-sequencing strategy for population genomic analysis in conifers.

    PubMed

    Pan, Jin; Wang, Baosheng; Pei, Zhi-Yong; Zhao, Wei; Gao, Jie; Mao, Jian-Feng; Wang, Xiao-Ru

    2015-07-01

    Flexibility and low cost make genotyping-by-sequencing (GBS) an ideal tool for population genomic studies of nonmodel species. However, to utilize the potential of the method fully, many parameters affecting library quality and single nucleotide polymorphism (SNP) discovery require optimization, especially for conifer genomes with a high repetitive DNA content. In this study, we explored strategies for effective GBS analysis in pine species. We constructed GBS libraries using HpaII, PstI and EcoRI-MseI digestions with different multiplexing levels and examined the effect of restriction enzymes on library complexity and the impact of sequencing depth and size selection of restriction fragments on sequence coverage bias. We tested and compared UNEAK, Stacks and GATK pipelines for the GBS data, and then developed a reference-free SNP calling strategy for haploid pine genomes. Our GBS procedure proved to be effective in SNP discovery, producing 7000-11 000 and 14 751 SNPs within and among three pine species, respectively, from a PstI library. This investigation provides guidance for the design and analysis of GBS experiments, particularly for organisms for which genomic information is lacking. © 2014 John Wiley & Sons Ltd.

  1. [Microbial diversity and ammonia-oxidizing microorganism of a soil sample near an acid mine drainage lake].

    PubMed

    Liu, Ying; Wang, Li-Hua; Hao, Chun-Bo; Li, Lu; Li, Si-Yuan; Feng, Chuan-Ping

    2014-06-01

    The main physicochemical parameters of the soil sample which was collected near an acid mine drainage reservoir in Anhui province was analyzed. The microbial diversity and community structure was studied through the construction of bacteria and archaea 16S rRNA gene clone libraries and ammonia monooxygenase gene clone library of archaea. The functional groups which were responsible for the process of ammonia oxidation were also discussed. The results indicated that the soil sample had extreme low pH value (pH < 3) and high ions concentration, which was influenced by the acid mine drainage (AMD). All the 16S rRNA gene sequences of bacteria clone library fell into 11 phyla, and Acidobacteria played the most significant role in the ecosystem followed by Verrucomicrobia. A great number of acidophilic bacteria existed in the soil sample, such as Candidatus Koribacter versatilis and Holophaga sp.. The archaea clone library consisted of 2 phyla (Thaumarchaeota and Euryarchaeota). The abundance of Thaumarchaeota was remarkably higher than Euryarchaeota. The ammonia oxidation in the soil environment was probably driven by ammonia-oxidizing archaea, and new species of ammonia-oxidizing archaea existed in the soil sample.

  2. ParFit: A Python-Based Object-Oriented Program for Fitting Molecular Mechanics Parameters to ab Initio Data

    DOE PAGES

    Zahariev, Federico; De Silva, Nuwan; Gordon, Mark S.; ...

    2017-02-23

    Here, a newly created object-oriented program for automating the process of fitting molecular-mechanics parameters to ab initio data, termed ParFit, is presented. ParFit uses a hybrid of deterministic and stochastic genetic algorithms. ParFit can simultaneously handle several molecular-mechanics parameters in multiple molecules and can also apply symmetric and antisymmetric constraints on the optimized parameters. The simultaneous handling of several molecules enhances the transferability of the fitted parameters. ParFit is written in Python, uses a rich set of standard and nonstandard Python libraries, and can be run in parallel on multicore computer systems. As an example, a series of phosphine oxides,more » important for metal extraction chemistry, are parametrized using ParFit.« less

  3. ParFit: A Python-Based Object-Oriented Program for Fitting Molecular Mechanics Parameters to ab Initio Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zahariev, Federico; De Silva, Nuwan; Gordon, Mark S.

    Here, a newly created object-oriented program for automating the process of fitting molecular-mechanics parameters to ab initio data, termed ParFit, is presented. ParFit uses a hybrid of deterministic and stochastic genetic algorithms. ParFit can simultaneously handle several molecular-mechanics parameters in multiple molecules and can also apply symmetric and antisymmetric constraints on the optimized parameters. The simultaneous handling of several molecules enhances the transferability of the fitted parameters. ParFit is written in Python, uses a rich set of standard and nonstandard Python libraries, and can be run in parallel on multicore computer systems. As an example, a series of phosphine oxides,more » important for metal extraction chemistry, are parametrized using ParFit.« less

  4. A Fortran 90 Hartree-Fock program for one-dimensional periodic π-conjugated systems using Pariser-Parr-Pople model

    NASA Astrophysics Data System (ADS)

    Kondayya, Gundra; Shukla, Alok

    2012-03-01

    Pariser-Parr-Pople (P-P-P) model Hamiltonian is employed frequently to study the electronic structure and optical properties of π-conjugated systems. In this paper we describe a Fortran 90 computer program which uses the P-P-P model Hamiltonian to solve the Hartree-Fock (HF) equation for infinitely long, one-dimensional, periodic, π-electron systems. The code is capable of computing the band structure, as also the linear optical absorption spectrum, by using the tight-binding and the HF methods. Furthermore, using our program the user can solve the HF equation in the presence of a finite external electric field, thereby, allowing the simulation of gated systems. We apply our code to compute various properties of polymers such as trans-polyacetylene, poly- para-phenylene, and armchair and zigzag graphene nanoribbons, in the infinite length limit. Program summaryProgram title: ppp_bulk.x Catalogue identifier: AEKW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 87 464 No. of bytes in distributed program, including test data, etc.: 2 046 933 Distribution format: tar.gz Programming language: Fortran 90 Computer: PCs and workstations Operating system: Linux, Code was developed and tested on various recent versions of 64-bit Fedora including Fedora 14 (kernel version 2.6.35.12-90). Classification: 7.3 External routines: This program needs to link with LAPACK/BLAS libraries compiled with the same compiler as the program. For the Intel Fortran Compiler we used the ACML library version 4.4.0, while for the gfortran compiler we used the libraries supplied with the Fedora distribution. Nature of problem: The electronic structure of one-dimensional periodic π-conjugated systems is an intense area of research at present because of the tremendous interest in the physics of conjugated polymers and graphene nanoribbons. The computer program described in this paper provides an efficient way of solving the Hartree-Fock equations for such systems within the P-P-P model. In addition to the Bloch orbitals, band structure, and the density of states, the program can also compute quantities such as the linear absorption spectrum, and the electro-absorption spectrum of these systems. Solution method: For a one-dimensional periodic π-conjugated system lying in the xy-plane, the single-particle Bloch orbitals are expressed as linear combinations of p-orbitals of individual atoms. Then using various parameters defining the P-P-P Hamiltonian, the Hartree-Fock equations are set up as a matrix eigenvalue problem in the k-space. Thereby, its solutions are obtained in a self-consistent manner, using the iterative diagonalizing technique at several k points. The band structure and the corresponding Bloch orbitals thus obtained are used to perform a variety of calculations such as the density of states, linear optical absorption spectrum, electro-absorption spectrum, etc. Running time: Most of the examples provided take only a few seconds to run. For a large system, however, depending on the system size, the run time may be a few minutes to a few hours.

  5. Inter-eye Asymmetry of Optical Coherence Tomography Angiography Vessel Density in Bilateral Glaucoma, Glaucoma Suspect, and Healthy Eyes.

    PubMed

    Hou, Huiyuan; Moghimi, Sasan; Zangwill, Linda M; Shoji, Takuhei; Ghahari, Elham; Manalastas, Patricia Isabel C; Penteado, Rafaella C; Weinreb, Robert N

    2018-03-23

    To investigate inter-eye retinal vessel density asymmetry in healthy, glaucoma suspect, and mild to moderate glaucoma subjects, and its potential utility for early detection of glaucomatous damage. Cross-sectional study. 153 subjects including 55 healthy, 32 glaucoma suspect, and 66 glaucoma subjects enrolled in the Diagnostic Innovations in Glaucoma Study(DIGS). Vessel density was obtained from optical coherence tomography angiography (OCT-A) macular and optic nerve head scans. Thickness of peripapillary retinal nerve fiber layer (RNFL) and macular ganglion cell complex (mGCC) was measured with spectral-domain optical coherence tomography (SD-OCT) scans. Inter-eye asymmetry was calculated by taking the absolute value of difference in vessel density and thickness between the right and left eyes. Inter-eye retinal vessel density asymmetry parameters were significantly different among the three groups. Glaucoma suspects had significantly higher peripapillary and macular inter-eye vessel density asymmetries compared to healthy groups in univariate (1.1% vs. 2.0%, P=0.014 and 1.2% vs. 2.5%, P=0.027, respectively) and multivariate analyses (P=0.007 and 0.038, respectively). No significant differences in asymmetry of thickness parameters were found between glaucoma suspect and healthy groups (all P> 0.718). However significant differences in asymmetry of thickness parameters between glaucoma suspects and glaucoma patients (P<0.01) were found for all parameters. Inter-eye vessel density asymmetry can be quantified by OCT-A measurement. Glaucoma suspects have significantly greater vessel density asymmetry than healthy eyes. Longitudinal studies are needed to better characterize the relationship of vessel density asymmetry with the development and progression of glaucoma. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. [Is there a relation between weight in rats, bone density, ash weight and histomorphometric indicators of trabecular volume and thickness in the bones of extremities?].

    PubMed

    Zák, J; Kapitola, J; Povýsil, C

    2003-01-01

    Authors deal with question, if there is possibility to infer bone histological structure (described by histomorphometric parameters of trabecular bone volume and trabecular thickness) from bone density, ash weight or even from weight of animal (rat). Both tibias of each of 30 intact male rats, 90 days old, were processed. Left tibia was utilized to the determination of histomorphometric parameters of undecalcified bone tissue patterns by automatic image analysis. Right tibia was used to the determination of values of bone density, using Archimedes' principle. Values of bone density, ash weight, ash weight related to bone volume and animal weight were correlated with histomorphometric parameters (trabecular bone volume, trabecular thickness) by Pearson's correlation test. One could presume the existence of relation between data, describing bone mass at the histological level (trabecular bone of tibia) and other data, describing mass of whole bone or even animal mass (weight). But no statistically significant correlation was found. The reason of the present results could be in the deviations of trabecular density in marrow of tibia. Because of higher trabecular bone density in metaphyseal and epiphyseal regions, the histomorphometric analysis of trabecular bone is preferentially done in these areas. It is possible, that this irregularity of trabecular tibial density could be the source of the deviations, which could influence the results of correlations determined. The values of bone density, ash weight and animal weight do not influence trabecular bone volume and vice versa: static histomorphometric parameters of trabecular bone do not reflect bone density, ash weight and weight of animal.

  7. Fast clustering using adaptive density peak detection.

    PubMed

    Wang, Xiao-Feng; Xu, Yifan

    2017-12-01

    Common limitations of clustering methods include the slow algorithm convergence, the instability of the pre-specification on a number of intrinsic parameters, and the lack of robustness to outliers. A recent clustering approach proposed a fast search algorithm of cluster centers based on their local densities. However, the selection of the key intrinsic parameters in the algorithm was not systematically investigated. It is relatively difficult to estimate the "optimal" parameters since the original definition of the local density in the algorithm is based on a truncated counting measure. In this paper, we propose a clustering procedure with adaptive density peak detection, where the local density is estimated through the nonparametric multivariate kernel estimation. The model parameter is then able to be calculated from the equations with statistical theoretical justification. We also develop an automatic cluster centroid selection method through maximizing an average silhouette index. The advantage and flexibility of the proposed method are demonstrated through simulation studies and the analysis of a few benchmark gene expression data sets. The method only needs to perform in one single step without any iteration and thus is fast and has a great potential to apply on big data analysis. A user-friendly R package ADPclust is developed for public use.

  8. Computing mammographic density from a multiple regression model constructed with image-acquisition parameters from a full-field digital mammographic unit

    PubMed Central

    Lu, Lee-Jane W.; Nishino, Thomas K.; Khamapirad, Tuenchit; Grady, James J; Leonard, Morton H.; Brunder, Donald G.

    2009-01-01

    Breast density (the percentage of fibroglandular tissue in the breast) has been suggested to be a useful surrogate marker for breast cancer risk. It is conventionally measured using screen-film mammographic images by a labor intensive histogram segmentation method (HSM). We have adapted and modified the HSM for measuring breast density from raw digital mammograms acquired by full-field digital mammography. Multiple regression model analyses showed that many of the instrument parameters for acquiring the screening mammograms (e.g. breast compression thickness, radiological thickness, radiation dose, compression force, etc) and image pixel intensity statistics of the imaged breasts were strong predictors of the observed threshold values (model R2=0.93) and %density (R2=0.84). The intra-class correlation coefficient of the %-density for duplicate images was estimated to be 0.80, using the regression model-derived threshold values, and 0.94 if estimated directly from the parameter estimates of the %-density prediction regression model. Therefore, with additional research, these mathematical models could be used to compute breast density objectively, automatically bypassing the HSM step, and could greatly facilitate breast cancer research studies. PMID:17671343

  9. Tolerance to structural disorder and tunable mechanical behavior in self-assembled superlattices of polymer-grafted nanocrystals

    DOE PAGES

    Gu, X. Wendy; Ye, Xingchen; Koshy, David M.; ...

    2017-02-27

    Large, freestanding membranes with remarkably high elastic modulus ( > 10 GPa) have been fabricated through the self-Assembly of ligand-stabilized inorganic nanocrystals, even though these nanocrystals are connected only by soft organic ligands (e.g., dodecanethiol or DNA) that are not cross-linked or entangled. Recent developments in the synthesis of polymer-grafted nanocrystals have greatly expanded the library of accessible superlattice architectures,which allows superlattice mechanical behavior to be linked to specific structural features. Here, colloidal self-Assembly is used to organize polystyrene-grafted Au nanocrystals at a fluid interface to form ordered solids with sub-10-nm periodic features. We used thin-film buckling and nanoindentation tomore » evaluate the mechanical behavior of polymer-grafted nanocrystal superlattices while exploring the role of polymer structural conformation, nanocrystal packing, and superlattice dimensions. Superlattices containing 3-20 vol % Au are found to have an elastic modulus of ~6-19 GPa, and hardness of ~120-170 MPa. We also found that rapidly self-Assembled superlattices have the highest elastic modulus, despite containing significant structural defects. Polymer extension, interdigitation, and grafting density are determined to be critical parameters that govern superlattice elastic and plastic deformation.« less

  10. Yttrium-90 microspheres for the treatment of hepatocellular carcinoma: A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salem, Riad; Hunter, Russell D.

    2006-10-01

    To present a critical review of yttrium-90 (TheraSphere) for the treatment of hepatocellular carcinoma (HCC). Medical literature databases (Medline, Cochrane Library, and CANCERLIT) were searched for available literature concerning the treatment of HCC with TheraSphere. These publications were reviewed for scientific and clinical validity. Studies pertaining to the use of yttrium-90 for HCC date back to the 1960s. The results from the early animal safety studies established a radiation exposure range of 50-100 Gy to be used in human studies. Phase I dose escalation studies followed, which were instrumental in delineating radiation dosimetry and safety parameters in humans. These earlymore » studies emphasized the importance of differential arteriolar density between hypervascular HCC and surrounding liver parenchyma. Current trends in research have focused on advancing techniques to safely implement this technology as an alternative to traditional methods of treating unresectable HCC, such as external beam radiotherapy, conformal beam radiotherapy, ethanol ablation, trans-arterial chemoembolization, and radiofrequency ablation. Yttrium-90 (TheraSphere) is an outpatient treatment option for HCC. Current and future research should focus on implementing multicenter phase II and III trials comparing TheraSphere with other therapies for HCC.« less

  11. Review of the progress toward achieving heat confinement-the holy grail of photothermal therapy

    NASA Astrophysics Data System (ADS)

    Sheng, Wangzhong; He, Sha; Seare, William J.; Almutairi, Adah

    2017-08-01

    Photothermal therapy (PTT) involves the application of normally benign light wavelengths in combination with efficient photothermal (PT) agents that convert the absorbed light to heat to ablate selected cancers. The major challenge in PTT is the ability to confine heating and thus direct cellular death to precisely where PT agents are located. The dominant strategy in the field has been to create large libraries of PT agents with increased absorption capabilities and to enhance their delivery and accumulation to achieve sufficiently high concentrations in the tissue targets of interest. While the challenge of material confinement is important for achieving "heat and lethality confinement," this review article suggests another key prospective strategy to make this goal a reality. In this approach, equal emphasis is placed on selecting parameters of light exposure, including wavelength, duration, power density, and total power supplied, based on the intrinsic properties and geometry of tissue targets that influence heat dissipation, to truly achieve heat confinement. This review highlights significant milestones researchers have achieved, as well as examples that suggest future research directions, in this promising technique, as it becomes more relevant in clinical cancer therapy and other noncancer applications.

  12. Tolerance to structural disorder and tunable mechanical behavior in self-assembled superlattices of polymer-grafted nanocrystals

    NASA Astrophysics Data System (ADS)

    Gu, X. Wendy; Ye, Xingchen; Koshy, David M.; Vachhani, Shraddha; Hosemann, Peter; Alivisatos, A. Paul

    2017-03-01

    Large, freestanding membranes with remarkably high elastic modulus (>10 GPa) have been fabricated through the self-assembly of ligand-stabilized inorganic nanocrystals, even though these nanocrystals are connected only by soft organic ligands (e.g., dodecanethiol or DNA) that are not cross-linked or entangled. Recent developments in the synthesis of polymer-grafted nanocrystals have greatly expanded the library of accessible superlattice architectures, which allows superlattice mechanical behavior to be linked to specific structural features. Here, colloidal self-assembly is used to organize polystyrene-grafted Au nanocrystals at a fluid interface to form ordered solids with sub-10-nm periodic features. Thin-film buckling and nanoindentation are used to evaluate the mechanical behavior of polymer-grafted nanocrystal superlattices while exploring the role of polymer structural conformation, nanocrystal packing, and superlattice dimensions. Superlattices containing 3-20 vol % Au are found to have an elastic modulus of ˜6-19 GPa, and hardness of ˜120-170 MPa. We find that rapidly self-assembled superlattices have the highest elastic modulus, despite containing significant structural defects. Polymer extension, interdigitation, and grafting density are determined to be critical parameters that govern superlattice elastic and plastic deformation.

  13. A tunable electron beam source using trapping of electrons in a density down-ramp in laser wakefield acceleration.

    PubMed

    Ekerfelt, Henrik; Hansson, Martin; Gallardo González, Isabel; Davoine, Xavier; Lundh, Olle

    2017-09-25

    One challenge in the development of laser wakefield accelerators is to demonstrate sufficient control and reproducibility of the parameters of the generated bunches of accelerated electrons. Here we report on a numerical study, where we demonstrate that trapping using density down-ramps allows for tuning of several electron bunch parameters by varying the properties of the density down-ramp. We show that the electron bunch length is determined by the difference in density before and after the ramp. Furthermore, the transverse emittance of the bunch is controlled by the steepness of the ramp. Finally, the amount of trapped charge depends both on the density difference and on the steepness of the ramp. We emphasize that both parameters of the density ramp are feasible to vary experimentally. We therefore conclude that this tunable electron accelerator makes it suitable for a wide range of applications, from those requiring short pulse length and low emittance, such as the free-electron lasers, to those requiring high-charge, large-emittance bunches to maximize betatron X-ray generation.

  14. Parametric dependence of density limits in the Tokamak Experiment for Technology Oriented Research (TEXTOR): Comparison of thermal instability theory with experiment

    NASA Astrophysics Data System (ADS)

    Kelly, F. A.; Stacey, W. M.; Rapp, J.

    2001-11-01

    The observed dependence of the TEXTOR [Tokamak Experiment for Technology Oriented Research: E. Hintz, P. Bogen, H. A. Claassen et al., Contributions to High Temperature Plasma Physics, edited by K. H. Spatschek and J. Uhlenbusch (Akademie Verlag, Berlin, 1994), p. 373] density limit on global parameters (I, B, P, etc.) and wall conditioning is compared with the predicted density limit parametric scaling of thermal instability theory. It is necessary first to relate the edge parameters of the thermal instability theory to n¯ and the other global parameters. The observed parametric dependence of the density limit in TEXTOR is generally consistent with the predicted density limit scaling of thermal instability theory. The observed wall conditioning dependence of the density limit can be reconciled with the theory in terms of the radiative emissivity temperature dependence of different impurities in the plasma edge. The thermal instability theory also provides an explanation of why symmetric detachment precedes radiative collapse for most low power shots, while a multifaceted asymmetric radiation from the edge MARFE precedes detachment for most high power shots.

  15. The Chromosome Microdissection and Microcloning Technique.

    PubMed

    Zhang, Ying-Xin; Deng, Chuan-Liang; Hu, Zan-Min

    2016-01-01

    Chromosome microdissection followed by microcloning is an efficient tool combining cytogenetics and molecular genetics that can be used for the construction of the high density molecular marker linkage map and fine physical map, the generation of probes for chromosome painting, and the localization and cloning of important genes. Here, we describe a modified technique to microdissect a single chromosome, paint individual chromosomes, and construct single-chromosome DNA libraries.

  16. A Library of Infectious Hepatitis C Viruses with Engineered Mutations in the E2 Gene Reveals Growth-Adaptive Mutations That Modulate Interactions with Scavenger Receptor Class B Type I.

    PubMed

    Zuiani, Adam; Chen, Kevin; Schwarz, Megan C; White, James P; Luca, Vincent C; Fremont, Daved H; Wang, David; Evans, Matthew J; Diamond, Michael S

    2016-12-01

    While natural hepatitis C virus (HCV) infection results in highly diverse quasispecies of related viruses over time, mutations accumulate more slowly in tissue culture, in part because of the inefficiency of replication in cells. To create a highly diverse population of HCV particles in cell culture and identify novel growth-enhancing mutations, we engineered a library of infectious HCV with all codons represented at most positions in the ectodomain of the E2 gene. We identified many putative growth-adaptive mutations and selected nine highly represented E2 mutants for further study: Q412R, T416R, S449P, T563V, A579R, L619T, V626S, K632T, and L644I. We evaluated these mutants for changes in particle-to-infectious-unit ratio, sensitivity to neutralizing antibody or CD81 large extracellular loop (CD81-LEL) inhibition, entry factor usage, and buoyant density profiles. Q412R, T416R, S449P, T563V, and L619T were neutralized more efficiently by anti-E2 antibodies and T416R, T563V, and L619T by CD81-LEL. Remarkably, all nine variants showed reduced dependence on scavenger receptor class B type I (SR-BI) for infection. This shift from SR-BI usage did not correlate with a change in the buoyant density profiles of the variants, suggesting an altered E2-SR-BI interaction rather than changes in the virus-associated lipoprotein-E2 interaction. Our results demonstrate that residues influencing SR-BI usage are distributed across E2 and support the development of large-scale mutagenesis studies to identify viral variants with unique functional properties. Characterizing variant viruses can reveal new information about the life cycle of HCV and the roles played by different viral genes. However, it is difficult to recapitulate high levels of diversity in the laboratory because of limitations in the HCV culture system. To overcome this limitation, we engineered a library of mutations into the E2 gene in the context of an infectious clone of the virus. We used this library of viruses to identify nine mutations that enhance the growth rate of HCV. These growth-enhancing mutations reduced the dependence on a key entry receptor, SR-BI. By generating a highly diverse library of infectious HCV, we mapped regions of the E2 protein that influence a key virus-host interaction and provide proof of principle for the generation of large-scale mutant libraries for the study of pathogens with great sequence variability. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  17. Density-dependent home-range size revealed by spatially explicit capture–recapture

    USGS Publications Warehouse

    Efford, M.G.; Dawson, Deanna K.; Jhala, Y.V.; Qureshi, Q.

    2016-01-01

    The size of animal home ranges often varies inversely with population density among populations of a species. This fact has implications for population monitoring using spatially explicit capture–recapture (SECR) models, in which both the scale of home-range movements σ and population density D usually appear as parameters, and both may vary among populations. It will often be appropriate to model a structural relationship between population-specific values of these parameters, rather than to assume independence. We suggest re-parameterizing the SECR model using kp = σp √Dp, where kp relates to the degree of overlap between home ranges and the subscript p distinguishes populations. We observe that kp is often nearly constant for populations spanning a range of densities. This justifies fitting a model in which the separate kp are replaced by the single parameter k and σp is a density-dependent derived parameter. Continuous density-dependent spatial variation in σ may also be modelled, using a scaled non-Euclidean distance between detectors and the locations of animals. We illustrate these methods with data from automatic photography of tigers (Panthera tigris) across India, in which the variation is among populations, from mist-netting of ovenbirds (Seiurus aurocapilla) in Maryland, USA, in which the variation is within a single population over time, and from live-trapping of brushtail possums (Trichosurus vulpecula) in New Zealand, modelling spatial variation within one population. Possible applications and limitations of the methods are discussed. A model in which kp is constant, while density varies, provides a parsimonious null model for SECR. The parameter k of the null model is a concise summary of the empirical relationship between home-range size and density that is useful in comparative studies. We expect deviations from this model, particularly the dependence of kp on covariates, to be biologically interesting.

  18. Measurement of carrier transport and recombination parameter in heavily doped silicon

    NASA Technical Reports Server (NTRS)

    Swanson, Richard M.

    1986-01-01

    The minority carrier transport and recombination parameters in heavily doped bulk silicon were measured. Both Si:P and Si:B with bulk dopings from 10 to the 17th and 10 to the 20th power/cu cm were studied. It is shown that three parameters characterize transport in bulk heavily doped Si: the minority carrier lifetime tau, the minority carrier mobility mu, and the equilibrium minority carrier density of n sub 0 and p sub 0 (in p-type and n-type Si respectively.) However, dc current-voltage measurements can never measure all three of these parameters, and some ac or time-transient experiment is required to obtain the values of these parameters as a function of dopant density. Using both dc electrical measurements on bipolar transitors with heavily doped base regions and transients optical measurements on heavily doped bulk and epitaxially grown samples, lifetime, mobility, and bandgap narrowing were measured as a function of both p and n type dopant densities. Best fits of minority carrier mobility, bandgap narrowing and lifetime as a function of doping density (in the heavily doped range) were constructed to allow accurate modeling of minority carrier transport in heavily doped Si.

  19. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  20. Performance of conversion efficiency of a crystalline silicon solar cell with base doping density

    NASA Astrophysics Data System (ADS)

    Sahin, Gokhan; Kerimli, Genber; Barro, Fabe Idrissa; Sane, Moustapha; Alma, Mehmet Hakkı

    In this study, we investigate theoretically the electrical parameters of a crystalline silicon solar cell in steady state. Based on a one-dimensional modeling of the cell, the short circuit current density, the open circuit voltage, the shunt and series resistances and the conversion efficiency are calculated, taking into account the base doping density. Either the I-V characteristic, series resistance, shunt resistance and conversion efficiency are determined and studied versus base doping density. The effects applied of base doping density on these parameters have been studied. The aim of this work is to show how short circuit current density, open circuit voltage and parasitic resistances are related to the base doping density and to exhibit the role played by those parasitic resistances on the conversion efficiency of the crystalline silicon solar.

  1. Three-dimensional neutronics optimization of helium-cooled blanket for multi-functional experimental fusion-fission hybrid reactor (FDS-MFX)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, J.; Yuan, B.; Jin, M.

    2012-07-01

    Three-dimensional neutronics optimization calculations were performed to analyse the parameters of Tritium Breeding Ratio (TBR) and maximum average Power Density (PDmax) in a helium-cooled multi-functional experimental fusion-fission hybrid reactor named FDS (Fusion-Driven hybrid System)-MFX (Multi-Functional experimental) blanket. Three-stage tests will be carried out successively, in which the tritium breeding blanket, uranium-fueled blanket and spent-fuel-fueled blanket will be utilized respectively. In this contribution, the most significant and main goal of the FDS-MFX blanket is to achieve the PDmax of about 100 MW/m3 with self-sustaining tritium (TBR {>=} 1.05) based on the second-stage test with uranium-fueled blanket to check and validate themore » demonstrator reactor blanket relevant technologies based on the viable fusion and fission technologies. Four different enriched uranium materials were taken into account to evaluate PDmax in subcritical blanket: (i) natural uranium, (ii) 3.2% enriched uranium, (iii) 19.75% enriched uranium, and (iv) 64.4% enriched uranium carbide. These calculations and analyses were performed using a home-developed code VisualBUS and Hybrid Evaluated Nuclear Data Library (HENDL). The results showed that the performance of the blanket loaded with 64.4% enriched uranium was the most attractive and it could be promising to effectively obtain tritium self-sufficiency (TBR-1.05) and a high maximum average power density ({approx}100 MW/m{sup 3}) when the blanket was loaded with the mass of {sup 235}U about 1 ton. (authors)« less

  2. Alteration of lipid profile in subclinical hypothyroidism: a meta-analysis.

    PubMed

    Liu, Xiao-Li; He, Shan; Zhang, Shao-Fang; Wang, Jun; Sun, Xiu-Fa; Gong, Chun-Mei; Zheng, Shi-Jie; Zhou, Ji-Chang; Xu, Jian

    2014-08-14

    Previous studies yielded controversial results about the alteration of lipid profiles in patients with subclinical hypothyroidism. We performed a meta-analysis to investigate the association between subclinical hypothyroidism and lipid profiles. We searched PubMed, Cochrane Library, and China National Knowledge Infrastructure articles published January 1990 through January 2014. Dissertation databases (PQDT and CDMD) were searched for additional unpublished articles. We included articles reporting the relationship between subclinical hypothyroidism and at least 1 parameter of lipid profiles, and calculated the overall weighted mean difference (WMD) with a random effects model. Meta-regression was used to explore the source of heterogeneity among studies, and the Egger test, Begg test, and the trim and fill method were used to assess potential publication bias. Sixteen observational studies were included in our analysis. Meta-analysis suggested that the serum total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), and total triglyceride levels were significantly increased in patients with subclinical hypothyroidism compared with euthyroidism individuals; the WMD were 12.17 mg/dl, 7.01 mg/dl, and 13.19 mg/dl, respectively (P<0.001 for all). No significant difference was observed for serum high-density lipoprotein cholesterol (HDL-C). Match strategy was the main source of heterogeneity among studies in TC and LDL-C analysis. Potential publication bias was found in TC and LDL-C analysis by the Egger test or Begg test and was not confirmed by the trim and fill method. Subclinical hypothyroidism may correlate with altered lipid profile. Previous studies had limitations in the control of potential confounding factors and further studies should consider those factors.

  3. MEASURING PROTOPLANETARY DISK GAS SURFACE DENSITY PROFILES WITH ALMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Jonathan P.; McPartland, Conor, E-mail: jpw@ifa.hawaii.edu

    2016-10-10

    The gas and dust are spatially segregated in protoplanetary disks due to the vertical settling and radial drift of large grains. A fuller accounting of the mass content and distribution in disks therefore requires spectral line observations. We extend the modeling approach presented in Williams and Best to show that gas surface density profiles can be measured from high fidelity {sup 13}CO integrated intensity images. We demonstrate the methodology by fitting ALMA observations of the HD 163296 disk to determine a gas mass, M {sub gas} = 0.048 M {sub ⊙}, and accretion disk characteristic size R {sub c} =more » 213 au and gradient γ = 0.39. The same parameters match the C{sup 18}O 2–1 image and indicate an abundance ratio [{sup 12}CO]/[C{sup 18}O] of 700 independent of radius. To test how well this methodology can be applied to future line surveys of smaller, lower mass T Tauri disks, we create a large {sup 13}CO 2–1 image library and fit simulated data. For disks with gas masses 3–10 M {sub Jup} at 150 pc, ALMA observations with a resolution of 0.″2–0.″3 and integration times of ∼20 minutes allow reliable estimates of R {sub c} to within about 10 au and γ to within about 0.2. Economic gas imaging surveys are therefore feasible and offer the opportunity to open up a new dimension for studying disk structure and its evolution toward planet formation.« less

  4. Density perturbation in the models reconstructed from jerk parameter

    NASA Astrophysics Data System (ADS)

    Sinha, Srijita; Banerjee, Narayan

    2018-06-01

    The present work deals with the late time evolution of the linear density contrast in the dark energy models reconstructed from the jerk parameter. It is found that the non-interacting models are favoured compared to the models where an interaction is allowed in the dark sector.

  5. Free-Spinning Wind-Tunnel Tests of a Low-Wing Monoplane with Systematic Changes in Wings and Tails V : Effect of Airplane Relative Density

    NASA Technical Reports Server (NTRS)

    Seidman, Oscar; Neihouse, A I

    1940-01-01

    The reported tests are a continuation of an NACA investigation being made in the free-spinning wind tunnel to determine the effects of independent variations in load distribution, wing and tail arrangement, and control disposition on the spin characteristics of airplanes. The standard series of tests was repeated to determine the effect of airplane relative density. Tests were made at values of the relative-density parameter of 6.8, 8.4 (basic), and 12.0; and the results were analyzed. The tested variations in the relative-density parameter may be considered either as variations in the wing loading of an airplane spun at a given altitude, with the radii of gyration kept constant, or as a variation of the altitude at which the spin takes place for a given airplane. The lower values of the relative-density parameter correspond to the lower wing loadings or to the lower altitudes of the spin.

  6. Tunable non-interacting free-energy functionals: development and applications to low-density aluminum

    NASA Astrophysics Data System (ADS)

    Trickey, Samuel; Karasiev, Valentin

    We introduce the concept of tunable orbital-free non-interacting free-energy density functionals and present a generalized gradient approximation (GGA) with a subset of parameters defined from constraints and a few free parameters. Those free parameters are tuned to reproduce reference Kohn-Sham (KS) static-lattice pressures for Al at T=8 kK for bulk densities between 0.6 and 2 g/cm3. The tuned functional then is used in OF molecular dynamics (MD) simulations for Al with densities between 0.1 and 2 g/cm3 and T between 6 and 50 kK to calculate the equation of state and generate configurations for electrical conductivity calculations. The tunable functional produces accurate results. Computationally it is very effective especially at elevated temperature. Kohn-Shiam calculations for such low densities are affordable only up to T=10 kK, while other OF approximations, including two-point functionals, fail badly in that regime. Work supported by US DoE Grant DE-SC0002139.

  7. GPU Linear Algebra Libraries and GPGPU Programming for Accelerating MOPAC Semiempirical Quantum Chemistry Calculations.

    PubMed

    Maia, Julio Daniel Carvalho; Urquiza Carvalho, Gabriel Aires; Mangueira, Carlos Peixoto; Santana, Sidney Ramos; Cabral, Lucidio Anjos Formiga; Rocha, Gerd B

    2012-09-11

    In this study, we present some modifications in the semiempirical quantum chemistry MOPAC2009 code that accelerate single-point energy calculations (1SCF) of medium-size (up to 2500 atoms) molecular systems using GPU coprocessors and multithreaded shared-memory CPUs. Our modifications consisted of using a combination of highly optimized linear algebra libraries for both CPU (LAPACK and BLAS from Intel MKL) and GPU (MAGMA and CUBLAS) to hasten time-consuming parts of MOPAC such as the pseudodiagonalization, full diagonalization, and density matrix assembling. We have shown that it is possible to obtain large speedups just by using CPU serial linear algebra libraries in the MOPAC code. As a special case, we show a speedup of up to 14 times for a methanol simulation box containing 2400 atoms and 4800 basis functions, with even greater gains in performance when using multithreaded CPUs (2.1 times in relation to the single-threaded CPU code using linear algebra libraries) and GPUs (3.8 times). This degree of acceleration opens new perspectives for modeling larger structures which appear in inorganic chemistry (such as zeolites and MOFs), biochemistry (such as polysaccharides, small proteins, and DNA fragments), and materials science (such as nanotubes and fullerenes). In addition, we believe that this parallel (GPU-GPU) MOPAC code will make it feasible to use semiempirical methods in lengthy molecular simulations using both hybrid QM/MM and QM/QM potentials.

  8. High precision series solutions of differential equations: Ordinary and regular singular points of second order ODEs

    NASA Astrophysics Data System (ADS)

    Noreen, Amna; Olaussen, Kåre

    2012-10-01

    A subroutine for a very-high-precision numerical solution of a class of ordinary differential equations is provided. For a given evaluation point and equation parameters the memory requirement scales linearly with precision P, and the number of algebraic operations scales roughly linearly with P when P becomes sufficiently large. We discuss results from extensive tests of the code, and how one, for a given evaluation point and equation parameters, may estimate precision loss and computing time in advance. Program summary Program title: seriesSolveOde1 Catalogue identifier: AEMW_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 991 No. of bytes in distributed program, including test data, etc.: 488116 Distribution format: tar.gz Programming language: C++ Computer: PC's or higher performance computers. Operating system: Linux and MacOS RAM: Few to many megabytes (problem dependent). Classification: 2.7, 4.3 External routines: CLN — Class Library for Numbers [1] built with the GNU MP library [2], and GSL — GNU Scientific Library [3] (only for time measurements). Nature of problem: The differential equation -s2({d2}/{dz2}+{1-ν+-ν-}/{z}{d}/{dz}+{ν+ν-}/{z2})ψ(z)+{1}/{z} ∑n=0N vnznψ(z)=0, is solved numerically to very high precision. The evaluation point z and some or all of the equation parameters may be complex numbers; some or all of them may be represented exactly in terms of rational numbers. Solution method: The solution ψ(z), and optionally ψ'(z), is evaluated at the point z by executing the recursion A(z)={s-2}/{(m+1+ν-ν+)(m+1+ν-ν-)} ∑n=0N Vn(z)A(z), ψ(z)=ψ(z)+A(z), to sufficiently large m. Here ν is either ν+ or ν-, and Vn(z)=vnz. The recursion is initialized by A(z)=δzν,for n=0,1,…,N ψ(z)=A0(z). Restrictions: No solution is computed if z=0, or s=0, or if ν=ν- (assuming Reν+≥Reν-) with ν+-ν- an integer, except when ν+-ν-=1 and v =0 (i.e. when z is an ordinary point for zψ(z)). Additional comments: The code of the main algorithm is in the file seriesSolveOde1.cc, which "#include" the file checkForBreakOde1.cc. These routines, and the programs using them, must "#include" the file seriesSolveOde1.cc. Running time: On a Linux PC that is a few years old, at y=√{10} to an accuracy of P=200 decimal digits, evaluating the ground state wavefunction of the anharmonic oscillator (with the eigenvalue known in advance); (cf. Eq. (6)) takes about 2 ms, and about 40 min at an accuracy of P=100000 decimal digits. References: [1] B. Haible and R.B. Kreckel, CLN — Class Library for Numbers, http://www.ginac.de/CLN/ [2] T. Granlund and collaborators, GMP — The GNU Multiple Precision Arithmetic Library, http://gmplib.org/ [3] M. Galassi et al., GNU Scientific Library Reference Manual (3rd Ed.), ISBN 0954612078., http://www.gnu.org/software/gsl/

  9. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  10. Pair density waves in superconducting vortex halos

    NASA Astrophysics Data System (ADS)

    Wang, Yuxuan; Edkins, Stephen D.; Hamidian, Mohammad H.; Davis, J. C. Séamus; Fradkin, Eduardo; Kivelson, Steven A.

    2018-05-01

    We analyze the interplay between a d -wave uniform superconducting and a pair-density-wave (PDW) order parameter in the neighborhood of a vortex. We develop a phenomenological nonlinear sigma model, solve the saddle-point equation for the order-parameter configuration, and compute the resulting local density of states in the vortex halo. The intertwining of the two superconducting orders leads to a charge density modulation with the same periodicity as the PDW, which is twice the period of the charge density wave that arises as a second harmonic of the PDW itself. We discuss key features of the charge density modulation that can be directly compared with recent results from scanning tunneling microscopy and speculate on the role PDW order may play in the global phase diagram of the hole-doped cuprates.

  11. Mass spectrometric screening of ligands with lower off-rate from a clicked-based pooled library.

    PubMed

    Arai, Satoshi; Hirosawa, Shota; Oguchi, Yusuke; Suzuki, Madoka; Murata, Atsushi; Ishiwata, Shin'ichi; Takeoka, Shinji

    2012-08-13

    This paper describes a convenient screening method using ion trap electrospray ionization mass spectrometry to classify ligands to a target molecule in terms of kinetic parameters. We demonstrate this method in the screening of ligands to a hexahistidine tag from a pooled library synthesized by click chemistry. The ion trap mass spectrometry analysis revealed that higher stabilities of ligand-target complexes in the gas phase were related to lower dissociation rate constants, i.e., off-rates in solution. Finally, we prepared a fluorescent probe utilizing the ligand with lowest off-rate and succeeded in performing single molecule observations of hexahistidine-tagged myosin V walking on actin filaments.

  12. An improved model to estimate trapping parameters in polymeric materials and its application on normal and aged low-density polyethylenes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ning, E-mail: nl4g12@soton.ac.uk; He, Miao; Alghamdi, Hisham

    2015-08-14

    Trapping parameters can be considered as one of the important attributes to describe polymeric materials. In the present paper, a more accurate charge dynamics model has been developed, which takes account of charge dynamics in both volts-on and off stage into simulation. By fitting with measured charge data with the highest R-square value, trapping parameters together with injection barrier of both normal and aged low-density polyethylene samples were estimated using the improved model. The results show that, after long-term ageing process, the injection barriers of both electrons and holes is lowered, overall trap depth is shallower, and trap density becomesmore » much greater. Additionally, the changes in parameters for electrons are more sensitive than those of holes after ageing.« less

  13. Combinatorial investigation of Fe–B thin-film nanocomposites

    PubMed Central

    Brunken, Hayo; Grochla, Dario; Savan, Alan; Kieschnick, Michael; Meijer, Jan D; Ludwig, Alfred

    2011-01-01

    Combinatorial magnetron sputter deposition from elemental targets was used to create Fe–B composition spread type thin film materials libraries on thermally oxidized 4-in. Si wafers. The materials libraries consisting of wedge-type multilayer thin films were annealed at 500 or 700 °C to transform the multilayers into multiphase alloys. The libraries were characterized by nuclear reaction analysis, Rutherford backscattering, nanoindentation, vibrating sample magnetometry, x-ray diffraction (XRD) and transmission electron microscopy (TEM). Young's modulus and hardness values were related to the annealing parameters, structure and composition of the films. The magnetic properties of the films were improved by annealing in a H2 atmosphere, showing a more than tenfold decrease in the coercive field values in comparison to those of the vacuum-annealed films. The hardness values increased from 8 to 18 GPa when the annealing temperature was increased from 500 to 700 °C. The appearance of Fe2B phases, as revealed by XRD and TEM, had a significant effect on the mechanical properties of the films. PMID:27877435

  14. Genome-scale CRISPR-Cas9 Knockout and Transcriptional Activation Screening

    PubMed Central

    Joung, Julia; Konermann, Silvana; Gootenberg, Jonathan S.; Abudayyeh, Omar O.; Platt, Randall J.; Brigham, Mark D.; Sanjana, Neville E.; Zhang, Feng

    2017-01-01

    Forward genetic screens are powerful tools for the unbiased discovery and functional characterization of specific genetic elements associated with a phenotype of interest. Recently, the RNA-guided endonuclease Cas9 from the microbial CRISPR (clustered regularly interspaced short palindromic repeats) immune system has been adapted for genome-scale screening by combining Cas9 with pooled guide RNA libraries. Here we describe a protocol for genome-scale knockout and transcriptional activation screening using the CRISPR-Cas9 system. Custom- or ready-made guide RNA libraries are constructed and packaged into lentiviral vectors for delivery into cells for screening. As each screen is unique, we provide guidelines for determining screening parameters and maintaining sufficient coverage. To validate candidate genes identified from the screen, we further describe strategies for confirming the screening phenotype as well as genetic perturbation through analysis of indel rate and transcriptional activation. Beginning with library design, a genome-scale screen can be completed in 9–15 weeks followed by 4–5 weeks of validation. PMID:28333914

  15. Validation of tungsten cross sections in the neutron energy region up to 100 keV

    NASA Astrophysics Data System (ADS)

    Pigni, Marco T.; Žerovnik, Gašper; Leal, Luiz. C.; Trkov, Andrej

    2017-09-01

    Following a series of recent cross section evaluations on tungsten isotopes performed at Oak Ridge National Laboratory (ORNL), this paper presents the validation work carried out to test the performance of the evaluated cross sections based on lead-slowing-down (LSD) benchmarks conducted in Grenoble. ORNL completed the resonance parameter evaluation of four tungsten isotopes - 182,183,184,186W - in August 2014 and submitted it as an ENDF-compatible file to be part of the next release of the ENDF/B-VIII.0 nuclear data library. The evaluations were performed with support from the US Nuclear Criticality Safety Program in an effort to provide improved tungsten cross section and covariance data for criticality safety sensitivity analyses. The validation analysis based on the LSD benchmarks showed an improved agreement with the experimental response when the ORNL tungsten evaluations were included in the ENDF/B-VII.1 library. Comparison with the results obtained with the JEFF-3.2 nuclear data library are also discussed.

  16. Genome-scale CRISPR-Cas9 knockout and transcriptional activation screening.

    PubMed

    Joung, Julia; Konermann, Silvana; Gootenberg, Jonathan S; Abudayyeh, Omar O; Platt, Randall J; Brigham, Mark D; Sanjana, Neville E; Zhang, Feng

    2017-04-01

    Forward genetic screens are powerful tools for the unbiased discovery and functional characterization of specific genetic elements associated with a phenotype of interest. Recently, the RNA-guided endonuclease Cas9 from the microbial CRISPR (clustered regularly interspaced short palindromic repeats) immune system has been adapted for genome-scale screening by combining Cas9 with pooled guide RNA libraries. Here we describe a protocol for genome-scale knockout and transcriptional activation screening using the CRISPR-Cas9 system. Custom- or ready-made guide RNA libraries are constructed and packaged into lentiviral vectors for delivery into cells for screening. As each screen is unique, we provide guidelines for determining screening parameters and maintaining sufficient coverage. To validate candidate genes identified by the screen, we further describe strategies for confirming the screening phenotype, as well as genetic perturbation, through analysis of indel rate and transcriptional activation. Beginning with library design, a genome-scale screen can be completed in 9-15 weeks, followed by 4-5 weeks of validation.

  17. GuiTope: an application for mapping random-sequence peptides to protein sequences.

    PubMed

    Halperin, Rebecca F; Stafford, Phillip; Emery, Jack S; Navalkar, Krupa Arun; Johnston, Stephen Albert

    2012-01-03

    Random-sequence peptide libraries are a commonly used tool to identify novel ligands for binding antibodies, other proteins, and small molecules. It is often of interest to compare the selected peptide sequences to the natural protein binding partners to infer the exact binding site or the importance of particular residues. The ability to search a set of sequences for similarity to a set of peptides may sometimes enable the prediction of an antibody epitope or a novel binding partner. We have developed a software application designed specifically for this task. GuiTope provides a graphical user interface for aligning peptide sequences to protein sequences. All alignment parameters are accessible to the user including the ability to specify the amino acid frequency in the peptide library; these frequencies often differ significantly from those assumed by popular alignment programs. It also includes a novel feature to align di-peptide inversions, which we have found improves the accuracy of antibody epitope prediction from peptide microarray data and shows utility in analyzing phage display datasets. Finally, GuiTope can randomly select peptides from a given library to estimate a null distribution of scores and calculate statistical significance. GuiTope provides a convenient method for comparing selected peptide sequences to protein sequences, including flexible alignment parameters, novel alignment features, ability to search a database, and statistical significance of results. The software is available as an executable (for PC) at http://www.immunosignature.com/software and ongoing updates and source code will be available at sourceforge.net.

  18. Autosophy information theory provides lossless data and video compression based on the data content

    NASA Astrophysics Data System (ADS)

    Holtz, Klaus E.; Holtz, Eric S.; Holtz, Diana

    1996-09-01

    A new autosophy information theory provides an alternative to the classical Shannon information theory. Using the new theory in communication networks provides both a high degree of lossless compression and virtually unbreakable encryption codes for network security. The bandwidth in a conventional Shannon communication is determined only by the data volume and the hardware parameters, such as image size; resolution; or frame rates in television. The data content, or what is shown on the screen, is irrelevant. In contrast, the bandwidth in autosophy communication is determined only by data content, such as novelty and movement in television images. It is the data volume and hardware parameters that become irrelevant. Basically, the new communication methods use prior 'knowledge' of the data, stored in a library, to encode subsequent transmissions. The more 'knowledge' stored in the libraries, the higher the potential compression ratio. 'Information' is redefined as that which is not already known by the receiver. Everything already known is redundant and need not be re-transmitted. In a perfect communication each transmission code, called a 'tip,' creates a new 'engram' of knowledge in the library in which each tip transmission can represent any amount of data. Autosophy theories provide six separate learning modes, or omni dimensional networks, all of which can be used for data compression. The new information theory reveals the theoretical flaws of other data compression methods, including: the Huffman; Ziv Lempel; LZW codes and commercial compression codes such as V.42bis and MPEG-2.

  19. The association between lipid parameters and obesity in university students.

    PubMed

    Hertelyova, Z; Salaj, R; Chmelarova, A; Dombrovsky, P; Dvorakova, M C; Kruzliak, P

    2016-07-01

    Abdominal obesity is associated with high plasma triglyceride and with low plasma high-density lipoprotein cholesterol levels. Objective of the study was to find an association between plasma lipid and lipoprotein levels and anthropometric parameters in abdominal obesity in Slovakian university students. Lipid profile and anthropometric parameters of obesity were studied in a sample of 419 probands, including 137 men and 282 women. Males had higher values of non-high-density lipoprotein cholesterol (non-HDL-C), low-density lipoprotein cholesterol (LDL-C), triglycerides (TG) and very low-density lipoprotein cholesterol (VLDL-C) than females, but these differences were not significant. Females had significantly (P < 0.05) higher TC and HDL-C (P < 0.001) than males. In comparison, all anthropometric parameters in the males were significantly (P < 0.001) higher than in the females. A positive correlation between non-HDL-C, TG, VLDL-C and anthropometric parameters (BMI, WC, WHR, WHtR) was found at P < 0.001. LDL was positively correlated with BMI, WCF, WHtR and TC with BMI, WHtR at P < 0.001. We also observed a correlation between TC-WCF and LDL-WHR at P < 0.01. A negative correlation was found between HDL and all monitored anthropometric parameters at P < 0.001. On the other hand, no correlation between TC and WHR was detected. This study shows an association between plasma lipid and lipoprotein levels and anthropometric parameters in abdominal obesity in young people, predominantly university students.

  20. The Shannon entropy information for mixed Manning Rosen potential in D-dimensional Schrodinger equation

    NASA Astrophysics Data System (ADS)

    Suparmi, A.; Cari, C.; Nur Pratiwi, Beta; Arya Nugraha, Dewanta

    2017-01-01

    D dimensional Schrodinger equation for the mixed Manning Rosen potential was investigated using supersymmetric quantum mechanics. We obtained the energy eigenvalues from radial part solution and wavefunctions in radial and angular parts solution. From the lowest radial wavefunctions, we evaluated the Shannon entropy information using Matlab software. Based on the entropy densities demonstrated graphically, we obtained that the wave of position information entropy density moves right when the value of potential parameter q increases, while its wave moves left with the increase of parameter α. The wave of momentum information entropy densities were expressed in graphs. We observe that its amplitude increase with increasing parameter q and α

  1. Evaluation of stream water quality in Atlanta, Georgia, and the surrounding region (USA)

    USGS Publications Warehouse

    Peters, N.E.; Kandell, S.J.

    1999-01-01

    A water-quality index (WQI) was developed from historical data (1986-1995) for streams in the Atlanta Region and augmented with 'new' and generally more comprehensive biweekly data on four small urban streams, representing an industrial area, a developed medium-density residential area and developing and developed low-density residential areas. Parameter WQIs were derived from percentile ranks of individual water-quality parameter values for each site by normalizing the constituent ranks for values from all sites in the area for a base period, i.e. 1990-1995. WQIs were developed primarily for nutrient-related parameters due to data availability. Site WQIs, which were computed by averaging the parameter WQIs, range from 0.2 (good quality) to 0.8 (poor quality), and increased downstream of known nutrient sources. Also, annual site WQI decreases from 1986 to 1995 at most long-term monitoring sites. Annual site WQI for individual parameters correlated with annual hydrological characteristics, particularly runoff, precipitation quantity, and water yield, reflecting the effect of dilution on parameter values. The WQIs of the four small urban streams were evaluated for the core-nutrient-related parameters, parameters for specific dissolved trace metal concentrations and sediment characteristics, and a species diversity index for the macro-invertebrate taxa. The site WQI for the core-nutrient-related parameters used in the retrospective analysis was, as expected, the worst for the industrial area and the best for the low-density residential areas. However, macro-invertebrate data indicate that although the species at the medium-density residential site were diverse, the taxa at the site were for species tolerant of degraded water quality. Furthermore, although a species-diversity index indicates no substantial difference between the two low-density residential areas, the number for macro-invertebrates for the developing area was much less than that for the developed area, consistent with observations of recent sediment problems probably associated with construction in the basin. However, sediment parameters were similar for the two sites suggesting that the routine biweekly measurements may not capture the short-term increases in sediment transport associated with rainstorms. The WQI technique is limited by the number and types of parameters included in it, the general conditions of those parameters for the range of conditions in area streams, and by the effects of external factors, such as hydrology, and therefore, should be used with caution.

  2. High throughput screening using acoustic droplet ejection to combine protein crystals and chemical libraries on crystallization plates at high density

    DOE PAGES

    Teplitsky, Ella; Joshi, Karan; Ericson, Daniel L.; ...

    2015-07-01

    We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using thismore » system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. Moreover, a fragment mini-library was screened to observe two known lysozyme We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using this system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. A fragment mini-library was screened to observe two known lysozyme ligands using both co-crystallization and soaking. A similar approach was used to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.ds using both co-crystallization and soaking. We used a A similar approach to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.« less

  3. The Relationship between Ionospheric Slab Thickness and the Peak Density Height, hmF2

    NASA Astrophysics Data System (ADS)

    Meehan, J.; Sojka, J. J.

    2017-12-01

    The electron density profile is one of the most critical elements in the ionospheric modeling-related applications today. Ionosphere parameters, hmF2, the height of the peak density layer, and slab thickness, the ratio of the total electron content, TEC, to the peak density value, NmF2, are generally obtained from any global sounding observation network and are easily incorporated into models, theoretical or empirical, as numerical representations. Slab thickness is a convenient one-parameter summary of the electron density profile and can relate a variety of elements of interest that effect the overall electron profile shape, such as the neutral and ionospheric temperatures and gradients, the ionospheric composition, and dynamics. Using ISR data from the 2002 Millstone Hill ISR data campaign, we found, for the first time, slab thickness to be correlated to hmF2. For this, we introduce a new ionospheric index, k, which ultimately relates electron density parameters and can be a very useful tool for describing the topside ionosphere shape. Our study is an initial one location, one season, 30-day study, and future work is needed to verify the robustness of our claim. Generally, the ionospheric profile shape, requires knowledge of several ionospheric parameters: electron, ion and neutral temperatures, ion composition, electric fields, and neutral winds, and is dependent upon seasons, local time, location, and the level of solar and geomagnetic activity; however, with this new index, only readily-available, ionospheric density information is needed. Such information, as used in this study, is obtained from a bottomside electron density profile provided by an ionosonde, and TEC data provided by a local, collocated GPS receiver.

  4. Micro-computed tomography assessment of human alveolar bone: bone density and three-dimensional micro-architecture.

    PubMed

    Kim, Yoon Jeong; Henkin, Jeffrey

    2015-04-01

    Micro-computed tomography (micro-CT) is a valuable means to evaluate and secure information related to bone density and quality in human necropsy samples and small live animals. The aim of this study was to assess the bone density of the alveolar jaw bones in human cadaver, using micro-CT. The correlation between bone density and three-dimensional micro architecture of trabecular bone was evaluated. Thirty-four human cadaver jaw bone specimens were harvested. Each specimen was scanned with micro-CT at resolution of 10.5 μm. The bone volume fraction (BV/TV) and the bone mineral density (BMD) value within a volume of interest were measured. The three-dimensional micro architecture of trabecular bone was assessed. All the parameters in the maxilla and the mandible were subject to comparison. The variables for the bone density and the three-dimensional micro architecture were analyzed for nonparametric correlation using Spearman's rho at the significance level of p < .05. A wide range of bone density was observed. There was a significant difference between the maxilla and mandible. All micro architecture parameters were consistently higher in the mandible, up to 3.3 times greater than those in the maxilla. The most linear correlation was observed between BV/TV and BMD, with Spearman's rho = 0.99 (p = .01). Both BV/TV and BMD were highly correlated with all micro architecture parameters with Spearman's rho above 0.74 (p = .01). Two aspects of bone density using micro-CT, the BV/TV and BMD, are highly correlated with three-dimensional micro architecture parameters, which represent the quality of trabecular bone. This noninvasive method may adequately enhance evaluation of the alveolar bone. © 2013 Wiley Periodicals, Inc.

  5. Individual Colorimetric Observer Model

    PubMed Central

    Asano, Yuta; Fairchild, Mark D.; Blondé, Laurent

    2016-01-01

    This study proposes a vision model for individual colorimetric observers. The proposed model can be beneficial in many color-critical applications such as color grading and soft proofing to assess ranges of color matches instead of a single average match. We extended the CIE 2006 physiological observer by adding eight additional physiological parameters to model individual color-normal observers. These eight parameters control lens pigment density, macular pigment density, optical densities of L-, M-, and S-cone photopigments, and λmax shifts of L-, M-, and S-cone photopigments. By identifying the variability of each physiological parameter, the model can simulate color matching functions among color-normal populations using Monte Carlo simulation. The variabilities of the eight parameters were identified through two steps. In the first step, extensive reviews of past studies were performed for each of the eight physiological parameters. In the second step, the obtained variabilities were scaled to fit a color matching dataset. The model was validated using three different datasets: traditional color matching, applied color matching, and Rayleigh matches. PMID:26862905

  6. Application of a high-energy-density permanent magnet material in underwater systems

    NASA Astrophysics Data System (ADS)

    Cho, C. P.; Egan, C.; Krol, W. P.

    1996-06-01

    This paper addresses the application of high-energy-density permanent magnet (PM) technology to (1) the brushless, axial-field PM motor and (2) the integrated electric motor/pump system for under-water applications. Finite-element analysis and lumped parameter magnetic circuit analysis were used to calculate motor parameters and performance characteristics and to conduct tradeoff studies. Compact, efficient, reliable, and quiet underwater systems are attainable with the development of high-energy-density PM material, power electronic devices, and power integrated-circuit technology.

  7. Validation of photosynthetic-fluorescence parameters as biomarkers for isoproturon toxic effect on alga Scenedesmus obliquus.

    PubMed

    Dewez, David; Didur, Olivier; Vincent-Héroux, Jonathan; Popovic, Radovan

    2008-01-01

    Photosynthetic-fluorescence parameters were investigated to be used as valid biomarkers of toxicity when alga Scenedesmus obliquus was exposed to isoproturon [3-(4-isopropylphenyl)-1,1-dimethylurea] effect. Chlorophyll fluorescence induction of algal cells treated with isoproturon showed inactivation of photosystem II (PSII) reaction centers and strong inhibition of PSII electron transport. A linear correlation was found (R2>or=0.861) between the change of cells density affected by isoproturon and the change of effective PSII quantum yield (PhiM'), photochemical quenching (qP) and relative photochemical quenching (qP(rel)) values. The cells density was also linearly dependent (R2=0.838) on the relative unquenched fluorescence parameter (UQF(rel)). Non-linear correlation was found (R2=0.937) only between cells density and the energy transfer efficiency from absorbed light to PSII reaction center (ABS/RC). The order of sensitivity determined by the EC-50% was: UQF(rel)>PhiM'>qP>qP(rel)>ABS/RC. Correlations between cells density and those photosynthetic-fluorescence parameters provide supporting evidence to use them as biomarkers of toxicity for environmental pollutants.

  8. High-Content Surface and Total Expression siRNA Kinase Library Screen with VX-809 Treatment Reveals Kinase Targets that Enhance F508del-CFTR Rescue.

    PubMed

    Perkins, Lydia A; Fisher, Gregory W; Naganbabu, Matharishwan; Schmidt, Brigitte F; Mun, Frederick; Bruchez, Marcel P

    2018-03-05

    The most promising F508del-CFTR corrector, VX-809, has been unsuccessful as an effective, stand-alone treatment for CF patients, but the rescue effect in combination with other drugs may confer an acceptable level of therapeutic benefit. Targeting cellular factors that modify trafficking may act to enhance the cell surface density of F508-CFTR with VX-809 correction. Our goal is to identify druggable kinases that enhance F508del-CFTR rescue and stabilization at the cell surface beyond that achievable with the VX-809 corrector alone. To achieve this goal, we implemented a new high-throughput screening paradigm that quickly and quantitatively measures surface density and total protein in the same cells. This allowed for rapid screening for increased surface targeting and proteostatic regulation. The assay utilizes fluorogen-activating-protein (FAP) technology with cell excluded and cell permeant fluorogenic dyes in a quick, wash-free fluorescent plate reader format on live cells to first measure F508del-CFTR expressed on the surface and then the total amount of F508del-CFTR protein present. To screen for kinase targets, we used Dharmacon's ON-TARGET plus SMARTpool siRNA Kinase library (715 target kinases) with and without 10 μM VX-809 treatment in triplicate at 37 °C. We identified several targets that had a significant interaction with VX-809 treatment in enhancing surface density with siRNA knockdown. Select small-molecule inhibitors of the kinase targets demonstrated augmented surface expression with VX-809 treatment.

  9. Seasonal and spatial variations in microbial activity at various phylogenetic resolutions at a groundwater - surface water interface.

    PubMed

    Yu, Ran; Smets, Barth F; Gan, Ping; MacKay, Allison A; Graf, Joerg

    2014-05-01

    We investigated the seasonal and spatial variation in activity and density of the metabolically active in situ microbial community (AIMC) at a landfill leachate-impacted groundwater - surface water interface (GSI). A series of AIMC traps were designed and implemented for AIMC sampling and microbial activity and density examinations. Measurements were made not only at the level of bacterial domain but also at the levels of alphaproteobacterial Rhizobiales order and gammaproteobacterial Pseudomonas genus, both of which included a large number of iron-oxidizing bacteria as revealed from previous analysis. Consistently higher microbial activities with less variation in depth were measured in the AIMC traps than in the ambient sediments. Flood disturbance appeared to control AIMC activity distributions at the gradually elevated GSI. The highest AIMC activities were generally obtained from locations closest to the free surface water boundary except during the dry season when microbial activities were similar across the entire GSI. A clone library of AIMC 16S rRNA genes was constructed, and it confirmed the predominant role of the targeted alphaproteobacterial group in AIMC activity and composition. This taxon constituted 2%-14% of all bacteria with similar activity distribution profiles. The Pseudomonas group occupied only 0.1‰-0.5‰ of the total bacterial density, but its activity was 27 times higher than the bacterial average. Of the 16S rRNA sequences in the AIMC clone library, 7.5% were phylogenetically related to putative IOB, supporting the occurrence and persistence of active microbial iron oxidation across the studied iron-rich GSI ecosystem.

  10. Equation of State for Detonation Product Gases

    NASA Astrophysics Data System (ADS)

    Nagayama, Kunihito; Kubota, Shiro

    2013-06-01

    Based on the empirical linear relationship between detonation velocity and loading density, an approximate description for the Chapman-Jouguet state for detonation product gases of solid phase high explosives has been developed. Provided that the Grüneisen parameter is a function only of volume, systematic and closed system of equations for the Grüneisen parameter and CJ volume have been formulated. These equations were obtained by combining this approximation with the Jones-Stanyukovich-Manson relation together with JWL isentrope for detonation of crystal density PETN. A thermodynamic identity between the Grüneisen parameter and another non-dimensional material parameter introduced by Wu and Jing can be used to derive the enthalpy-pressure-volume equation of state for detonation gases. This Wu-Jing parameter is found to be the ratio of the Grüneisen parameter and the adiabatic index. Behavior of this parameter as a function of pressure was calculated and revealed that their change with pressure is very gradual. By using this equation of state, several isentropes down from the Chapman-Jouguet states reached by four different lower initial density PETN have been calculated and compared with available cylinder expansion tests.

  11. Relating saturation capacity to charge density in strong cation exchangers.

    PubMed

    Steinebach, Fabian; Coquebert de Neuville, Bertrand; Morbidelli, Massimo

    2017-07-21

    In this work the relation between physical and chemical resin characteristics and the total amount of adsorbed protein (saturation capacity) for ion-exchange resins is discussed. Eleven different packing materials with a sulfo-functionalization and one multimodal resin were analyzed in terms of their porosity, pore size distribution, ligand density and binding capacity. By specifying the ligand density and binding capacity by the total and accessible surface area, two different groups of resins were identified: Below a ligand density of approx. 2.5μmol/m 2 area the ligand density controls the saturation capacity, while above this limit the accessible surface area becomes the limiting factor. This results in a maximum protein uptake of around 2.5mg/m 2 of accessible surface area. The obtained results allow estimating the saturation capacity from independent resin characteristics like the saturation capacity mainly depends on "library data" such as the accessible and total surface area and the charge density. Hence these results give an insight into the fundamentals of protein adsorption and help to find suitable resins, thus limiting the experimental effort in early process development stages. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A two-parameter family of double-power-law biorthonormal potential-density expansions

    NASA Astrophysics Data System (ADS)

    Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn

    2018-07-01

    We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in a closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.

  13. A two-parameter family of double-power-law biorthonormal potential-density expansions

    NASA Astrophysics Data System (ADS)

    Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn

    2018-05-01

    We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. (2017a) expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.

  14. A two-parameter family of double-power-law biorthonormal potential-density expansions

    NASA Astrophysics Data System (ADS)

    Lilley, Edward J.; Sanders, Jason L.; Evans, N. Wyn

    2018-05-01

    We present a two-parameter family of biorthonormal double-power-law potential-density expansions. Both the potential and density are given in closed analytic form and may be rapidly computed via recurrence relations. We show that this family encompasses all the known analytic biorthonormal expansions: the Zhao expansions (themselves generalizations of ones found earlier by Hernquist & Ostriker and by Clutton-Brock) and the recently discovered Lilley et al. (2018b) expansion. Our new two-parameter family includes expansions based around many familiar spherical density profiles as zeroth-order models, including the γ models and the Jaffe model. It also contains a basis expansion that reproduces the famous Navarro-Frenk-White (NFW) profile at zeroth order. The new basis expansions have been found via a systematic methodology which has wide applications in finding other new expansions. In the process, we also uncovered a novel integral transform solution to Poisson's equation.

  15. Task-oriented comparison of power spectral density estimation methods for quantifying acoustic attenuation in diagnostic ultrasound using a reference phantom method.

    PubMed

    Rosado-Mendez, Ivan M; Nam, Kibo; Hall, Timothy J; Zagzebski, James A

    2013-07-01

    Reported here is a phantom-based comparison of methods for determining the power spectral density (PSD) of ultrasound backscattered signals. Those power spectral density values are then used to estimate parameters describing α(f), the frequency dependence of the acoustic attenuation coefficient. Phantoms were scanned with a clinical system equipped with a research interface to obtain radiofrequency echo data. Attenuation, modeled as a power law α(f)= α0 f (β), was estimated using a reference phantom method. The power spectral density was estimated using the short-time Fourier transform (STFT), Welch's periodogram, and Thomson's multitaper technique, and performance was analyzed when limiting the size of the parameter-estimation region. Errors were quantified by the bias and standard deviation of the α0 and β estimates, and by the overall power-law fit error (FE). For parameter estimation regions larger than ~34 pulse lengths (~1 cm for this experiment), an overall power-law FE of 4% was achieved with all spectral estimation methods. With smaller parameter estimation regions as in parametric image formation, the bias and standard deviation of the α0 and β estimates depended on the size of the parameter estimation region. Here, the multitaper method reduced the standard deviation of the α0 and β estimates compared with those using the other techniques. The results provide guidance for choosing methods for estimating the power spectral density in quantitative ultrasound methods.

  16. Effects of phenotypic plasticity on pathogen transmission in the field in a Lepidoptera-NPV system.

    PubMed

    Reeson, A F; Wilson, K; Cory, J S; Hankard, P; Weeks, J M; Goulson, D; Hails, R S

    2000-08-01

    In models of insect-pathogen interactions, the transmission parameter (ν) is the term that describes the efficiency with which pathogens are transmitted between hosts. There are two components to the transmission parameter, namely the rate at which the host encounters pathogens (contact rate) and the rate at which contact between host and pathogen results in infection (host susceptibility). Here it is shown that in larvae of Spodoptera exempta (Lepidoptera: Noctuidae), in which rearing density triggers the expression of one of two alternative phenotypes, the high-density morph is associated with an increase in larval activity. This response is likely to result in an increase in the contact rate between hosts and pathogens. Rearing density is also known to affect susceptibility of S. exempta to pathogens, with the high-density morph showing increased resistance to a baculovirus. In order to determine whether density-dependent differences observed in the laboratory might affect transmission in the wild, a field trial was carried out to estimate the transmission parameter for S. exempta and its nuclear polyhedrosis virus (NPV). The transmission parameter was found to be significantly higher among larvae reared in isolation than among those reared in crowds. Models of insect-pathogen interactions, in which the transmission parameter is assumed to be constant, will therefore not fully describe the S. exempta-NPV system. The finding that crowding can influence transmission in this way has major implications for both the long-term population dynamics and the invasion dynamics of insect-pathogen systems.

  17. Science Library of Test Items. Volume Twelve. Mastery Testing Programme. [Mastery Tests Series 4.] Tests M39-M50.

    ERIC Educational Resources Information Center

    New South Wales Dept. of Education, Sydney (Australia).

    As part of a series of tests to measure mastery of specific skills in the natural sciences, copies of tests 39 through 50 include: (39) using a code; (40) naming the parts of a microscope; (41) calculating density and predicting flotation; (42) estimating metric length; (43) using SI symbols; (44) using s=vt; (45) applying a novel theory; (46)…

  18. Validation of Ionosonde Electron Density Reconstruction Algorithms with IONOLAB-RAY in Central Europe

    NASA Astrophysics Data System (ADS)

    Gok, Gokhan; Mosna, Zbysek; Arikan, Feza; Arikan, Orhan; Erdem, Esra

    2016-07-01

    Ionospheric observation is essentially accomplished by specialized radar systems called ionosondes. The time delay between the transmitted and received signals versus frequency is measured by the ionosondes and the received signals are processed to generate ionogram plots, which show the time delay or reflection height of signals with respect to transmitted frequency. The critical frequencies of ionospheric layers and virtual heights, that provide useful information about ionospheric structurecan be extracted from ionograms . Ionograms also indicate the amount of variability or disturbances in the ionosphere. With special inversion algorithms and tomographical methods, electron density profiles can also be estimated from the ionograms. Although structural pictures of ionosphere in the vertical direction can be observed from ionosonde measurements, some errors may arise due to inaccuracies that arise from signal propagation, modeling, data processing and tomographic reconstruction algorithms. Recently IONOLAB group (www.ionolab.org) developed a new algorithm for effective and accurate extraction of ionospheric parameters and reconstruction of electron density profile from ionograms. The electron density reconstruction algorithm applies advanced optimization techniques to calculate parameters of any existing analytical function which defines electron density with respect to height using ionogram measurement data. The process of reconstructing electron density with respect to height is known as the ionogram scaling or true height analysis. IONOLAB-RAY algorithm is a tool to investigate the propagation path and parameters of HF wave in the ionosphere. The algorithm models the wave propagation using ray representation under geometrical optics approximation. In the algorithm , the structural ionospheric characteristics arerepresented as realistically as possible including anisotropicity, inhomogenity and time dependence in 3-D voxel structure. The algorithm is also used for various purposes including calculation of actual height and generation of ionograms. In this study, the performance of electron density reconstruction algorithm of IONOLAB group and standard electron density profile algorithms of ionosondes are compared with IONOLAB-RAY wave propagation simulation in near vertical incidence. The electron density reconstruction and parameter extraction algorithms of ionosondes are validated with the IONOLAB-RAY results both for quiet anddisturbed ionospheric states in Central Europe using ionosonde stations such as Pruhonice and Juliusruh . It is observed that IONOLAB ionosonde parameter extraction and electron density reconstruction algorithm performs significantly better compared to standard algorithms especially for disturbed ionospheric conditions. IONOLAB-RAY provides an efficient and reliable tool to investigate and validate ionosonde electron density reconstruction algorithms, especially in determination of reflection height (true height) of signals and critical parameters of ionosphere. This study is supported by TUBITAK 114E541, 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  19. Demand-Adjusted Shelf Availability Parameters: A Second Look.

    ERIC Educational Resources Information Center

    Schwarz, Philip

    1983-01-01

    Data gathered in application of Paul Kantor's demand-adjusted shelf availability model to medium-sized academic library indicate significant differences in shelf availability when data are analyzed by last circulation date, acquisition date, and imprint date, and when they are gathered during periods of low and high use. Ten references are cited.…

  20. A library least-squares approach for scatter correction in gamma-ray tomography

    NASA Astrophysics Data System (ADS)

    Meric, Ilker; Anton Johansen, Geir; Valgueiro Malta Moreira, Icaro

    2015-03-01

    Scattered radiation is known to lead to distortion in reconstructed images in Computed Tomography (CT). The effects of scattered radiation are especially more pronounced in non-scanning, multiple source systems which are preferred for flow imaging where the instantaneous density distribution of the flow components is of interest. In this work, a new method based on a library least-squares (LLS) approach is proposed as a means of estimating the scatter contribution and correcting for this. The validity of the proposed method is tested using the 85-channel industrial gamma-ray tomograph previously developed at the University of Bergen (UoB). The results presented here confirm that the LLS approach can effectively estimate the amounts of transmission and scatter components in any given detector in the UoB gamma-ray tomography system.

  1. Model development and validation of geometrically complex eddy current coils using finite element methods

    NASA Astrophysics Data System (ADS)

    Brown, Alexander; Eviston, Connor

    2017-02-01

    Multiple FEM models of complex eddy current coil geometries were created and validated to calculate the change of impedance due to the presence of a notch. Capable realistic simulations of eddy current inspections are required for model assisted probability of detection (MAPOD) studies, inversion algorithms, experimental verification, and tailored probe design for NDE applications. An FEM solver was chosen to model complex real world situations including varying probe dimensions and orientations along with complex probe geometries. This will also enable creation of a probe model library database with variable parameters. Verification and validation was performed using other commercially available eddy current modeling software as well as experimentally collected benchmark data. Data analysis and comparison showed that the created models were able to correctly model the probe and conductor interactions and accurately calculate the change in impedance of several experimental scenarios with acceptable error. The promising results of the models enabled the start of an eddy current probe model library to give experimenters easy access to powerful parameter based eddy current models for alternate project applications.

  2. An algorithm-based topographical biomaterials library to instruct cell fate

    PubMed Central

    Unadkat, Hemant V.; Hulsman, Marc; Cornelissen, Kamiel; Papenburg, Bernke J.; Truckenmüller, Roman K.; Carpenter, Anne E.; Wessling, Matthias; Post, Gerhard F.; Uetz, Marc; Reinders, Marcel J. T.; Stamatialis, Dimitrios; van Blitterswijk, Clemens A.; de Boer, Jan

    2011-01-01

    It is increasingly recognized that material surface topography is able to evoke specific cellular responses, endowing materials with instructive properties that were formerly reserved for growth factors. This opens the window to improve upon, in a cost-effective manner, biological performance of any surface used in the human body. Unfortunately, the interplay between surface topographies and cell behavior is complex and still incompletely understood. Rational approaches to search for bioactive surfaces will therefore omit previously unperceived interactions. Hence, in the present study, we use mathematical algorithms to design nonbiased, random surface features and produce chips of poly(lactic acid) with 2,176 different topographies. With human mesenchymal stromal cells (hMSCs) grown on the chips and using high-content imaging, we reveal unique, formerly unknown, surface topographies that are able to induce MSC proliferation or osteogenic differentiation. Moreover, we correlate parameters of the mathematical algorithms to cellular responses, which yield novel design criteria for these particular parameters. In conclusion, we demonstrate that randomized libraries of surface topographies can be broadly applied to unravel the interplay between cells and surface topography and to find improved material surfaces. PMID:21949368

  3. Analyzing Spacecraft Telecommunication Systems

    NASA Technical Reports Server (NTRS)

    Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric

    2004-01-01

    Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.

  4. Comparison of methods for library construction and short read annotation of shellfish viral metagenomes.

    PubMed

    Wei, Hong-Ying; Huang, Sheng; Wang, Jiang-Yong; Gao, Fang; Jiang, Jing-Zhe

    2018-03-01

    The emergence and widespread use of high-throughput sequencing technologies have promoted metagenomic studies on environmental or animal samples. Library construction for metagenome sequencing and annotation of the produced sequence reads are important steps in such studies and influence the quality of metagenomic data. In this study, we collected some marine mollusk samples, such as Crassostrea hongkongensis, Chlamys farreri, and Ruditapes philippinarum, from coastal areas in South China. These samples were divided into two batches to compare two library construction methods for shellfish viral metagenome. Our analysis showed that reverse-transcribing RNA into cDNA and then amplifying it simultaneously with DNA by whole genome amplification (WGA) yielded a larger amount of DNA compared to using only WGA or WTA (whole transcriptome amplification). Moreover, higher quality libraries were obtained by agarose gel extraction rather than with AMPure bead size selection. However, the latter can also provide good results if combined with the adjustment of the filter parameters. This, together with its simplicity, makes it a viable alternative. Finally, we compared three annotation tools (BLAST, DIAMOND, and Taxonomer) and two reference databases (NCBI's NR and Uniprot's Uniref). Considering the limitations of computing resources and data transfer speed, we propose the use of DIAMOND with Uniref for annotating metagenomic short reads as its running speed can guarantee a good annotation rate. This study may serve as a useful reference for selecting methods for Shellfish viral metagenome library construction and read annotation.

  5. Continuous quality improvement using intelligent infusion pump data analysis.

    PubMed

    Breland, Burnis D

    2010-09-01

    The use of continuous quality-improvement (CQI) processes in the implementation of intelligent infusion pumps in a community teaching hospital is described. After the decision was made to implement intelligent i.v. infusion pumps in a 413-bed, community teaching hospital, drug libraries for use in the safety software had to be created. Before drug libraries could be created, it was necessary to determine the epidemiology of medication use in various clinical care areas. Standardization of medication administration was performed through the CQI process, using practical knowledge of clinicians at the bedside and evidence-based drug safety parameters in the scientific literature. Post-implementation, CQI allowed refinement of clinically important safety limits while minimizing inappropriate, meaningless soft limit alerts on a few select agents. Assigning individual clinical care areas (CCAs) to individual patient care units facilitated customization of drug libraries and identification of specific CCA compliance concerns. Between June 2007 and June 2008, there were seven library updates. These involved drug additions and deletions, customization of individual CCAs, and alterations of limits. Overall compliance with safety software use rose over time, from 33% in November 2006 to over 98% in December 2009. Many potentially clinically significant dosing errors were intercepted by the safety software, prompting edits by end users. Only 4-6% of soft limit alerts resulted in edits. Compliance rates for use of infusion pump safety software varied among CCAs over time. Education, auditing, and refinement of drug libraries led to improved compliance in most CCAs.

  6. Assessment of various parameters to improve MALDI-TOF MS reference spectra libraries constructed for the routine identification of filamentous fungi

    PubMed Central

    2013-01-01

    Background The poor reproducibility of matrix-assisted desorption/ionization time-of-flight (MALDI-TOF) spectra limits the effectiveness of the MALDI-TOF MS-based identification of filamentous fungi with highly heterogeneous phenotypes in routine clinical laboratories. This study aimed to enhance the MALDI-TOF MS-based identification of filamentous fungi by assessing several architectures of reference spectrum libraries. Results We established reference spectrum libraries that included 30 filamentous fungus species with various architectures characterized by distinct combinations of the following: i) technical replicates, i.e., the number of analyzed deposits for each culture used to build a reference meta-spectrum (RMS); ii) biological replicates, i.e., the number of RMS derived from the distinct subculture of each strain; and iii) the number of distinct strains of a given species. We then compared the effectiveness of each library in the identification of 200 prospectively collected clinical isolates, including 38 species in 28 genera. Identification effectiveness was improved by increasing the number of both RMS per strain (p<10-4) and strains for a given species (p<10-4) in a multivariate analysis. Conclusion Addressing the heterogeneity of MALDI-TOF spectra derived from filamentous fungi by increasing the number of RMS obtained from distinct subcultures of strains included in the reference spectra library markedly improved the effectiveness of the MALDI-TOF MS-based identification of clinical filamentous fungi. PMID:23565856

  7. Construction of an 800-kb contig in the near-centromeric region of the rice blast resistance gene Pi-ta2 using a highly representative rice BAC library.

    PubMed

    Nakamura, S; Asakawa, S; Ohmido, N; Fukui, K; Shimizu, N; Kawasaki, S

    1997-05-01

    We constructed a rice Bacterial Artificial Chromosome (BAC) library from green leaf protoplasts of the cultivar Shimokita harboring the rice blast resistance gene Pi-ta. The average insert size of 155 kb and the library size of seven genome equivalents make it one of the most comprehensive BAC libraries available, and larger than many plant YAC libraries. The library clones were plated on seven high density membranes of microplate size, enabling efficient colony identification in colony hybridization experiments. Seven percent of clones carried chloroplast DNA. By probing with markers close to the blast resistance genes Pi-ta2(closely linked to Pi-ta) and Pi-b, respectively located in the centromeric region of chromosome 12 and near the telomeric end of chromosome 2, on average 2.2 +/- 1.3 and 8.0 +/- 2.6 BAC clones/marker were isolated. Differences in chromosomal structures may contribute to this wide variation in yield. A contig of about 800 kb, consisting of 19 clones, was constructed in the Pi-ta2 region. This region had a high frequency of repetitive sequences. To circumvent this difficulty, we devised a "two-step walking" method. The contig spanned a 300 kb region between markers located at 0 cM and 0.3 cM from Pi-ta. The ratio of physical to genetic distances (> 1,000 kb/cM) was more than three times larger than the average of rice (300 kb/cM). The low recombination rate and high frequency of repetitive sequences may also be related to the near centromeric character of this region. Fluorescent in situ hybridization (FISH) with a BAC clone from the Pi-b region yielded very clear signals on the long arm of chromosome 2, while a clone from the Pi-ta2 region showed various cross-hybridizing signals near the centromeric regions of all chromosomes.

  8. Characterization of zirconium carbides using electron microscopy, optical anisotropy, Auger depth profiles, X-ray diffraction, and electron density calculated by charge flipping method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chinthaka Silva, G.W., E-mail: chinthaka.silva@gmail.com; Kercher, Andrew A., E-mail: rokparent@comcast.net; Hunn, John D., E-mail: hunnjd@ornl.gov

    2012-10-15

    Samples with five different zirconium carbide compositions (C/Zr molar ratio=0.84, 0.89, 0.95, 1.05, and 1.17) have been fabricated and studied using a variety of experimental techniques. Each sample was zone refined to ensure that the end product was polycrystalline with a grain size of 10-100 {mu}m. It was found that the lattice parameter was largest for the x=0.89 composition and smallest for the x=1.17 total C/Zr composition, but was not linear; this nonlinearity is possibly explained using electron densities calculated using charge flipping technique. Among the five samples, the unit cell of the ZrC{sub 0.89} sample showed the highest electronmore » density, corresponding to the highest carbon incorporation and the largest lattice parameter. The ZrC{sub 0.84} sample showed the lowest carbon incorporation, resulting in a larger number of carbon vacancies and resultant strain. Samples with larger carbon ratios (x=0.95, 1.05, and 1.17) showed a slight decrease in lattice parameter, due to a decrease in electron density. Optical anisotropy measurements suggest that these three samples contained significant amounts of a graphitic carbon phase, not bonded to the Zr atoms. - Graphical abstract: Characterization of zirconium carbides using electron microscopy, optical anisotropy, Auger depth profiles, X-ray diffraction, and electron density calculated by the charge flipping method. Highlights: Black-Right-Pointing-Pointer The lattice parameter variation: ZrC{sub 0.89}>ZrC{sub 0.84}>ZrC{sub 0.95}>ZrC{sub 1.05}>ZrC{sub 1.17}. Black-Right-Pointing-Pointer Surface oxygen with no correlation to the lattice parameter variation. Black-Right-Pointing-Pointer ZrC{sub 0.89} had highest electron densities correspond to highest carbon incorporation. Black-Right-Pointing-Pointer Second highest lattice parameter in ZrC{sub 0.84} due to strain. Black-Right-Pointing-Pointer Unit cell electron density order: ZrC{sub 0.95}>ZrC{sub 1.05}>ZrC{sub 1.17}.« less

  9. Right adolescent idiopathic thoracic curve (Lenke 1 A and B): does cost of instrumentation and implant density improve radiographic and cosmetic parameters?

    PubMed

    Yang, Scott; Jones-Quaidoo, Sean M; Eager, Matthew; Griffin, Justin W; Reddi, Vasantha; Novicoff, Wendy; Shilt, Jeffrey; Bersusky, Ernesto; Defino, Helton; Ouellet, Jean; Arlet, Vincent

    2011-07-01

    In adolescent idiopathic scoliosis (AIS) there has been a shift towards increasing the number of implants and pedicle screws, which has not been proven to improve cosmetic correction. To evaluate if increasing cost of instrumentation correlates with cosmetic correction using clinical photographs. 58 Lenke 1A and B cases from a multicenter AIS database with at least 3 months follow-up of clinical photographs were used for analysis. Cosmetic parameters on PA and forward bending photographs included angular measurements of trunk shift, shoulder balance, rib hump, and ratio measurements of waist line asymmetry. Pre-op and follow-up X-rays were measured for coronal and sagittal deformity parameters. Cost density was calculated by dividing the total cost of instrumentation by the number of vertebrae being fused. Linear regression and spearman's correlation were used to correlate cost density to X-ray and photo outcomes. Three independent observers verified radiographic and cosmetic parameters for inter/interobserver variability analysis. Average pre-op Cobb angle and instrumented correction were 54° (SD 12.5) and 59% (SD 25) respectively. The average number of vertebrae fused was 10 (SD 1.9). The total cost of spinal instrumentation ranged from $6,769 to $21,274 (Mean $12,662, SD $3,858). There was a weak positive and statistically significant correlation between Cobb angle correction and cost density (r = 0.33, p = 0.01), and no correlation between Cobb angle correction of the uninstrumented lumbar spine and cost density (r = 0.15, p = 0.26). There was no significant correlation between all sagittal X-ray measurements or any of the photo parameters and cost density. There was good to excellent inter/intraobserver variability of all photographic parameters based on the intraclass correlation coefficient (ICC 0.74-0.98). Our method used to measure cosmesis had good to excellent inter/intraobserver variability, and may be an effective tool to objectively assess cosmesis from photographs. Since increasing cost density only improves mildly the Cobb angle correction of the main thoracic curve and not the correction of the uninstrumented spine or any of the cosmetic parameters, one should consider the cost of increasing implant density in Lenke 1A and B curves. In the area of rationalization of health care expenses, this study demonstrates that increasing the number of implants does not improve any relevant cosmetic or radiographic outcomes.

  10. The Microbial Ferrous Wheel in a Neutral pH Groundwater Seep

    PubMed Central

    Roden, Eric E.; McBeth, Joyce M.; Blöthe, Marco; Percak-Dennett, Elizabeth M.; Fleming, Emily J.; Holyoke, Rebecca R.; Luther, George W.; Emerson, David; Schieber, Juergen

    2012-01-01

    Evidence for microbial Fe redox cycling was documented in a circumneutral pH groundwater seep near Bloomington, Indiana. Geochemical and microbiological analyses were conducted at two sites, a semi-consolidated microbial mat and a floating puffball structure. In situ voltammetric microelectrode measurements revealed steep opposing gradients of O2 and Fe(II) at both sites, similar to other groundwater seep and sedimentary environments known to support microbial Fe redox cycling. The puffball structure showed an abrupt increase in dissolved Fe(II) just at its surface (∼5 cm depth), suggesting an internal Fe(II) source coupled to active Fe(III) reduction. Most probable number enumerations detected microaerophilic Fe(II)-oxidizing bacteria (FeOB) and dissimilatory Fe(III)-reducing bacteria (FeRB) at densities of 102 to 105 cells mL−1 in samples from both sites. In vitro Fe(III) reduction experiments revealed the potential for immediate reduction (no lag period) of native Fe(III) oxides. Conventional full-length 16S rRNA gene clone libraries were compared with high throughput barcode sequencing of the V1, V4, or V6 variable regions of 16S rRNA genes in order to evaluate the extent to which new sequencing approaches could provide enhanced insight into the composition of Fe redox cycling microbial community structure. The composition of the clone libraries suggested a lithotroph-dominated microbial community centered around taxa related to known FeOB (e.g., Gallionella, Sideroxydans, Aquabacterium). Sequences related to recognized FeRB (e.g., Rhodoferax, Aeromonas, Geobacter, Desulfovibrio) were also well-represented. Overall, sequences related to known FeOB and FeRB accounted for 88 and 59% of total clone sequences in the mat and puffball libraries, respectively. Taxa identified in the barcode libraries showed partial overlap with the clone libraries, but were not always consistent across different variable regions and sequencing platforms. However, the barcode libraries provided confirmation of key clone library results (e.g., the predominance of Betaproteobacteria) and an expanded view of lithotrophic microbial community composition. PMID:22783228

  11. Spectral analysis of shielded gamma ray sources using precalculated library data

    NASA Astrophysics Data System (ADS)

    Holmes, Thomas Wesley; Gardner, Robin P.

    2015-11-01

    In this work, an approach has been developed for determining the intensity of a shielded source by first determining the thicknesses of three different shielding materials from a passively collected gamma-ray spectrum by making comparisons with predetermined shielded spectra. These evaluations are dependent on the accuracy and validity of the predetermined library spectra which were created by changing the thicknesses of the three chosen materials lead, aluminum and wood that are used to simulate any actual shielding. Each of the spectra produced was generated using MCNP5 with a sufficiently large number of histories to ensure a low relative error at each channel. The materials were held in the same respective order from source to detector, where each material consisted of three individual thicknesses and a null condition. This then produced two separate data sets of 27 total shielding material situations and subsequent predetermined libraries that were created for each radionuclide source used. The technique used to calculate the thicknesses of the materials implements a Levenberg-Marquardt nonlinear search that employs a tri-linear interpolation with the respective predetermined libraries within each channel for the supplied input unknown spectrum. Given that the nonlinear parameters require an initial guess for the calculations, the approach demonstrates first that when the correct values are input, the correct thicknesses are found. It then demonstrates that when multiple trials of random values are input for each of the nonlinear parameters, the average of the calculated solutions that successfully converges also produced the correct thicknesses. Under situations with sufficient information known about the detection situation at hand, the method was shown to behave in a manner that produces reasonable results and can serve as a good preliminary solution. This technique has the capability to be used in a variety of full spectrum inverse analysis problems including homeland security issues.

  12. The statistics of primordial density fluctuations

    NASA Astrophysics Data System (ADS)

    Barrow, John D.; Coles, Peter

    1990-05-01

    The statistical properties of the density fluctuations produced by power-law inflation are investigated. It is found that, even the fluctuations present in the scalar field driving the inflation are Gaussian, the resulting density perturbations need not be, due to stochastic variations in the Hubble parameter. All the moments of the density fluctuations are calculated, and is is argued that, for realistic parameter choices, the departures from Gaussian statistics are small and would have a negligible effect on the large-scale structure produced in the model. On the other hand, the model predicts a power spectrum with n not equal to 1, and this could be good news for large-scale structure.

  13. Density-Aware Clustering Based on Aggregated Heat Kernel and Its Transformation

    DOE PAGES

    Huang, Hao; Yoo, Shinjae; Yu, Dantong; ...

    2015-06-01

    Current spectral clustering algorithms suffer from the sensitivity to existing noise, and parameter scaling, and may not be aware of different density distributions across clusters. If these problems are left untreated, the consequent clustering results cannot accurately represent true data patterns, in particular, for complex real world datasets with heterogeneous densities. This paper aims to solve these problems by proposing a diffusion-based Aggregated Heat Kernel (AHK) to improve the clustering stability, and a Local Density Affinity Transformation (LDAT) to correct the bias originating from different cluster densities. AHK statistically\\ models the heat diffusion traces along the entire time scale, somore » it ensures robustness during clustering process, while LDAT probabilistically reveals local density of each instance and suppresses the local density bias in the affinity matrix. Our proposed framework integrates these two techniques systematically. As a result, not only does it provide an advanced noise-resisting and density-aware spectral mapping to the original dataset, but also demonstrates the stability during the processing of tuning the scaling parameter (which usually controls the range of neighborhood). Furthermore, our framework works well with the majority of similarity kernels, which ensures its applicability to many types of data and problem domains. The systematic experiments on different applications show that our proposed algorithms outperform state-of-the-art clustering algorithms for the data with heterogeneous density distributions, and achieve robust clustering performance with respect to tuning the scaling parameter and handling various levels and types of noise.« less

  14. Effect of cement kiln dust and gamma irradiation on the ultrasonic parameters of HMO borate glasses

    NASA Astrophysics Data System (ADS)

    Abd elfadeel, G.; Saddeek, Yasser B.; Mohamed, Gehan Y.; Mostafa, A. M. A.; Shokry Hassan, H.

    2017-03-01

    Glass samples with the chemical formula x CKD-(100 - x) (5Na2O-65 B2O3-9 Bi2O3-21PbO), (0 ⩽ x ⩽ 32 mol%) were prepared. The density and the ultrasonic estimations of the investigated glasses were analyzed at room temperature before and after the impact of two dosages of gamma irradiation to study the effect of both CKD and gamma radiation. It was found that the density, and the ultrasonic parameters are sensitive to the variety of the content of CKD and the effect of γ-radiation. Replacement of oxides with higher atomic weights such as Bi2O3 and PbO by CKD decreases the density. Analysis of the behavior of the ultrasonic parameters demonstrates that creation of CaO6 and SiO4 on one hand and an alternate transformation between BO4 and BO3 structural units, on the other hand, affect the increase of the ultrasonic velocities and the elastic moduli. Moreover, the density and the ultrasonic parameters decrease somewhat with the increase of the doses of γ-irradiation. The variations of the previous physical parameters can be referred to the creation of radiation imperfections, which occupied the voids inside the glass structure.

  15. Cross-species bacterial artificial chromosome (BAC) library screening via overgo-based hybridization and BAC-contig mapping of a yield enhancement quantitative trait locus (QTL) yld1.1 in the Malaysian wild rice Oryza rufipogon.

    PubMed

    Song, Beng-Kah; Nadarajah, Kalaivani; Romanov, Michael N; Ratnam, Wickneswari

    2005-01-01

    The construction of BAC-contig physical maps is an important step towards a partial or ultimate genome sequence analysis. Here, we describe our initial efforts to apply an overgo approach to screen a BAC library of the Malaysian wild rice species, Oryza rufipogon. Overgo design is based on repetitive element masking and sequence uniqueness, and uses short probes (approximately 40 bp), making this method highly efficient and specific. Pairs of 24-bp oligos that contain an 8-bp overlap were developed from the publicly available genomic sequences of the cultivated rice, O. sativa, to generate 20 overgo probes for a 1-Mb region that encompasses a yield enhancement QTL yld1.1 in O. rufipogon. The advantages of a high similarity in melting temperature, hybridization kinetics and specific activities of overgos further enabled a pooling strategy for library screening by filter hybridization. Two pools of ten overgos each were hybridized to high-density filters representing the O. rufipogon genomic BAC library. These screening tests succeeded in providing 69 PCR-verified positive hits from a total of 23,040 BAC clones of the entire O. rufipogon library. A minimal tilling path of clones was generated to contribute to a fully covered BAC-contig map of the targeted 1-Mb region. The developed protocol for overgo design based on O. sativa sequences as a comparative genomic framework, and the pooled overgo hybridization screening technique are suitable means for high-resolution physical mapping and the identification of BAC candidates for sequencing.

  16. Error assessment in molecular dynamics trajectories using computed NMR chemical shifts.

    PubMed

    Koes, David R; Vries, John K

    2017-01-01

    Accurate chemical shifts for the atoms in molecular mechanics (MD) trajectories can be obtained from quantum mechanical (QM) calculations that depend solely on the coordinates of the atoms in the localized regions surrounding atoms of interest. If these coordinates are correct and the sample size is adequate, the ensemble average of these chemical shifts should be equal to the chemical shifts obtained from NMR spectroscopy. If this is not the case, the coordinates must be incorrect. We have utilized this fact to quantify the errors associated with the backbone atoms in MD simulations of proteins. A library of regional conformers containing 169,499 members was constructed from 6 model proteins. The chemical shifts associated with the backbone atoms in each of these conformers was obtained from QM calculations using density functional theory at the B3LYP level with a 6-311+G(2d,p) basis set. Chemical shifts were assigned to each backbone atom in each MD simulation frame using a template matching approach. The ensemble average of these chemical shifts was compared to chemical shifts from NMR spectroscopy. A large systematic error was identified that affected the 1 H atoms of the peptide bonds involved in hydrogen bonding with water molecules or peptide backbone atoms. This error was highly sensitive to changes in electrostatic parameters. Smaller errors affecting the 13 C a and 15 N atoms were also detected. We believe these errors could be useful as metrics for comparing the force-fields and parameter sets used in MD simulation because they are directly tied to errors in atomic coordinates.

  17. HEATING 7. 1 user's manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childs, K.W.

    1991-07-01

    HEATING is a FORTRAN program designed to solve steady-state and/or transient heat conduction problems in one-, two-, or three- dimensional Cartesian, cylindrical, or spherical coordinates. A model may include multiple materials, and the thermal conductivity, density, and specific heat of each material may be both time- and temperature-dependent. The thermal conductivity may be anisotropic. Materials may undergo change of phase. Thermal properties of materials may be input or may be extracted from a material properties library. Heating generation rates may be dependent on time, temperature, and position, and boundary temperatures may be time- and position-dependent. The boundary conditions, which maymore » be surface-to-boundary or surface-to-surface, may be specified temperatures or any combination of prescribed heat flux, forced convection, natural convection, and radiation. The boundary condition parameters may be time- and/or temperature-dependent. General graybody radiation problems may be modeled with user-defined factors for radiant exchange. The mesh spacing may be variable along each axis. HEATING is variably dimensioned and utilizes free-form input. Three steady-state solution techniques are available: point-successive-overrelaxation iterative method with extrapolation, direct-solution (for one-dimensional or two-dimensional problems), and conjugate gradient. Transient problems may be solved using one of several finite-difference schemes: Crank-Nicolson implicit, Classical Implicit Procedure (CIP), Classical Explicit Procedure (CEP), or Levy explicit method (which for some circumstances allows a time step greater than the CEP stability criterion). The solution of the system of equations arising from the implicit techniques is accomplished by point-successive-overrelaxation iteration and includes procedures to estimate the optimum acceleration parameter.« less

  18. Mass of materials: the impact of designers on construction ergonomics.

    PubMed

    Smallwood, John

    2012-01-01

    Many construction injuries are musculoskeletal related in the form of sprains and strains arising from the handling of materials, which are specified by designers. The paper presents the results of a study conducted among delegates attending two 'designing for H&S' (DfH&S) seminars using a questionnaire. The salient findings include: the level of knowledge relative to the mass and density of materials is limited; designers generally do not consider the mass and density of materials when designing structures and elements and specifying materials; to a degree designers appreciate that the mass and density of materials impact on construction ergonomics; designers rate their knowledge of the mass and density of materials as limited, and designers appreciate the potential of the consideration of the mass and density of materials to contribute to an improvement in construction ergonomics. Conclusions include: designers lack the requisite knowledge relative to the mass and density of materials; designers are thus precluded from conducting optimum design hazard identification and risk assessments, and tertiary built environment designer education does not enlighten designers relative to construction ergonomics. Recommendations include: tertiary built environment designer education should construction ergonomics; professional associations should raise the level of awareness relative to construction ergonomics, and design practices should include a category 'mass and density of materials' in their practice libraries.

  19. Ideas for the rapid development of the structural models in mechanical engineering

    NASA Astrophysics Data System (ADS)

    Oanta, E.; Raicu, A.; Panait, C.

    2017-08-01

    Conceiving computer based instruments is a long run concern of the authors. Some of the original solutions are: optimal processing of the large matrices, interfaces between the programming languages, approximation theory using spline functions, numerical programming increased accuracy based on the extended arbitrary precision libraries. For the rapid development of the models we identified the following directions: atomization, ‘librarization’, parameterization, automatization and integration. Each of these directions has some particular aspects if we approach mechanical design problems or software development. Atomization means a thorough top-down decomposition analysis which offers an insight regarding the basic features of the phenomenon. Creation of libraries of reusable mechanical parts and libraries of programs (data types, functions) save time, cost and effort when a new model must be conceived. Parameterization leads to flexible definition of the mechanical parts, the values of the parameters being changed either using a dimensioning program or in accord to other parts belonging to the same assembly. The resulting templates may be also included in libraries. Original software applications are useful for the model’s input data generation, to input the data into CAD/FEA commercial applications and for the data integration of the various types of studies included in the same project.

  20. An efficient platform for genetic selection and screening of gene switches in Escherichia coli

    PubMed Central

    Muranaka, Norihito; Sharma, Vandana; Nomura, Yoko; Yokobayashi, Yohei

    2009-01-01

    Engineered gene switches and circuits that can sense various biochemical and physical signals, perform computation, and produce predictable outputs are expected to greatly advance our ability to program complex cellular behaviors. However, rational design of gene switches and circuits that function in living cells is challenging due to the complex intracellular milieu. Consequently, most successful designs of gene switches and circuits have relied, to some extent, on high-throughput screening and/or selection from combinatorial libraries of gene switch and circuit variants. In this study, we describe a generic and efficient platform for selection and screening of gene switches and circuits in Escherichia coli from large libraries. The single-gene dual selection marker tetA was translationally fused to green fluorescent protein (gfpuv) via a flexible peptide linker and used as a dual selection and screening marker for laboratory evolution of gene switches. Single-cycle (sequential positive and negative selections) enrichment efficiencies of >7000 were observed in mock selections of model libraries containing functional riboswitches in liquid culture. The technique was applied to optimize various parameters affecting the selection outcome, and to isolate novel thiamine pyrophosphate riboswitches from a complex library. Artificial riboswitches with excellent characteristics were isolated that exhibit up to 58-fold activation as measured by fluorescent reporter gene assay. PMID:19190095

  1. XPI: The Xanadu Parameter Interface

    NASA Technical Reports Server (NTRS)

    White, N.; Barrett, P.; Oneel, B.; Jacobs, P.

    1992-01-01

    XPI is a table driven parameter interface which greatly simplifies both command driven programs such as BROWSE and XIMAGE as well as stand alone single-task programs. It moves all of the syntax and semantic parsing of commands and parameters out of the users code into common code and externally defined tables. This allows the programmer to concentrate on writing the code unique to the application rather than reinventing the user interface and for external graphical interfaces to interface with no changes to the command driven program. XPI also includes a compatibility library which allows programs written using the IRAF host interface (Mandel and Roll) to use XPI in place of the IRAF host interface.

  2. A microwave method for measuring moisture content, density, and grain angle of wood

    Treesearch

    W. L. James; Y.-H. Yen; R. J. King

    1985-01-01

    The attenuation, phase shift and depolarization of a polarized 4.81-gigahertz wave as it is transmitted through a wood specimen can provide estimates of the moisture content (MC), density, and grain angle of the specimen. Calibrations are empirical, and computations are complicated, with considerable interaction between parameters. Measured dielectric parameters,...

  3. Tradeoffs between hydraulic and mechanical stress responses of mature Norway spruce trunk wood.

    PubMed

    Rosner, Sabine; Klein, Andrea; Müller, Ulrich; Karlsson, Bo

    2008-08-01

    We tested the effects of growth characteristics and basic density on hydraulic and mechanical properties of mature Norway spruce (Picea abies (L.) Karst.) wood from six 24-year-old clones, grown on two sites in southern Sweden differing in water availability. Hydraulic parameters assessed were specific hydraulic conductivity at full saturation (ks100) and vulnerability to cavitation (Psi50), mechanical parameters included bending strength (sigma b), modulus of elasticity (MOE), compression strength (sigma a) and Young's modulus (E). Basic density, diameter at breast height, tree height, and hydraulic and mechanical parameters varied considerably among clones. Clonal means of hydraulic and mechanical properties were strongly related to basic density and to growth parameters across sites, especially to diameter at breast height. Compared with stem wood of slower growing clones, stem wood of rapidly growing clones had significantly lower basic density, lower sigma b, MOE, sigma a and E, was more vulnerable to cavitation, but had higher ks100. Basic density was negatively correlated to Psi50 and ks100. We therefore found a tradeoff between Psi50 and ks100. Clones with high basic density had significantly lower hydraulic vulnerability, but also lower hydraulic conductivity at full saturation and thus less rapid growth than clones with low basic density. This tradeoff involved a negative relationship between Psi50 and sigma b as well as MOE, and between ks100 and sigma b, MOE and sigma a. Basic density and Psi50 showed no site-specific differences, but tree height, diameter at breast height, ks100 and mechanical strength and stiffness were significantly lower at the drier site. Basic density had no influence on the site-dependent differences in hydraulic and mechanical properties, but was strongly negatively related to diameter at breast height. Selecting for growth may thus lead not only to a reduction in mechanical strength and stiffness but also to a reduction in hydraulic safety.

  4. Population Pharmacokinetic/Pharmacodynamic Analysis of Alirocumab in Healthy Volunteers or Hypercholesterolemic Subjects Using an Indirect Response Model to Predict Low-Density Lipoprotein Cholesterol Lowering: Support for a Biologics License Application Submission: Part II.

    PubMed

    Nicolas, Xavier; Djebli, Nassim; Rauch, Clémence; Brunet, Aurélie; Hurbin, Fabrice; Martinez, Jean-Marie; Fabre, David

    2018-05-03

    Alirocumab, a human monoclonal antibody against proprotein convertase subtilisin/kexin type 9 (PCSK9), significantly lowers low-density lipoprotein cholesterol levels. This analysis aimed to develop and qualify a population pharmacokinetic/pharmacodynamic model for alirocumab based on pooled data obtained from 13 phase I/II/III clinical trials. From a dataset of 2799 individuals (14,346 low-density lipoprotein-cholesterol values), individual pharmacokinetic parameters from the population pharmacokinetic model presented in Part I of this series were used to estimate alirocumab concentrations. As a second step, we then developed the current population pharmacokinetic/pharmacodynamic model using an indirect response model with a Hill coefficient, parameterized with increasing low-density lipoprotein cholesterol elimination, to relate alirocumab concentrations to low-density lipoprotein cholesterol values. The population pharmacokinetic/pharmacodynamic model allowed the characterization of the pharmacokinetic/pharmacodynamic properties of alirocumab in the target population and estimation of individual low-density lipoprotein cholesterol levels and derived pharmacodynamic parameters (the maximum decrease in low-density lipoprotein cholesterol values from baseline and the difference between baseline low-density lipoprotein cholesterol and the pre-dose value before the next alirocumab dose). Significant parameter-covariate relationships were retained in the model, with a total of ten covariates (sex, age, weight, free baseline PCSK9, total time-varying PCSK9, concomitant statin administration, total baseline PCSK9, co-administration of high-dose statins, disease status) included in the final population pharmacokinetic/pharmacodynamic model to explain between-subject variability. Nevertheless, the high number of covariates included in the model did not have a clinically meaningful impact on model-derived pharmacodynamic parameters. This model successfully allowed the characterization of the population pharmacokinetic/pharmacodynamic properties of alirocumab in its target population and the estimation of individual low-density lipoprotein cholesterol levels.

  5. Determination of buoyant density and sensitivity to chloroform and freon for the etiological agent of infectious salmonid anaemia

    USGS Publications Warehouse

    Christie, K.E.; Hjeltnes, B.; Uglenes , I.; Winton, J.R.

    1993-01-01

    Plasma was collected from Atlantic salmon Salrno salar with acute infectious salmon anaemia (ISA) and used to challenge Atlantic salmon parr by intraperitoneal injection. Treatment of plasma with the lipid solvent, chloroform, showed that the etiological agent of ISA contained essential lipids, probably as a viral envelope. Some infectivity remained following treatment with freon. Injection challenges using fractions from equilibrium density gradient centrifugation of plasma from fish with acute ISA revealed a band of infectivity in the range 1.184 to 1.262 g cm-3. The band was believed to conta~n both complete ISA-virus particles and infectious particles lacking a complete envelope, nucleocapsid or genome. Density gradient centrifugation of infectious plasma for enrichment of the putative ISA virus appeared to offer a suitable method for obtaining virus-specific nucleic acid for use in the construction of cDNA libraries. 

  6. DNA stable-isotope probing (DNA-SIP).

    PubMed

    Dunford, Eric A; Neufeld, Josh D

    2010-08-02

    DNA stable-isotope probing (DNA-SIP) is a powerful technique for identifying active microorganisms that assimilate particular carbon substrates and nutrients into cellular biomass. As such, this cultivation-independent technique has been an important methodology for assigning metabolic function to the diverse communities inhabiting a wide range of terrestrial and aquatic environments. Following the incubation of an environmental sample with stable-isotope labelled compounds, extracted nucleic acid is subjected to density gradient ultracentrifugation and subsequent gradient fractionation to separate nucleic acids of differing densities. Purification of DNA from cesium chloride retrieves labelled and unlabelled DNA for subsequent molecular characterization (e.g. fingerprinting, microarrays, clone libraries, metagenomics). This JoVE video protocol provides visual step-by-step explanations of the protocol for density gradient ultracentrifugation, gradient fractionation and recovery of labelled DNA. The protocol also includes sample SIP data and highlights important tips and cautions that must be considered to ensure a successful DNA-SIP analysis.

  7. The effect of laser focus and process parameters on microstructure and mechanical properties of SLM Inconel 718

    NASA Astrophysics Data System (ADS)

    Bean, Glenn E.; Witkin, David B.; McLouth, Tait D.; Zaldivar, Rafael J.

    2018-02-01

    Research on the selective laser melting (SLM) method of laser powder bed fusion additive manufacturing (AM) has shown that surface and internal quality of AM parts is directly related to machine settings such as laser energy density, scanning strategies, and atmosphere. To optimize laser parameters for improved component quality, the energy density is typically controlled via laser power, scanning rate, and scanning strategy, but can also be controlled by changing the spot size via laser focal plane shift. Present work being conducted by The Aerospace Corporation was initiated after observing inconsistent build quality of parts printed using OEM-installed settings. Initial builds of Inconel 718 witness geometries using OEM laser parameters were evaluated for surface roughness, density, and porosity while varying energy density via laser focus shift. Based on these results, hardware and laser parameter adjustments were conducted in order to improve build quality and consistency. Tensile testing was also conducted to investigate the effect of build plate location and laser settings on SLM 718. This work has provided insight into the limitations of OEM parameters compared with optimized parameters towards the goal of manufacturing aerospace-grade parts, and has led to the development of a methodology for laser parameter tuning that can be applied to other alloy systems. Additionally, evidence was found that for 718, which derives its strength from post-manufacturing heat treatment, there is a possibility that tensile testing may not be perceptive to defects which would reduce component performance. Ongoing research is being conducted towards identifying appropriate testing and analysis methods for screening and quality assurance.

  8. Determination of remodeling parameters for a strain-adaptive finite element model of the distal ulna.

    PubMed

    Neuert, Mark A C; Dunning, Cynthia E

    2013-09-01

    Strain energy-based adaptive material models are used to predict bone resorption resulting from stress shielding induced by prosthetic joint implants. Generally, such models are governed by two key parameters: a homeostatic strain-energy state (K) and a threshold deviation from this state required to initiate bone reformation (s). A refinement procedure has been performed to estimate these parameters in the femur and glenoid; this study investigates the specific influences of these parameters on resulting density distributions in the distal ulna. A finite element model of a human ulna was created using micro-computed tomography (µCT) data, initialized to a homogeneous density distribution, and subjected to approximate in vivo loading. Values for K and s were tested, and the resulting steady-state density distribution compared with values derived from µCT images. The sensitivity of these parameters to initial conditions was examined by altering the initial homogeneous density value. The refined model parameters selected were then applied to six additional human ulnae to determine their performance across individuals. Model accuracy using the refined parameters was found to be comparable with that found in previous studies of the glenoid and femur, and gross bone structures, such as the cortical shell and medullary canal, were reproduced. The model was found to be insensitive to initial conditions; however, a fair degree of variation was observed between the six specimens. This work represents an important contribution to the study of changes in load transfer in the distal ulna following the implementation of commercial orthopedic implants.

  9. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    NASA Astrophysics Data System (ADS)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-03-01

    Seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismic profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ΄), modulus-density (κ, μ and ρ), Lamé-density (λ, μ΄ and ρ‴), impedance-density (IP, IS and ρ″), velocity-impedance-I (α΄, β΄ and I_P^'), and velocity-impedance-II (α″, β″ and I_S^'). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. The heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson's ratios, can be identified clearly with the inverted isotropic-elastic parameters.

  10. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    DOE PAGES

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-03-06

    We report seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismicmore » profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ'), modulus-density (κ, μ and ρ), Lamé-density (λ, μ' and ρ'''), impedance-density (IP, IS and ρ''), velocity-impedance-I (α', β' and I' P), and velocity-impedance-II (α'', β'' and I'S). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. Finally, the heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson’s ratios, can be identified clearly with the inverted isotropic-elastic parameters.« less

  11. Elastic full-waveform inversion and parameterization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    We report seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter tradeoff, arising from the covariance between different physical parameters, which increases nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parameterization and acquisition arrangement. An appropriate choice of model parameterization is critical to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parameterizations in isotropic-elastic FWI with walk-away vertical seismicmore » profile (W-VSP) dataset for unconventional heavy oil reservoir characterization. Six model parameterizations are considered: velocity-density (α, β and ρ'), modulus-density (κ, μ and ρ), Lamé-density (λ, μ' and ρ'''), impedance-density (IP, IS and ρ''), velocity-impedance-I (α', β' and I' P), and velocity-impedance-II (α'', β'' and I'S). We begin analyzing the interparameter tradeoff by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. In this paper, we discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter tradeoffs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter tradeoffs for various model parameterizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parameterization, the inverted density profile can be over-estimated, under-estimated or spatially distorted. Among the six cases, only the velocity-density parameterization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. Finally, the heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson’s ratios, can be identified clearly with the inverted isotropic-elastic parameters.« less

  12. Elastic full-waveform inversion and parametrization analysis applied to walk-away vertical seismic profile data for unconventional (heavy oil) reservoir characterization

    NASA Astrophysics Data System (ADS)

    Pan, Wenyong; Innanen, Kristopher A.; Geng, Yu

    2018-06-01

    Seismic full-waveform inversion (FWI) methods hold strong potential to recover multiple subsurface elastic properties for hydrocarbon reservoir characterization. Simultaneously updating multiple physical parameters introduces the problem of interparameter trade-off, arising from the simultaneous variations of different physical parameters, which increase the nonlinearity and uncertainty of multiparameter FWI. The coupling effects of different physical parameters are significantly influenced by model parametrization and acquisition arrangement. An appropriate choice of model parametrization is important to successful field data applications of multiparameter FWI. The objective of this paper is to examine the performance of various model parametrizations in isotropic-elastic FWI with walk-away vertical seismic profile (W-VSP) data for unconventional heavy oil reservoir characterization. Six model parametrizations are considered: velocity-density (α, β and ρ΄), modulus-density (κ, μ and ρ), Lamé-density (λ, μ΄ and ρ‴), impedance-density (IP, IS and ρ″), velocity-impedance-I (α΄, β΄ and I_P^' }) and velocity-impedance-II (α″, β″ and I_S^' }). We begin analysing the interparameter trade-off by making use of scattering radiation patterns, which is a common strategy for qualitative parameter resolution analysis. We discuss the advantages and limitations of the scattering radiation patterns and recommend that interparameter trade-offs be evaluated using interparameter contamination kernels, which provide quantitative, second-order measurements of the interparameter contaminations and can be constructed efficiently with an adjoint-state approach. Synthetic W-VSP isotropic-elastic FWI experiments in the time domain verify our conclusions about interparameter trade-offs for various model parametrizations. Density profiles are most strongly influenced by the interparameter contaminations; depending on model parametrization, the inverted density profile can be overestimated, underestimated or spatially distorted. Among the six cases, only the velocity-density parametrization provides stable and informative density features not included in the starting model. Field data applications of multicomponent W-VSP isotropic-elastic FWI in the time domain were also carried out. The heavy oil reservoir target zone, characterized by low α-to-β ratios and low Poisson's ratios, can be identified clearly with the inverted isotropic-elastic parameters.

  13. Parameter dependences of the separatrix density in nitrogen seeded ASDEX Upgrade H-mode discharges

    NASA Astrophysics Data System (ADS)

    Kallenbach, A.; Sun, H. J.; Eich, T.; Carralero, D.; Hobirk, J.; Scarabosio, A.; Siccinio, M.; ASDEX Upgrade Team; EUROfusion MST1 Team

    2018-04-01

    The upstream separatrix electron density is an important interface parameter for core performance and divertor power exhaust. It has been measured in ASDEX Upgrade H-mode discharges by means of Thomson scattering using a self-consistent estimate of the upstream electron temperature under the assumption of Spitzer-Härm electron conduction. Its dependence on various plasma parameters has been tested for different plasma conditions in H-mode. The leading parameter determining n e,sep was found to be the neutral divertor pressure, which can be considered as an engineering parameter since it is determined mainly by the gas puff rate and the pumping speed. The experimentally found parameter dependence of n e,sep, which is dominated by the divertor neutral pressure, could be approximately reconciled by 2-point modelling.

  14. DAMQT: A package for the analysis of electron density in molecules

    NASA Astrophysics Data System (ADS)

    López, Rafael; Rico, Jaime Fernández; Ramírez, Guillermo; Ema, Ignacio; Zorrilla, David

    2009-09-01

    DAMQT is a package for the analysis of the electron density in molecules and the fast computation of the density, density deformations, electrostatic potential and field, and Hellmann-Feynman forces. The method is based on the partition of the electron density into atomic fragments by means of a least deformation criterion. Each atomic fragment of the density is expanded in regular spherical harmonics times radial factors, which are piecewise represented in terms of analytical functions. This representation is used for the fast evaluation of the electrostatic potential and field generated by the electron density and nuclei, as well as for the computation of the Hellmann-Feynman forces on the nuclei. An analysis of the atomic and molecular deformations of the density can be also carried out, yielding a picture that connects with several concepts of the empirical structural chemistry. Program summaryProgram title: DAMQT1.0 Catalogue identifier: AEDL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPLv3 No. of lines in distributed program, including test data, etc.: 278 356 No. of bytes in distributed program, including test data, etc.: 31 065 317 Distribution format: tar.gz Programming language: Fortran90 and C++ Computer: Any Operating system: Linux, Windows (Xp, Vista) RAM: 190 Mbytes Classification: 16.1 External routines: Trolltech's Qt (4.3 or higher) ( http://www.qtsoftware.com/products), OpenGL (1.1 or higher) ( http://www.opengl.org/), GLUT 3.7 ( http://www.opengl.org/resources/libraries/glut/). Nature of problem: Analysis of the molecular electron density and density deformations, including fast evaluation of electrostatic potential, electric field and Hellmann-Feynman forces on nuclei. Solution method: The method of Deformed Atoms in Molecules, reported elsewhere [1], is used for partitioning the molecular electron density into atomic fragments, which are further expanded in spherical harmonics times radial factors. The partition is used for defining molecular density deformations and for the fast calculation of several properties associated to density. Restrictions: The current version is limited to 120 atoms, 2000 contracted functions, and l=5 in basis functions. Density must come from a LCAO calculation (any level) with spherical (not Cartesian) Gaussian functions. Unusual features: The program contains an OPEN statement to binary files (stream) in file GOPENMOL.F90. This statement has not a standard syntax in Fortran 90. Two possibilities are considered in conditional compilation: Intel's ifort and Fortran2003 standard. This latter is applied to compilers other than ifort (gfortran uses this one, for instance). Additional comments: The distribution file for this program is over 30 Mbytes and therefore is not delivered directly when download or e-mail is requested. Instead a html file giving details of how the program can be obtained is sent. Running time: Largely dependent on the system size and the module run (from fractions of a second to hours). References: [1] J. Fernández Rico, R. López, I. Ema, G. Ramírez, J. Mol. Struct. (Theochem) 727 (2005) 115.

  15. libprofit: Image creation from luminosity profiles

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Taranu, D.; Tobar, R.

    2016-12-01

    libprofit is a C++ library for image creation based on different luminosity profiles. It offers fast and accurate two-dimensional integration for a useful number of profiles, including Sersic, Core-Sersic, broken-exponential, Ferrer, Moffat, empirical King, point-source and sky, with a simple mechanism for adding new profiles. libprofit provides a utility to read the model and profile parameters from the command-line and generate the corresponding image. It can output the resulting image as text values, a binary stream, or as a simple FITS file. It also provides a shared library exposing an API that can be used by any third-party application. R and Python interfaces are available: ProFit (ascl:1612.004) and PyProfit (ascl:1612.005).

  16. Research and realization of key technology in HILS interactive system

    NASA Astrophysics Data System (ADS)

    Liu, Che; Lu, Huiming; Wang, Fankai

    2018-03-01

    This paper designed HILS (Hardware In the Loop Simulation) interactive system based on xPC platform . Through the interface between C++ and MATLAB engine, establish the seamless data connection between Simulink and interactive system, complete data interaction between system and Simulink, realize the function development of model configuration, parameter modification and off line simulation. We establish the data communication between host and target machine through TCP/IP protocol to realize the model download and real-time simulation. Use database to store simulation data, implement real-time simulation monitoring and simulation data management. Realize system function integration by Qt graphic interface library and dynamic link library. At last, take the typical control system as an example to verify the feasibility of HILS interactive system.

  17. Spiral Galaxy Lensing: A Model with Twist

    NASA Astrophysics Data System (ADS)

    Bell, Steven R.; Ernst, Brett; Fancher, Sean; Keeton, Charles R.; Komanduru, Abi; Lundberg, Erik

    2014-12-01

    We propose a single galaxy gravitational lensing model with a mass density that has a spiral structure. Namely, we extend the arcsine gravitational lens (a truncated singular isothermal elliptical model), adding an additional parameter that controls the amount of spiraling in the structure of the mass density. An important feature of our model is that, even though the mass density is sophisticated, we succeed in integrating the deflection term in closed form using a Gauss hypergeometric function. When the spiraling parameter is set to zero, this reduces to the arcsine lens.

  18. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung.

    PubMed

    Holman, Beverley F; Cuplov, Vesna; Hutton, Brian F; Groves, Ashley M; Thielemans, Kris

    2016-04-21

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant (18)F-FDG and (18)F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  19. On the role of density and attenuation in 3D multi-parameter visco-acoustic VTI frequency-domain FWI: an OBC case study from the North Sea

    NASA Astrophysics Data System (ADS)

    Operto, S.; Miniussi, A.

    2018-03-01

    Three-dimensional frequency-domain full waveform inversion (FWI) is applied on North Sea wide-azimuth ocean-bottom cable data at low frequencies (≤ 10 Hz) to jointly update vertical wavespeed, density and quality factor Q in the visco-acoustic VTI approximation. We assess whether density and Q should be viewed as proxy to absorb artefacts resulting from approximate wave physics or are valuable for interpretation in presence of saturated sediments and gas. FWI is performed in the frequency domain to account for attenuation easily. Multi-parameter frequency-domain FWI is efficiently performed with a few discrete frequencies following a multi-scale frequency continuation. However, grouping a few frequencies during each multi-scale step is necessary to mitigate acquisition footprint and match dispersive shallow guided waves. Q and density absorb a significant part of the acquisition footprint hence cleaning the velocity model from this pollution. Low Q perturbations correlate with low velocity zones associated with soft sediments and gas cloud. However, the amplitudes of the Q perturbations show significant variations when the inversion tuning is modified. This dispersion in the Q reconstructions is however not passed on the velocity parameter suggesting that cross-talks between first-order kinematic and second-order dynamic parameters are limited. The density model shows a good match with a well log at shallow depths. Moreover, the impedance built a posteriori from the FWI velocity and density models shows a well-focused image with however local differences with the velocity model near the sea bed where density might have absorbed elastic effects. The FWI models are finally assessed against time-domain synthetic seismogram modelling performed with the same frequency-domain modelling engine used for FWI.

  20. The effect of respiratory induced density variations on non-TOF PET quantitation in the lung

    NASA Astrophysics Data System (ADS)

    Holman, Beverley F.; Cuplov, Vesna; Hutton, Brian F.; Groves, Ashley M.; Thielemans, Kris

    2016-04-01

    Accurate PET quantitation requires a matched attenuation map. Obtaining matched CT attenuation maps in the thorax is difficult due to the respiratory cycle which causes both motion and density changes. Unlike with motion, little attention has been given to the effects of density changes in the lung on PET quantitation. This work aims to explore the extent of the errors caused by pulmonary density attenuation map mismatch on dynamic and static parameter estimates. Dynamic XCAT phantoms were utilised using clinically relevant 18F-FDG and 18F-FMISO time activity curves for all organs within the thorax to estimate the expected parameter errors. The simulations were then validated with PET data from 5 patients suffering from idiopathic pulmonary fibrosis who underwent PET/Cine-CT. The PET data were reconstructed with three gates obtained from the Cine-CT and the average Cine-CT. The lung TACs clearly displayed differences between true and measured curves with error depending on global activity distribution at the time of measurement. The density errors from using a mismatched attenuation map were found to have a considerable impact on PET quantitative accuracy. Maximum errors due to density mismatch were found to be as high as 25% in the XCAT simulation. Differences in patient derived kinetic parameter estimates and static concentration between the extreme gates were found to be as high as 31% and 14%, respectively. Overall our results show that respiratory associated density errors in the attenuation map affect quantitation throughout the lung, not just regions near boundaries. The extent of this error is dependent on the activity distribution in the thorax and hence on the tracer and time of acquisition. Consequently there may be a significant impact on estimated kinetic parameters throughout the lung.

  1. Density of Spray-Formed Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin M. McHugh; Volker Uhlenwinkel; Nils Ellendr

    2008-06-01

    Spray Forming is an advanced materials processing technology that transforms molten metal into a near-net-shape solid by depositing atomized droplets onto a substrate. Depending on the application, the spray-formed material may be used in the as-deposited condition or it may undergo post-deposition processing. Regardless, the density of the as-deposited material is an important issue. Porosity is detrimental because it can significantly reduce strength, toughness, hardness and other properties. While it is not feasible to achieve fully-dense material in the as-deposited state, density greater than 99% of theoretical density is possible if the atomization and impact conditions are optimized. Thermal conditionsmore » at the deposit surface and droplet impact angle are key processing parameters that influence the density of the material. This paper examines the factors that contribute to porosity formation during spray forming and illustrates that very high as-deposited density is achieved by optimizing processing parameters.« less

  2. ParFit: A Python-Based Object-Oriented Program for Fitting Molecular Mechanics Parameters to ab Initio Data.

    PubMed

    Zahariev, Federico; De Silva, Nuwan; Gordon, Mark S; Windus, Theresa L; Dick-Perez, Marilu

    2017-03-27

    A newly created object-oriented program for automating the process of fitting molecular-mechanics parameters to ab initio data, termed ParFit, is presented. ParFit uses a hybrid of deterministic and stochastic genetic algorithms. ParFit can simultaneously handle several molecular-mechanics parameters in multiple molecules and can also apply symmetric and antisymmetric constraints on the optimized parameters. The simultaneous handling of several molecules enhances the transferability of the fitted parameters. ParFit is written in Python, uses a rich set of standard and nonstandard Python libraries, and can be run in parallel on multicore computer systems. As an example, a series of phosphine oxides, important for metal extraction chemistry, are parametrized using ParFit. ParFit is in an open source program available for free on GitHub ( https://github.com/fzahari/ParFit ).

  3. A study of the 3D radiative transfer effect in cloudy atmospheres

    NASA Astrophysics Data System (ADS)

    Okata, M.; Teruyuki, N.; Suzuki, K.

    2015-12-01

    Evaluation of the effect of clouds in the atmosphere is a significant problem in the Earth's radiation budget study with their large uncertainties of microphysics and the optical properties. In this situation, we still need more investigations of 3D cloud radiative transer problems using not only models but also satellite observational data.For this purpose, we have developed a 3D-Monte-Carlo radiative transfer code that is implemented with various functions compatible with the OpenCLASTR R-Star radiation code for radiance and flux computation, i.e. forward and backward tracing routines, non-linear k-distribution parameterization (Sekiguchi and Nakajima, 2008) for broad band solar flux calculation, and DM-method for flux and TMS-method for upward radiance (Nakajima and Tnaka 1998). We also developed a Minimum cloud Information Deviation Profiling Method (MIDPM) as a method for a construction of 3D cloud field with MODIS/AQUA and CPR/CloudSat data. We then selected a best-matched radar reflectivity factor profile from the library for each of off-nadir pixels of MODIS where CPR profile is not available, by minimizing the deviation between library MODIS parameters and those at the pixel. In this study, we have used three cloud microphysical parameters as key parameters for the MIDPM, i.e. effective particle radius, cloud optical thickness and top of cloud temperature, and estimated 3D cloud radiation budget. We examined the discrepancies between satellite observed and mode-simulated radiances and three cloud microphysical parameter's pattern for studying the effects of cloud optical and microphysical properties on the radiation budget of the cloud-laden atmospheres.

  4. Comparison of TG-43 dosimetric parameters of brachytherapy sources obtained by three different versions of MCNP codes.

    PubMed

    Zaker, Neda; Zehtabian, Mehdi; Sina, Sedigheh; Koontz, Craig; Meigooni, Ali S

    2016-03-08

    Monte Carlo simulations are widely used for calculation of the dosimetric parameters of brachytherapy sources. MCNP4C2, MCNP5, MCNPX, EGS4, EGSnrc, PTRAN, and GEANT4 are among the most commonly used codes in this field. Each of these codes utilizes a cross-sectional library for the purpose of simulating different elements and materials with complex chemical compositions. The accuracies of the final outcomes of these simulations are very sensitive to the accuracies of the cross-sectional libraries. Several investigators have shown that inaccuracies of some of the cross section files have led to errors in 125I and 103Pd parameters. The purpose of this study is to compare the dosimetric parameters of sample brachytherapy sources, calculated with three different versions of the MCNP code - MCNP4C, MCNP5, and MCNPX. In these simulations for each source type, the source and phantom geometries, as well as the number of the photons, were kept identical, thus eliminating the possible uncertainties. The results of these investigations indicate that for low-energy sources such as 125I and 103Pd there are discrepancies in gL(r) values. Discrepancies up to 21.7% and 28% are observed between MCNP4C and other codes at a distance of 6 cm for 103Pd and 10 cm for 125I from the source, respectively. However, for higher energy sources, the discrepancies in gL(r) values are less than 1.1% for 192Ir and less than 1.2% for 137Cs between the three codes.

  5. Density functional calculations of the Mössbauer parameters in hexagonal ferrite SrFe12O19

    NASA Astrophysics Data System (ADS)

    Ikeno, Hidekazu

    2018-03-01

    Mössbauer parameters in a magnetoplumbite-type hexagonal ferrite, SrFe12O19, are computed using the all-electron band structure calculation based on the density functional theory. The theoretical isomer shift and quadrupole splitting are consistent with experimentally obtained values. The absolute values of hyperfine splitting parameters are found to be underestimated, but the relative scale can be reproduced. The present results validate the site-dependence of Mössbauer parameters obtained by analyzing experimental spectra of hexagonal ferrites. The results also show the usefulness of theoretical calculations for increasing the reliability of interpretation of the Mössbauer spectra.

  6. COMSOL in the Academic Environment at USNA

    DTIC Science & Technology

    2009-10-01

    figure shows the electric field calculated and the right shows the electron density at one point in time. 3.3 Acoustic Detection of Landmines – 3...industries heavy investment in computer graphics and modeling. Packages such as Maya , Zbrush, Mudbox and others excel at this type of modeling. A...like Sketch-Up, Maya or AutoCAD. An extensive library of pre-built models would include all of the Platonic solids, combinations of Platonic

  7. Polymer density functional theory approach based on scaling second-order direct correlation function.

    PubMed

    Zhou, Shiqi

    2006-06-01

    A second-order direct correlation function (DCF) from solving the polymer-RISM integral equation is scaled up or down by an equation of state for bulk polymer, the resultant scaling second-order DCF is in better agreement with corresponding simulation results than the un-scaling second-order DCF. When the scaling second-order DCF is imported into a recently proposed LTDFA-based polymer DFT approach, an originally associated adjustable but mathematically meaningless parameter now becomes mathematically meaningful, i.e., the numerical value lies now between 0 and 1. When the adjustable parameter-free version of the LTDFA is used instead of the LTDFA, i.e., the adjustable parameter is fixed at 0.5, the resultant parameter-free version of the scaling LTDFA-based polymer DFT is also in good agreement with the corresponding simulation data for density profiles. The parameter-free version of the scaling LTDFA-based polymer DFT is employed to investigate the density profiles of a freely jointed tangent hard sphere chain near a variable sized central hard sphere, again the predictions reproduce accurately the simulational results. Importance of the present adjustable parameter-free version lies in its combination with a recently proposed universal theoretical way, in the resultant formalism, the contact theorem is still met by the adjustable parameter associated with the theoretical way.

  8. Two-parameter partially correlated ground-state electron density of some light spherical atoms from Hartree-Fock theory with nonintegral nuclear charge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cordero, Nicolas A.; March, Norman H.; Alonso, Julio A.

    2007-05-15

    Partially correlated ground-state electron densities for some spherical light atoms are calculated, into which nonrelativistic ionization potentials represent essential input data. The nuclear cusp condition of Kato is satisfied precisely. The basic theoretical starting point, however, is Hartree-Fock (HF) theory for the N electrons under consideration but with nonintegral nuclear charge Z{sup '} slightly different from the atomic number Z (=N). This HF density is scaled with a parameter {lambda}, near to unity, to preserve normalization. Finally, some tests are performed on the densities for the atoms Ne and Ar, as well as for Be and Mg.

  9. Transport of magnetohydrodynamic nanomaterial in a stratified medium considering gyrotactic microorganisms

    NASA Astrophysics Data System (ADS)

    Waqas, M.; Hayat, T.; Shehzad, S. A.; Alsaedi, A.

    2018-01-01

    Impact of gyrotactic microorganisms on two-dimensional (2D) stratified flow of an Oldroyd-B nanomaterial is highlighted. Applied magnetic field along with mixed convection is considered in the formulation. Theory of microorganisms is utilized just to stabilize the suspended nanoparticles through bioconvection induced by combined effects of buoyancy forces and magnetic field. Convergent series solutions for the obtained nonlinear differential systems are derived. Impacts of different emerging parameters on velocity, temperature, concentration, motile microorganisms density, density number of motile microorganisms and local Nusselt and Sherwood numbers are graphically addressed. It is observed that thermal, concentration and motile density stratification parameters result in reduction of temperature, concentration and motile microorganisms density distributions respectively.

  10. Determination of Critical Parameters Based on the Intensity of Transmitted Light Around Gas-Liquid Interface: Critical Parameters of CO

    NASA Astrophysics Data System (ADS)

    Nakayama, Masaki; Katano, Hiroaki; Sato, Haruki

    2014-05-01

    A precise determination of the critical temperature and density for technically important fluids would be possible on the basis of the digital image for the visual observation of the phase boundary in the vicinity of the critical point since the sensitivity and resolution are higher than those of naked eyes. In addition, the digital image can avoid the personal uncertainty of an observer. A strong density gradient occurs in a sample cell at the critical point due to gravity. It was carefully assessed to determine the critical density, where the density profile in the sample cell can be observed from the luminance profile of a digital image. The density-gradient profile becomes symmetric at the critical point. One of the best fluids, whose thermodynamic properties have been measured with the highest reliability among technically important fluids, would be carbon dioxide. In order to confirm the reliability of the proposed method, the critical temperature and density of carbon dioxide were determined using the digital image. The critical temperature and density values of carbon dioxide are ( and ( kg m, respectively. The critical temperature and density values agree with the existing best values within estimated uncertainties. The reliability of the method was confirmed. The critical pressure, 7.3795 MPa, corresponding to the determined critical temperature of 304.143 K is also proposed. A new set of parameters for the vapor-pressure equation is also provided.

  11. Characteristic parameters of superconductor-coolant interaction including high Tc current density limits

    NASA Technical Reports Server (NTRS)

    Frederking, T. H. K.

    1989-01-01

    In the area of basic mechanisms of helium heat transfer and related influence on super-conducting magnet stability, thermal boundary conditions are important constraints. Characteristic lengths are considered along with other parameters of the superconducting composite-coolant system. Based on helium temperature range developments, limiting critical current densities are assessed at low fields for high transition temperature superconductors.

  12. Investigation of the Specht density estimator

    NASA Technical Reports Server (NTRS)

    Speed, F. M.; Rydl, L. M.

    1971-01-01

    The feasibility of using the Specht density estimator function on the IBM 360/44 computer is investigated. Factors such as storage, speed, amount of calculations, size of the smoothing parameter and sample size have an effect on the results. The reliability of the Specht estimator for normal and uniform distributions and the effects of the smoothing parameter and sample size are investigated.

  13. Required sample size for monitoring stand dynamics in strict forest reserves: a case study

    Treesearch

    Diego Van Den Meersschaut; Bart De Cuyper; Kris Vandekerkhove; Noel Lust

    2000-01-01

    Stand dynamics in European strict forest reserves are commonly monitored using inventory densities of 5 to 15 percent of the total surface. The assumption that these densities guarantee a representative image of certain parameters is critically analyzed in a case study for the parameters basal area and stem number. The required sample sizes for different accuracy and...

  14. An empirical model for parameters affecting energy consumption in boron removal from boron-containing wastewaters by electrocoagulation.

    PubMed

    Yilmaz, A Erdem; Boncukcuoğlu, Recep; Kocakerim, M Muhtar

    2007-06-01

    In this study, it was investigated parameters affecting energy consumption in boron removal from boron containing wastewaters prepared synthetically, via electrocoagulation method. The solution pH, initial boron concentration, dose of supporting electrolyte, current density and temperature of solution were selected as experimental parameters affecting energy consumption. The obtained experimental results showed that boron removal efficiency reached up to 99% under optimum conditions, in which solution pH was 8.0, current density 6.0 mA/cm(2), initial boron concentration 100mg/L and solution temperature 293 K. The current density was an important parameter affecting energy consumption too. High current density applied to electrocoagulation cell increased energy consumption. Increasing solution temperature caused to decrease energy consumption that high temperature decreased potential applied under constant current density. That increasing initial boron concentration and dose of supporting electrolyte caused to increase specific conductivity of solution decreased energy consumption. As a result, it was seen that energy consumption for boron removal via electrocoagulation method could be minimized at optimum conditions. An empirical model was predicted by statistically. Experimentally obtained values were fitted with values predicted from empirical model being as following; [formula in text]. Unfortunately, the conditions obtained for optimum boron removal were not the conditions obtained for minimum energy consumption. It was determined that support electrolyte must be used for increase boron removal and decrease electrical energy consumption.

  15. Selective laser melting of high-performance pure tungsten: parameter design, densification behavior and mechanical properties

    PubMed Central

    Zhou, Kesong; Ma, Wenyou; Attard, Bonnie; Zhang, Panpan; Kuang, Tongchun

    2018-01-01

    Abstract Selective laser melting (SLM) additive manufacturing of pure tungsten encounters nearly all intractable difficulties of SLM metals fields due to its intrinsic properties. The key factors, including powder characteristics, layer thickness, and laser parameters of SLM high density tungsten are elucidated and discussed in detail. The main parameters were designed from theoretical calculations prior to the SLM process and experimentally optimized. Pure tungsten products with a density of 19.01 g/cm3 (98.50% theoretical density) were produced using SLM with the optimized processing parameters. A high density microstructure is formed without significant balling or macrocracks. The formation mechanisms for pores and the densification behaviors are systematically elucidated. Electron backscattered diffraction analysis confirms that the columnar grains stretch across several layers and parallel to the maximum temperature gradient, which can ensure good bonding between the layers. The mechanical properties of the SLM-produced tungsten are comparable to that produced by the conventional fabrication methods, with hardness values exceeding 460 HV0.05 and an ultimate compressive strength of about 1 GPa. This finding offers new potential applications of refractory metals in additive manufacturing. PMID:29707073

  16. Selective laser melting of high-performance pure tungsten: parameter design, densification behavior and mechanical properties.

    PubMed

    Tan, Chaolin; Zhou, Kesong; Ma, Wenyou; Attard, Bonnie; Zhang, Panpan; Kuang, Tongchun

    2018-01-01

    Selective laser melting (SLM) additive manufacturing of pure tungsten encounters nearly all intractable difficulties of SLM metals fields due to its intrinsic properties. The key factors, including powder characteristics, layer thickness, and laser parameters of SLM high density tungsten are elucidated and discussed in detail. The main parameters were designed from theoretical calculations prior to the SLM process and experimentally optimized. Pure tungsten products with a density of 19.01 g/cm 3 (98.50% theoretical density) were produced using SLM with the optimized processing parameters. A high density microstructure is formed without significant balling or macrocracks. The formation mechanisms for pores and the densification behaviors are systematically elucidated. Electron backscattered diffraction analysis confirms that the columnar grains stretch across several layers and parallel to the maximum temperature gradient, which can ensure good bonding between the layers. The mechanical properties of the SLM-produced tungsten are comparable to that produced by the conventional fabrication methods, with hardness values exceeding 460 HV 0.05 and an ultimate compressive strength of about 1 GPa. This finding offers new potential applications of refractory metals in additive manufacturing.

  17. Improving Measurement of Forest Structural Parameters by Co-Registering of High Resolution Aerial Imagery and Low Density LiDAR Data

    PubMed Central

    Huang, Huabing; Gong, Peng; Cheng, Xiao; Clinton, Nick; Li, Zengyuan

    2009-01-01

    Forest structural parameters, such as tree height and crown width, are indispensable for evaluating forest biomass or forest volume. LiDAR is a revolutionary technology for measurement of forest structural parameters, however, the accuracy of crown width extraction is not satisfactory when using a low density LiDAR, especially in high canopy cover forest. We used high resolution aerial imagery with a low density LiDAR system to overcome this shortcoming. A morphological filtering was used to generate a DEM (Digital Elevation Model) and a CHM (Canopy Height Model) from LiDAR data. The LiDAR camera image is matched to the aerial image with an automated keypoints search algorithm. As a result, a high registration accuracy of 0.5 pixels was obtained. A local maximum filter, watershed segmentation, and object-oriented image segmentation are used to obtain tree height and crown width. Results indicate that the camera data collected by the integrated LiDAR system plays an important role in registration with aerial imagery. The synthesis with aerial imagery increases the accuracy of forest structural parameter extraction when compared to only using the low density LiDAR data. PMID:22573971

  18. Parasitism alters three power laws of scaling in a metazoan community: Taylor’s law, density-mass allometry, and variance-mass allometry

    PubMed Central

    Lagrue, Clément; Poulin, Robert; Cohen, Joel E.

    2015-01-01

    How do the lifestyles (free-living unparasitized, free-living parasitized, and parasitic) of animal species affect major ecological power-law relationships? We investigated this question in metazoan communities in lakes of Otago, New Zealand. In 13,752 samples comprising 1,037,058 organisms, we found that species of different lifestyles differed in taxonomic distribution and body mass and were well described by three power laws: a spatial Taylor’s law (the spatial variance in population density was a power-law function of the spatial mean population density); density-mass allometry (the spatial mean population density was a power-law function of mean body mass); and variance-mass allometry (the spatial variance in population density was a power-law function of mean body mass). To our knowledge, this constitutes the first empirical confirmation of variance-mass allometry for any animal community. We found that the parameter values of all three relationships differed for species with different lifestyles in the same communities. Taylor's law and density-mass allometry accurately predicted the form and parameter values of variance-mass allometry. We conclude that species of different lifestyles in these metazoan communities obeyed the same major ecological power-law relationships but did so with parameters specific to each lifestyle, probably reflecting differences among lifestyles in population dynamics and spatial distribution. PMID:25550506

  19. Parasitism alters three power laws of scaling in a metazoan community: Taylor's law, density-mass allometry, and variance-mass allometry.

    PubMed

    Lagrue, Clément; Poulin, Robert; Cohen, Joel E

    2015-02-10

    How do the lifestyles (free-living unparasitized, free-living parasitized, and parasitic) of animal species affect major ecological power-law relationships? We investigated this question in metazoan communities in lakes of Otago, New Zealand. In 13,752 samples comprising 1,037,058 organisms, we found that species of different lifestyles differed in taxonomic distribution and body mass and were well described by three power laws: a spatial Taylor's law (the spatial variance in population density was a power-law function of the spatial mean population density); density-mass allometry (the spatial mean population density was a power-law function of mean body mass); and variance-mass allometry (the spatial variance in population density was a power-law function of mean body mass). To our knowledge, this constitutes the first empirical confirmation of variance-mass allometry for any animal community. We found that the parameter values of all three relationships differed for species with different lifestyles in the same communities. Taylor's law and density-mass allometry accurately predicted the form and parameter values of variance-mass allometry. We conclude that species of different lifestyles in these metazoan communities obeyed the same major ecological power-law relationships but did so with parameters specific to each lifestyle, probably reflecting differences among lifestyles in population dynamics and spatial distribution.

  20. EggLib: processing, analysis and simulation tools for population genetics and genomics

    PubMed Central

    2012-01-01

    Background With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. Results In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. Conclusions EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded. PMID:22494792

  1. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, Madeline Louise; McMath, Garrett Earl

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  2. Verification of Plutonium Content in PuBe Sources Using MCNP® 6.2.0 Beta with TENDL 2012 Libraries

    DOE PAGES

    Lockhart, Madeline Louise; McMath, Garrett Earl

    2017-10-26

    Although the production of PuBe neutron sources has discontinued, hundreds of sources with unknown or inaccurately declared plutonium content are in existence around the world. Institutions have undertaken the task of assaying these sources, measuring, and calculating the isotopic composition, plutonium content, and neutron yield. The nominal plutonium content, based off the neutron yield per gram of pure 239Pu, has shown to be highly inaccurate. New methods of measuring the plutonium content allow a more accurate estimate of the true Pu content, but these measurements need verification. Using the TENDL 2012 nuclear data libraries, MCNP6 has the capability to simulatemore » the (α, n) interactions in a PuBe source. Theoretically, if the source is modeled according to the plutonium content, isotopic composition, and other source characteristics, the calculated neutron yield in MCNP can be compared to the experimental yield, offering an indication of the accuracy of the declared plutonium content. In this study, three sets of PuBe sources from various backgrounds were modeled in MCNP6 1.2 Beta, according to the source specifications dictated by the individuals who assayed the source. Verification of the source parameters with MCNP6 also serves as a means to test the alpha transport capabilities of MCNP6 1.2 Beta with TENDL 2012 alpha transport libraries. Finally, good agreement in the comparison would indicate the accuracy of the source parameters in addition to demonstrating MCNP's capabilities in simulating (α, n) interactions.« less

  3. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  4. EggLib: processing, analysis and simulation tools for population genetics and genomics.

    PubMed

    De Mita, Stéphane; Siol, Mathieu

    2012-04-11

    With the considerable growth of available nucleotide sequence data over the last decade, integrated and flexible analytical tools have become a necessity. In particular, in the field of population genetics, there is a strong need for automated and reliable procedures to conduct repeatable and rapid polymorphism analyses, coalescent simulations, data manipulation and estimation of demographic parameters under a variety of scenarios. In this context, we present EggLib (Evolutionary Genetics and Genomics Library), a flexible and powerful C++/Python software package providing efficient and easy to use computational tools for sequence data management and extensive population genetic analyses on nucleotide sequence data. EggLib is a multifaceted project involving several integrated modules: an underlying computationally efficient C++ library (which can be used independently in pure C++ applications); two C++ programs; a Python package providing, among other features, a high level Python interface to the C++ library; and the egglib script which provides direct access to pre-programmed Python applications. EggLib has been designed aiming to be both efficient and easy to use. A wide array of methods are implemented, including file format conversion, sequence alignment edition, coalescent simulations, neutrality tests and estimation of demographic parameters by Approximate Bayesian Computation (ABC). Classes implementing different demographic scenarios for ABC analyses can easily be developed by the user and included to the package. EggLib source code is distributed freely under the GNU General Public License (GPL) from its website http://egglib.sourceforge.net/ where a full documentation and a manual can also be found and downloaded.

  5. Neutrino oscillation parameter sampling with MonteCUBES

    NASA Astrophysics Data System (ADS)

    Blennow, Mattias; Fernandez-Martinez, Enrique

    2010-01-01

    We present MonteCUBES ("Monte Carlo Utility Based Experiment Simulator"), a software package designed to sample the neutrino oscillation parameter space through Markov Chain Monte Carlo algorithms. MonteCUBES makes use of the GLoBES software so that the existing experiment definitions for GLoBES, describing long baseline and reactor experiments, can be used with MonteCUBES. MonteCUBES consists of two main parts: The first is a C library, written as a plug-in for GLoBES, implementing the Markov Chain Monte Carlo algorithm to sample the parameter space. The second part is a user-friendly graphical Matlab interface to easily read, analyze, plot and export the results of the parameter space sampling. Program summaryProgram title: MonteCUBES (Monte Carlo Utility Based Experiment Simulator) Catalogue identifier: AEFJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public Licence No. of lines in distributed program, including test data, etc.: 69 634 No. of bytes in distributed program, including test data, etc.: 3 980 776 Distribution format: tar.gz Programming language: C Computer: MonteCUBES builds and installs on 32 bit and 64 bit Linux systems where GLoBES is installed Operating system: 32 bit and 64 bit Linux RAM: Typically a few MBs Classification: 11.1 External routines: GLoBES [1,2] and routines/libraries used by GLoBES Subprograms used:Cat Id ADZI_v1_0, Title GLoBES, Reference CPC 177 (2007) 439 Nature of problem: Since neutrino masses do not appear in the standard model of particle physics, many models of neutrino masses also induce other types of new physics, which could affect the outcome of neutrino oscillation experiments. In general, these new physics imply high-dimensional parameter spaces that are difficult to explore using classical methods such as multi-dimensional projections and minimizations, such as those used in GLoBES [1,2]. Solution method: MonteCUBES is written as a plug-in to the GLoBES software [1,2] and provides the necessary methods to perform Markov Chain Monte Carlo sampling of the parameter space. This allows an efficient sampling of the parameter space and has a complexity which does not grow exponentially with the parameter space dimension. The integration of the MonteCUBES package with the GLoBES software makes sure that the experimental definitions already in use by the community can also be used with MonteCUBES, while also lowering the learning threshold for users who already know GLoBES. Additional comments: A Matlab GUI for interpretation of results is included in the distribution. Running time: The typical running time varies depending on the dimensionality of the parameter space, the complexity of the experiment, and how well the parameter space should be sampled. The running time for our simulations [3] with 15 free parameters at a Neutrino Factory with O(10) samples varied from a few hours to tens of hours. References:P. Huber, M. Lindner, W. Winter, Comput. Phys. Comm. 167 (2005) 195, hep-ph/0407333. P. Huber, J. Kopp, M. Lindner, M. Rolinec, W. Winter, Comput. Phys. Comm. 177 (2007) 432, hep-ph/0701187. S. Antusch, M. Blennow, E. Fernandez-Martinez, J. Lopez-Pavon, arXiv:0903.3986 [hep-ph].

  6. Relationships of storm-time changes in thermospheric mass density with solar wind/IMF parameters and ring current index of Sym-H

    NASA Astrophysics Data System (ADS)

    Zhou, Yunliang; Ma, S. Y.; Xiong, Chao; Luehr, Hermann

    The total air mass densities at about 500 km altitude are derived using super-STAR accelerom-eter measurements onboard GRACE satellites for 25 great magnetic storms with minimum Dst less than 100 nT during 2002 to 2006 years. Taking NRLMSISE-00 model-predicted densities without active ap index input as a reference baseline of quiet-time mass density, the storm-time changes in upper thermospheric mass densities are obtained by subtraction for all the storm events and sorted into different grids of latitude by local time sector. The relationships of the storm-time density changes with various interplanetary parameters and magnetospheric ring current index of Sym-H are statistically investigated. The parameters include Akasofu energy coupling function, the merging electric field Em, the magnitude of IMF component in the GSM y-z plane etc. as calculated from OMNI data at 1 AU. It is found that the storm-time changes in the upper thermospheric mass density have the best linear correlation with the Sym-H index in general, showing nearly zero time delay at low-latitudes and a little time ahead at high-latitudes for most cases. Unexpectedly, the magnitude of IMF component in the y-z plane, Byz, shows correlation with storm-time mass density changes better and closer than Akasofu function and even Em. And, the mass density changes lag behind Byz about 1-4 hours for most cases at low-latitudes. The correlations considered above are local time dependent, showing the lowest at dusk sectors. For the largest superstorm of November 2003, the changes in mass density are correlated very closely with Byz, Em, and Sym-H index, showing correlation coefficients averaged over all latitudes in noon sector as high as 0.93, 0.91 and 0.90 separately. The physical factors controlling the lag times between the mass density changes at mid-low-latitudes and the interplanetary parameter variations are also analyzed. The results in this study may pro-vide useful suggestions for establishing empirical model to predict storm-time changes in upper thermospheric mass density. This work is supported by NSFC (No. 40804049) and Doctoral Fund of Ministry of Education of China (No. 200804860012).

  7. Understanding sources of uncertainty and bias error in models of human response to low amplitude sonic booms

    NASA Astrophysics Data System (ADS)

    Collmar, M.; Cook, B. G.; Cowart, R.; Freund, D.; Gavin, J.

    2015-10-01

    A pool of 240 subjects was exposed to a library of waveforms consisting of example signatures of low boom aircraft. The signature library included intentional variations in both loudness and spectral content, and were auralized using the Gulfstream SASS-II sonic boom simulator. Post-processing was used to quantify the impacts of test design decisions on the "quality" of the resultant database. Specific lessons learned from this study include insight regarding potential for bias error due to variations in loudness or peak over-pressure, sources of uncertainty and their relative importance on objective measurements and robustness of individual metrics to wide variations in spectral content. Results provide clear guidance for design of future large scale community surveys, where one must optimize the complex tradeoffs between the size of the surveyed population, spatial footprint of those participants, and the fidelity/density of objective measurements.

  8. Uncertainty quantification in LES of channel flow

    DOE PAGES

    Safta, Cosmin; Blaylock, Myra; Templeton, Jeremy; ...

    2016-07-12

    Here, in this paper, we present a Bayesian framework for estimating joint densities for large eddy simulation (LES) sub-grid scale model parameters based on canonical forced isotropic turbulence direct numerical simulation (DNS) data. The framework accounts for noise in the independent variables, and we present alternative formulations for accounting for discrepancies between model and data. To generate probability densities for flow characteristics, posterior densities for sub-grid scale model parameters are propagated forward through LES of channel flow and compared with DNS data. Synthesis of the calibration and prediction results demonstrates that model parameters have an explicit filter width dependence andmore » are highly correlated. Discrepancies between DNS and calibrated LES results point to additional model form inadequacies that need to be accounted for.« less

  9. Effects of plantation density on wood density and anatomical properties of red pine (Pinus resinosa Ait.)

    Treesearch

    J. Y. Zhu; C. Tim Scott; Karen L. Scallon; Gary C. Myers

    2007-01-01

    This study demonstrated that average ring width (or average annual radial growth rate) is a reliable parameter to quantify the effects of tree plantation density (growth suppression) on wood density and tracheid anatomical properties. The average ring width successfully correlated wood density and tracheid anatomical properties of red pines (Pinus resinosa Ait.) from a...

  10. Effects of molecular elongation on liquid crystalline phase behaviour: isotropic-nematic transition

    NASA Astrophysics Data System (ADS)

    Singh, Ram Chandra; Ram, Jokhan

    2003-08-01

    We present the density-functional approach to study the isotropic-nematic transitions and calculate the values of freezing parameters of the Gay-Berne liquid crystal model, concentrating on the effects of varying the molecular elongation, x0. For this, we have solved the Percus-Yevick integral equation theory to calculate the pair-correlation functions of a fluid the molecules of which interact via a Gay-Berne pair potential. These results have been used in the density-functional theory as an input to locate the isotropic-nematic transition and calculate freezing parameters for a range of length-to-width parameters 3.0⩽ x0⩽4.0 at reduced temperatures 0.95 and 1.25. We observed that as x0 is increased, the isotropic-nematic transition is seen to move to lower density at a given temperature. We find that the density-functional theory is good to study the freezing transitions in such fluids. We have also compared our results with computer simulation results wherever they are available.

  11. Stacking fault density and bond orientational order of fcc ruthenium nanoparticles

    NASA Astrophysics Data System (ADS)

    Seo, Okkyun; Sakata, Osami; Kim, Jae Myung; Hiroi, Satoshi; Song, Chulho; Kumara, Loku Singgappulige Rosantha; Ohara, Koji; Dekura, Shun; Kusada, Kohei; Kobayashi, Hirokazu; Kitagawa, Hiroshi

    2017-12-01

    We investigated crystal structure deviations of catalytic nanoparticles (NPs) using synchrotron powder X-ray diffraction. The samples were fcc ruthenium (Ru) NPs with diameters of 2.4, 3.5, 3.9, and 5.4 nm. We analyzed average crystal structures by applying the line profile method to a stacking fault model and local crystal structures using bond orientational order (BOO) parameters. The reflection peaks shifted depending on rules that apply to each stacking fault. We evaluated the quantitative stacking faults densities for fcc Ru NPs, and the stacking fault per number of layers was 2-4, which is quite large. Our analysis shows that the fcc Ru 2.4 nm-diameter NPs have a considerably high stacking fault density. The B factor tends to increase with the increasing stacking fault density. A structural parameter that we define from the BOO parameters exhibits a significant difference from the ideal value of the fcc structure. This indicates that the fcc Ru NPs are highly disordered.

  12. Application of densification process in organic waste management.

    PubMed

    Zafari, Abedin; Kianmehr, Mohammad Hossein

    2013-07-01

    Densification of biomass material that usually has a low density is good way of increasing density, reducing the cost of transportation, and simplifying the storage and distribution of this material. The current study was conducted to investigate the influence of raw material parameters (moisture content and particle size), and densification process parameters (piston speed and die length) on the density and durability of pellets from compost manure. A hydraulic press and a single pelleter were used to produce pellets in controlled conditions. Ground biomass samples were compressed with three levels of moisture content [35%, 40% and 45% (wet basis)], piston speed (2, 6 and 10 mm/s), die length (8, 10 and 12 mm) and particle size (0.3., 0.9 and 1.5 mm) to establish density and durability of pellets. A response surface methodology based on the Box Behnken design was used to study the responses pattern and to understand the influence of parameters. The results revealed that all independent variables have significant (P < 0.01) effects on studied responses in this research.

  13. Determination of Process Parameters for High-Density, Ti-6Al-4V Parts Using Additive Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, C.

    In our earlier work, we described an approach for determining the process parameters that re- sult in high-density parts manufactured using the additive-manufacturing process of selective laser melting (SLM). Our approach, which combines simple simulations and experiments, was demon- strated using 316L stainless steel. We have also used the approach successfully for several other materials. This short note summarizes the results of our work in determining process parameters for Ti-6Al-4V using a Concept Laser M2 system.

  14. The relationship between dental implant stability and trabecular bone structure using cone-beam computed tomography

    PubMed Central

    2016-01-01

    Purpose The objective of this study was to investigate the relationships between primary implant stability as measured by impact response frequency and the structural parameters of trabecular bone using cone-beam computed tomography(CBCT), excluding the effect of cortical bone thickness. Methods We measured the impact response of a dental implant placed into swine bone specimens composed of only trabecular bone without the cortical bone layer using an inductive sensor. The peak frequency of the impact response spectrum was determined as an implant stability criterion (SPF). The 3D microstructural parameters were calculated from CT images of the bone specimens obtained using both micro-CT and CBCT. Results SPF had significant positive correlations with trabecular bone structural parameters (BV/TV, BV, BS, BSD, Tb.Th, Tb.N, FD, and BS/BV) (P<0.01) while SPF demonstrated significant negative correlations with other microstructural parameters (Tb.Sp, Tb.Pf, and SMI) using micro-CT and CBCT (P<0.01). Conclusions There was an increase in implant stability prediction by combining BV/TV and SMI in the stepwise forward regression analysis. Bone with high volume density and low surface density shows high implant stability. Well-connected thick bone with small marrow spaces also shows high implant stability. The combination of bone density and architectural parameters measured using CBCT can predict the implant stability more accurately than the density alone in clinical diagnoses. PMID:27127692

  15. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.

  16. Cell size and wall dimensions drive distinct variability of earlywood and latewood density in Northern Hemisphere conifers.

    PubMed

    Björklund, Jesper; Seftigen, Kristina; Schweingruber, Fritz; Fonti, Patrick; von Arx, Georg; Bryukhanova, Marina V; Cuny, Henri E; Carrer, Marco; Castagneri, Daniele; Frank, David C

    2017-11-01

    Interannual variability of wood density - an important plant functional trait and environmental proxy - in conifers is poorly understood. We therefore explored the anatomical basis of density. We hypothesized that earlywood density is determined by tracheid size and latewood density by wall dimensions, reflecting their different functional tasks. To determine general patterns of variability, density parameters from 27 species and 349 sites across the Northern Hemisphere were correlated to tree-ring width parameters and local climate. We performed the same analyses with density and width derived from anatomical data comprising two species and eight sites. The contributions of tracheid size and wall dimensions to density were disentangled with sensitivity analyses. Notably, correlations between density and width shifted from negative to positive moving from earlywood to latewood. Temperature responses of density varied intraseasonally in strength and sign. The sensitivity analyses revealed tracheid size as the main determinant of earlywood density, while wall dimensions become more influential for latewood density. Our novel approach of integrating detailed anatomical data with large-scale tree-ring data allowed us to contribute to an improved understanding of interannual variations of conifer growth and to illustrate how conifers balance investments in the competing xylem functions of hydraulics and mechanical support. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.

  17. Alteration of Lipid Profile in Subclinical Hypothyroidism: A Meta-Analysis

    PubMed Central

    Liu, Xiao-Li; He, Shan; Zhang, Shao-Fang; Wang, Jun; Sun, Xiu-Fa; Gong, Chun-Mei; Zheng, Shi-Jie; Zhou, Ji-Chang; Xu, Jian

    2014-01-01

    Background Previous studies yielded controversial results about the alteration of lipid profiles in patients with subclinical hypothyroidism. We performed a meta-analysis to investigate the association between subclinical hypothyroidism and lipid profiles. Material/Methods We searched PubMed, Cochrane Library, and China National Knowledge Infrastructure articles published January 1990 through January 2014. Dissertation databases (PQDT and CDMD) were searched for additional unpublished articles. We included articles reporting the relationship between subclinical hypothyroidism and at least 1 parameter of lipid profiles, and calculated the overall weighted mean difference (WMD) with a random effects model. Meta-regression was used to explore the source of heterogeneity among studies, and the Egger test, Begg test, and the trim and fill method were used to assess potential publication bias. Results Sixteen observational studies were included in our analysis. Meta-analysis suggested that the serum total cholesterol (TC), low-density lipoprotein cholesterol (LDL-C), and total triglyceride levels were significantly increased in patients with subclinical hypothyroidism compared with euthyroidism individuals; the WMD were 12.17 mg/dl, 7.01 mg/dl, and 13.19 mg/dl, respectively (P<0.001 for all). No significant difference was observed for serum high-density lipoprotein cholesterol (HDL-C). Match strategy was the main source of heterogeneity among studies in TC and LDL-C analysis. Potential publication bias was found in TC and LDL-C analysis by the Egger test or Begg test and was not confirmed by the trim and fill method. Conclusions Subclinical hypothyroidism may correlate with altered lipid profile. Previous studies had limitations in the control of potential confounding factors and further studies should consider those factors. PMID:25124461

  18. Transcutaneous Raman Spectroscopy of Bone

    NASA Astrophysics Data System (ADS)

    Maher, Jason R.

    Clinical diagnoses of bone health and fracture risk typically rely upon measurements of bone density or structure, but the strength of a bone is also dependent upon its chemical composition. One technology that has been used extensively in ex vivo, exposed-bone studies to measure the chemical composition of bone is Raman spectroscopy. This spectroscopic technique provides chemical information about a sample by probing its molecular vibrations. In the case of bone tissue, Raman spectra provide chemical information about both the inorganic mineral and organic matrix components, which each contribute to bone strength. To explore the relationship between bone strength and chemical composition, our laboratory has contributed to ex vivo, exposed-bone animal studies of rheumatoid arthritis, glucocorticoid-induced osteoporosis, and prolonged lead exposure. All of these studies suggest that Raman-based predictions of biomechanical strength may be more accurate than those produced by the clinically-used parameter of bone mineral density. The utility of Raman spectroscopy in ex vivo, exposed-bone studies has inspired attempts to perform bone spectroscopy transcutaneously. Although the results are promising, further advancements are necessary to make non-invasive, in vivo measurements of bone that are of sufficient quality to generate accurate predictions of fracture risk. In order to separate the signals from bone and soft tissue that contribute to a transcutaneous measurement, we developed an overconstrained extraction algorithm that is based upon fitting with spectral libraries derived from separately-acquired measurements of the underlying tissue components. This approach allows for accurate spectral unmixing despite the fact that similar chemical components (e.g., type I collagen) are present in both soft tissue and bone and was applied to experimental data in order to transcutaneously detect, to our knowledge for the first time, age- and disease-related spectral differences in murine bone.

  19. Model Considerations for Memory-based Automatic Music Transcription

    NASA Astrophysics Data System (ADS)

    Albrecht, Štěpán; Šmídl, Václav

    2009-12-01

    The problem of automatic music description is considered. The recorded music is modeled as a superposition of known sounds from a library weighted by unknown weights. Similar observation models are commonly used in statistics and machine learning. Many methods for estimation of the weights are available. These methods differ in the assumptions imposed on the weights. In Bayesian paradigm, these assumptions are typically expressed in the form of prior probability density function (pdf) on the weights. In this paper, commonly used assumptions about music signal are summarized and complemented by a new assumption. These assumptions are translated into pdfs and combined into a single prior density using combination of pdfs. Validity of the model is tested in simulation using synthetic data.

  20. Icing Analysis of a Swept NACA 0012 Wing Using LEWICE3D Version 3.48

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.

    2014-01-01

    Icing calculations were performed for a NACA 0012 swept wing tip using LEWICE3D Version 3.48 coupled with the ANSYS CFX flow solver. The calculated ice shapes were compared to experimental data generated in the NASA Glenn Icing Research Tunnel (IRT). The IRT tests were designed to test the performance of the LEWICE3D ice void density model which was developed to improve the prediction of swept wing ice shapes. Icing tests were performed for a range of temperatures at two different droplet inertia parameters and two different sweep angles. The predicted mass agreed well with the experiment with an average difference of 12%. The LEWICE3D ice void density model under-predicted void density by an average of 30% for the large inertia parameter cases and by 63% for the small inertia parameter cases. This under-prediction in void density resulted in an over-prediction of ice area by an average of 115%. The LEWICE3D ice void density model produced a larger average area difference with experiment than the standard LEWICE density model, which doesn't account for the voids in the swept wing ice shape, (115% and 75% respectively) but it produced ice shapes which were deemed more appropriate because they were conservative (larger than experiment). Major contributors to the overly conservative ice shape predictions were deficiencies in the leading edge heat transfer and the sensitivity of the void ice density model to the particle inertia parameter. The scallop features present on the ice shapes were thought to generate interstitial flow and horse shoe vortices which enhance the leading edge heat transfer. A set of changes to improve the leading edge heat transfer and the void density model were tested. The changes improved the ice shape predictions considerably. More work needs to be done to evaluate the performance of these modifications for a wider range of geometries and icing conditions.

  1. Icing Analysis of a Swept NACA 0012 Wing Using LEWICE3D Version 3.48

    NASA Technical Reports Server (NTRS)

    Bidwell, Colin S.

    2014-01-01

    Icing calculations were performed for a NACA 0012 swept wing tip using LEWICE3D Version 3.48 coupled with the ANSYS CFX flow solver. The calculated ice shapes were compared to experimental data generated in the NASA Glenn Icing Research Tunnel (IRT). The IRT tests were designed to test the performance of the LEWICE3D ice void density model which was developed to improve the prediction of swept wing ice shapes. Icing tests were performed for a range of temperatures at two different droplet inertia parameters and two different sweep angles. The predicted mass agreed well with the experiment with an average difference of 12%. The LEWICE3D ice void density model under-predicted void density by an average of 30% for the large inertia parameter cases and by 63% for the small inertia parameter cases. This under-prediction in void density resulted in an over-prediction of ice area by an average of 115%. The LEWICE3D ice void density model produced a larger average area difference with experiment than the standard LEWICE density model, which doesn't account for the voids in the swept wing ice shape, (115% and 75% respectively) but it produced ice shapes which were deemed more appropriate because they were conservative (larger than experiment). Major contributors to the overly conservative ice shape predictions were deficiencies in the leading edge heat transfer and the sensitivity of the void ice density model to the particle inertia parameter. The scallop features present on the ice shapes were thought to generate interstitial flow and horse shoe vortices which enhance the leading edge heat transfer. A set of changes to improve the leading edge heat transfer and the void density model were tested. The changes improved the ice shape predictions considerably. More work needs to be done to evaluate the performance of these modifications for a wider range of geometries and icing conditions

  2. Comparison of dwarf bamboos (Indocalamus sp.) leaf parameters to determine relationship between spatial density of plants and total leaf area per plant.

    PubMed

    Shi, Pei-Jian; Xu, Qiang; Sandhu, Hardev S; Gielis, Johan; Ding, Yu-Long; Li, Hua-Rong; Dong, Xiao-Bo

    2015-10-01

    The relationship between spatial density and size of plants is an important topic in plant ecology. The self-thinning rule suggests a -3/2 power between average biomass and density or a -1/2 power between stand yield and density. However, the self-thinning rule based on total leaf area per plant and density of plants has been neglected presumably because of the lack of a method that can accurately estimate the total leaf area per plant. We aimed to find the relationship between spatial density of plants and total leaf area per plant. We also attempted to provide a novel model for accurately describing the leaf shape of bamboos. We proposed a simplified Gielis equation with only two parameters to describe the leaf shape of bamboos one model parameter represented the overall ratio of leaf width to leaf length. Using this method, we compared some leaf parameters (leaf shape, number of leaves per plant, ratio of total leaf weight to aboveground weight per plant, and total leaf area per plant) of four bamboo species of genus Indocalamus Nakai (I. pedalis (Keng) P.C. Keng, I. pumilus Q.H. Dai and C.F. Keng, I. barbatus McClure, and I. victorialis P.C. Keng). We also explored the possible correlation between spatial density and total leaf area per plant using log-linear regression. We found that the simplified Gielis equation fit the leaf shape of four bamboo species very well. Although all these four species belonged to the same genus, there were still significant differences in leaf shape. Significant differences also existed in leaf area per plant, ratio of leaf weight to aboveground weight per plant, and leaf length. In addition, we found that the total leaf area per plant decreased with increased spatial density. Therefore, we directly demonstrated the self-thinning rule to improve light interception.

  3. Regulating the surface poly(ethylene glycol) density of polymeric nanoparticles and evaluating its role in drug delivery in vivo.

    PubMed

    Du, Xiao-Jiao; Wang, Ji-Long; Liu, Wei-Wei; Yang, Jin-Xian; Sun, Chun-Yang; Sun, Rong; Li, Hong-Jun; Shen, Song; Luo, Ying-Li; Ye, Xiao-Dong; Zhu, Yan-Hua; Yang, Xian-Zhu; Wang, Jun

    2015-11-01

    Poly(ethylene glycol) (PEG) is usually used to protect nanoparticles from rapid clearance in blood. The effects are highly dependent on the surface PEG density of nanoparticles. However, there lacks a detailed and informative study in PEG density and in vivo drug delivery due to the critical techniques to precisely control the surface PEG density when maintaining other nano-properties. Here, we regulated the polymeric nanoparticles' size and surface PEG density by incorporating poly(ε-caprolactone) (PCL) homopolymer into poly(ethylene glycol)-block-poly(ε-caprolactone) (PEG-PCL) and adjusting the mass ratio of PCL to PEG-PCL during the nanoparticles preparation. We further developed a library of polymeric nanoparticles with different but controllable sizes and surface PEG densities by changing the molecular weight of the PCL block in PEG-PCL and tuning the molar ratio of repeating units of PCL (CL) to that of PEG (EG). We thus obtained a group of nanoparticles with variable surface PEG densities but with other nano-properties identical, and investigated the effects of surface PEG densities on the biological behaviors of nanoparticles in mice. We found that, high surface PEG density made the nanoparticles resistant to absorption of serum protein and uptake by macrophages, leading to a greater accumulation of nanoparticles in tumor tissue, which recuperated the defects of decreased internalization by tumor cells, resulting in superior antitumor efficacy when carrying docetaxel. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Macromolecular ab initio phasing enforcing secondary and tertiary structure.

    PubMed

    Millán, Claudia; Sammito, Massimo; Usón, Isabel

    2015-01-01

    Ab initio phasing of macromolecular structures, from the native intensities alone with no experimental phase information or previous particular structural knowledge, has been the object of a long quest, limited by two main barriers: structure size and resolution of the data. Current approaches to extend the scope of ab initio phasing include use of the Patterson function, density modification and data extrapolation. The authors' approach relies on the combination of locating model fragments such as polyalanine α-helices with the program PHASER and density modification with the program SHELXE. Given the difficulties in discriminating correct small substructures, many putative groups of fragments have to be tested in parallel; thus calculations are performed in a grid or supercomputer. The method has been named after the Italian painter Arcimboldo, who used to compose portraits out of fruit and vegetables. With ARCIMBOLDO, most collections of fragments remain a 'still-life', but some are correct enough for density modification and main-chain tracing to reveal the protein's true portrait. Beyond α-helices, other fragments can be exploited in an analogous way: libraries of helices with modelled side chains, β-strands, predictable fragments such as DNA-binding folds or fragments selected from distant homologues up to libraries of small local folds that are used to enforce nonspecific tertiary structure; thus restoring the ab initio nature of the method. Using these methods, a number of unknown macromolecules with a few thousand atoms and resolutions around 2 Å have been solved. In the 2014 release, use of the program has been simplified. The software mediates the use of massive computing to automate the grid access required in difficult cases but may also run on a single multicore workstation (http://chango.ibmb.csic.es/ARCIMBOLDO_LITE) to solve straightforward cases.

  5. Relative effectiveness of kinetic analysis vs single point readings for classifying environmental samples based on community-level physiological profiles (CLPP)

    NASA Technical Reports Server (NTRS)

    Garland, J. L.; Mills, A. L.; Young, J. S.

    2001-01-01

    The relative effectiveness of average-well-color-development-normalized single-point absorbance readings (AWCD) vs the kinetic parameters mu(m), lambda, A, and integral (AREA) of the modified Gompertz equation fit to the color development curve resulting from reduction of a redox sensitive dye from microbial respiration of 95 separate sole carbon sources in microplate wells was compared for a dilution series of rhizosphere samples from hydroponically grown wheat and potato ranging in inoculum densities of 1 x 10(4)-4 x 10(6) cells ml-1. Patterns generated with each parameter were analyzed using principal component analysis (PCA) and discriminant function analysis (DFA) to test relative resolving power. Samples of equivalent cell density (undiluted samples) were correctly classified by rhizosphere type for all parameters based on DFA analysis of the first five PC scores. Analysis of undiluted and 1:4 diluted samples resulted in misclassification of at least two of the wheat samples for all parameters except the AWCD normalized (0.50 abs. units) data, and analysis of undiluted, 1:4, and 1:16 diluted samples resulted in misclassification for all parameter types. Ordination of samples along the first principal component (PC) was correlated to inoculum density in analyses performed on all of the kinetic parameters, but no such influence was seen for AWCD-derived results. The carbon sources responsible for classification differed among the variable types with the exception of AREA and A, which were strongly correlated. These results indicate that the use of kinetic parameters for pattern analysis in CLPP may provide some additional information, but only if the influence of inoculum density is carefully considered. c2001 Elsevier Science Ltd. All rights reserved.

  6. Effects of hair removal alexandrite laser on biometric parameters of the skin.

    PubMed

    Alavi, Shiva; Abolhasani, Ehsan; Nilforoushzadeh, Mohammadali

    2016-04-01

    The effects of alexandrite laser (AL) on skin parameters such as melanin content, skin layer depth, elasticity, and density have not been investigated through biometric methods. We aim to assess the effect of AL on the skin parameters through biometric devices to determine whether it has positive effects on treated region. In this pretest-posttest study, we recruited patients who attended Laser Clinic of Skin and Stem Cell Research Center, Tehran University of Medical Sciences, Tehran, Iran, from January through December 2014. Patients had to be free of any dermatologic conditions and lesion at the site of treatment or any contraindication to laser therapy. Baseline measurements were performed and patients received four sessions of AL therapy (spot size, 12 mm; fluence, 12 J/cm(2); and pulse width, 5 Hz) with 4-week intervals. Four weeks after the last treatment session, the same parameters were assessed that included skin color, transepidermal water loss (TEWL), dermis and epidermis density and depth (through skin ultrasonography), melanin content, erythema intensity, and skin elasticity. Biometric parameters of 33 patients (27 females [81.8%]), with mean (SD) age of 35.7 (9.5) years were evaluated. The mean percent changes of skin parameters were as follows: skin color, 5.88% through Visioface and by 56.8% through Colorimeter devices (became lighter); melanin content, -15.95%; TEWL, -2.96%; elasticity, +14.88%; dermis depth -19.01%; and dermis density, +1580.11% (P < 0.001 for changes in each parameter). AL could decrease melanin content of the skin and make the skin thinner while it could increase elasticity and density of epidermis and dermis, which might indicate increased collagen content of skin.

  7. Immunohistological features related to functional impairment in lymphangioleiomyomatosis.

    PubMed

    Nascimento, Ellen Caroline Toledo do; Baldi, Bruno Guedes; Mariani, Alessandro Wasum; Annoni, Raquel; Kairalla, Ronaldo Adib; Pimenta, Suzana Pinheiro; da Silva, Luiz Fernando Ferraz; Carvalho, Carlos Roberto Ribeiro; Dolhnikoff, Marisa

    2018-05-08

    Lymphangioleiomyomatosis (LAM) is a low-grade neoplasm characterized by the pulmonary infiltration of smooth muscle-like cells (LAM cells) and cystic destruction. Patients usually present with airway obstruction in pulmonary function tests (PFTs). Previous studies have shown correlations among histological parameters, lung function abnormalities and prognosis in LAM. We investigated the lung tissue expression of proteins related to the mTOR pathway, angiogenesis and enzymatic activity and its correlation with functional parameters in LAM patients. We analyzed morphological and functional parameters of thirty-three patients. Two groups of disease severity were identified according to FEV1 values. Lung tissue from open biopsies or lung transplants was immunostained for SMA, HMB-45, mTOR, VEGF-D, MMP-9 and D2-40. Density of cysts, density of nodules and protein expression were measured by image analysis and correlated with PFT parameters. There was no difference in the expression of D2-40 between the more severe and the less severe groups. All other immunohistological parameters showed significantly higher values in the more severe group (p ≤ 0.002). The expression of VEGF-D, MMP-9 and mTOR in LAM cells was associated with the density of both cysts and nodules. The density of cysts and nodules as well as the expression of MMP-9 and VEGF-D were associated with the impairment of PFT parameters. Severe LAM represents an active phase of the disease with high expression of VEGF-D, mTOR, and MMP-9, as well as LAM cell infiltration. Our findings suggest that the tissue expression levels of VEGF-D and MMP-9 are important parameters associated with the loss of pulmonary function and could be considered as potential severity markers in open lung biopsies of LAM patients.

  8. Multi-objective based spectral unmixing for hyperspectral images

    NASA Astrophysics Data System (ADS)

    Xu, Xia; Shi, Zhenwei

    2017-02-01

    Sparse hyperspectral unmixing assumes that each observed pixel can be expressed by a linear combination of several pure spectra in a priori library. Sparse unmixing is challenging, since it is usually transformed to a NP-hard l0 norm based optimization problem. Existing methods usually utilize a relaxation to the original l0 norm. However, the relaxation may bring in sensitive weighted parameters and additional calculation error. In this paper, we propose a novel multi-objective based algorithm to solve the sparse unmixing problem without any relaxation. We transform sparse unmixing to a multi-objective optimization problem, which contains two correlative objectives: minimizing the reconstruction error and controlling the endmember sparsity. To improve the efficiency of multi-objective optimization, a population-based randomly flipping strategy is designed. Moreover, we theoretically prove that the proposed method is able to recover a guaranteed approximate solution from the spectral library within limited iterations. The proposed method can directly deal with l0 norm via binary coding for the spectral signatures in the library. Experiments on both synthetic and real hyperspectral datasets demonstrate the effectiveness of the proposed method.

  9. Scaling relations for a needle-like electron beam plasma from the self-similar behavior in beam propagation

    NASA Astrophysics Data System (ADS)

    Bai, Xiaoyan; Chen, Chen; Li, Hong; Liu, Wandong; Chen, Wei

    2017-10-01

    Scaling relations of the main parameters of a needle-like electron beam plasma (EBP) to the initial beam energy, beam current, and discharge pressures are presented. The relations characterize the main features of the plasma in three parameter space and can provide great convenience in plasma design with electron beams. First, starting from the self-similar behavior of electron beam propagation, energy and charge depositions in beam propagation were expressed analytically as functions of the three parameters. Second, according to the complete coupled theoretical model of an EBP and appropriate assumptions, independent equations controlling the density and space charges were derived. Analytical expressions for the density and charges versus functions of energy and charge depositions were obtained. Finally, with the combination of the expressions derived in the above two steps, scaling relations of the density and potential to the three parameters were constructed. Meanwhile, numerical simulations were used to test part of the scaling relations.

  10. Calculation of Weibull strength parameters, Batdorf flaw density constants and related statistical quantities using PC-CARES

    NASA Technical Reports Server (NTRS)

    Szatmary, Steven A.; Gyekenyesi, John P.; Nemeth, Noel N.

    1990-01-01

    This manual describes the operation and theory of the PC-CARES (Personal Computer-Ceramic Analysis and Reliability Evaluation of Structures) computer program for the IBM PC and compatibles running PC-DOS/MS-DOR OR IBM/MS-OS/2 (version 1.1 or higher) operating systems. The primary purpose of this code is to estimate Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities. Included in the manual is the description of the calculation of shape and scale parameters of the two-parameter Weibull distribution using the least-squares analysis and maximum likelihood methods for volume- and surface-flaw-induced fracture in ceramics with complete and censored samples. The methods for detecting outliers and for calculating the Kolmogorov-Smirnov and the Anderson-Darling goodness-of-fit statistics and 90 percent confidence bands about the Weibull line, as well as the techniques for calculating the Batdorf flaw-density constants are also described.

  11. Speech processing using conditional observable maximum likelihood continuity mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogden, John; Nix, David

    A computer implemented method enables the recognition of speech and speech characteristics. Parameters are initialized of first probability density functions that map between the symbols in the vocabulary of one or more sequences of speech codes that represent speech sounds and a continuity map. Parameters are also initialized of second probability density functions that map between the elements in the vocabulary of one or more desired sequences of speech transcription symbols and the continuity map. The parameters of the probability density functions are then trained to maximize the probabilities of the desired sequences of speech-transcription symbols. A new sequence ofmore » speech codes is then input to the continuity map having the trained first and second probability function parameters. A smooth path is identified on the continuity map that has the maximum probability for the new sequence of speech codes. The probability of each speech transcription symbol for each input speech code can then be output.« less

  12. Large-scale high density 3D AMT for mineral exploration — A case history from volcanic massive sulfide Pb-Zn deposit with 2000 AMT sites

    NASA Astrophysics Data System (ADS)

    Chen, R.; Chen, S.; He, L.; Yao, H.; Li, H.; Xi, X.; Zhao, X.

    2017-12-01

    EM method plays a key role in volcanic massive sulfide (VMS) deposit which is with high grade and high economic value. However, the performance of high density 3D AMT in detecting deep concealed VMS targets is not clear. The size of a typical VMS target is less than 100 m x 100 m x 50 m, it's a challenge task to find it with large depth. We carried a test in a VMS Pb-Zn deposit using high density 3D AMT with site spacing as 20 m and profile spacing as 40 - 80 m. About 2000 AMT sites were acquired in an area as 2000 m x 1500 m. Then we used a sever with 8 CPUs (Intel Xeon E7-8880 v3, 2.3 GHz, 144 cores), 2048 GB RAM, and 40 TB disk array to invert above 3D AMT sites using integral equation forward modeling and re-weighted conjugated-gradient inversion. The depth of VMS ore body is about 600 m and the size of the ore body is about 100 x 100 x 20m with dip angle about 45 degree. We finds that it's very hard to recover the location and shape of the ore body by 3D AMT inversion even using the data of all AMT sites and frequencies. However, it's possible to recover the location and shape of the deep concealed ore body if we adjust the inversion parameters carefully. A new set of inversion parameter needs to be find for high density 3D AMT data set and the inversion parameters working good for Dublin Secret Model II (DSM 2) is not suitable for our real data. This problem may be caused by different data density and different number of frequency. We find a set of good inversion parameter by comparing the shape and location of ore body with inversion result and trying different inversion parameters. And the application of new inversion parameter in nearby area with high density AMT sites shows that the inversion result is improved greatly.

  13. Optical coherence tomography angiography retinal vascular network assessment in multiple sclerosis.

    PubMed

    Lanzillo, Roberta; Cennamo, Gilda; Criscuolo, Chiara; Carotenuto, Antonio; Velotti, Nunzio; Sparnelli, Federica; Cianflone, Alessandra; Moccia, Marcello; Brescia Morra, Vincenzo

    2017-09-01

    Optical coherence tomography (OCT) angiography is a new method to assess the density of the vascular networks. Vascular abnormalities are considered involved in multiple sclerosis (MS) pathology. To assess the presence of vascular abnormalities in MS and to evaluate their correlation to disease features. A total of 50 MS patients with and without history of optic neuritis (ON) and 46 healthy subjects were included. All underwent spectral domain (SD)-OCT and OCT angiography. Clinical history, Expanded Disability Status Scale (EDSS), Multiple Sclerosis Severity Score (MSSS) and disease duration were collected. Angio-OCT showed a vessel density reduction in eyes of MS patients when compared to controls. A statistically significant reduction in all SD-OCT and OCT angiography parameters was noticed both in eyes with and without ON when compared with control eyes. We found an inverse correlation between SD-OCT parameters and MSSS ( p = 0.003) and between vessel density parameters and EDSS ( p = 0.007). We report a vessel density reduction in retina of MS patients. We highlight the clinical correlation between vessel density and EDSS, suggesting that angio-OCT could be a good marker of disease and of disability in MS.

  14. Quantum crystallographic charge density of urea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wall, Michael E.

    Standard X-ray crystallography methods use free-atom models to calculate mean unit-cell charge densities. Real molecules, however, have shared charge that is not captured accurately using free-atom models. To address this limitation, a charge density model of crystalline urea was calculated using high-level quantum theory and was refined against publicly available ultra-high-resolution experimental Bragg data, including the effects of atomic displacement parameters. The resulting quantum crystallographic model was compared with models obtained using spherical atom or multipole methods. Despite using only the same number of free parameters as the spherical atom model, the agreement of the quantum model with the datamore » is comparable to the multipole model. The static, theoretical crystalline charge density of the quantum model is distinct from the multipole model, indicating the quantum model provides substantially new information. Hydrogen thermal ellipsoids in the quantum model were very similar to those obtained using neutron crystallography, indicating that quantum crystallography can increase the accuracy of the X-ray crystallographic atomic displacement parameters. Lastly, the results demonstrate the feasibility and benefits of integrating fully periodic quantum charge density calculations into ultra-high-resolution X-ray crystallographic model building and refinement.« less

  15. Quantum crystallographic charge density of urea

    DOE PAGES

    Wall, Michael E.

    2016-06-08

    Standard X-ray crystallography methods use free-atom models to calculate mean unit-cell charge densities. Real molecules, however, have shared charge that is not captured accurately using free-atom models. To address this limitation, a charge density model of crystalline urea was calculated using high-level quantum theory and was refined against publicly available ultra-high-resolution experimental Bragg data, including the effects of atomic displacement parameters. The resulting quantum crystallographic model was compared with models obtained using spherical atom or multipole methods. Despite using only the same number of free parameters as the spherical atom model, the agreement of the quantum model with the datamore » is comparable to the multipole model. The static, theoretical crystalline charge density of the quantum model is distinct from the multipole model, indicating the quantum model provides substantially new information. Hydrogen thermal ellipsoids in the quantum model were very similar to those obtained using neutron crystallography, indicating that quantum crystallography can increase the accuracy of the X-ray crystallographic atomic displacement parameters. Lastly, the results demonstrate the feasibility and benefits of integrating fully periodic quantum charge density calculations into ultra-high-resolution X-ray crystallographic model building and refinement.« less

  16. Data series embedding and scale invariant statistics.

    PubMed

    Michieli, I; Medved, B; Ristov, S

    2010-06-01

    Data sequences acquired from bio-systems such as human gait data, heart rate interbeat data, or DNA sequences exhibit complex dynamics that is frequently described by a long-memory or power-law decay of autocorrelation function. One way of characterizing that dynamics is through scale invariant statistics or "fractal-like" behavior. For quantifying scale invariant parameters of physiological signals several methods have been proposed. Among them the most common are detrended fluctuation analysis, sample mean variance analyses, power spectral density analysis, R/S analysis, and recently in the realm of the multifractal approach, wavelet analysis. In this paper it is demonstrated that embedding the time series data in the high-dimensional pseudo-phase space reveals scale invariant statistics in the simple fashion. The procedure is applied on different stride interval data sets from human gait measurements time series (Physio-Bank data library). Results show that introduced mapping adequately separates long-memory from random behavior. Smaller gait data sets were analyzed and scale-free trends for limited scale intervals were successfully detected. The method was verified on artificially produced time series with known scaling behavior and with the varying content of noise. The possibility for the method to falsely detect long-range dependence in the artificially generated short range dependence series was investigated. (c) 2009 Elsevier B.V. All rights reserved.

  17. Visualization for Molecular Dynamics Simulation of Gas and Metal Surface Interaction

    NASA Astrophysics Data System (ADS)

    Puzyrkov, D.; Polyakov, S.; Podryga, V.

    2016-02-01

    The development of methods, algorithms and applications for visualization of molecular dynamics simulation outputs is discussed. The visual analysis of the results of such calculations is a complex and actual problem especially in case of the large scale simulations. To solve this challenging task it is necessary to decide on: 1) what data parameters to render, 2) what type of visualization to choose, 3) what development tools to use. In the present work an attempt to answer these questions was made. For visualization it was offered to draw particles in the corresponding 3D coordinates and also their velocity vectors, trajectories and volume density in the form of isosurfaces or fog. We tested the way of post-processing and visualization based on the Python language with use of additional libraries. Also parallel software was developed that allows processing large volumes of data in the 3D regions of the examined system. This software gives the opportunity to achieve desired results that are obtained in parallel with the calculations, and at the end to collect discrete received frames into a video file. The software package "Enthought Mayavi2" was used as the tool for visualization. This visualization application gave us the opportunity to study the interaction of a gas with a metal surface and to closely observe the adsorption effect.

  18. Theoretical Studies of Strongly Interacting Fine Particle Systems

    NASA Astrophysics Data System (ADS)

    Fearon, Michael

    Available from UMI in association with The British Library. A theoretical analysis of the time dependent behaviour of a system of fine magnetic particles as a function of applied field and temperature was carried out. The model used was based on a theory assuming Neel relaxation with a distribution of particle sizes. This theory predicted a linear variation of S_{max} with temperature and a finite intercept, which is not reflected by experimental observations. The remanence curves of strongly interacting fine-particle systems were also investigated theoretically. It was shown that the Henkel plot of the dc demagnetisation remanence vs the isothermal remanence is a useful representation of interactions. The form of the plot was found to be a reflection of the magnetic and physical microstructure of the material, which is consistent with experimental data. The relationship between the Henkel plot and the noise of a particulate recording medium, another property dependent on the microstructure, is also considered. The Interaction Field Factor (IFF), a single parameter characterising the non-linearity of the Henkel plot, is investigated. These results are consistent with a previous experimental study. Finally the results of the noise power spectral density for erased and saturated recording media are presented, so that characterisation of interparticle interactions may be carried out with greater accuracy.

  19. Optimization of the lithium/thionyl chloride battery

    NASA Technical Reports Server (NTRS)

    White, Ralph E.

    1989-01-01

    A 1-D math model for the lithium/thionyl chloride primary cell is used in conjunction with a parameter estimation technique in order to estimate the electro-kinetic parameters of this electrochemical system. The electro-kinetic parameters include the anodic transfer coefficient and exchange current density of the lithium oxidation, alpha sub a,1 and i sub o,i,ref, the cathodic transfer coefficient and the effective exchange current density of the thionyl chloride reduction, alpha sub c,2 and a sup o i sub o,2,ref, and a morphology parameter, Xi. The parameter estimation is performed on simulated data first in order to gain confidence in the method. Data, reported in the literature, for a high rate discharge of an experimental lithium/thionyl chloride cell is used for an analysis.

  20. Effect of parity on bone mineral density: A systematic review and meta-analysis.

    PubMed

    Song, Seung Yeon; Kim, Yejee; Park, Hyunmin; Kim, Yun Joo; Kang, Wonku; Kim, Eun Young

    2017-08-01

    Parity has been suggested as a possible factor affecting bone health in women. However, study results on its association with bone mineral density are conflicting. PubMed, EMBASE, the Cochrane Library, and Korean online databases were searched using the terms "parity" and "bone mineral density", in May 2016. Two independent reviewers extracted the mean and standard deviation of bone mineral density measurements of the femoral neck, spine, and total hip in nulliparous and parous healthy women. Among the initial 10,146 studies, 10 articles comprising 24,771 women met the inclusion criteria. The overall effect of parity on bone mineral density was positive (mean difference=5.97mg/cm 2 ; 95% CI 2.37 to 9.57; P=0.001). The effect appears site-specific as parity was not significantly associated with the bone mineral density of the femoral neck (P=0.09) and lumbar spine (P=0.17), but parous women had significantly higher bone mineral density of the total hip compared to nulliparous women (mean difference=5.98mg/cm 2 ; 95% CI 1.72 to 10.24; P=0.006). No obvious heterogeneity existed among the included studies (femoral neck I 2 =0%; spine I 2 =31%; total hip I 2 =0%). Parity has a positive effect on bone in healthy, community-dwelling women and its effect appears site-specific. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. High-Density 16S Microarray and Clone Library-Based Microbial Community Composition of the Phoenix Spacecraft Assembly Clean Room

    NASA Astrophysics Data System (ADS)

    Vaishampayan, Parag; Osman, Shariff; Andersen, Gary; Venkateswaran, Kasthuri

    2010-06-01

    The bacterial diversity and comparative community structure of a clean room used for assembling the Phoenix spacecraft was characterized throughout the spacecraft assembly process by using 16S rRNA gene cloning/sequencing and DNA microarray (PhyloChip) technologies. Samples were collected from several locations of the clean room at three time points: before Phoenix's arrival (PHX-B), during hardware assembly (PHX-D), and after the spacecraft was removed for launch (PHX-A). Bacterial diversity comprised of all major bacterial phyla of PHX-B was found to be statistically different from PHX-D and PHX-A samples. Due to stringent cleaning and decontamination protocols during assembly, PHX-D bacterial diversity was dramatically reduced when compared to PHX-B and PHX-A samples. Comparative community analysis based on PhyloChip results revealed similar overall trends as were seen in clone libraries, but the high-density phylogenetic microarray detected larger diversity in all sampling events. The decrease in community complexity in PHX-D compared to PHX-B, and the subsequent recurrence of these organisms in PHX-A, speaks to the effectiveness of NASA cleaning protocols. However, the persistence of a subset of bacterial signatures throughout all spacecraft assembly phases underscores the need for continued refinement of sterilization technologies and the implementation of safeguards that monitor and inventory microbial contaminants.

  2. High-density 16S microarray and clone library-based microbial community composition of the Phoenix spacecraft assembly clean room.

    PubMed

    Vaishampayan, Parag; Osman, Shariff; Andersen, Gary; Venkateswaran, Kasthuri

    2010-06-01

    The bacterial diversity and comparative community structure of a clean room used for assembling the Phoenix spacecraft was characterized throughout the spacecraft assembly process by using 16S rRNA gene cloning/sequencing and DNA microarray (PhyloChip) technologies. Samples were collected from several locations of the clean room at three time points: before Phoenix's arrival (PHX-B), during hardware assembly (PHX-D), and after the spacecraft was removed for launch (PHX-A). Bacterial diversity comprised of all major bacterial phyla of PHX-B was found to be statistically different from PHX-D and PHX-A samples. Due to stringent cleaning and decontamination protocols during assembly, PHX-D bacterial diversity was dramatically reduced when compared to PHX-B and PHX-A samples. Comparative community analysis based on PhyloChip results revealed similar overall trends as were seen in clone libraries, but the high-density phylogenetic microarray detected larger diversity in all sampling events. The decrease in community complexity in PHX-D compared to PHX-B, and the subsequent recurrence of these organisms in PHX-A, speaks to the effectiveness of NASA cleaning protocols. However, the persistence of a subset of bacterial signatures throughout all spacecraft assembly phases underscores the need for continued refinement of sterilization technologies and the implementation of safeguards that monitor and inventory microbial contaminants.

  3. Spectral density method to Anderson-Holstein model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chebrolu, Narasimha Raju, E-mail: narasimharaju.phy@gmail.com; Chatterjee, Ashok

    Two-parameter spectral density function of a magnetic impurity electron in a non-magnetic metal is calculated within the framework of the Anderson-Holstein model using the spectral density approximation method. The effect of electron-phonon interaction on the spectral function is investigated.

  4. Symmetry Parameter Constraints from a Lower Bound on Neutron-matter Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tews, Ingo; Lattimer, James M.; Ohnishi, Akira

    We propose the existence of a lower bound on the energy of pure neutron matter (PNM) on the basis of unitary-gas considerations. We discuss its justification from experimental studies of cold atoms as well as from theoretical studies of neutron matter. We demonstrate that this bound results in limits to the density-dependent symmetry energy, which is the difference between the energies of symmetric nuclear matter and PNM. In particular, this bound leads to a lower limit to the volume symmetry energy parameter S {sub 0}. In addition, for assumed values of S {sub 0} above this minimum, this bound impliesmore » both upper and lower limits to the symmetry energy slope parameter L , which describes the lowest-order density dependence of the symmetry energy. A lower bound on neutron-matter incompressibility is also obtained. These bounds are found to be consistent with both recent calculations of the energies of PNM and constraints from nuclear experiments. Our results are significant because several equations of state that are currently used in astrophysical simulations of supernovae and neutron star mergers, as well as in nuclear physics simulations of heavy-ion collisions, have symmetry energy parameters that violate these bounds. Furthermore, below the nuclear saturation density, the bound on neutron-matter energies leads to a lower limit to the density-dependent symmetry energy, which leads to upper limits to the nuclear surface symmetry parameter and the neutron-star crust–core boundary. We also obtain a lower limit to the neutron-skin thicknesses of neutron-rich nuclei. Above the nuclear saturation density, the bound on neutron-matter energies also leads to an upper limit to the symmetry energy, with implications for neutron-star cooling via the direct Urca process.« less

  5. Ground reaction forces and bone parameters in females with tibial stress fracture.

    PubMed

    Bennell, Kim; Crossley, Kay; Jayarajan, Jyotsna; Walton, Elizabeth; Warden, Stuart; Kiss, Z Stephen; Wrigley, Tim

    2004-03-01

    Tibial stress fracture is a common overuse running injury that results from the interplay of repetitive mechanical loading and bone strength. This research project aimed to determine whether female runners with a history of tibial stress fracture (TSF) differ in ground reaction force (GRF) parameters during running, regional bone density, and tibial bone geometry from those who have never sustained a stress fracture (NSF). Thirty-six female running athletes (13 TSF; 23 NSF) ranging in age from 18 to 44 yr were recruited for this cross-sectional study. The groups were well matched for demographic, training, and menstrual parameters. A force platform measured selected GRF parameters (peak and time to peak for vertical impact and active forces, and horizontal braking and propulsive forces) during overground running at 4.0 m.s.(-1). Lumbar spine, proximal femur, and distal tibial bone mineral density were assessed by dual energy x-ray absorptiometry. Tibial bone geometry (cross-sectional dimensions and areas, and second moments of area) was calculated from a computerized tomography scan at the junction of the middle and distal thirds. There were no significant differences between the groups for any of the GRF, bone density, or tibial bone geometric parameters (P > 0.05). Both TSF and NSF subjects had bone density levels that were average or above average compared with a young adult reference range. Factor analysis followed by discriminant function analysis did not find any combinations of variables that differentiated between TSF and NSF groups. These findings do not support a role for GRF, bone density, or tibial bone geometry in the development of tibial stress fractures, suggesting that other risk factors were more important in this cohort of female runners.

  6. Symmetry Parameter Constraints from a Lower Bound on Neutron-matter Energy

    NASA Astrophysics Data System (ADS)

    Tews, Ingo; Lattimer, James M.; Ohnishi, Akira; Kolomeitsev, Evgeni E.

    2017-10-01

    We propose the existence of a lower bound on the energy of pure neutron matter (PNM) on the basis of unitary-gas considerations. We discuss its justification from experimental studies of cold atoms as well as from theoretical studies of neutron matter. We demonstrate that this bound results in limits to the density-dependent symmetry energy, which is the difference between the energies of symmetric nuclear matter and PNM. In particular, this bound leads to a lower limit to the volume symmetry energy parameter S 0. In addition, for assumed values of S 0 above this minimum, this bound implies both upper and lower limits to the symmetry energy slope parameter L ,which describes the lowest-order density dependence of the symmetry energy. A lower bound on neutron-matter incompressibility is also obtained. These bounds are found to be consistent with both recent calculations of the energies of PNM and constraints from nuclear experiments. Our results are significant because several equations of state that are currently used in astrophysical simulations of supernovae and neutron star mergers, as well as in nuclear physics simulations of heavy-ion collisions, have symmetry energy parameters that violate these bounds. Furthermore, below the nuclear saturation density, the bound on neutron-matter energies leads to a lower limit to the density-dependent symmetry energy, which leads to upper limits to the nuclear surface symmetry parameter and the neutron-star crust-core boundary. We also obtain a lower limit to the neutron-skin thicknesses of neutron-rich nuclei. Above the nuclear saturation density, the bound on neutron-matter energies also leads to an upper limit to the symmetry energy, with implications for neutron-star cooling via the direct Urca process.

  7. SU-E-T-212: Comparison of TG-43 Dosimetric Parameters of Low and High Energy Brachytherapy Sources Obtained by MCNP Code Versions of 4C, X and 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zehtabian, M; Zaker, N; Sina, S

    2015-06-15

    Purpose: Different versions of MCNP code are widely used for dosimetry purposes. The purpose of this study is to compare different versions of the MCNP codes in dosimetric evaluation of different brachytherapy sources. Methods: The TG-43 parameters such as dose rate constant, radial dose function, and anisotropy function of different brachytherapy sources, i.e. Pd-103, I-125, Ir-192, and Cs-137 were calculated in water phantom. The results obtained by three versions of Monte Carlo codes (MCNP4C, MCNPX, MCNP5) were compared for low and high energy brachytherapy sources. Then the cross section library of MCNP4C code was changed to ENDF/B-VI release 8 whichmore » is used in MCNP5 and MCNPX codes. Finally, the TG-43 parameters obtained using the MCNP4C-revised code, were compared with other codes. Results: The results of these investigations indicate that for high energy sources, the differences in TG-43 parameters between the codes are less than 1% for Ir-192 and less than 0.5% for Cs-137. However for low energy sources like I-125 and Pd-103, large discrepancies are observed in the g(r) values obtained by MCNP4C and the two other codes. The differences between g(r) values calculated using MCNP4C and MCNP5 at the distance of 6cm were found to be about 17% and 28% for I-125 and Pd-103 respectively. The results obtained with MCNP4C-revised and MCNPX were similar. However, the maximum difference between the results obtained with the MCNP5 and MCNP4C-revised codes was 2% at 6cm. Conclusion: The results indicate that using MCNP4C code for dosimetry of low energy brachytherapy sources can cause large errors in the results. Therefore it is recommended not to use this code for low energy sources, unless its cross section library is changed. Since the results obtained with MCNP4C-revised and MCNPX were similar, it is concluded that the difference between MCNP4C and MCNPX is their cross section libraries.« less

  8. A Numerical Fit of Analytical to Simulated Density Profiles in Dark Matter Haloes

    NASA Astrophysics Data System (ADS)

    Caimmi, R.; Marmo, C.; Valentinuzzi, T.

    2005-06-01

    Analytical and geometrical properties of generalized power-law (GPL) density profiles are investigated in detail. In particular, a one-to-one correspondence is found between mathematical parameters (a scaling radius, r_0, a scaling density, rho_0, and three exponents, alpha, beta, gamma), and geometrical parameters (the coordinates of the intersection of the asymptotes, x_C, y_C, and three vertical intercepts, b, b_beta, b_gamma, related to the curve and the asymptotes, respectively): (r_0,rho_0,alpha,beta,gamma) <--> (x_C,y_C,b,b_beta,b_gamma). Then GPL density profiles are compared with simulated dark haloes (SDH) density profiles, and nonlinear least-absolute values and least-squares fits involving the above mentioned five parameters (RFSM5 method) are prescribed. More specifically, the sum of absolute values or squares of absolute logarithmic residuals, R_i= log rhoSDH(r_i)-log rhoGPL(r_i), is evaluated on 10^5 points making a 5- dimension hypergrid, through a few iterations. The size is progressively reduced around a fiducial minimum, and superpositions on nodes of earlier hypergrids are avoided. An application is made to a sample of 17 SDHs on the scale of cluster of galaxies, within a flat LambdaCDM cosmological model (Rasia et al. 2004). In dealing with the mean SDH density profile, a virial radius, rvir, averaged over the whole sample, is assigned, which allows the calculation of the remaining parameters. Using a RFSM5 method provides a better fit with respect to other methods. The geometrical parameters, averaged over the whole sample of best fitting GPL density profiles, yield (alpha,beta,gamma) approx(0.6,3.1,1.0), to be compared with (alpha,beta,gamma)=(1,3,1), i.e. the NFW density profile (Navarro et al. 1995, 1996, 1997), (alpha,beta,gamma)=(1.5,3,1.5) (Moore et al. 1998, 1999), (alpha,beta,gamma)=(1,2.5,1) (Rasia et al. 2004); and, in addition, gamma approx 1.5 (Hiotelis 2003), deduced from the application of a RFSM5 method, but using a different definition of scaled radius, or concentration; and gamma approx 1.2-1.3 deduced from more recent high-resolution simulations (Diemand et al. 2004, Reed et al. 2005). No evident correlation is found between SDH dynamical state (relaxed or merging) and asymptotic inner slope of the fitting logarithmic density profile or (for SDH comparable virial masses) scaled radius. Mean values and standard deviations of some parameters are calculated, and in particular the decimal logarithm of the scaled radius, xivir, reads < log xivir >=0.74 and sigma_s log xivir=0.15-0.17, consistent with previous results related to NFW density profiles. It provides additional support to the idea, that NFW density profiles may be considered as a convenient way to parametrize SDH density profiles, without implying that it necessarily produces the best possible fit (Bullock et al. 2001). A certain degree of degeneracy is found in fitting GPL to SDH density profiles. If it is intrinsic to the RFSM5 method or it could be reduced by the next generation of high-resolution simulations, still remains an open question.

  9. Classification of high-resolution multispectral satellite remote sensing images using extended morphological attribute profiles and independent component analysis

    NASA Astrophysics Data System (ADS)

    Wu, Yu; Zheng, Lijuan; Xie, Donghai; Zhong, Ruofei

    2017-07-01

    In this study, the extended morphological attribute profiles (EAPs) and independent component analysis (ICA) were combined for feature extraction of high-resolution multispectral satellite remote sensing images and the regularized least squares (RLS) approach with the radial basis function (RBF) kernel was further applied for the classification. Based on the major two independent components, the geometrical features were extracted using the EAPs method. In this study, three morphological attributes were calculated and extracted for each independent component, including area, standard deviation, and moment of inertia. The extracted geometrical features classified results using RLS approach and the commonly used LIB-SVM library of support vector machines method. The Worldview-3 and Chinese GF-2 multispectral images were tested, and the results showed that the features extracted by EAPs and ICA can effectively improve the accuracy of the high-resolution multispectral image classification, 2% larger than EAPs and principal component analysis (PCA) method, and 6% larger than APs and original high-resolution multispectral data. Moreover, it is also suggested that both the GURLS and LIB-SVM libraries are well suited for the multispectral remote sensing image classification. The GURLS library is easy to be used with automatic parameter selection but its computation time may be larger than the LIB-SVM library. This study would be helpful for the classification application of high-resolution multispectral satellite remote sensing images.

  10. Semi-Automatic Modelling of Building FAÇADES with Shape Grammars Using Historic Building Information Modelling

    NASA Astrophysics Data System (ADS)

    Dore, C.; Murphy, M.

    2013-02-01

    This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.

  11. Analysis of Total Electron Content and Electron Density Profile during Different Geomagnetic Storms

    NASA Astrophysics Data System (ADS)

    Chapagain, N. P.; Rana, B.; Adhikari, B.

    2017-12-01

    Total Electron content (TEC) and electron density are the key parameters in the mitigation of ionospheric effects on radio communication system. Detail study of the TEC and electron density variations has been carried out during geomagnetic storms, with longitude and latitude, for four different locations: (13˚N -17˚N, 88˚E -98˚E), (30˚N-50˚N, 120˚W -95˚W), (29˚S-26˚S, 167˚W-163˚W,) and (60˚S-45˚S, 120˚W-105˚W) using the Gravity Recovery and Climate Experiment (GRACE) satellite observations. In order to find the geomagnetic activity, the solar wind parameters such as north-south component of inter planetary magnetic field (Bz), plasma drift velocity (Vsw), flow pressure (nPa), AE, Dst and Kp indices were obtained from Operating Mission as Nodes on the Internet (OMNI) web system. The data for geomagnetic indices have been correlated with the TEC and electron density for four different events of geomagnetic storms on 6 April 2008, 27 March 2008, 4 September 2008, and 11 October 2008. The result illustrates that the observed TEC and electron density profile significantly vary with longitudes and latitudes. This study illustrates that the values of TEC and the vertical electron density profile are influenced by the solar wind parameters associated with solar activities. The peak values of electron density and TEC increase as the geomagnetic storms become stronger. Similarly, the electron density profile varies with altitudes, which peaks around the altitude range of about 250- 350 km, depending on the strength of geomagnetic storms. The results clearly show that the peak electron density shifted to higher altitude (from about 250 km to 350 km) as the geomagnetic disturbances becomes stronger.

  12. Estimation of the Thickness and Emulsion Rate of Oil Spilled at Sea Using Hyperspectral Remote Sensing Imagery in the SWIR Domain

    NASA Astrophysics Data System (ADS)

    Sicot, G.; Lennon, M.; Miegebielle, V.; Dubucq, D.

    2015-08-01

    The thickness and the emulsion rate of an oil spill are two key parameters allowing to design a tailored response to an oil discharge. If estimated on per pixel basis at a high spatial resolution, the estimation of the oil thickness allows the volume of pollutant to be estimated, and that volume is needed in order to evaluate the magnitude of the pollution, and to determine the most adapted recovering means to use. The estimation of the spatial distribution of the thicknesses also allows the guidance of the recovering means at sea. The emulsion rate can guide the strategy to adopt in order to deal with an offshore oil spill: efficiency of dispersants is for example not identical on a pure oil or on an emulsion. Moreover, the thickness and emulsion rate allow the amount of the oil that has been discharged to be estimated. It appears that the shape of the reflectance spectrum of oil in the SWIR range (1000-2500nm) varies according to the emulsion rate and to the layer thickness. That shape still varies when the oil layer reaches a few millimetres, which is not the case in the visible range (400-700nm), where the spectral variation saturates around 200 μm (the upper limit of the Bonn agreement oil appearance code). In that context, hyperspectral imagery in the SWIR range shows a high potential to describe and characterize oil spills. Previous methods which intend to estimate those two parameters are based on the use of a spectral library. In that paper, we will present a method based on the inversion of a simple radiative transfer model in the oil layer. We will show that the proposed method is robust against another parameter that affects the reflectance spectrum: the size of water droplets in the emulsion. The method shows relevant results using measurements made in laboratory, equivalent to the ones obtained using methods based on the use of a spectral library. The method has the advantage to release the need of a spectral library, and to provide maps of thickness and emulsion rate values per pixel. The maps obtained are not composed of regions of thickness ranges, such as the ones obtained using discretized levels of measurements in the spectral library, or maps made from visual observations following the Bonn agreement oil appearance code.

  13. Modelling of backscatter from vegetation layers

    NASA Technical Reports Server (NTRS)

    Van Zyl, J. J.; Engheta, N.; Papas, C. H.; Elachi, C.; Zebker, H.

    1985-01-01

    A simple way to build up a library of models which may be used to distinguish between the different types of vegetation and ground surfaces by means of their backscatter properties is presented. The curve of constant power received by the antenna (Gamma sphere) is calculated for the given Stokes Scattering Operator, and model parameters are adopted of the most similar library model Gamma sphere. Results calculated for a single scattering model resembling coniferous trees are compared with the Gamma spheres of a model resembling tropical region trees. The polarization which would minimize the effect of either the ground surface or the vegetation layer can be calculated and used to analyze the backscatter from the ground surface/vegetation layer combination, and enhance the power received from the desired part of the combination.

  14. Method for determining formation quality factor from well log data and its application to seismic reservoir characterization

    DOEpatents

    Walls, Joel; Taner, M. Turhan; Dvorkin, Jack

    2006-08-08

    A method for seismic characterization of subsurface Earth formations includes determining at least one of compressional velocity and shear velocity, and determining reservoir parameters of subsurface Earth formations, at least including density, from data obtained from a wellbore penetrating the formations. A quality factor for the subsurface formations is calculated from the velocity, the density and the water saturation. A synthetic seismogram is calculated from the calculated quality factor and from the velocity and density. The synthetic seismogram is compared to a seismic survey made in the vicinity of the wellbore. At least one parameter is adjusted. The synthetic seismogram is recalculated using the adjusted parameter, and the adjusting, recalculating and comparing are repeated until a difference between the synthetic seismogram and the seismic survey falls below a selected threshold.

  15. Quantum Jeffreys prior for displaced squeezed thermal states

    NASA Astrophysics Data System (ADS)

    Kwek, L. C.; Oh, C. H.; Wang, Xiang-Bin

    1999-09-01

    It is known that, by extending the equivalence of the Fisher information matrix to its quantum version, the Bures metric, the quantum Jeffreys prior can be determined from the volume element of the Bures metric. We compute the Bures metric for the displaced squeezed thermal state and analyse the quantum Jeffreys prior and its marginal probability distributions. To normalize the marginal probability density function, it is necessary to provide a range of values of the squeezing parameter or the inverse temperature. We find that if the range of the squeezing parameter is kept narrow, there are significant differences in the marginal probability density functions in terms of the squeezing parameters for the displaced and undisplaced situations. However, these differences disappear as the range increases. Furthermore, marginal probability density functions against temperature are very different in the two cases.

  16. Emergent Topological order from Spin-Orbit Density wave

    NASA Astrophysics Data System (ADS)

    Gupta, Gaurav; Das, Tanmoy

    We study the emergence of a Z2 -type topological order because of Landau type symmetry breaking order parameter. When two Rashba type SOC bands of different chirality become nested by a magic wavevector [(0, ∖pi) or (∖pi,0)], it introduces the inversion of chirality between different lattice sites. Such a density wave state is known as spin-orbit density wave. The resulting quantum order is associated with the topological order which is classified by a Z2 invariant. So, this system can simultaneously be classified by both a symmetry breaking order parameter and the associated Z2 topological invariant. This order parameter can be realized or engineered in two- or quasi-two-dimensional fermionic lattices, quantum wires, with tunable RSOC and correlation strength. The work is facilitated by the computer cluster facility at Department of Physics, Indian Institute of Science.

  17. ωB97X-V: A 10-parameter, range-separated hybrid, generalized gradient approximation density functional with nonlocal correlation, designed by a survival-of-the-fittest strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardirossian, Narbe; Head-Gordon, Martin

    2013-12-18

    A 10-parameter, range-separated hybrid (RSH), generalized gradient approximation (GGA) density functional with nonlocal correlation (VV10) is presented in this paper. Instead of truncating the B97-type power series inhomogeneity correction factors (ICF) for the exchange, same-spin correlation, and opposite-spin correlation functionals uniformly, all 16 383 combinations of the linear parameters up to fourth order (m = 4) are considered. These functionals are individually fit to a training set and the resulting parameters are validated on a primary test set in order to identify the 3 optimal ICF expansions. Through this procedure, it is discovered that the functional that performs best onmore » the training and primary test sets has 7 linear parameters, with 3 additional nonlinear parameters from range-separation and nonlocal correlation. The resulting density functional, ωB97X-V, is further assessed on a secondary test set, the parallel-displaced coronene dimer, as well as several geometry datasets. Finally and furthermore, the basis set dependence and integration grid sensitivity of ωB97X-V are analyzed and documented in order to facilitate the use of the functional.« less

  18. Large eddy simulation on Rayleigh–Bénard convection of cold water in the neighborhood of the maximum density

    NASA Astrophysics Data System (ADS)

    Huang, Xiao-Jie; Zhang, Li; Hu, Yu-Peng; Li, You-Rong

    2018-06-01

    In order to understand the effect of the Rayleigh number, the density inversion phenomenon and the aspect ratio on the flow patterns and the heat transfer characteristics of Rayleigh–Bénard convection of cold water in the neighborhood of the maximum density, a series of large eddy simulations are conducted by using the finite volume method. The Rayleigh number ranges between 106 and 109, the density inversion parameter and the aspect ratio are varied from 0 to 0.9 and from 0.4 to 2.5, respectively. The results indicate that the reversal of the large scale circulation (LSC) occurs with the increase of the Rayleigh number. When there exists a density inversion phenomenon, the key driver for the LSC is hot plumes. When the density inversion parameter is large enough, a stagnant region is found near the top of the container as the hot plumes cannot move to the top wall. The flow pattern structures depend mainly on the aspect ratio. When the aspect ratio is small, the rolls are vertically stacked and the flow keeps on switching among different flow states. For a moderate aspect ratio, different long-lived roll states coexist at a fixed aspect ratio. For a larger aspect ratio, the flow state is everlasting. The number of rolls increases with the increase of the aspect ratio. Furthermore, the aspect ratio has only slight influence on the time averaged Nusselt number for all density inversion parameters.

  19. A long-range-corrected density functional that performs well for both ground-state properties and time-dependent density functional theory excitation energies, including charge-transfer excited states.

    PubMed

    Rohrdanz, Mary A; Martins, Katie M; Herbert, John M

    2009-02-07

    We introduce a hybrid density functional that asymptotically incorporates full Hartree-Fock exchange, based on the long-range-corrected exchange-hole model of Henderson et al. [J. Chem. Phys. 128, 194105 (2008)]. The performance of this functional, for ground-state properties and for vertical excitation energies within time-dependent density functional theory, is systematically evaluated, and optimal values are determined for the range-separation parameter, omega, and for the fraction of short-range Hartree-Fock exchange. We denote the new functional as LRC-omegaPBEh, since it reduces to the standard PBEh hybrid functional (also known as PBE0 or PBE1PBE) for a certain choice of its two parameters. Upon optimization of these parameters against a set of ground- and excited-state benchmarks, the LRC-omegaPBEh functional fulfills three important requirements: (i) It outperforms the PBEh hybrid functional for ground-state atomization energies and reaction barrier heights; (ii) it yields statistical errors comparable to PBEh for valence excitation energies in both small and medium-sized molecules; and (iii) its performance for charge-transfer excitations is comparable to its performance for valence excitations. LRC-omegaPBEh, with the parameters determined herein, is the first density functional that satisfies all three criteria. Notably, short-range Hartree-Fock exchange appears to be necessary in order to obtain accurate ground-state properties and vertical excitation energies using the same value of omega.

  20. Mapping the genome of meta-generalized gradient approximation density functionals: The search for B97M-V

    NASA Astrophysics Data System (ADS)

    Mardirossian, Narbe; Head-Gordon, Martin

    2015-02-01

    A meta-generalized gradient approximation density functional paired with the VV10 nonlocal correlation functional is presented. The functional form is selected from more than 1010 choices carved out of a functional space of almost 1040 possibilities. Raw data come from training a vast number of candidate functional forms on a comprehensive training set of 1095 data points and testing the resulting fits on a comprehensive primary test set of 1153 data points. Functional forms are ranked based on their ability to reproduce the data in both the training and primary test sets with minimum empiricism, and filtered based on a set of physical constraints and an often-overlooked condition of satisfactory numerical precision with medium-sized integration grids. The resulting optimal functional form has 4 linear exchange parameters, 4 linear same-spin correlation parameters, and 4 linear opposite-spin correlation parameters, for a total of 12 fitted parameters. The final density functional, B97M-V, is further assessed on a secondary test set of 212 data points, applied to several large systems including the coronene dimer and water clusters, tested for the accurate prediction of intramolecular and intermolecular geometries, verified to have a readily attainable basis set limit, and checked for grid sensitivity. Compared to existing density functionals, B97M-V is remarkably accurate for non-bonded interactions and very satisfactory for thermochemical quantities such as atomization energies, but inherits the demonstrable limitations of existing local density functionals for barrier heights.

  1. BADGER v1.0: A Fortran equation of state library

    NASA Astrophysics Data System (ADS)

    Heltemes, T. A.; Moses, G. A.

    2012-12-01

    The BADGER equation of state library was developed to enable inertial confinement fusion plasma codes to more accurately model plasmas in the high-density, low-temperature regime. The code had the capability to calculate 1- and 2-T plasmas using the Thomas-Fermi model and an individual electron accounting model. Ion equation of state data can be calculated using an ideal gas model or via a quotidian equation of state with scaled binding energies. Electron equation of state data can be calculated via the ideal gas model or with an adaptation of the screened hydrogenic model with ℓ-splitting. The ionization and equation of state calculations can be done in local thermodynamic equilibrium or in a non-LTE mode using a variant of the Busquet equivalent temperature method. The code was written as a stand-alone Fortran library for ease of implementation by external codes. EOS results for aluminum are presented that show good agreement with the SESAME library and ionization calculations show good agreement with the FLYCHK code. Program summaryProgram title: BADGERLIB v1.0 Catalogue identifier: AEND_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEND_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 41 480 No. of bytes in distributed program, including test data, etc.: 2 904 451 Distribution format: tar.gz Programming language: Fortran 90. Computer: 32- or 64-bit PC, or Mac. Operating system: Windows, Linux, MacOS X. RAM: 249.496 kB plus 195.630 kB per isotope record in memory Classification: 19.1, 19.7. Nature of problem: Equation of State (EOS) calculations are necessary for the accurate simulation of high energy density plasmas. Historically, most EOS codes used in these simulations have relied on an ideal gas model. This model is inadequate for low-temperature, high-density plasma conditions; the gaseous and liquid phases; and the solid phase. The BADGER code was developed to give more realistic EOS data in these regimes. Solution method: BADGER has multiple, user-selectable models to treat the ions, average-atom ionization state and electrons. Ion models are ideal gas and quotidian equation of state (QEOS), ionization models are Thomas-Fermi and individual accounting method (IEM) formulation of the screened hydrogenic model (SHM) with l-splitting, electron ionization models are ideal gas and a Helmholtz free energy minimization method derived from the SHM. The default equation of state and ionization models are appropriate for plasmas in local thermodynamic equilibrium (LTE). The code can calculate non-LTE equation of state (EOS) and ionization data using a simplified form of the Busquet equivalent-temperature method. Restrictions: Physical data are only provided for elements Z=1 to Z=86. Multiple solid phases are not currently supported. Liquid, gas and plasma phases are combined into a generalized "fluid" phase. Unusual features: BADGER divorces the calculation of average-atom ionization from the electron equation of state model, allowing the user to select ionization and electron EOS models that are most appropriate to the simulation. The included ion ideal gas model uses ground-state nuclear spin data to differentiate between isotopes of a given element. Running time: Example provided only takes a few seconds to run.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khomkin, A. L., E-mail: alhomkin@mail.ru; Shumikhin, A. S.

    The conductivity of metal vapors at the critical point and near it has been considered. The liquid-metal conductivity originates in this region. The thermodynamic parameters of the critical point, the density of conduction electrons, and the conductivities of various metal vapors have been calculated within the unified approach. It has been proposed to consider the conductivity at the critical point—critical conductivity—as the fourth critical parameter in addition to the density, temperature, and pressure.

  3. Effect of Microstructural Parameters on the Relative Densities of Metal Foams

    NASA Technical Reports Server (NTRS)

    Raj, S. V.; Kerr, Jacob A.

    2010-01-01

    A detailed quantitative microstructural analyses of primarily open cell FeCrAlY and 314 stainless steel metal foams with different relative densities and pores per inch (p.p.i.) were undertaken in the present investigation to determine the effect of microstructural parameters on the relative densities of metal foams. Several elements of the microstructure, such as longitudinal and transverse cell sizes, cell areas and perimeters, ligament dimensions, cell shapes and volume fractions of closed and open cells, were measured. The cross-sections of the foam ligaments showed a large number of shrinkage cavities, and their circularity factors and average sizes were determined. The volume fractions of closed cells increased linearly with increasing relative density. In contrast, the volume fractions of the open cells and ligaments decreased with increasing relative density. The relative densities and p.p.i. were not significantly dependent on cell size, cell perimeter and ligament dimensions within the limits of experimental scatter. A phenomenological model is proposed to rationalize the present microstructural observations.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Constantin, Lucian A.; Fabiano, Eduardo; Della Sala, Fabio

    We introduce a novel non-local ingredient for the construction of exchange density functionals: the reduced Hartree parameter, which is invariant under the uniform scaling of the density and represents the exact exchange enhancement factor for one- and two-electron systems. The reduced Hartree parameter is used together with the conventional meta-generalized gradient approximation (meta-GGA) semilocal ingredients (i.e., the electron density, its gradient, and the kinetic energy density) to construct a new generation exchange functional, termed u-meta-GGA. This u-meta-GGA functional is exact for the exchange of any one- and two-electron systems, is size-consistent and non-empirical, satisfies the uniform density scaling relation, andmore » recovers the modified gradient expansion derived from the semiclassical atom theory. For atoms, ions, jellium spheres, and molecules, it shows a good accuracy, being often better than meta-GGA exchange functionals. Our construction validates the use of the reduced Hartree ingredient in exchange-correlation functional development, opening the way to an additional rung in the Jacob’s ladder classification of non-empirical density functionals.« less

  5. On the logistic equation subject to uncertainties in the environmental carrying capacity and initial population density

    NASA Astrophysics Data System (ADS)

    Dorini, F. A.; Cecconello, M. S.; Dorini, L. B.

    2016-04-01

    It is recognized that handling uncertainty is essential to obtain more reliable results in modeling and computer simulation. This paper aims to discuss the logistic equation subject to uncertainties in two parameters: the environmental carrying capacity, K, and the initial population density, N0. We first provide the closed-form results for the first probability density function of time-population density, N(t), and its inflection point, t*. We then use the Maximum Entropy Principle to determine both K and N0 density functions, treating such parameters as independent random variables and considering fluctuations of their values for a situation that commonly occurs in practice. Finally, closed-form results for the density functions and statistical moments of N(t), for a fixed t > 0, and of t* are provided, considering the uniform distribution case. We carried out numerical experiments to validate the theoretical results and compared them against that obtained using Monte Carlo simulation.

  6. Wavefronts, actions and caustics determined by the probability density of an Airy beam

    NASA Astrophysics Data System (ADS)

    Espíndola-Ramos, Ernesto; Silva-Ortigoza, Gilberto; Sosa-Sánchez, Citlalli Teresa; Julián-Macías, Israel; de Jesús Cabrera-Rosas, Omar; Ortega-Vidals, Paula; Alejandro Juárez-Reyes, Salvador; González-Juárez, Adriana; Silva-Ortigoza, Ramón

    2018-07-01

    The main contribution of the present work is to use the probability density of an Airy beam to identify its maxima with the family of caustics associated with the wavefronts determined by the level curves of a one-parameter family of solutions to the Hamilton–Jacobi equation with a given potential. To this end, we give a classical mechanics characterization of a solution of the one-dimensional Schrödinger equation in free space determined by a complete integral of the Hamilton–Jacobi and Laplace equations in free space. That is, with this type of solution, we associate a two-parameter family of wavefronts in the spacetime, which are the level curves of a one-parameter family of solutions to the Hamilton–Jacobi equation with a determined potential, and a one-parameter family of caustics. The general results are applied to an Airy beam to show that the maxima of its probability density provide a discrete set of: caustics, wavefronts and potentials. The results presented here are a natural generalization of those obtained by Berry and Balazs in 1979 for an Airy beam. Finally, we remark that, in a natural manner, each maxima of the probability density of an Airy beam determines a Hamiltonian system.

  7. Optimization of laser-plasma injector via beam loading effects using ionization-induced injection

    NASA Astrophysics Data System (ADS)

    Lee, P.; Maynard, G.; Audet, T. L.; Cros, B.; Lehe, R.; Vay, J.-L.

    2018-05-01

    Simulations of ionization-induced injection in a laser driven plasma wakefield show that high-quality electron injectors in the 50-200 MeV range can be achieved in a gas cell with a tailored density profile. Using the PIC code Warp with parameters close to existing experimental conditions, we show that the concentration of N2 in a hydrogen plasma with a tailored density profile is an efficient parameter to tune electron beam properties through the control of the interplay between beam loading effects and varying accelerating field in the density profile. For a given laser plasma configuration, with moderate normalized laser amplitude, a0=1.6 and maximum electron plasma density, ne 0=4 ×1018 cm-3 , the optimum concentration results in a robust configuration to generate electrons at 150 MeV with a rms energy spread of 4% and a spectral charge density of 1.8 pC /MeV .

  8. A new numerical benchmark of a freshwater lens

    NASA Astrophysics Data System (ADS)

    Stoeckl, L.; Walther, M.; Graf, T.

    2016-04-01

    A numerical benchmark for 2-D variable-density flow and solute transport in a freshwater lens is presented. The benchmark is based on results of laboratory experiments conducted by Stoeckl and Houben (2012) using a sand tank on the meter scale. This benchmark describes the formation and degradation of a freshwater lens over time as it can be found under real-world islands. An error analysis gave the appropriate spatial and temporal discretization of 1 mm and 8.64 s, respectively. The calibrated parameter set was obtained using the parameter estimation tool PEST. Comparing density-coupled and density-uncoupled results showed that the freshwater-saltwater interface position is strongly dependent on density differences. A benchmark that adequately represents saltwater intrusion and that includes realistic features of coastal aquifers or freshwater lenses was lacking. This new benchmark was thus developed and is demonstrated to be suitable to test variable-density groundwater models applied to saltwater intrusion investigations.

  9. Carrier-density dependence of photoluminescence from localized states in InGaN/GaN quantum wells in nanocolumns and a thin film

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shimosako, N., E-mail: n-shimosako@sophia.jp; Inose, Y.; Satoh, H.

    2015-11-07

    We have measured and analyzed the carrier-density dependence of photoluminescence (PL) spectra and the PL efficiency of InGaN/GaN multiple quantum wells in nanocolumns and in a thin film over a wide excitation range. The localized states parameters, such as the tailing parameter, density and size of the localized states, and the mobility edge density are estimated. The spectral change and reduction of PL efficiency are explained by filling of the localized states and population into the extended states around the mobility edge density. We have also found that the nanocolumns have a narrower distribution of the localized states and amore » higher PL efficiency than those of the film sample although the In composition of the nanocolumns is higher than that of the film.« less

  10. Correlation of the tokamak H-mode density limit with ballooning stability at the separatrix

    NASA Astrophysics Data System (ADS)

    Eich, T.; Goldston, R. J.; Kallenbach, A.; Sieglin, B.; Sun, H. J.; ASDEX Upgrade Team; Contributors, JET

    2018-03-01

    We show for JET and ASDEX Upgrade, based on Thomson-scattering measurements, a clear correlation of the density limit of the tokamak H-mode high-confinement regime with the approach to the ideal ballooning instability threshold at the periphery of the plasma. It is shown that the MHD ballooning parameter at the separatrix position α_sep increases about linearly with the separatrix density normalized to Greenwald density, n_e, sep/n_GW for a wide range of discharge parameters in both devices. The observed operational space is found to reach at maximum n_e, sep/n_GW≈ 0.4 -0.5 at values for α_sep≈ 2 -2.5, in the range of theoretical predictions for ballooning instability. This work supports the hypothesis that the H-mode density limit may be set by ballooning stability at the separatrix.

  11. Effects of Differing Energy Dependences in Three Level-Density Models on Calculated Cross Sections

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, C.Y.

    2000-07-15

    Three level-density formalisms commonly used for cross-section calculations are examined. Residual nuclides in neutron interaction with {sup 58}Ni are chosen to quantify the well-known differences in the energy dependences of the three formalisms. Level-density parameters for the Gilbert and Cameron model are determined from experimental information. Parameters for the back-shifted Fermi-gas and generalized superfluid models are obtained by fitting their level densities at two selected energies for each nuclide to those of the Gilbert and Cameron model, forcing the level densities of the three models to be as close as physically allowed. The remaining differences are in their energy dependencesmore » that, it is shown, can change the calculated cross sections and particle emission spectra significantly, in some cases or energy ranges by a factor of 2.« less

  12. Central depression in nucleonic densities: Trend analysis in the nuclear density functional theory approach

    NASA Astrophysics Data System (ADS)

    Schuetrumpf, B.; Nazarewicz, W.; Reinhard, P.-G.

    2017-08-01

    Background: The central depression of nucleonic density, i.e., a reduction of density in the nuclear interior, has been attributed to many factors. For instance, bubble structures in superheavy nuclei are believed to be due to the electrostatic repulsion. In light nuclei, the mechanism behind the density reduction in the interior has been discussed in terms of shell effects associated with occupations of s orbits. Purpose: The main objective of this work is to reveal mechanisms behind the formation of central depression in nucleonic densities in light and heavy nuclei. To this end, we introduce several measures of the internal nucleonic density. Through the statistical analysis, we study the information content of these measures with respect to nuclear matter properties. Method: We apply nuclear density functional theory with Skyrme functionals. Using the statistical tools of linear least square regression, we inspect correlations between various measures of central depression and model parameters, including nuclear matter properties. We study bivariate correlations with selected quantities as well as multiple correlations with groups of parameters. Detailed correlation analysis is carried out for 34Si for which a bubble structure has been reported recently, 48Ca, and N =82 , 126, and 184 isotonic chains. Results: We show that the central depression in medium-mass nuclei is very sensitive to shell effects, whereas for superheavy systems it is firmly driven by the electrostatic repulsion. An appreciable semibubble structure in proton density is predicted for 294Og, which is currently the heaviest nucleus known experimentally. Conclusion: Our correlation analysis reveals that the central density indicators in nuclei below 208Pb carry little information on parameters of nuclear matter; they are predominantly driven by shell structure. On the other hand, in the superheavy nuclei there exists a clear relationship between the central nucleonic density and symmetry energy.

  13. Influence of the initial parameters of the magnetic field and plasma on the spatial structure of the electric current and electron density in current sheets formed in helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrovskaya, G. V., E-mail: galya-ostr@mail.ru; Markov, V. S.; Frank, A. G., E-mail: annfrank@fpl.gpi.ru

    The influence of the initial parameters of the magnetic field and plasma on the spatial structure of the electric current and electron density in current sheets formed in helium plasma in 2D and 3D magnetic configurations with X-type singular lines is studied by the methods of holographic interferometry and magnetic measurements. Significant differences in the structures of plasma and current sheets formed at close parameters of the initial plasma and similar configurations of the initial magnetic fields are revealed.

  14. Control over dark current densities and cutoff wavelengths of GaAs/AlGaAs QWIP grown by multi-wafer MBE reactor

    NASA Astrophysics Data System (ADS)

    Roodenko, K.; Choi, K. K.; Clark, K. P.; Fraser, E. D.; Vargason, K. W.; Kuo, J.-M.; Kao, Y.-C.; Pinsukanjana, P. R.

    2016-09-01

    Performance of quantum well infrared photodetector (QWIP) device parameters such as detector cutoff wavelength and the dark current density depend strongly on the quality and the control of the epitaxy material growth. In this work, we report on a methodology to precisely control these critical material parameters for long wavelength infrared (LWIR) GaAs/AlGaAs QWIP epi wafers grown by multi-wafer production Molecular beam epitaxy (MBE). Critical growth parameters such as quantum well (QW) thickness, AlGaAs composition and QW doping level are discussed.

  15. An Analytical Planning Model to Estimate the Optimal Density of Charging Stations for Electric Vehicles.

    PubMed

    Ahn, Yongjun; Yeo, Hwasoo

    2015-01-01

    The charging infrastructure location problem is becoming more significant due to the extensive adoption of electric vehicles. Efficient charging station planning can solve deeply rooted problems, such as driving-range anxiety and the stagnation of new electric vehicle consumers. In the initial stage of introducing electric vehicles, the allocation of charging stations is difficult to determine due to the uncertainty of candidate sites and unidentified charging demands, which are determined by diverse variables. This paper introduces the Estimating the Required Density of EV Charging (ERDEC) stations model, which is an analytical approach to estimating the optimal density of charging stations for certain urban areas, which are subsequently aggregated to city level planning. The optimal charging station's density is derived to minimize the total cost. A numerical study is conducted to obtain the correlations among the various parameters in the proposed model, such as regional parameters, technological parameters and coefficient factors. To investigate the effect of technological advances, the corresponding changes in the optimal density and total cost are also examined by various combinations of technological parameters. Daejeon city in South Korea is selected for the case study to examine the applicability of the model to real-world problems. With real taxi trajectory data, the optimal density map of charging stations is generated. These results can provide the optimal number of chargers for driving without driving-range anxiety. In the initial planning phase of installing charging infrastructure, the proposed model can be applied to a relatively extensive area to encourage the usage of electric vehicles, especially areas that lack information, such as exact candidate sites for charging stations and other data related with electric vehicles. The methods and results of this paper can serve as a planning guideline to facilitate the extensive adoption of electric vehicles.

  16. Off-premise alcohol outlet characteristics and violence.

    PubMed

    Snowden, Aleksandra J; Pridemore, William Alex

    2014-07-01

    There is considerable evidence of an association between alcohol outlet density and violence. Although prior research reveals the importance of specific characteristics of bars on this association and that the relationship between bar density and violence may be moderated by these characteristics, there are few similar studies of the characteristics of off-premise outlets (e.g., liquor and convenience stores). We examined whether immediate environment, business practice, staff, and patron characteristics of off-premise alcohol outlets are associated with simple and aggravated assault density. Cross-sectional design using aggregate data from 65 census block groups in a non-metropolitan college town, systematic social observation, and spatial modeling techniques. We found limited effects of immediate environment, business practice, staff, and patron characteristics on simple assault density and no effect on aggravated assault density. Only two out of 17 characteristics were associated with simple assault density (i.e., nearby library and male patrons). This is the first study to examine the association between several off-premise alcohol outlet characteristics and assault. Our findings suggest that where the off-premise outlets are located, how well the immediate environment is maintained, what types of beverages the outlets sell, who visits them, and who works there matter little in their association with violence. This suggests the importance of outlet density itself as a primary driver of any association with violence. Public policies aimed at reducing alcohol outlet density or clustering may be useful for reducing violence.

  17. Library based x-ray scatter correction for dedicated cone beam breast CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Linxi; Zhu, Lei, E-mail: leizhu@gatech.edu

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the GEANT4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correctionmore » on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal views. Conclusions: The library-based scatter correction does not require increase in radiation dose or hardware modifications, and it improves over the existing methods on implementation simplicity and computational efficiency. As demonstrated through patient studies, the authors’ approach is effective and stable, and is therefore clinically attractive for CBBCT imaging.« less

  18. Library based x-ray scatter correction for dedicated cone beam breast CT

    PubMed Central

    Shi, Linxi; Karellas, Andrew; Zhu, Lei

    2016-01-01

    Purpose: The image quality of dedicated cone beam breast CT (CBBCT) is limited by substantial scatter contamination, resulting in cupping artifacts and contrast-loss in reconstructed images. Such effects obscure the visibility of soft-tissue lesions and calcifications, which hinders breast cancer detection and diagnosis. In this work, we propose a library-based software approach to suppress scatter on CBBCT images with high efficiency, accuracy, and reliability. Methods: The authors precompute a scatter library on simplified breast models with different sizes using the geant4-based Monte Carlo (MC) toolkit. The breast is approximated as a semiellipsoid with homogeneous glandular/adipose tissue mixture. For scatter correction on real clinical data, the authors estimate the breast size from a first-pass breast CT reconstruction and then select the corresponding scatter distribution from the library. The selected scatter distribution from simplified breast models is spatially translated to match the projection data from the clinical scan and is subtracted from the measured projection for effective scatter correction. The method performance was evaluated using 15 sets of patient data, with a wide range of breast sizes representing about 95% of general population. Spatial nonuniformity (SNU) and contrast to signal deviation ratio (CDR) were used as metrics for evaluation. Results: Since the time-consuming MC simulation for library generation is precomputed, the authors’ method efficiently corrects for scatter with minimal processing time. Furthermore, the authors find that a scatter library on a simple breast model with only one input parameter, i.e., the breast diameter, sufficiently guarantees improvements in SNU and CDR. For the 15 clinical datasets, the authors’ method reduces the average SNU from 7.14% to 2.47% in coronal views and from 10.14% to 3.02% in sagittal views. On average, the CDR is improved by a factor of 1.49 in coronal views and 2.12 in sagittal views. Conclusions: The library-based scatter correction does not require increase in radiation dose or hardware modifications, and it improves over the existing methods on implementation simplicity and computational efficiency. As demonstrated through patient studies, the authors’ approach is effective and stable, and is therefore clinically attractive for CBBCT imaging. PMID:27487870

  19. Universal functions of nuclear proximity potential for Skyrme nucleus-nucleus interaction in a semiclassical approach

    NASA Astrophysics Data System (ADS)

    Gupta, Raj K.; Singh, Dalip; Kumar, Raj; Greiner, Walter

    2009-07-01

    The universal function of the nuclear proximity potential is obtained for the Skyrme nucleus-nucleus interaction in the semiclassical extended Thomas-Fermi (ETF) approach. This is obtained as a sum of the spin-orbit-density-independent and spin-orbit-density-dependent parts of the Hamiltonian density, since the two terms behave differently, the spin-orbit-density-independent part mainly attractive and the spin-orbit-density-dependent part mainly repulsive. The semiclassical expansions of kinetic energy density and spin-orbit density are allowed up to second order, and the two-parameter Fermi density, with its parameters fitted to experiments, is used for the nuclear density. The universal functions or the resulting nuclear proximity potential reproduce the 'exact' Skyrme nucleus-nucleus interaction potential in the semiclassical approach, within less than ~1 MeV of difference, both at the maximum attraction and in the surface region. An application of the resulting interaction potential to fusion excitation functions shows clearly that the parameterized universal functions of nuclear proximity potential substitute completely the 'exact' potential in the Skyrme energy density formalism based on the semiclassical ETF method, including also the modifications of interaction barriers at sub-barrier energies in terms of modifying the constants of the universal functions.

  20. Study on the effect of hydrogen addition on the variation of plasma parameters of argon-oxygen magnetron glow discharge for synthesis of TiO{sub 2} films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saikia, Partha, E-mail: partha.008@gmail.com; Institute of Physics, Pontificia Universidad Católica de Chile, Av. Vicuña Mackenna 4860, Santiago; Saikia, Bipul Kumar

    2016-04-15

    We report the effect of hydrogen addition on plasma parameters of argon-oxygen magnetron glow discharge plasma in the synthesis of H-doped TiO{sub 2} films. The parameters of the hydrogen-added Ar/O{sub 2} plasma influence the properties and the structural phases of the deposited TiO{sub 2} film. Therefore, the variation of plasma parameters such as electron temperature (T{sub e}), electron density (n{sub e}), ion density (n{sub i}), degree of ionization of Ar and degree of dissociation of H{sub 2} as a function of hydrogen content in the discharge is studied. Langmuir probe and Optical emission spectroscopy are used to characterize the plasma.more » On the basis of the different reactions in the gas phase of the magnetron discharge, the variation of plasma parameters and sputtering rate are explained. It is observed that the electron and heavy ion density decline with gradual addition of hydrogen in the discharge. Hydrogen addition significantly changes the degree of ionization of Ar which influences the structural phases of the TiO{sub 2} film.« less

Top