Science.gov

Sample records for mcnp-based depletion codes

  1. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    SciTech Connect

    Richard, Joshua; Galloway, Jack; Fensin, Michael; Trellue, Holly

    2015-04-04

    A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from the combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.

  2. SMITHERS: An object-oriented modular mapping methodology for MCNP-based neutronic–thermal hydraulic multiphysics

    DOE PAGES

    Richard, Joshua; Galloway, Jack; Fensin, Michael; ...

    2015-04-04

    A novel object-oriented modular mapping methodology for externally coupled neutronics–thermal hydraulics multiphysics simulations was developed. The Simulator using MCNP with Integrated Thermal-Hydraulics for Exploratory Reactor Studies (SMITHERS) code performs on-the-fly mapping of material-wise power distribution tallies implemented by MCNP-based neutron transport/depletion solvers for use in estimating coolant temperature and density distributions with a separate thermal-hydraulic solver. The key development of SMITHERS is that it reconstructs the hierarchical geometry structure of the material-wise power generation tallies from the depletion solver automatically, with only a modicum of additional information required from the user. In addition, it performs the basis mapping from themore » combinatorial geometry of the depletion solver to the required geometry of the thermal-hydraulic solver in a generalizable manner, such that it can transparently accommodate varying levels of thermal-hydraulic solver geometric fidelity, from the nodal geometry of multi-channel analysis solvers to the pin-cell level of discretization for sub-channel analysis solvers.« less

  3. ALEPH2 - A general purpose Monte Carlo depletion code

    SciTech Connect

    Stankovskiy, A.; Van Den Eynde, G.; Baeten, P.; Trakas, C.; Demy, P. M.; Villatte, L.

    2012-07-01

    The Monte-Carlo burn-up code ALEPH is being developed at SCK-CEN since 2004. A previous version of the code implemented the coupling between the Monte Carlo transport (any version of MCNP or MCNPX) and the ' deterministic' depletion code ORIGEN-2.2 but had important deficiencies in nuclear data treatment and limitations inherent to ORIGEN-2.2. A new version of the code, ALEPH2, has several unique features making it outstanding among other depletion codes. The most important feature is full data consistency between steady-state Monte Carlo and time-dependent depletion calculations. The last generation general-purpose nuclear data libraries (JEFF-3.1.1, ENDF/B-VII and JENDL-4) are fully implemented, including special purpose activation, spontaneous fission, fission product yield and radioactive decay data. The built-in depletion algorithm allows to eliminate the uncertainties associated with obtaining the time-dependent nuclide concentrations. A predictor-corrector mechanism, calculation of nuclear heating, calculation of decay heat, decay neutron sources are available as well. The validation of the code on the results of REBUS experimental program has been performed. The ALEPH2 has shown better agreement with measured data than other depletion codes. (authors)

  4. The New MCNP6 Depletion Capability

    SciTech Connect

    Fensin, Michael Lorne; James, Michael R.; Hendricks, John S.; Goorley, John T.

    2012-06-19

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology.

  5. The new MCNP6 depletion capability

    SciTech Connect

    Fensin, M. L.; James, M. R.; Hendricks, J. S.; Goorley, J. T.

    2012-07-01

    The first MCNP based in-line Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. Both the MCNP5 and MCNPX codes have historically provided a successful combinatorial geometry based, continuous energy, Monte Carlo radiation transport solution for advanced reactor modeling and simulation. However, due to separate development pathways, useful simulation capabilities were dispersed between both codes and not unified in a single technology. MCNP6, the next evolution in the MCNP suite of codes, now combines the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. We describe here the new capabilities of the MCNP6 depletion code dating from the official RSICC release MCNPX 2.6.0, reported previously, to the now current state of MCNP6. NEA/OECD benchmark results are also reported. The MCNP6 depletion capability enhancements beyond MCNPX 2.6.0 reported here include: (1) new performance enhancing parallel architecture that implements both shared and distributed memory constructs; (2) enhanced memory management that maximizes calculation fidelity; and (3) improved burnup physics for better nuclide prediction. MCNP6 depletion enables complete, relatively easy-to-use depletion calculations in a single Monte Carlo code. The enhancements described here help provide a powerful capability as well as dictate a path forward for future development to improve the usefulness of the technology. (authors)

  6. ORIGEN2: a revised and updated version of the Oak Ridge isotope generation and depletion code

    SciTech Connect

    Croff, A.G.

    1980-07-01

    ORIGEN2 is a versatile point depletion and decay computer code for use in simulating nuclear fuel cycles and calculating the nuclide compositions of materials contained therein. This code represents a revision and update of the original ORIGEN computer code which has been distributed world-wide beginning in the early 1970s. The purpose of this report is to give a summary description of a revised and updated version of the original ORIGEN computer code, which has been designated ORIGEN2. A detailed description of the computer code ORIGEN2 is presented. The methods used by ORIGEN2 to solve the nuclear depletion and decay equations are included. Input information necessary to use ORIGEN2 that has not been documented in supporting reports is documented.

  7. DANDE: a linked code system for core neutronics/depletion analysis

    SciTech Connect

    LaBauve, R.J.; England, T.R.; George, D.C.; MacFarlane, R.E.; Wilson, W.B.

    1985-06-01

    This report describes DANDE - a modular neutronics, depletion code system for reactor analysis. It consists of nuclear data processing, core physics, and fuel depletion modules, and allows one to use diffusion and transport methods interchangeably in core neutronics calculations. This latter capability is especially important in the design of small modular cores. Additional unique features include the capability of updating the nuclear data file during a calculation; a detailed treatment of depletion, burnable poisons as well as fuel; and the ability to make geometric changes such as control rod repositioning and fuel relocation in the course of a calculation. The detailed treatment of reactor fuel burnup, fission-product creation and decay, as well as inventories of higher-order actinides is a necessity when predicting the behavior of reactor fuel under increased burn conditions. The operation of the code system is made clear in this report by following a sample problem.

  8. Validation of depletion codes for burnup credit evaluation of LWR assemblies

    SciTech Connect

    Ranta-aho, A.

    2006-07-01

    This paper reports the comparison of the CASMO-4E predictions with the radiochemical assay data from assemblies irradiated in Takahama-3 PWR and Fukushima-Daini-2 BWR, and the most recently reported spent fuel data from the VVER-440 assembly irradiated in Novovoronezh 4. Some of the calculations were repeated with the ABURN burnup code, which is a combination of the MCNP4C Monte Carlo code and the ORIGEN2 depletion code. The cross section libraries applied were based on the ENDF/B-VI and the JEF-2.2 data. (authors)

  9. Improvements of MCOR: A Monte Carlo depletion code system for fuel assembly reference calculations

    SciTech Connect

    Tippayakul, C.; Ivanov, K.; Misu, S.

    2006-07-01

    This paper presents the improvements of MCOR, a Monte Carlo depletion code system for fuel assembly reference calculations. The improvements of MCOR were initiated by the cooperation between the Penn State Univ. and AREVA NP to enhance the original Penn State Univ. MCOR version in order to be used as a new Monte Carlo depletion analysis tool. Essentially, a new depletion module using KORIGEN is utilized to replace the existing ORIGEN-S depletion module in MCOR. Furthermore, the online burnup cross section generation by the Monte Carlo calculation is implemented in the improved version instead of using the burnup cross section library pre-generated by a transport code. Other code features have also been added to make the new MCOR version easier to use. This paper, in addition, presents the result comparisons of the original and the improved MCOR versions against CASMO-4 and OCTOPUS. It was observed in the comparisons that there were quite significant improvements of the results in terms of k{sub inf}, fission rate distributions and isotopic contents. (authors)

  10. MCNP-based computational model for the Leksell Gamma Knife

    SciTech Connect

    Trnka, Jiri; Novotny, Josef Jr.; Kluson, Jaroslav

    2007-01-15

    We have focused on the usage of MCNP code for calculation of Gamma Knife radiation field parameters with a homogenous polystyrene phantom. We have investigated several parameters of the Leksell Gamma Knife radiation field and compared the results with other studies based on EGS4 and PENELOPE code as well as the Leksell Gamma Knife treatment planning system Leksell GammaPlan (LGP). The current model describes all 201 radiation beams together and simulates all the sources in the same time. Within each beam, it considers the technical construction of the source, the source holder, collimator system, the spherical phantom, and surrounding material. We have calculated output factors for various sizes of scoring volumes, relative dose distributions along basic planes including linear dose profiles, integral doses in various volumes, and differential dose volume histograms. All the parameters have been calculated for each collimator size and for the isocentric configuration of the phantom. We have found the calculated output factors to be in agreement with other authors' works except the case of 4 mm collimator size, where averaging over the scoring volume and statistical uncertainties strongly influences the calculated results. In general, all the results are dependent on the choice of the scoring volume. The calculated linear dose profiles and relative dose distributions also match independent studies and the Leksell GammaPlan, but care must be taken about the fluctuations within the plateau, which can influence the normalization, and accuracy in determining the isocenter position, which is important for comparing different dose profiles. The calculated differential dose volume histograms and integral doses have been compared with data provided by the Leksell GammaPlan. The dose volume histograms are in good agreement as well as integral doses calculated in small calculation matrix volumes. However, deviations in integral doses up to 50% can be observed for large

  11. The Modeling of Advanced BWR Fuel Designs with the NRC Fuel Depletion Codes PARCS/PATHS

    SciTech Connect

    Ward, Andrew; Downar, Thomas J.; Xu, Y.; March-Leuba, Jose A; Thurston, Carl; Hudson, Nathanael H.; Ireland, A.; Wysocki, A.

    2015-04-22

    The PATHS (PARCS Advanced Thermal Hydraulic Solver) code was developed at the University of Michigan in support of U.S. Nuclear Regulatory Commission research to solve the steady-state, two-phase, thermal-hydraulic equations for a boiling water reactor (BWR) and to provide thermal-hydraulic feedback for BWR depletion calculations with the neutronics code PARCS (Purdue Advanced Reactor Core Simulator). The simplified solution methodology, including a three-equation drift flux formulation and an optimized iteration scheme, yields very fast run times in comparison to conventional thermal-hydraulic systems codes used in the industry, while still retaining sufficient accuracy for applications such as BWR depletion calculations. Lastly, the capability to model advanced BWR fuel designs with part-length fuel rods and heterogeneous axial channel flow geometry has been implemented in PATHS, and the code has been validated against previously benchmarked advanced core simulators as well as BWR plant and experimental data. We describe the modifications to the codes and the results of the validation in this paper.

  12. The Modeling of Advanced BWR Fuel Designs with the NRC Fuel Depletion Codes PARCS/PATHS

    DOE PAGES

    Ward, Andrew; Downar, Thomas J.; Xu, Y.; ...

    2015-04-22

    The PATHS (PARCS Advanced Thermal Hydraulic Solver) code was developed at the University of Michigan in support of U.S. Nuclear Regulatory Commission research to solve the steady-state, two-phase, thermal-hydraulic equations for a boiling water reactor (BWR) and to provide thermal-hydraulic feedback for BWR depletion calculations with the neutronics code PARCS (Purdue Advanced Reactor Core Simulator). The simplified solution methodology, including a three-equation drift flux formulation and an optimized iteration scheme, yields very fast run times in comparison to conventional thermal-hydraulic systems codes used in the industry, while still retaining sufficient accuracy for applications such as BWR depletion calculations. Lastly, themore » capability to model advanced BWR fuel designs with part-length fuel rods and heterogeneous axial channel flow geometry has been implemented in PATHS, and the code has been validated against previously benchmarked advanced core simulators as well as BWR plant and experimental data. We describe the modifications to the codes and the results of the validation in this paper.« less

  13. Probabilistic approach for decay heat uncertainty estimation using URANIE platform and MENDEL depletion code

    NASA Astrophysics Data System (ADS)

    Tsilanizara, A.; Gilardi, N.; Huynh, T. D.; Jouanne, C.; Lahaye, S.; Martinez, J. M.; Diop, C. M.

    2014-06-01

    The knowledge of the decay heat quantity and the associated uncertainties are important issues for the safety of nuclear facilities. Many codes are available to estimate the decay heat. ORIGEN, FISPACT, DARWIN/PEPIN2 are part of them. MENDEL is a new depletion code developed at CEA, with new software architecture, devoted to the calculation of physical quantities related to fuel cycle studies, in particular decay heat. The purpose of this paper is to present a probabilistic approach to assess decay heat uncertainty due to the decay data uncertainties from nuclear data evaluation like JEFF-3.1.1 or ENDF/B-VII.1. This probabilistic approach is based both on MENDEL code and URANIE software which is a CEA uncertainty analysis platform. As preliminary applications, single thermal fission of uranium 235, plutonium 239 and PWR UOx spent fuel cell are investigated.

  14. KRAM, A lattice physics code for modeling the detailed depletion of gadolinia isotopes in BWR lattice designs

    SciTech Connect

    Knott, D.; Baratta, A. )

    1990-01-01

    Lattice physics codes are used to deplete the burnable isotopes present in each lattice design, calculate the buildup of fission products, and generate the few-group cross-section data needed by the various nodal simulator codes. Normally, the detailed depletion of gadolinia isotopes is performed outside the lattice physics code in a one-dimensional environment using an onion-skin model, such as the method used in MICBURN. Results from the onion-skin depletion, in the form of effective microscopic absorption cross sections for the gadolinia, are then used by the lattice physics code during the lattice-depletion analysis. The reactivity of the lattice at any point in the cycle depends to a great extent on the amount of gadolinia present. In an attempt to improve the modeling of gadolinia depletion from fresh boiling water reactor (BWR) fuel designs, the electric Power Research Institute (EPRI) lattice-physics code CPM-2 has been modified extensively. In this paper, the modified code KRAM is described, and results from various lattice-depletion analyses are discussed in comparison with results from standard CPM-2 and CASMO-2 analyses.

  15. A MCNP-based calibration method and a voxel phantom for in vivo monitoring of 241Am in skull

    NASA Astrophysics Data System (ADS)

    Moraleda, M.; Gómez-Ros, J. M.; López, M. A.; Navarro, T.; Navarro, J. F.

    2004-07-01

    Whole body counter (WBC) facilities are currently used for assessment of internal radionuclide body burdens by directly measuring the radiation emitted from the body. Previous calibration of the detection devices requires the use of specific anthropomorphic phantoms. This paper describes the MCNP-based Monte Carlo technique developed for calibration of the germanium detectors (Canberra LE Ge) used in the CIEMAT WBC for in vivo measurements of 241Am in skull. The proposed method can also be applied for in vivo counting of different radionuclides distributed in other anatomical regions as well as for other detectors. A computer software was developed to automatically generate the input files for the MCNP code starting from any segmented human anatomy data. A specific model of a human head for the assessment of 241Am was built based on the tomographic phantom VOXELMAN of Yale University. The germanium detectors were carefully modelled from data provided by the manufacturer. This numerical technique has been applied to investigate the best counting geometry and the uncertainty due to improper positioning of the detectors.

  16. Application of the RCP01 Code to Depletion of a PWR Spent Nuclear Fuel Sample

    SciTech Connect

    Joo, Hansem

    2002-01-01

    An essential component of a proposed burnup credit methodology for commercial PWR spent nuclear fuel (SNF) is the validation of the tools used for isotopic and criticality calculations. A number of benchmark experiments have been analyzed to establish the validation of the tools and to determine biases and corrections. To benchmark the RCP01 Monte Carlo computer code, an isotopic validation study was conducted for one of the benchmark experiments, a SNF sample taken from the Calvert Cliffs PWR Unit-1 (CCPU1). Modeling considerations and nuclear data associated with the RCP01 transport/depletion calculations are discussed. The accuracy of RCP01 calculations is demonstrated to be very good when RCP01 results are compared to destructive chemical assay data for major actinides and important fission products in the SNF sample.

  17. NEPHTIS: Core depletion validation relying on 2D transport core calculations with the APOLLO2 code

    SciTech Connect

    Damian, F.; Raepsaet, X.; Groizard, M.; Poinot, C.

    2006-07-01

    The CEA, in collaboration with EDF and AREVA-NP, is developing a core modelling tool called NEPHTIS, for Neutronic Process for HTGR Innovating Systems and dedicated at present day to the prismatic block-type HTGR (High Temperature Gas-Cooled Reactors). Due to the lack of usable HTGR experimental results, the confidence in this neutronic computational tool relies essentially on comparisons to reference or best-estimate calculations. In the present analysis, the Aleppo deterministic transport code has been selected as reference for validating core depletion simulations carried out within NEPHTIS. These reference calculations were performed on fully detailed 2D core configurations using the Method of Characteristics. The latter has been validated versus Monte Carlo method for different static core configurations [1], [2] and [3]. All the presented results come from an annular HTGR core loaded with uranium-based fuel (15% enrichment). During the core depletion validation, reactivity, reaction rates distributions and nuclei concentrations have been compared. In addition, the impact of various physical and geometrical parameters such as the core loading (one-through or batch-wise reloading) and the amount of burnable poison has been investigated during the validation phases. The results confirm that NEPHTIS is able to predict the core reactivity with uncertainties of {+-}350 pcm. At the end of the core irradiation, the U-235 consumption is calculated within {+-} 0, 7 % while the plutonium mass discharged from the core is calculated within {+-}1 %. As far as the core power distributions are concerned, small discrepancies ( and < 2.3 %) can be observed on the fuel block-averaged power distribution in the core. (authors)

  18. Development of Depletion Code Surrogate Models for Uncertainty Propagation in Scenario Studies

    NASA Astrophysics Data System (ADS)

    Krivtchik, Guillaume; Coquelet-Pascal, Christine; Blaise, Patrick; Garzenne, Claude; Le Mer, Joël; Freynet, David

    2014-06-01

    The result of transition scenario studies, which enable the comparison of different options of the reactor fleet evolution and management of the future fuel cycle materials, allow to perform technical and economic feasibility studies. The COSI code is developed by CEA and used to perform scenario calculations. It allows to model any fuel type, reactor fleet, fuel facility, and permits the tracking of U, Pu, minor actinides and fission products nuclides on a large time scale. COSI is coupled with the CESAR code which performs the depletion calculations based on one-group cross-section libraries and nuclear data. Different types of uncertainties have an impact on scenario studies: nuclear data and scenario assumptions. Therefore, it is necessary to evaluate their impact on the major scenario results. The methodology adopted to propagate these uncertainties throughout the scenario calculations is a stochastic approach. Considering the amount of inputs to be sampled in order to perform a stochastic calculation of the propagated uncertainty, it appears necessary to reduce the calculation time. Given that evolution calculations represent approximately 95% of the total scenario simulation time, an optimization can be done, with the development and implementation of a surrogate models library of CESAR in COSI. The input parameters of CESAR are sampled with URANIE, the CEA uncertainty platform, and for every sample, the isotopic composition after evolution evaluated with CESAR is stored. Then statistical analysis of the input and output tables allow to model the behavior of CESAR on each CESAR library, i.e. building a surrogate model. Several quality tests are performed on each surrogate model to insure the prediction power is satisfying. Afterward, a new routine implemented in COSI reads these surrogate models and using them in replacement of CESAR calculations. A preliminary study of the calculation time gain shows that the use of surrogate models allows stochastic

  19. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    SciTech Connect

    Blink, J.A.

    1985-03-01

    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs.

  20. A common class of transcripts with 5'-intron depletion, distinct early coding sequence features, and N(1)-methyladenosine modification.

    PubMed

    Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P; Palazzo, Alexander F; Moore, Melissa J; Roth, Frederick P

    2017-03-01

    Introns are found in 5' untranslated regions (5'UTRs) for 35% of all human transcripts. These 5'UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5'UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5'UTR intron status, we developed a classifier that can predict 5'UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5' proximal-intron-minus-like-coding regions ("5IM" transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5' cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5' proximal positions. Finally, N(1)-methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5' proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N(1)-methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. © 2017 Cenik et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  1. Nitrogen depletion in the fission yeast Schizosaccharomyces pombe causes nucleosome loss in both promoters and coding regions of activated genes

    PubMed Central

    Kristell, Carolina; Orzechowski Westholm, Jakub; Olsson, Ida; Ronne, Hans; Komorowski, Jan; Bjerling, Pernilla

    2010-01-01

    Gene transcription is associated with local changes in chromatin, both in nucleosome positions and in chemical modifications of the histones. Chromatin dynamics has mostly been studied on a single-gene basis. Those genome-wide studies that have been made primarily investigated steady-state transcription. However, three studies of genome-wide changes in chromatin during the transcriptional response to heat shock in the budding yeast Saccharomyces cerevisiae revealed nucleosome eviction in promoter regions but only minor effects in coding regions. Here, we describe the short-term response to nitrogen starvation in the fission yeast Schizosaccharomyces pombe. Nitrogen depletion leads to a fast induction of a large number of genes in S. pombe and is thus suitable for genome-wide studies of chromatin dynamics during gene regulation. After 20 min of nitrogen removal, 118 transcripts were up-regulated. The distribution of regulated genes throughout the genome was not random; many up-regulated genes were found in clusters, while large parts of the genome were devoid of up-regulated genes. Surprisingly, this up-regulation was associated with nucleosome eviction of equal magnitudes in the promoters and in the coding regions. The nucleosome loss was not limited to induction by nitrogen depletion but also occurred during cadmium treatment. Furthermore, the lower nucleosome density persisted for at least 60 min after induction. Two highly induced genes, urg1+ and urg2+, displayed a substantial nucleosome loss, with only 20% of the nucleosomes being left in the coding region. We conclude that nucleosome loss during transcriptional activation is not necessarily limited to promoter regions. PMID:20086243

  2. Accelerated equilibrium core composition search using a new MCNP-based simulator

    NASA Astrophysics Data System (ADS)

    Seifried, Jeffrey E.; Gorman, Phillip M.; Vujic, Jasmina L.; Greenspan, Ehud

    2014-06-01

    MocDown is a new Monte Carlo depletion and recycling simulator which couples neutron transport with MCNP and transmutation with ORIGEN. This modular approach to depletion allows for flexible operation by incorporating the accelerated progression of a complex fuel processing scheme towards equilibrium and by allowing for the online coupling of thermo-fluids feedback. MocDown also accounts for the variation of decay heat with fuel isotopics evolution. In typical cases, MocDown requires just over a day to find the equilibrium core composition for a multi-recycling fuel cycle, with a self-consistent thermo-fluids solution-a task that required between one and two weeks using previous Monte Carlo-based approaches.

  3. Performance upgrades to the MCNP6 burnup capability for large scale depletion calculations

    DOE PAGES

    Fensin, M. L.; Galloway, J. D.; James, M. R.

    2015-04-11

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. With the merger of MCNPX and MCNP5, MCNP6 combined the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. The new MCNP6 depletion capability was first showcased at the International Congress for Advancements in Nuclear Power Plants (ICAPP) meeting in 2012. At that conference the new capabilities addressed included the combined distributive and shared memory parallel architecture for the burnup capability, improved memory management, physics enhancements, and newmore » predictability as compared to the H.B Robinson Benchmark. At Los Alamos National Laboratory, a special purpose cluster named “tebow,” was constructed such to maximize available RAM per CPU, as well as leveraging swap space with solid state hard drives, to allow larger scale depletion calculations (allowing for significantly more burnable regions than previously examined). As the MCNP6 burnup capability was scaled to larger numbers of burnable regions, a noticeable slowdown was realized.This paper details two specific computational performance strategies for improving calculation speedup: (1) retrieving cross sections during transport; and (2) tallying mechanisms specific to burnup in MCNP. To combat this slowdown new performance upgrades were developed and integrated into MCNP6 1.2.« less

  4. Performance upgrades to the MCNP6 burnup capability for large scale depletion calculations

    SciTech Connect

    Fensin, M. L.; Galloway, J. D.; James, M. R.

    2015-04-11

    The first MCNP based inline Monte Carlo depletion capability was officially released from the Radiation Safety Information and Computational Center as MCNPX 2.6.0. With the merger of MCNPX and MCNP5, MCNP6 combined the capability of both simulation tools, as well as providing new advanced technology, in a single radiation transport code. The new MCNP6 depletion capability was first showcased at the International Congress for Advancements in Nuclear Power Plants (ICAPP) meeting in 2012. At that conference the new capabilities addressed included the combined distributive and shared memory parallel architecture for the burnup capability, improved memory management, physics enhancements, and new predictability as compared to the H.B Robinson Benchmark. At Los Alamos National Laboratory, a special purpose cluster named “tebow,” was constructed such to maximize available RAM per CPU, as well as leveraging swap space with solid state hard drives, to allow larger scale depletion calculations (allowing for significantly more burnable regions than previously examined). As the MCNP6 burnup capability was scaled to larger numbers of burnable regions, a noticeable slowdown was realized.This paper details two specific computational performance strategies for improving calculation speedup: (1) retrieving cross sections during transport; and (2) tallying mechanisms specific to burnup in MCNP. To combat this slowdown new performance upgrades were developed and integrated into MCNP6 1.2.

  5. A common class of transcripts with 5′-intron depletion, distinct early coding sequence features, and N1-methyladenosine modification

    PubMed Central

    Cenik, Can; Chua, Hon Nian; Singh, Guramrit; Akef, Abdalla; Snyder, Michael P.; Palazzo, Alexander F.

    2017-01-01

    Introns are found in 5′ untranslated regions (5′UTRs) for 35% of all human transcripts. These 5′UTR introns are not randomly distributed: Genes that encode secreted, membrane-bound and mitochondrial proteins are less likely to have them. Curiously, transcripts lacking 5′UTR introns tend to harbor specific RNA sequence elements in their early coding regions. To model and understand the connection between coding-region sequence and 5′UTR intron status, we developed a classifier that can predict 5′UTR intron status with >80% accuracy using only sequence features in the early coding region. Thus, the classifier identifies transcripts with 5′ proximal-intron-minus-like-coding regions (“5IM” transcripts). Unexpectedly, we found that the early coding sequence features defining 5IM transcripts are widespread, appearing in 21% of all human RefSeq transcripts. The 5IM class of transcripts is enriched for non-AUG start codons, more extensive secondary structure both preceding the start codon and near the 5′ cap, greater dependence on eIF4E for translation, and association with ER-proximal ribosomes. 5IM transcripts are bound by the exon junction complex (EJC) at noncanonical 5′ proximal positions. Finally, N1-methyladenosines are specifically enriched in the early coding regions of 5IM transcripts. Taken together, our analyses point to the existence of a distinct 5IM class comprising ∼20% of human transcripts. This class is defined by depletion of 5′ proximal introns, presence of specific RNA sequence features associated with low translation efficiency, N1-methyladenosines in the early coding region, and enrichment for noncanonical binding by the EJC. PMID:27994090

  6. Depletion of the Trypanosome Pumilio Domain Protein PUF2 or of Some Other Essential Proteins Causes Transcriptome Changes Related to Coding Region Length

    PubMed Central

    Jha, Bhaskar Anand; Fadda, Abeer; Merce, Clementine; Mugo, Elisha; Droll, Dorothea

    2014-01-01

    Pumilio domain RNA-binding proteins are known mainly as posttranscriptional repressors of gene expression that reduce mRNA translation and stability. Trypanosoma brucei has 11 PUF proteins. We show here that PUF2 is in the cytosol, with roughly the same number of molecules per cell as there are mRNAs. Although PUF2 exhibits a low level of in vivo RNA binding, it is not associated with polysomes. PUF2 also decreased reporter mRNA levels in a tethering assay, consistent with a repressive role. Depletion of PUF2 inhibited growth of bloodstream-form trypanosomes, causing selective loss of mRNAs with long open reading frames and increases in mRNAs with shorter open reading frames. Reexamination of published RNASeq data revealed the same trend in cells depleted of some other proteins. We speculate that these length effects could be caused by inhibition of the elongation phase of transcription or by an influence of translation status or polysomal conformation on mRNA decay. PMID:24681684

  7. Specification for the VERA Depletion Benchmark Suite

    SciTech Connect

    Kim, Kang Seog

    2015-12-17

    CASL-X-2015-1014-000 iii Consortium for Advanced Simulation of LWRs EXECUTIVE SUMMARY The CASL neutronics simulator MPACT is under development for the neutronics and T-H coupled simulation for the pressurized water reactor. MPACT includes the ORIGEN-API and internal depletion module to perform depletion calculations based upon neutron-material reaction and radioactive decay. It is a challenge to validate the depletion capability because of the insufficient measured data. One of the detoured methods to validate it is to perform a code-to-code comparison for benchmark problems. In this study a depletion benchmark suite has been developed and a detailed guideline has been provided to obtain meaningful computational outcomes which can be used in the validation of the MPACT depletion capability.

  8. CO depletion in the Gould Belt clouds

    NASA Astrophysics Data System (ADS)

    Christie, H.; Viti, S.; Yates, J.; Hatchell, J.; Fuller, G. A.; Duarte-Cabral, A.; Sadavoy, S.; Buckle, J. V.; Graves, S.; Roberts, J.; Nutter, D.; Davis, C.; White, G. J.; Hogerheijde, M.; Ward-Thompson, D.; Butner, H.; Richer, J.; Di Francesco, J.

    2012-05-01

    We present a statistical comparison of CO depletion in a set of local molecular clouds within the Gould Belt using Sub-millimetre Common User Bolometer Array (SCUBA) and Heterodyne Array Receiver Programme (HARP) data. This is the most wide-ranging study of depletion thus far within the Gould Belt. We estimate CO column densities assuming local thermodynamic equilibrium and, for a selection of sources, using the radiative transfer code RADEX in order to compare the two column density estimation methods. High levels of depletion are seen in the centres of several dust cores in all the clouds. We find that in the gas surrounding protostars, levels of depletion are somewhat lower than for starless cores with the exception of a few highly depleted protostellar cores in Serpens and NGC 2024. There is a tentative correlation between core mass and core depletion, particularly in Taurus and Serpens. Taurus has, on average, the highest levels of depletion. Ophiuchus has low average levels of depletion which could perhaps be related to the anomalous dust grain size distribution observed in this cloud. High levels of depletion are often seen around the edges of regions of optical emission (Orion) or in more evolved or less dynamic regions such as the bowl of L1495 in Taurus and the north-western region of Serpens.

  9. Water Depletion Threatens Agriculture

    NASA Astrophysics Data System (ADS)

    Brauman, K. A.; Richter, B. D.; Postel, S.; Floerke, M.; Malsy, M.

    2014-12-01

    Irrigated agriculture is the human activity that has by far the largest impact on water, constituting 85% of global water consumption and 67% of global water withdrawals. Much of this water use occurs in places where water depletion, the ratio of water consumption to water availability, exceeds 75% for at least one month of the year. Although only 17% of global watershed area experiences depletion at this level or more, nearly 30% of total cropland and 60% of irrigated cropland are found in these depleted watersheds. Staple crops are particularly at risk, with 75% of global irrigated wheat production and 65% of irrigated maize production found in watersheds that are at least seasonally depleted. Of importance to textile production, 75% of cotton production occurs in the same watersheds. For crop production in depleted watersheds, we find that one half to two-thirds of production occurs in watersheds that have not just seasonal but annual water shortages, suggesting that re-distributing water supply over the course of the year cannot be an effective solution to shortage. We explore the degree to which irrigated production in depleted watersheds reflects limitations in supply, a byproduct of the need for irrigation in perennially or seasonally dry landscapes, and identify heavy irrigation consumption that leads to watershed depletion in more humid climates. For watersheds that are not depleted, we evaluate the potential impact of an increase in irrigated production. Finally, we evaluate the benefits of irrigated agriculture in depleted and non-depleted watersheds, quantifying the fraction of irrigated production going to food production, animal feed, and biofuels.

  10. Halo Star Lithium Depletion

    SciTech Connect

    Pinsonneault, M. H.; Walker, T. P.; Steigman, G.; Narayanan, Vijay K.

    1999-12-10

    The depletion of lithium during the pre-main-sequence and main-sequence phases of stellar evolution plays a crucial role in the comparison of the predictions of big bang nucleosynthesis with the abundances observed in halo stars. Previous work has indicated a wide range of possible depletion factors, ranging from minimal in standard (nonrotating) stellar models to as much as an order of magnitude in models that include rotational mixing. Recent progress in the study of the angular momentum evolution of low-mass stars permits the construction of theoretical models capable of reproducing the angular momentum evolution of low-mass open cluster stars. The distribution of initial angular momenta can be inferred from stellar rotation data in young open clusters. In this paper we report on the application of these models to the study of lithium depletion in main-sequence halo stars. A range of initial angular momenta produces a range of lithium depletion factors on the main sequence. Using the distribution of initial conditions inferred from young open clusters leads to a well-defined halo lithium plateau with modest scatter and a small population of outliers. The mass-dependent angular momentum loss law inferred from open cluster studies produces a nearly flat plateau, unlike previous models that exhibited a downward curvature for hotter temperatures in the 7Li-Teff plane. The overall depletion factor for the plateau stars is sensitive primarily to the solar initial angular momentum used in the calibration for the mixing diffusion coefficients. Uncertainties remain in the treatment of the internal angular momentum transport in the models, and the potential impact of these uncertainties on our results is discussed. The 6Li/7Li depletion ratio is also examined. We find that the dispersion in the plateau and the 6Li/7Li depletion ratio scale with the absolute 7Li depletion in the plateau, and we use observational data to set bounds on the 7Li depletion in main-sequence halo

  11. Depleted Uranium: Technical Brief

    EPA Pesticide Factsheets

    This technical brief provides accepted data and references to additional sources for radiological and chemical characteristics, health risks and references for both the monitoring and measurement, and applicable treatment techniques for depleted uranium.

  12. Battery depletion monitor

    SciTech Connect

    Lee, Y.S.

    1982-01-26

    A cmos inverter is used to compare pacemaker battery voltage to a referenced voltage. When the reference voltage exceeds the measured battery voltage, the inverter changes state to indicate battery depletion.

  13. Addressing Ozone Layer Depletion

    EPA Pesticide Factsheets

    Access information on EPA's efforts to address ozone layer depletion through regulations, collaborations with stakeholders, international treaties, partnerships with the private sector, and enforcement actions under Title VI of the Clean Air Act.

  14. Cholesterol depletion induces autophagy

    SciTech Connect

    Cheng, Jinglei; Ohsaki, Yuki; Tauchi-Sato, Kumi; Fujita, Akikazu; Fujimoto, Toyoshi . E-mail: tfujimot@med.nagoya-u.ac.jp

    2006-12-08

    Autophagy is a mechanism to digest cells' own components, and its importance in many physiological and pathological processes is being recognized. But the molecular mechanism that regulates autophagy is not understood in detail. In the present study, we found that cholesterol depletion induces macroautophagy. The cellular cholesterol in human fibroblasts was depleted either acutely using 5 mM methyl-{beta}-cyclodextrin or 10-20 {mu}g/ml nystatin for 1 h, or metabolically by 20 {mu}M mevastatin and 200 {mu}M mevalonolactone along with 10% lipoprotein-deficient serum for 2-3 days. By any of these protocols, marked increase of LC3-II was detected by immunoblotting and by immunofluorescence microscopy, and the increase was more extensive than that caused by amino acid starvation, i.e., incubation in Hanks' solution for several hours. The induction of autophagic vacuoles by cholesterol depletion was also observed in other cell types, and the LC3-positive membranes were often seen as long tubules, >50 {mu}m in length. The increase of LC3-II by methyl-{beta}-cyclodextrin was suppressed by phosphatidylinositol 3-kinase inhibitors and was accompanied by dephosphorylation of mammalian target of rapamycin. By electron microscopy, autophagic vacuoles induced by cholesterol depletion were indistinguishable from those seen after amino acid starvation. These results demonstrate that a decrease in cholesterol activates autophagy by a phosphatidylinositol 3-kinase-dependent mechanism.

  15. Depletion of Intense Fields

    NASA Astrophysics Data System (ADS)

    Seipt, D.; Heinzl, T.; Marklund, M.; Bulanov, S. S.

    2017-04-01

    The interaction of charged particles and photons with intense electromagnetic fields gives rise to multiphoton Compton and Breit-Wheeler processes. These are usually described in the framework of the external field approximation, where the electromagnetic field is assumed to have infinite energy. However, the multiphoton nature of these processes implies the absorption of a significant number of photons, which scales as the external field amplitude cubed. As a result, the interaction of a highly charged electron bunch with an intense laser pulse can lead to significant depletion of the laser pulse energy, thus rendering the external field approximation invalid. We provide relevant estimates for this depletion and find it to become important in the interaction between fields of amplitude a0˜1 03 and electron bunches with charges of the order of 10 nC.

  16. Certain Adenylated Non-Coding RNAs, Including 5′ Leader Sequences of Primary MicroRNA Transcripts, Accumulate in Mouse Cells following Depletion of the RNA Helicase MTR4

    PubMed Central

    Dorweiler, Jane E.; Ni, Ting; Zhu, Jun; Munroe, Stephen H.; Anderson, James T.

    2014-01-01

    RNA surveillance plays an important role in posttranscriptional regulation. Seminal work in this field has largely focused on yeast as a model system, whereas exploration of RNA surveillance in mammals is only recently begun. The increased transcriptional complexity of mammalian systems provides a wider array of targets for RNA surveillance, and, while many questions remain unanswered, emerging data suggest the nuclear RNA surveillance machinery exhibits increased complexity as well. We have used a small interfering RNA in mouse N2A cells to target the homolog of a yeast protein that functions in RNA surveillance (Mtr4p). We used high-throughput sequencing of polyadenylated RNAs (PA-seq) to quantify the effects of the mMtr4 knockdown (KD) on RNA surveillance. We demonstrate that overall abundance of polyadenylated protein coding mRNAs is not affected, but several targets of RNA surveillance predicted from work in yeast accumulate as adenylated RNAs in the mMtr4KD. microRNAs are an added layer of transcriptional complexity not found in yeast. After Drosha cleavage separates the pre-miRNA from the microRNA's primary transcript, the byproducts of that transcript are generally thought to be degraded. We have identified the 5′ leading segments of pri-miRNAs as novel targets of mMtr4 dependent RNA surveillance. PMID:24926684

  17. Depleted uranium management alternatives

    SciTech Connect

    Hertzler, T.J.; Nishimoto, D.D.

    1994-08-01

    This report evaluates two management alternatives for Department of Energy depleted uranium: continued storage as uranium hexafluoride, and conversion to uranium metal and fabrication to shielding for spent nuclear fuel containers. The results will be used to compare the costs with other alternatives, such as disposal. Cost estimates for the continued storage alternative are based on a life-cycle of 27 years through the year 2020. Cost estimates for the recycle alternative are based on existing conversion process costs and Capital costs for fabricating the containers. Additionally, the recycle alternative accounts for costs associated with intermediate product resale and secondary waste disposal for materials generated during the conversion process.

  18. Tank depletion flow controller

    DOEpatents

    Georgeson, Melvin A.

    1976-10-26

    A flow control system includes two bubbler tubes installed at different levels within a tank containing such as radioactive liquid. As the tank is depleted, a differential pressure transmitter monitors pressure differences imparted by the two bubbler tubes at a remote, shielded location during uniform time intervals. At the end of each uniform interval, balance pots containing a dense liquid are valved together to equalize the pressures. The resulting sawtooth-shaped signal generated by the differential pressure transmitter is compared with a second sawtooth signal representing the desired flow rate during each time interval. Variations in the two signals are employed by a control instrument to regulate flow rate.

  19. Depletion of intense fields

    NASA Astrophysics Data System (ADS)

    Bulanov, S. S.; Seipt, D.; Heinzl, T.; Marklund, M.

    2017-03-01

    The problem of backreaction of quantum processes on the properties of the background field still remains on the list of outstanding questions of high intensity particle physics. Usually, photon emission by an electron or positron, photon decay into electron-positron pairs in strong electromagnetic fields, or electron-positron pair production by such fields are described in the framework of the external field approximation. It is assumed that the external field has infinite energy and is not affected by these processes. However, the above-mentioned processes have a multi-photon nature, i.e., they occur with the absorption of a significant number of field photons. As a result, the interaction of an intense electromagnetic field with either a highly charged electron bunch or a fast growing population of electrons, positrons, and gamma photons (as in the case of an electromagnetic cascade) may lead to a depletion of the field energy, thus making the external field approximation invalid. Taking the multi-photon Compton process as an example, we estimate the threshold of depletion and find it to become significant at field strengths (a0˜103) and electron bunch charge of about tens of nC.

  20. Depletion analysis of the UMLRR reactor core using MCNP6

    NASA Astrophysics Data System (ADS)

    Odera, Dim Udochukwu

    Accurate knowledge of the neutron flux and temporal nuclide inventory in reactor physics calculations is necessary for a variety of application in nuclear engineering such as criticality safety, safeguards, and spent fuel storage. The Monte Carlo N- Particle (MCNP6) code with integrated buildup depletion code (CINDER90) provides a high-fidelity tool that can be used to perform 3D, full core simulation to evaluate fissile material utilization, and nuclide inventory calculations as a function of burnup. The University of Massachusetts Lowell Research Reactor (UMLRR) reactor has been modeled with the deterministic based code, VENTURE and with an older version of MCNP (MCNP5). The MIT developed MCODE (MCNP ORIGEN DEPLETION CODE) was used previously to perform some limited depletion calculations. This work chronicles the use of MCNP6, released in June 2013, to perform coupled neutronics and depletion calculation. The results are compared to previously benchmarked results. Furthermore, the code is used to determine the ratio of fission products 134Cs and 137Cs (burnup indicators), and the resultant ratio is compared to the burnup of the UMLRR.

  1. Stimulated Emission Depletion Microscopy.

    PubMed

    Blom, Hans; Widengren, Jerker

    2017-06-14

    Despite its short history, diffraction-unlimited fluorescence microscopy techniques have already made a substantial imprint in the biological sciences. In this review, we describe how stimulated emission depletion (STED) imaging originally evolved, how it compares to other optical super-resolution imaging techniques, and what advantages it provides compared to previous golden-standards for biological microscopy, such as diffraction-limited optical microscopy and electron microscopy. We outline the prerequisites for successful STED imaging experiments, emphasizing the equally critical roles of instrumentation, sample preparation, and photophysics, and describe major evolving strategies for how to push the borders of STED imaging even further in life science. Finally, we provide examples of how STED nanoscopy can be applied, within three different fields with particular potential for STED imaging experiments: neuroscience, plasma membrane biophysics, and subcellular clinical diagnostics. In these areas, and in many more, STED imaging can be expected to play an increasingly important role in the future.

  2. Ozone Depletion by Hydrofluorocarbons

    NASA Astrophysics Data System (ADS)

    Hurwitz, M.; Fleming, E. L.; Newman, P. A.; Li, F.; Mlawer, E. J.; Cady-Pereira, K. E.; Bailey, R.

    2015-12-01

    Hydrofluorocarbons (HFCs) are second-generation replacements for the chlorofluorocarbons (CFCs), halons and other substances that caused the 'ozone hole'. Atmospheric concentrations of HFCs are projected to increase dramatically in the coming decades. Coupled chemistry-climate simulations forced by these projections show that HFCs will impact the global atmosphere in 2050. As strong radiative forcers, HFCs modulate atmospheric temperature, thereby changing ozone-destroying catalytic cycles and enhancing the stratospheric circulation. These changes lead to a weak depletion of stratospheric ozone. Sensitivity simulations with the NASA Goddard Space Flight Center (GSFC) 2D model show that HFC-125 is the most important contributor to atmospheric change in 2050, as compared with HFC-23, HFC-32, HFC-134a and HFC-143a. Incorporating the interactions between chemistry, radiation and dynamics, for a likely 2050 climate, ozone depletion potentials (ODPs) for HFCs range from 4.3x10-4 to 3.5x10-2; previously HFCs were assumed to have negligible ODPs since these species lack chlorine or bromine atoms. The ozone impacts of HFCs are further investigated with the Goddard Earth Observing System Chemistry-Climate Model (GEOSCCM). The GEOSCCM is a three-dimensional, fully coupled ocean-atmosphere model with interactive stratospheric chemistry. Sensitivity simulations in which CO2, CFC-11 and HCFC-22 are enhanced individually are used as proxies for the atmospheric response to the HFC concentrations expected by the mid-21st century. Sensitivity simulations provide quantitative estimates of the impacts of these greenhouse gases on global total ozone, and can be used to assess their effects on the recovery of Antarctic ozone.

  3. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  4. A multi-platform linking code for fuel burnup and radiotoxicity analysis

    NASA Astrophysics Data System (ADS)

    Cunha, R.; Pereira, C.; Veloso, M. A. F.; Cardoso, F.; Costa, A. L.

    2014-02-01

    A linking code between ORIGEN2.1 and MCNP has been developed at the Departamento de Engenharia Nuclear/UFMG to calculate coupled neutronic/isotopic results for nuclear systems and to produce a large number of criticality, burnup and radiotoxicity results. In its previous version, it evaluated the isotopic composition evolution in a Heat Pipe Power System model as well as the radiotoxicity and radioactivity during lifetime cycles. In the new version, the code presents features such as multi-platform execution and automatic results analysis. Improvements made in the code allow it to perform simulations in a simpler and faster way without compromising accuracy. Initially, the code generates a new input for MCNP based on the decisions of the user. After that, MCNP is run and data, such as recoverable energy per prompt fission neutron, reaction rates and keff, are automatically extracted from the output and used to calculate neutron flux and cross sections. These data are then used to construct new ORIGEN inputs, one for each cell in the core. Each new input is run on ORIGEN and generates outputs that represent the complete isotopic composition of the core on that time step. The results show good agreement between GB (Coupled Neutronic/Isotopic code) and Monteburns (Automated, Multi-Step Monte Carlo Burnup Code System), developed by the Los Alamos National Laboratory.

  5. Depletion optimization of lumped burnable poisons in pressurized water reactors

    SciTech Connect

    Kodah, Z.H.

    1982-01-01

    Techniques were developed to construct a set of basic poison depletion curves which deplete in a monotonical manner. These curves were combined to match a required optimized depletion profile by utilizing either linear or non-linear programming methods. Three computer codes, LEOPARD, XSDRN, and EXTERMINATOR-2 were used in the analyses. A depletion routine was developed and incorporated into the XSDRN code to allow the depletion of fuel, fission products, and burnable poisons. The Three Mile Island Unit-1 reactor core was used in this work as a typical PWR core. Two fundamental burnable poison rod designs were studied. They are a solid cylindrical poison rod and an annular cylindrical poison rod with water filling the central region.These two designs have either a uniform mixture of burnable poisons or lumped spheroids of burnable poisons in the poison region. Boron and gadolinium are the two burnable poisons which were investigated in this project. Thermal self-shielding factor calculations for solid and annular poison rods were conducted. Also expressions for overall thermal self-shielding factors for one or more than one size group of poison spheroids inside solid and annular poison rods were derived and studied. Poison spheroids deplete at a slower rate than the poison mixture because each spheroid exhibits some self-shielding effects of its own. The larger the spheroid, the higher the self-shielding effects due to the increase in poison concentration.

  6. 12. VIEW OF DEPLETED URANIUM INGOT AND MOLDS. DEPLETED URANIUM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. VIEW OF DEPLETED URANIUM INGOT AND MOLDS. DEPLETED URANIUM CASTING OPERATIONS CEASED IN 1988. (11/14/57) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  7. Depleted Uranium in Repositories

    SciTech Connect

    Haire, M.J.; Croff, A.G.

    1997-12-31

    For uranium to be useful in most fission nuclear reactors, it must be enriched (i.e. the concentration of the fissile isotope 235U must be increased). Therefore, depleted uranium (DU)-uranium which has less than naturally occurring concentrations of 235U-is a co-product of the enrichment process. Four to six tons of DU exist for every ton of fresh light water reactor fuel. There were 407,006 MgU 407,000 metric tons (t) of DU stored on U.S. Department of Energy (DOE) sites as of July 1993. If this DU were to be declared surplus, converted to a stable oxide form, and emplaced in a near surface disposal facility, the costs are estimated to be several billion dollars. However, the U.S. Nuclear Regulatory Commission has stated that near surface disposal of large quantities of DU tails is not appropriate. Thus, there is the possibility that disposition via disposal will be in a deep geological repository. One alternative that may significantly reduce the cost of DU disposition is to use it beneficially. In fact, DOE has begun the Beneficial Uses of DU Project to identify large scale uses of DU and to encourage its reuse. Several beneficial uses, many of which involve applications in the repository per se or in managing the wastes to go into the repository, are discussed in this report.

  8. Understanding the haling power depletion (HPD) method

    SciTech Connect

    Levine, S.; Blyth, T.; Ivanov, K.

    2012-07-01

    The Pennsylvania State Univ. (PSU) is using the university version of the Studsvik Scandpower Code System (CMS) for research and education purposes. Preparations have been made to incorporate the CMS into the PSU Nuclear Engineering graduate class 'Nuclear Fuel Management' course. The information presented in this paper was developed during the preparation of the material for the course. The Haling Power Depletion (HPD) was presented in the course for the first time. The HPD method has been criticized as not valid by many in the field even though it has been successfully applied at PSU for the past 20 years. It was noticed that the radial power distribution (RPD) for low leakage cores during depletion remained similar to that of the HPD during most of the cycle. Thus, the Haling Power Depletion (HPD) may be used conveniently mainly for low leakage cores. Studies were then made to better understand the HPD and the results are presented in this paper. Many different core configurations can be computed quickly with the HPD without using Burnable Poisons (BP) to produce several excellent low leakage core configurations that are viable for power production. Once the HPD core configuration is chosen for further analysis, techniques are available for establishing the BP design to prevent violating any of the safety constraints in such HPD calculated cores. In summary, in this paper it has been shown that the HPD method can be used for guiding the design for the low leakage core. (authors)

  9. The Toxicity of Depleted Uranium

    PubMed Central

    Briner, Wayne

    2010-01-01

    Depleted uranium (DU) is an emerging environmental pollutant that is introduced into the environment primarily by military activity. While depleted uranium is less radioactive than natural uranium, it still retains all the chemical toxicity associated with the original element. In large doses the kidney is the target organ for the acute chemical toxicity of this metal, producing potentially lethal tubular necrosis. In contrast, chronic low dose exposure to depleted uranium may not produce a clear and defined set of symptoms. Chronic low-dose, or subacute, exposure to depleted uranium alters the appearance of milestones in developing organisms. Adult animals that were exposed to depleted uranium during development display persistent alterations in behavior, even after cessation of depleted uranium exposure. Adult animals exposed to depleted uranium demonstrate altered behaviors and a variety of alterations to brain chemistry. Despite its reduced level of radioactivity evidence continues to accumulate that depleted uranium, if ingested, may pose a radiologic hazard. The current state of knowledge concerning DU is discussed. PMID:20195447

  10. Ego depletion impairs implicit learning.

    PubMed

    Thompson, Kelsey R; Sanchez, Daniel J; Wesley, Abigail H; Reber, Paul J

    2014-01-01

    Implicit skill learning occurs incidentally and without conscious awareness of what is learned. However, the rate and effectiveness of learning may still be affected by decreased availability of central processing resources. Dual-task experiments have generally found impairments in implicit learning, however, these studies have also shown that certain characteristics of the secondary task (e.g., timing) can complicate the interpretation of these results. To avoid this problem, the current experiments used a novel method to impose resource constraints prior to engaging in skill learning. Ego depletion theory states that humans possess a limited store of cognitive resources that, when depleted, results in deficits in self-regulation and cognitive control. In a first experiment, we used a standard ego depletion manipulation prior to performance of the Serial Interception Sequence Learning (SISL) task. Depleted participants exhibited poorer test performance than did non-depleted controls, indicating that reducing available executive resources may adversely affect implicit sequence learning, expression of sequence knowledge, or both. In a second experiment, depletion was administered either prior to or after training. Participants who reported higher levels of depletion before or after training again showed less sequence-specific knowledge on the post-training assessment. However, the results did not allow for clear separation of ego depletion effects on learning versus subsequent sequence-specific performance. These results indicate that performance on an implicitly learned sequence can be impaired by a reduction in executive resources, in spite of learning taking place outside of awareness and without conscious intent.

  11. Stratospheric ozone depletion.

    PubMed

    Rowland, F Sherwood

    2006-05-29

    Solar ultraviolet radiation creates an ozone layer in the atmosphere which in turn completely absorbs the most energetic fraction of this radiation. This process both warms the air, creating the stratosphere between 15 and 50 km altitude, and protects the biological activities at the Earth's surface from this damaging radiation. In the last half-century, the chemical mechanisms operating within the ozone layer have been shown to include very efficient catalytic chain reactions involving the chemical species HO, HO2, NO, NO2, Cl and ClO. The NOX and ClOX chains involve the emission at Earth's surface of stable molecules in very low concentration (N2O, CCl2F2, CCl3F, etc.) which wander in the atmosphere for as long as a century before absorbing ultraviolet radiation and decomposing to create NO and Cl in the middle of the stratospheric ozone layer. The growing emissions of synthetic chlorofluorocarbon molecules cause a significant diminution in the ozone content of the stratosphere, with the result that more solar ultraviolet-B radiation (290-320 nm wavelength) reaches the surface. This ozone loss occurs in the temperate zone latitudes in all seasons, and especially drastically since the early 1980s in the south polar springtime-the 'Antarctic ozone hole'. The chemical reactions causing this ozone depletion are primarily based on atomic Cl and ClO, the product of its reaction with ozone. The further manufacture of chlorofluorocarbons has been banned by the 1992 revisions of the 1987 Montreal Protocol of the United Nations. Atmospheric measurements have confirmed that the Protocol has been very successful in reducing further emissions of these molecules. Recovery of the stratosphere to the ozone conditions of the 1950s will occur slowly over the rest of the twenty-first century because of the long lifetime of the precursor molecules.

  12. Stratospheric ozone depletion

    PubMed Central

    Rowland, F. Sherwood

    2006-01-01

    Solar ultraviolet radiation creates an ozone layer in the atmosphere which in turn completely absorbs the most energetic fraction of this radiation. This process both warms the air, creating the stratosphere between 15 and 50 km altitude, and protects the biological activities at the Earth's surface from this damaging radiation. In the last half-century, the chemical mechanisms operating within the ozone layer have been shown to include very efficient catalytic chain reactions involving the chemical species HO, HO2, NO, NO2, Cl and ClO. The NOX and ClOX chains involve the emission at Earth's surface of stable molecules in very low concentration (N2O, CCl2F2, CCl3F, etc.) which wander in the atmosphere for as long as a century before absorbing ultraviolet radiation and decomposing to create NO and Cl in the middle of the stratospheric ozone layer. The growing emissions of synthetic chlorofluorocarbon molecules cause a significant diminution in the ozone content of the stratosphere, with the result that more solar ultraviolet-B radiation (290–320 nm wavelength) reaches the surface. This ozone loss occurs in the temperate zone latitudes in all seasons, and especially drastically since the early 1980s in the south polar springtime—the ‘Antarctic ozone hole’. The chemical reactions causing this ozone depletion are primarily based on atomic Cl and ClO, the product of its reaction with ozone. The further manufacture of chlorofluorocarbons has been banned by the 1992 revisions of the 1987 Montreal Protocol of the United Nations. Atmospheric measurements have confirmed that the Protocol has been very successful in reducing further emissions of these molecules. Recovery of the stratosphere to the ozone conditions of the 1950s will occur slowly over the rest of the twenty-first century because of the long lifetime of the precursor molecules. PMID:16627294

  13. CRDIAC: Coupled Reactor Depletion Instrument with Automated Control

    SciTech Connect

    Steven K. Logan

    2012-08-01

    When modeling the behavior of a nuclear reactor over time, it is important to understand how the isotopes in the reactor will change, or transmute, over that time. This is especially important in the reactor fuel itself. Many nuclear physics modeling codes model how particles interact in the system, but do not model this over time. Thus, another code is used in conjunction with the nuclear physics code to accomplish this. In our code, Monte Carlo N-Particle (MCNP) codes and the Multi Reactor Transmutation Analysis Utility (MRTAU) were chosen as the codes to use. In this way, MCNP would produce the reaction rates in the different isotopes present and MRTAU would use cross sections generated from these reaction rates to determine how the mass of each isotope is lost or gained. Between these two codes, the information must be altered and edited for use. For this, a Python 2.7 script was developed to aid the user in getting the information in the correct forms. This newly developed methodology was called the Coupled Reactor Depletion Instrument with Automated Controls (CRDIAC). As is the case in any newly developed methodology for modeling of physical phenomena, CRDIAC needed to be verified against similar methodology and validated against data taken from an experiment, in our case AFIP-3. AFIP-3 was a reduced enrichment plate type fuel tested in the ATR. We verified our methodology against the MCNP Coupled with ORIGEN2 (MCWO) method and validated our work against the Post Irradiation Examination (PIE) data. When compared to MCWO, the difference in concentration of U-235 throughout Cycle 144A was about 1%. When compared to the PIE data, the average bias for end of life U-235 concentration was about 2%. These results from CRDIAC therefore agree with the MCWO and PIE data, validating and verifying CRDIAC. CRDIAC provides an alternative to using ORIGEN-based methodology, which is useful because CRDIAC's depletion code, MRTAU, uses every available isotope in its depletion

  14. Testing fully depleted CCD

    NASA Astrophysics Data System (ADS)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  15. Transequatorial Propagation and Depletion Precursors

    NASA Astrophysics Data System (ADS)

    Miller, E. S.; Bust, G. S.; Kaeppler, S. R.; Frissell, N. A.; Paxton, L. J.

    2014-12-01

    The bottomside equatorial ionosphere in the afternoon and evening sector frequently evolves rapidly from smoothly stratified to violently unstable with large wedges of depleted plasma growing through to the topside on timescales of a few tens of minutes. These depletions have numerous practical impacts on radio propagation, including amplitude scintillation, field-aligned irregularity scatter, HF blackouts, and long-distance transequatorial propagation at frequencies above the MUF. Practical impacts notwithstanding, the pathways and conditions under which depletions form remain a topic of vigorous inquiry some 80 years after their first report. Structuring of the pre-sunset ionosphere---morphology of the equatorial anomalies and long-wavelength undulations of the isodensity contours on the bottomside---are likely to hold some clues to conditions that are conducive to depletion formation. The Conjugate Depletion Experiment is an upcoming transequatorial forward-scatter HF/VHF experiment to investigate pre-sunset undulations and their connection with depletion formation. We will present initial results from the Conjugate Depletion Experiment, as well as a companion analysis of a massive HF propagation data set.

  16. Ego Depletion Impairs Implicit Learning

    PubMed Central

    Thompson, Kelsey R.; Sanchez, Daniel J.; Wesley, Abigail H.; Reber, Paul J.

    2014-01-01

    Implicit skill learning occurs incidentally and without conscious awareness of what is learned. However, the rate and effectiveness of learning may still be affected by decreased availability of central processing resources. Dual-task experiments have generally found impairments in implicit learning, however, these studies have also shown that certain characteristics of the secondary task (e.g., timing) can complicate the interpretation of these results. To avoid this problem, the current experiments used a novel method to impose resource constraints prior to engaging in skill learning. Ego depletion theory states that humans possess a limited store of cognitive resources that, when depleted, results in deficits in self-regulation and cognitive control. In a first experiment, we used a standard ego depletion manipulation prior to performance of the Serial Interception Sequence Learning (SISL) task. Depleted participants exhibited poorer test performance than did non-depleted controls, indicating that reducing available executive resources may adversely affect implicit sequence learning, expression of sequence knowledge, or both. In a second experiment, depletion was administered either prior to or after training. Participants who reported higher levels of depletion before or after training again showed less sequence-specific knowledge on the post-training assessment. However, the results did not allow for clear separation of ego depletion effects on learning versus subsequent sequence-specific performance. These results indicate that performance on an implicitly learned sequence can be impaired by a reduction in executive resources, in spite of learning taking place outside of awareness and without conscious intent. PMID:25275517

  17. "When the going gets tough, who keeps going?" Depletion sensitivity moderates the ego-depletion effect.

    PubMed

    Salmon, Stefanie J; Adriaanse, Marieke A; De Vet, Emely; Fennis, Bob M; De Ridder, Denise T D

    2014-01-01

    Self-control relies on a limited resource that can get depleted, a phenomenon that has been labeled ego-depletion. We argue that individuals may differ in their sensitivity to depleting tasks, and that consequently some people deplete their self-control resource at a faster rate than others. In three studies, we assessed individual differences in depletion sensitivity, and demonstrate that depletion sensitivity moderates ego-depletion effects. The Depletion Sensitivity Scale (DSS) was employed to assess depletion sensitivity. Study 1 employs the DSS to demonstrate that individual differences in sensitivity to ego-depletion exist. Study 2 shows moderate correlations of depletion sensitivity with related self-control concepts, indicating that these scales measure conceptually distinct constructs. Study 3 demonstrates that depletion sensitivity moderates the ego-depletion effect. Specifically, participants who are sensitive to depletion performed worse on a second self-control task, indicating a stronger ego-depletion effect, compared to participants less sensitive to depletion.

  18. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  19. Depleting depletion: Polymer swelling in poor solvent mixtures

    NASA Astrophysics Data System (ADS)

    Mukherji, Debashish; Marques, Carlos; Stuehn, Torsten; Kremer, Kurt

    A polymer collapses in a solvent when the solvent particles dislike monomers more than the repulsion between monomers. This leads to an effective attraction between monomers, also referred to as depletion induced attraction. This attraction is the key factor behind standard polymer collapse in poor solvents. Strikingly, even if a polymer exhibits poor solvent condition in two different solvents, it can also swell in mixtures of these two poor solvents. This collapse-swelling-collapse scenario is displayed by poly(methyl methacrylate) (PMMA) in aqueous alcohol. Using molecular dynamics simulations of a thermodynamically consistent generic model and theoretical arguments, we unveil the microscopic origin of this phenomenon. Our analysis suggests that a subtle interplay of the bulk solution properties and the local depletion forces reduces depletion effects, thus dictating polymer swelling in poor solvent mixtures.

  20. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  1. Fully depleted back illuminated CCD

    DOEpatents

    Holland, Stephen Edward

    2001-01-01

    A backside illuminated charge coupled device (CCD) is formed of a relatively thick high resistivity photon sensitive silicon substrate, with frontside electronic circuitry, and an optically transparent backside ohmic contact for applying a backside voltage which is at least sufficient to substantially fully deplete the substrate. A greater bias voltage which overdepletes the substrate may also be applied. One way of applying the bias voltage to the substrate is by physically connecting the voltage source to the ohmic contact. An alternate way of applying the bias voltage to the substrate is to physically connect the voltage source to the frontside of the substrate, at a point outside the depletion region. Thus both frontside and backside contacts can be used for backside biasing to fully deplete the substrate. Also, high resistivity gaps around the CCD channels and electrically floating channel stop regions can be provided in the CCD array around the CCD channels. The CCD array forms an imaging sensor useful in astronomy.

  2. Charge depletion in organic heterojunction

    NASA Astrophysics Data System (ADS)

    Ng, T. W.; Lo, M. F.; Lee, S. T.; Lee, C. S.

    2012-03-01

    Until now two types of organic-organic heterojunction (OHJ) have been observed in P-N junctions formed between undoped-organic semiconductors. Charge-transfers across OHJs are either negligible or showing electron transfer from P-type to N-type materials, leading to charges accumulation near the interface. Here, we observed that junction of 4,4',4''-tris(2-methylphenyl-phenylamino)triphenylamine (m-MTDATA)/bathocuproine (BCP) show the third-behavior. Electrons in BCP (N-type) transfer to m-MTDATA (P-type), leading to depletion of mobile majority carriers near the junction. While "depletion junctions" are typical in inorganic semiconductors, there are no reports in undoped-OHJ. Formation mechanism of depletion OHJs and fundamental differences between inorganic and organic HJs are discussed.

  3. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  4. DNA codes

    SciTech Connect

    Torney, D. C.

    2001-01-01

    We have begun to characterize a variety of codes, motivated by potential implementation as (quaternary) DNA n-sequences, with letters denoted A, C The first codes we studied are the most reminiscent of conventional group codes. For these codes, Hamming similarity was generalized so that the score for matched letters takes more than one value, depending upon which letters are matched [2]. These codes consist of n-sequences satisfying an upper bound on the similarities, summed over the letter positions, of distinct codewords. We chose similarity 2 for matches of letters A and T and 3 for matches of the letters C and G, providing a rough approximation to double-strand bond energies in DNA. An inherent novelty of DNA codes is 'reverse complementation'. The latter may be defined, as follows, not only for alphabets of size four, but, more generally, for any even-size alphabet. All that is required is a matching of the letters of the alphabet: a partition into pairs. Then, the reverse complement of a codeword is obtained by reversing the order of its letters and replacing each letter by its match. For DNA, the matching is AT/CG because these are the Watson-Crick bonding pairs. Reversal arises because two DNA sequences form a double strand with opposite relative orientations. Thus, as will be described in detail, because in vitro decoding involves the formation of double-stranded DNA from two codewords, it is reasonable to assume - for universal applicability - that the reverse complement of any codeword is also a codeword. In particular, self-reverse complementary codewords are expressly forbidden in reverse-complement codes. Thus, an appropriate distance between all pairs of codewords must, when large, effectively prohibit binding between the respective codewords: to form a double strand. Only reverse-complement pairs of codewords should be able to bind. For most applications, a DNA code is to be bi-partitioned, such that the reverse-complementary pairs are separated

  5. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  6. Ozone Depletion from Nearby Supernovae

    NASA Technical Reports Server (NTRS)

    Gehrels, Neil; Laird, Claude M.; Jackman, Charles H.; Cannizzo, John K.; Mattson, Barbara J.; Chen, Wan; Bhartia, P. K. (Technical Monitor)

    2002-01-01

    Estimates made in the 1970's indicated that a supernova occurring within tens of parsecs of Earth could have significant effects on the ozone layer. Since that time improved tools for detailed modeling of atmospheric chemistry have been developed to calculate ozone depletion, and advances have been made also in theoretical modeling of supernovae and of the resultant gamma ray spectra. In addition, one now has better knowledge of the occurrence rate of supernovae in the galaxy, and of the spatial distribution of progenitors to core-collapse supernovae. We report here the results of two-dimensional atmospheric model calculations that take as input the spectral energy distribution of a supernova, adopting various distances from Earth and various latitude impact angles. In separate simulations we calculate the ozone depletion due to both gamma rays and cosmic rays. We find that for the combined ozone depletion from these effects roughly to double the 'biologically active' UV flux received at the surface of the Earth, the supernova must occur at approximately or less than 8 parsecs.

  7. Ozone depletion, paradigms, and politics

    SciTech Connect

    Iman, R.L.

    1993-10-01

    The destruction of the Earth`s protective ozone layer is a prime environmental concern. Industry has responded to this environmental problem by: implementing conservation techniques to reduce the emission of ozone-depleting chemicals (ODCs); using alternative cleaning solvents that have lower ozone depletion potentials (ODPs); developing new, non-ozone-depleting solvents, such as terpenes; and developing low-residue soldering processes. This paper presents an overview of a joint testing program at Sandia and Motorola to evaluate a low-residue (no-clean) soldering process for printed wiring boards (PWBs). Such processes are in widespread use in commercial applications because they eliminate the cleaning operation. The goal of this testing program was to develop a data base that could be used to support changes in the mil-specs. In addition, a joint task force involving industry and the military has been formed to conduct a follow-up evaluation of low-residue processes that encompass the concerns of the tri-services. The goal of the task force is to gain final approval of the low-residue technology for use in military applications.

  8. 26 CFR 1.613-7 - Application of percentage depletion rates provided in section 613(b) to certain taxable years...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TAXES (CONTINUED) Natural Resources § 1.613-7 Application of percentage depletion rates provided in... Code). In the case of mines, wells, or other natural deposits listed in section 613(b), the...

  9. Issues in Stratospheric Ozone Depletion.

    NASA Astrophysics Data System (ADS)

    Lloyd, Steven Andrew

    Following the announcement of the discovery of the Antarctic ozone hole in 1985 there have arisen a multitude of questions pertaining to the nature and consequences of polar ozone depletion. This thesis addresses several of these specific questions, using both computer models of chemical kinetics and the Earth's radiation field as well as laboratory kinetic experiments. A coupled chemical kinetic-radiative numerical model was developed to assist in the analysis of in situ field measurements of several radical and neutral species in the polar and mid-latitude lower stratosphere. Modeling was used in the analysis of enhanced polar ClO, mid-latitude diurnal variation of ClO, and simultaneous measurements of OH, HO_2, H_2 O and O_3. Most importantly, such modeling was instrumental in establishing the link between the observed ClO and BrO concentrations in the Antarctic polar vortex and the observed rate of ozone depletion. The principal medical concern of stratospheric ozone depletion is that ozone loss will lead to the enhancement of ground-level UV-B radiation. Global ozone climatology (40^circS to 50^ circN latitude) was incorporated into a radiation field model to calculate the biologically accumulated dosage (BAD) of UV-B radiation, integrated over days, months, and years. The slope of the annual BAD as a function of latitude was found to correspond to epidemiological data for non-melanoma skin cancers for 30^circ -50^circN. Various ozone loss scenarios were investigated. It was found that a small ozone loss in the tropics can provide as much additional biologically effective UV-B as a much larger ozone loss at higher latitudes. Also, for ozone depletions of > 5%, the BAD of UV-B increases exponentially with decreasing ozone levels. An important key player in determining whether polar ozone depletion can propagate into the populated mid-latitudes is chlorine nitrate, ClONO_2 . As yet this molecule is only indirectly accounted for in computer models and field

  10. Verification of a Depletion Method in SCALE for the Advanced High Temperature Reactor

    SciTech Connect

    KELLY, RYAN; Ilas, Dan

    2012-01-01

    This study describes a new method utilizing the Dancoff factor to model a non-standard TRISO fuel form characteristic of the AHTR reactor design concept for depletion analysis using the TRITON sequence of SCALE and the validation of this method by code-to-code comparisons. The fuel used in AHTR has the TRISO particles concentrated along the edges of a slab fuel element. This particular geometry prevented the use of a standard DOUBLEHET treatment, previously developed in SCALE to handle NGNP-designed fuel. The new method permits fuel depletion on complicated geometries that traditionally can be handled only by continuous energy based depletion code systems. The method was initially tested on a fuel design typical of the NGNP, where the DOUBLEHET treatment is available. A more comprehensive study was performed using the VESTA code that uses the continuous energy MCNP5 code as a transport solver and ORIGEN2.2 code for depletion calculations. Comparisons of the results indicate good agreement of whole core characteristics, such as the multiplication factor, and the isotopics, including their spatial distribution. Key isotopes analyzed included 235U, 239Pu, 240Pu and 241Pu. The results from this study indicate that the Dancoff factor method can generate estimates of core characteristics with reasonable precision for scoping studies of configurations where the DOUBLEHET treatment is unavailable.

  11. Verification of a Depletion Method in SCALE for the Advanced High Temperature Reactor

    SciTech Connect

    KELLY, RYAN; Ilas, Dan

    2013-01-01

    This study describes a new approach employing the Dancoff correction method to model the TRISO-based fuel form used by the Advanced High-Temperature Reactor (AHTR) reactor design concept. The Dancoff correction method is used to perform isotope depletion analysis using the TRITON sequence of SCALE and is verified by code-to-code comparisons. The current AHTR fuel design has TRISO particles concentrated along the edges of a slab fuel element. This geometry prevented the use of the DOUBLEHET treatment, previously developed in SCALE to model spherical and cylindrical fuel. The new method permits fuel depletion on complicated geometries that traditionally can be handled only by continuous energy based depletion code systems. The method was initially tested on a fuel configuration typical of the Next Generation Nuclear Plant (NGNP), where DOUBLEHET treatment is possible. A confirmatory study was performed on the AHTR reference core geometry using the VESTA code, which uses the continuous energy MCNP5 code as a transport solver and ORIGEN2.2 code for depletion calculations. Comparisons of the results indicate good agreement of whole core characteristics, such as the multiplication factor and the isotopics, including their spatial distribution. Key isotopes analyzed included 235U, 239Pu, 240Pu, and 241Pu. The results from this study indicate that the Dancoff factor method can generate estimates of core characteristics with reasonable precision for scoping studies of configurations where DOUBLEHET treatment cannot be performed.

  12. Physiological implications of anthropogenic environmental calcium depletion

    Treesearch

    Catherine H. Borer; Paul G. Schaberg; Donald H. DeHayes; Gary J. Hawley

    2001-01-01

    Recent evidence indicates that numerous anthropogenic factors can deplete calcium (Ca) from forested ecosystems. Although it is difficult to quantify the extent of this depletion, some reports indicate that the magnitude of Ca losses may be substantial. The potential for Ca depletion raises important questions about tree health. Only a fraction of foliar Ca is...

  13. Exposure to nature counteracts aggression after depletion.

    PubMed

    Wang, Yan; She, Yihan; Colarelli, Stephen M; Fang, Yuan; Meng, Hui; Chen, Qiuju; Zhang, Xin; Zhu, Hongwei

    2017-08-31

    Acts of self-control are more likely to fail after previous exertion of self-control, known as the ego depletion effect. Research has shown that depleted participants behave more aggressively than non-depleted participants, especially after being provoked. Although exposure to nature (e.g., a walk in the park) has been predicted to replenish resources common to executive functioning and self-control, the extent to which exposure to nature may counteract the depletion effect on aggression has yet to be determined. The present study investigated the effects of exposure to nature on aggression following depletion. Aggression was measured by the intensity of noise blasts participants delivered to an ostensible opponent in a competition reaction-time task. As predicted, an interaction occurred between depletion and environmental manipulations for provoked aggression. Specifically, depleted participants behaved more aggressively in response to provocation than non-depleted participants in the urban condition. However, provoked aggression did not differ between depleted and non-depleted participants in the natural condition. Moreover, within the depletion condition, participants in the natural condition had lower levels of provoked aggression than participants in the urban condition. This study suggests that a brief period of nature exposure may restore self-control and help depleted people regain control over aggressive urges. © 2017 Wiley Periodicals, Inc.

  14. Measured and calculated fast neutron spectra in a depleted uranium and lithium hydride shielded reactor

    NASA Technical Reports Server (NTRS)

    Lahti, G. P.; Mueller, R. A.

    1973-01-01

    Measurements of MeV neutron were made at the surface of a lithium hydride and depleted uranium shielded reactor. Four shield configurations were considered: these were assembled progressively with cylindrical shells of 5-centimeter-thick depleted uranium, 13-centimeter-thick lithium hydride, 5-centimeter-thick depleted uranium, 13-centimeter-thick lithium hydride, 5-centimeter-thick depleted uranium, and 3-centimeter-thick depleted uranium. Measurements were made with a NE-218 scintillation spectrometer; proton pulse height distributions were differentiated to obtain neutron spectra. Calculations were made using the two-dimensional discrete ordinates code DOT and ENDF/B (version 3) cross sections. Good agreement between measured and calculated spectral shape was observed. Absolute measured and calculated fluxes were within 50 percent of one another; observed discrepancies in absolute flux may be due to cross section errors.

  15. The Case of Ozone Depletion

    NASA Technical Reports Server (NTRS)

    Lambright, W. Henry

    2005-01-01

    While the National Aeronautics and Space Administration (NASA) is widely perceived as a space agency, since its inception NASA has had a mission dedicated to the home planet. Initially, this mission involved using space to better observe and predict weather and to enable worldwide communication. Meteorological and communication satellites showed the value of space for earthly endeavors in the 1960s. In 1972, NASA launched Landsat, and the era of earth-resource monitoring began. At the same time, in the late 1960s and early 1970s, the environmental movement swept throughout the United States and most industrialized countries. The first Earth Day event took place in 1970, and the government generally began to pay much more attention to issues of environmental quality. Mitigating pollution became an overriding objective for many agencies. NASA's existing mission to observe planet Earth was augmented in these years and directed more toward environmental quality. In the 1980s, NASA sought to plan and establish a new environmental effort that eventuated in the 1990s with the Earth Observing System (EOS). The Agency was able to make its initial mark via atmospheric monitoring, specifically ozone depletion. An important policy stimulus in many respects, ozone depletion spawned the Montreal Protocol of 1987 (the most significant international environmental treaty then in existence). It also was an issue critical to NASA's history that served as a bridge linking NASA's weather and land-resource satellites to NASA s concern for the global changes affecting the home planet. Significantly, as a global environmental problem, ozone depletion underscored the importance of NASA's ability to observe Earth from space. Moreover, the NASA management team's ability to apply large-scale research efforts and mobilize the talents of other agencies and the private sector illuminated its role as a lead agency capable of crossing organizational boundaries as well as the science-policy divide.

  16. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  17. Nature's Code

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa J.; Rowlands, Peter

    2008-10-01

    We propose that the mathematical structures related to the `universal rewrite system' define a universal process applicable to Nature, which we may describe as `Nature's code'. We draw attention here to such concepts as 4 basic units, 64- and 20-unit structures, symmetry-breaking and 5-fold symmetry, chirality, double 3-dimensionality, the double helix, the Van der Waals force and the harmonic oscillator mechanism, and our explanation of how they necessarily lead to self-aggregation, complexity and emergence in higher-order systems. Biological concepts, such as translation, transcription, replication, the genetic code and the grouping of amino acids appear to be driven by fundamental processes of this kind, and it would seem that the Platonic solids, pentagonal symmetry and Fibonacci numbers have significant roles in organizing `Nature's code'.

  18. Show Code.

    PubMed

    Shalev, Daniel

    2017-01-01

    "Let's get one thing straight: there is no such thing as a show code," my attending asserted, pausing for effect. "You either try to resuscitate, or you don't. None of this halfway junk." He spoke so loudly that the two off-service consultants huddled at computers at the end of the unit looked up… We did four rounds of compressions and pushed epinephrine twice. It was not a long code. We did good, strong compressions and coded this man in earnest until the end. Toward the final round, though, as I stepped up to do compressions, my attending looked at me in a deep way. It was a look in between willing me as some object under his command and revealing to me everything that lay within his brash, confident surface but could not be spoken. © 2017 The Hastings Center.

  19. Biomedical consequences of ozone depletion

    NASA Astrophysics Data System (ADS)

    Coohill, Thomas P.

    1994-07-01

    It is widely agreed that a portion of the earth's protective stratospheric ozone layer is being depleted. The major effect of this ozone loss will be an increase in the amount of ultraviolet radiation (UV reaching the biosphere. This increase will be completely contained within the UVB (290nm - 320nm). It is imperative that assessments be made of the effects of this additional UVB on living organisms. This requires a detailed knowledge of the UVB photobiology of these life forms. One analytical technique to aid in the approximations is the construction of UV action spectra for such important biological end-points as human skin cancer, cataracts, immune suppression; plant photosynthesis and crop yields; and aquatic organism responses to UVB, especially the phytoplankton. Combining these action spectra with the known solar spectrum (and estimates for various ozone depletion scenarios) can give rise to a series of effectiveness spectra for these parameters. This manuscript gives a first approximation, rough estimate, for the effectiveness spectra for some of these bioresponses, and a series of crude temporary values for how a 10% ozone loss would affect the above end-points. These are not intended to masquerade as final answers, but rather, to serve as beginning attempts for a process which should be continually refined. It is hoped that these estimates will be of some limited use to agencies, such as government and industry, that have to plan now for changes in human activities that might alter future atmospheric chemistry in a beneficial manner.

  20. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  1. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  2. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  3. Depleted Argon from Underground Sources

    SciTech Connect

    Back, H. O.; Galbiati, C.; Goretti, A.; Loer, B.; Montanari, D.; Mosteiro, P.; Alexander, T.; Alton, A.; Rogers, H.; Kendziora, C.; Pordes, S.

    2011-04-27

    Argon is a strong scintillator and an ideal target for Dark Matter detection; however {sup 39}Ar contamination in atmospheric argon from cosmic ray interactions limits the size of liquid argon dark matter detectors due to pile-up. Argon from deep underground is depleted in {sup 39}Ar due to the cosmic ray shielding of the earth. In Cortez, Colorado, a CO{sub 2} well has been discovered to contain approximately 600 ppm of argon as a contamination in the CO{sub 2}. We first concentrate the argon locally to 3% in an Ar, N{sub 2}, and He mixture, from the CO{sub 2} through chromatographic gas separation, and then the N{sub 2} and He will be removed by continuous distillation to purify the argon. We have collected 26 kg of argon from the CO{sub 2} facility and a cryogenic distillation column is under construction at Fermilab to further purify the argon.

  4. Depleted uranium disposal options evaluation

    SciTech Connect

    Hertzler, T.J.; Nishimoto, D.D.; Otis, M.D.

    1994-05-01

    The Department of Energy (DOE), Office of Environmental Restoration and Waste Management, has chartered a study to evaluate alternative management strategies for depleted uranium (DU) currently stored throughout the DOE complex. Historically, DU has been maintained as a strategic resource because of uses for DU metal and potential uses for further enrichment or for uranium oxide as breeder reactor blanket fuel. This study has focused on evaluating the disposal options for DU if it were considered a waste. This report is in no way declaring these DU reserves a ``waste,`` but is intended to provide baseline data for comparison with other management options for use of DU. To PICS considered in this report include: Retrievable disposal; permanent disposal; health hazards; radiation toxicity and chemical toxicity.

  5. Method for depleting BWRs using optimal control rod patterns

    SciTech Connect

    Taner, M.S.; Levine, S.H. ); Hsiao, M.Y. )

    1991-01-01

    Control rod (CR) programming is an essential core management activity for boiling water reactors (BWRs). After establishing a core reload design for a BWR, CR programming is performed to develop a sequence of exposure-dependent CR patterns that assure the safe and effective depletion of the core through a reactor cycle. A time-variant target power distribution approach has been assumed in this study. The authors have developed OCTOPUS to implement a new two-step method for designing semioptimal CR programs for BWRs. The optimization procedure of OCTOPUS is based on the method of approximation programming and uses the SIMULATE-E code for nucleonics calculations.

  6. High-voltage-compatible, fully depleted CCDs

    SciTech Connect

    Holland, Stephen E.; Bebek, Chris J.; Dawson, Kyle S.; Emes, JohnE.; Fabricius, Max H.; Fairfield, Jessaym A.; Groom, Don E.; Karcher, A.; Kolbe, William F.; Palaio, Nick P.; Roe, Natalie A.; Wang, Guobin

    2006-05-15

    We describe charge-coupled device (CCD) developmentactivities at the Lawrence Berkeley National Laboratory (LBNL).Back-illuminated CCDs fabricated on 200-300 mu m thick, fully depleted,high-resistivity silicon substrates are produced in partnership with acommercial CCD foundry.The CCDs are fully depleted by the application ofa substrate bias voltage. Spatial resolution considerations requireoperation of thick, fully depleted CCDs at high substrate bias voltages.We have developed CCDs that are compatible with substrate bias voltagesof at least 200V. This improves spatial resolution for a given thickness,and allows for full depletion of thicker CCDs than previously considered.We have demonstrated full depletion of 650-675 mu m thick CCDs, withpotential applications in direct x-ray detection. In this work we discussthe issues related to high-voltage operation of fully depleted CCDs, aswell as experimental results on high-voltage-compatible CCDs.

  7. Ego depletion increases risk-taking.

    PubMed

    Fischer, Peter; Kastenmüller, Andreas; Asal, Kathrin

    2012-01-01

    We investigated how the availability of self-control resources affects risk-taking inclinations and behaviors. We proposed that risk-taking often occurs from suboptimal decision processes and heuristic information processing (e.g., when a smoker suppresses or neglects information about the health risks of smoking). Research revealed that depleted self-regulation resources are associated with reduced intellectual performance and reduced abilities to regulate spontaneous and automatic responses (e.g., control aggressive responses in the face of frustration). The present studies transferred these ideas to the area of risk-taking. We propose that risk-taking is increased when individuals find themselves in a state of reduced cognitive self-control resources (ego-depletion). Four studies supported these ideas. In Study 1, ego-depleted participants reported higher levels of sensation seeking than non-depleted participants. In Study 2, ego-depleted participants showed higher levels of risk-tolerance in critical road traffic situations than non-depleted participants. In Study 3, we ruled out two alternative explanations for these results: neither cognitive load nor feelings of anger mediated the effect of ego-depletion on risk-taking. Finally, Study 4 clarified the underlying psychological process: ego-depleted participants feel more cognitively exhausted than non-depleted participants and thus are more willing to take risks. Discussion focuses on the theoretical and practical implications of these findings.

  8. Beneficial Uses of Depleted Uranium

    SciTech Connect

    Brown, C.; Croff, A.G.; Haire, M. J.

    1997-08-01

    Naturally occurring uranium contains 0.71 wt% {sup 235}U. In order for the uranium to be useful in most fission reactors, it must be enriched the concentration of the fissile isotope {sup 235}U must be increased. Depleted uranium (DU) is a co-product of the processing of natural uranium to produce enriched uranium, and DU has a {sup 235}U concentration of less than 0.71 wt%. In the United States, essentially all of the DU inventory is in the chemical form of uranium hexafluoride (UF{sub 6}) and is stored in large cylinders above ground. If this co-product material were to be declared surplus, converted to a stable oxide form, and disposed, the costs are estimated to be several billion dollars. Only small amounts of DU have at this time been beneficially reused. The U.S. Department of Energy (DOE) has begun the Beneficial Uses of DU Project to identify large-scale uses of DU and encourage its reuse for the primary purpose of potentially reducing the cost and expediting the disposition of the DU inventory. This paper discusses the inventory of DU and its rate of increase; DU disposition options; beneficial use options; a preliminary cost analysis; and major technical, institutional, and regulatory issues to be resolved.

  9. Depleted argon from underground sources

    SciTech Connect

    Back, H.O.; Alton, A.; Calaprice, F.; Galbiati, C.; Goretti, A.; Kendziora, C.; Loer, B.; Montanari, D.; Mosteiro, P.; Pordes, S.; /Fermilab

    2011-09-01

    Argon is a powerful scintillator and an excellent medium for detection of ionization. Its high discrimination power against minimum ionization tracks, in favor of selection of nuclear recoils, makes it an attractive medium for direct detection of WIMP dark matter. However, cosmogenic {sup 39}Ar contamination in atmospheric argon limits the size of liquid argon dark matter detectors due to pile-up. The cosmic ray shielding by the earth means that Argon from deep underground is depleted in {sup 39}Ar. In Cortez Colorado a CO{sub 2} well has been discovered to contain approximately 500ppm of argon as a contamination in the CO{sub 2}. In order to produce argon for dark matter detectors we first concentrate the argon locally to 3-5% in an Ar, N{sub 2}, and He mixture, from the CO{sub 2} through chromatographic gas separation. The N{sub 2} and He will be removed by continuous cryogenic distillation in the Cryogenic Distillation Column recently built at Fermilab. In this talk we will discuss the entire extraction and purification process; with emphasis on the recent commissioning and initial performance of the cryogenic distillation column purification.

  10. Development of the MCNPX depletion capability: A Monte Carlo linked depletion method that automates the coupling between MCNPX and CINDER90 for high fidelity burnup calculations

    NASA Astrophysics Data System (ADS)

    Fensin, Michael Lorne

    Monte Carlo-linked depletion methods have gained recent interest due to the ability to more accurately model complex 3-dimesional geometries and better track the evolution of temporal nuclide inventory by simulating the actual physical process utilizing continuous energy coefficients. The integration of CINDER90 into the MCNPX Monte Carlo radiation transport code provides a high-fidelity completely self-contained Monte-Carlo-linked depletion capability in a well established, widely accepted Monte Carlo radiation transport code that is compatible with most nuclear criticality (KCODE) particle tracking features in MCNPX. MCNPX depletion tracks all necessary reaction rates and follows as many isotopes as cross section data permits in order to achieve a highly accurate temporal nuclide inventory solution. This work chronicles relevant nuclear history, surveys current methodologies of depletion theory, details the methodology in applied MCNPX and provides benchmark results for three independent OECD/NEA benchmarks. Relevant nuclear history, from the Oklo reactor two billion years ago to the current major United States nuclear fuel cycle development programs, is addressed in order to supply the motivation for the development of this technology. A survey of current reaction rate and temporal nuclide inventory techniques is then provided to offer justification for the depletion strategy applied within MCNPX. The MCNPX depletion strategy is then dissected and each code feature is detailed chronicling the methodology development from the original linking of MONTEBURNS and MCNP to the most recent public release of the integrated capability (MCNPX 2.6.F). Calculation results of the OECD/NEA Phase IB benchmark, H. B. Robinson benchmark and OECD/NEA Phase IVB are then provided. The acceptable results of these calculations offer sufficient confidence in the predictive capability of the MCNPX depletion method. This capability sets up a significant foundation, in a well established

  11. Benefits of the delta K of depletion benchmarks for burnup credit validation

    SciTech Connect

    Lancaster, D.; Machiels, A.

    2012-07-01

    Pressurized Water Reactor (PWR) burnup credit validation is demonstrated using the benchmarks for quantifying fuel reactivity decrements, published as 'Benchmarks for Quantifying Fuel Reactivity Depletion Uncertainty,' EPRI Report 1022909 (August 2011). This demonstration uses the depletion module TRITON available in the SCALE 6.1 code system followed by criticality calculations using KENO-Va. The difference between the predicted depletion reactivity and the benchmark's depletion reactivity is a bias for the criticality calculations. The uncertainty in the benchmarks is the depletion reactivity uncertainty. This depletion bias and uncertainty is used with the bias and uncertainty from fresh UO{sub 2} critical experiments to determine the criticality safety limits on the neutron multiplication factor, k{sub eff}. The analysis shows that SCALE 6.1 with the ENDF/B-VII 238-group cross section library supports the use of a depletion bias of only 0.0015 in delta k if cooling is ignored and 0.0025 if cooling is credited. The uncertainty in the depletion bias is 0.0064. Reliance on the ENDF/B V cross section library produces much larger disagreement with the benchmarks. The analysis covers numerous combinations of depletion and criticality options. In all cases, the historical uncertainty of 5% of the delta k of depletion ('Kopp memo') was shown to be conservative for fuel with more than 30 GWD/MTU burnup. Since this historically assumed burnup uncertainty is not a function of burnup, the Kopp memo's recommended bias and uncertainty may be exceeded at low burnups, but its absolute magnitude is small. (authors)

  12. Depleted uranium--the growing concern.

    PubMed

    Abu-Qare, Aqel W; Abou-Donia, Mohamed B

    2002-01-01

    Recently, several studies have reported on the health and environmental consequences of the use of depleted uranium. Depleted uranium is a heavy metal that is also radioactive. It is commonly used in missiles as a counterweight because of its very high density (1.6 times more than lead). Immediate health risks associated with exposure to depleted uranium include kidney and respiratory problems, with conditions such as kidney stones, chronic cough and severe dermatitis. Long-term risks include lung and bone cancer. Several published reports implicated exposure to depleted uranium in kidney damage, mutagenicity, cancer, inhibition of bone, neurological deficits, significant decrease in the pregnancy rate in mice and adverse effects on the reproductive and central nervous systems. Acute poisoning with depleted uranium elicited renal failure that could lead to death. The environmental consequences of its residue will be felt for thousands of years. It is inhaled and passed through the skin and eyes, transferred through the placenta into the fetus, distributed into tissues and eliminated in urine. The use of depleted uranium during the Gulf and Kosovo Wars and the crash of a Boeing airplane carrying depleted uranium in Amsterdam in 1992 were implicated in a health concern related to exposure to depleted uranium.

  13. Depletion GPT-free sensitivity analysis for reactor eigenvalue problems

    SciTech Connect

    Kennedy, C.; Abdel-Khalik, H.

    2013-07-01

    This manuscript introduces a novel approach to solving depletion perturbation theory problems without the need to set up or solve the generalized perturbation theory (GPT) equations. The approach, hereinafter denoted generalized perturbation theory free (GPT-Free), constructs a reduced order model (ROM) using methods based in perturbation theory and computes response sensitivity profiles in a manner that is independent of the number or type of responses, allowing for an efficient computation of sensitivities when many responses are required. Moreover, the reduction error from using the ROM is quantified in the GPT-Free approach by means of a Wilks' order statistics error metric denoted the K-metric. Traditional GPT has been recognized as the most computationally efficient approach for performing sensitivity analyses of models with many input parameters, e.g. when forward sensitivity analyses are computationally intractable. However, most neutronics codes that can solve the fundamental (homogenous) adjoint eigenvalue problem do not have GPT capabilities unless envisioned during code development. The GPT-Free approach addresses this limitation by requiring only the ability to compute the fundamental adjoint. This manuscript demonstrates the GPT-Free approach for depletion reactor calculations performed in SCALE6 using the 7x7 UAM assembly model. A ROM is developed for the assembly over a time horizon of 990 days. The approach both calculates the reduction error over the lifetime of the simulation using the K-metric and benchmarks the obtained sensitivities using sample calculations. (authors)

  14. GRESS translation of the ORIGEN2 code

    SciTech Connect

    Pin, F. G.; Wright, R. Q.

    1986-05-01

    ORIGEN2 is a widely used point-depletion and radioactive-decay computer code for use in simulating nuclear fuel cycles and/or spent fuel characteristics. In typical applications the code calculates chain buildup and decay of more than 1600 radionuclides and elements. This report describes the addition to the ORIGEN2 code of an automated sensitivity calculation capability by means of the GRESS precompiler. The GRESS precompiler uses computer calculus to automatically enhance FORTRAN computer codes with derivative-taking capabilities. From these derivatives generated concurrently with the normal results, sensitivities of any variable used in the code with respect to any other variable or input parameter can readily be obtained. The added sensitivity calculation capability is verified on a sample problem involving fuel burnup under specified power, radioactive decay, and material irradiation under specified flux. The accuracy of the GRESS results is demonstrated using comparisons with the results of perturbation analyses.

  15. High homocysteine induces betaine depletion.

    PubMed

    Imbard, Apolline; Benoist, Jean-François; Esse, Ruben; Gupta, Sapna; Lebon, Sophie; de Vriese, An S; de Baulny, Helene Ogier; Kruger, Warren; Schiff, Manuel; Blom, Henk J

    2015-04-28

    Betaine is the substrate of the liver- and kidney-specific betaine-homocysteine (Hcy) methyltransferase (BHMT), an alternate pathway for Hcy remethylation. We hypothesized that BHMT is a major pathway for homocysteine removal in cases of hyperhomocysteinaemia (HHcy). Therefore, we measured betaine in plasma and tissues from patients and animal models of HHcy of genetic and acquired cause. Plasma was collected from patients presenting HHcy without any Hcy interfering treatment. Plasma and tissues were collected from rat models of HHcy induced by diet and from a mouse model of cystathionine β-synthase (CBS) deficiency. S-adenosyl-methionine (AdoMet), S-adenosyl-homocysteine (AdoHcy), methionine, betaine and dimethylglycine (DMG) were quantified by ESI-LC-MS/MS. mRNA expression was quantified using quantitative real-time (QRT)-PCR. For all patients with diverse causes of HHcy, plasma betaine concentrations were below the normal values of our laboratory. In the diet-induced HHcy rat model, betaine was decreased in all tissues analysed (liver, brain, heart). In the mouse CBS deficiency model, betaine was decreased in plasma, liver, heart and brain, but was conserved in kidney. Surprisingly, BHMT expression and activity was decreased in liver. However, in kidney, BHMT and SLC6A12 expression was increased in CBS-deficient mice. Chronic HHcy, irrespective of its cause, induces betaine depletion in plasma and tissues (liver, brain and heart), indicating a global decrease in the body betaine pool. In kidney, betaine concentrations were not affected, possibly due to overexpression of the betaine transporter SLC6A12 where betaine may be conserved because of its crucial role as an osmolyte.

  16. High homocysteine induces betaine depletion

    PubMed Central

    Imbard, Apolline; Benoist, Jean-François; Esse, Ruben; Gupta, Sapna; Lebon, Sophie; de Vriese, An S; de Baulny, Helene Ogier; Kruger, Warren; Schiff, Manuel; Blom, Henk J.

    2015-01-01

    Betaine is the substrate of the liver- and kidney-specific betaine-homocysteine (Hcy) methyltransferase (BHMT), an alternate pathway for Hcy remethylation. We hypothesized that BHMT is a major pathway for homocysteine removal in cases of hyperhomocysteinaemia (HHcy). Therefore, we measured betaine in plasma and tissues from patients and animal models of HHcy of genetic and acquired cause. Plasma was collected from patients presenting HHcy without any Hcy interfering treatment. Plasma and tissues were collected from rat models of HHcy induced by diet and from a mouse model of cystathionine β-synthase (CBS) deficiency. S-adenosyl-methionine (AdoMet), S-adenosyl-homocysteine (AdoHcy), methionine, betaine and dimethylglycine (DMG) were quantified by ESI—LC–MS/MS. mRNA expression was quantified using quantitative real-time (QRT)-PCR. For all patients with diverse causes of HHcy, plasma betaine concentrations were below the normal values of our laboratory. In the diet-induced HHcy rat model, betaine was decreased in all tissues analysed (liver, brain, heart). In the mouse CBS deficiency model, betaine was decreased in plasma, liver, heart and brain, but was conserved in kidney. Surprisingly, BHMT expression and activity was decreased in liver. However, in kidney, BHMT and SLC6A12 expression was increased in CBS-deficient mice. Chronic HHcy, irrespective of its cause, induces betaine depletion in plasma and tissues (liver, brain and heart), indicating a global decrease in the body betaine pool. In kidney, betaine concentrations were not affected, possibly due to overexpression of the betaine transporter SLC6A12 where betaine may be conserved because of its crucial role as an osmolyte. PMID:26182429

  17. Gulf war depleted uranium risks.

    PubMed

    Marshall, Albert C

    2008-01-01

    US and British forces used depleted uranium (DU) in armor-piercing rounds to disable enemy tanks during the Gulf and Balkan Wars. Uranium particulate is generated by DU shell impact and particulate entrained in air may be inhaled or ingested by troops and nearby civilian populations. As uranium is slightly radioactive and chemically toxic, a number of critics have asserted that DU exposure has resulted in a variety of adverse health effects for exposed veterans and nearby civilian populations. The study described in this paper used mathematical modeling to estimate health risks from exposure to DU during the 1991 Gulf War for both US troops and nearby Iraqi civilians. The analysis found that the risks of DU-induced leukemia or birth defects are far too small to result in an observable increase in these health effects among exposed veterans or Iraqi civilians. The analysis indicated that only a few ( approximately 5) US veterans in vehicles accidentally targeted by US tanks received significant exposure levels, resulting in about a 1.4% lifetime risk of DU radiation-induced fatal cancer (compared with about a 24% risk of a fatal cancer from all other causes). These veterans may have also experienced temporary kidney damage. Iraqi children playing for 500 h in DU-destroyed vehicles are predicted to incur a cancer risk of about 0.4%. In vitro and animal tests suggest the possibility of chemically induced health effects from DU internalization, such as immune system impairment. Further study is needed to determine the applicability of these findings for Gulf War exposure to DU. Veterans and civilians who did not occupy DU-contaminated vehicles are unlikely to have internalized quantities of DU significantly in excess of normal internalization of natural uranium from the environment.

  18. Action orientation overcomes the ego depletion effect.

    PubMed

    Dang, Junhua; Xiao, Shanshan; Shi, Yucai; Mao, Lihua

    2015-04-01

    It has been consistently demonstrated that initial exertion of self-control had negative influence on people's performance on subsequent self-control tasks. This phenomenon is referred to as the ego depletion effect. Based on action control theory, the current research investigated whether the ego depletion effect could be moderated by individuals' action versus state orientation. Our results showed that only state-oriented individuals exhibited ego depletion. For individuals with action orientation, however, their performance was not influenced by initial exertion of self-control. The beneficial effect of action orientation against ego depletion in our experiment results from its facilitation for adapting to the depleting task. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  19. Possible ozone depletions following nuclear explosions

    NASA Technical Reports Server (NTRS)

    Whitten, R. C.; Borucki, W. J.; Turco, R. P.

    1975-01-01

    The degree of depletion of the ozone layer ensuing after delivery of strategic nuclear warheads (5000 and 10,000 Mton) due to production of nitrogen oxides is theoretically assessed. Strong depletions are calculated for 16-km and 26-km altitudes, peaking 1-2 months after detonation and lasting for three years, while a significant depletion at 36 km would peak after one year. Assuming the explosions occur between 30 and 70 deg N, these effects should be much more pronounced in this region than over the Northern Hemisphere as a whole. It is concluded that Hampson's concern on this matter (1974) is well-founded.-

  20. Fully Depleted Charge-Coupled Devices

    SciTech Connect

    Holland, Stephen E.

    2006-05-15

    We have developed fully depleted, back-illuminated CCDs thatbuild upon earlier research and development efforts directed towardstechnology development of silicon-strip detectors used inhigh-energy-physics experiments. The CCDs are fabricated on the same typeof high-resistivity, float-zone-refined silicon that is used for stripdetectors. The use of high-resistivity substrates allows for thickdepletion regions, on the order of 200-300 um, with corresponding highdetection efficiency for near-infrared andsoft x-ray photons. We comparethe fully depleted CCD to thep-i-n diode upon which it is based, anddescribe the use of fully depleted CCDs in astronomical and x-ray imagingapplications.

  1. Possible ozone depletions following nuclear explosions

    NASA Technical Reports Server (NTRS)

    Whitten, R. C.; Borucki, W. J.; Turco, R. P.

    1975-01-01

    The degree of depletion of the ozone layer ensuing after delivery of strategic nuclear warheads (5000 and 10,000 Mton) due to production of nitrogen oxides is theoretically assessed. Strong depletions are calculated for 16-km and 26-km altitudes, peaking 1-2 months after detonation and lasting for three years, while a significant depletion at 36 km would peak after one year. Assuming the explosions occur between 30 and 70 deg N, these effects should be much more pronounced in this region than over the Northern Hemisphere as a whole. It is concluded that Hampson's concern on this matter (1974) is well-founded.-

  2. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  3. Ray-Based Calculations with DEPLETE of Laser Backscatter in ICF Targets

    SciTech Connect

    Strozzi, D J; Williams, E; Hinkel, D; Froula, D; London, R; Callahan, D

    2008-05-19

    A steady-state model for Brillouin and Raman backscatter along a laser ray path is presented. The daughter plasma waves are treated in the strong damping limit, and have amplitudes given by the (linear) kinetic response to the ponderomotive drive. Pump depletion, inverse-bremsstrahlung damping, bremsstrahlung emission, Thomson scattering off density fluctuations, and whole-beam focusing are included. The numerical code Deplete, which implements this model, is described. The model is compared with traditional linear gain calculations, as well as 'plane-wave' simulations with the paraxial propagation code pF3D. Comparisons with Brillouin-scattering experiments at the Omega Laser Facility show that laser speckles greatly enhance the reflectivity over the Deplete results. An approximate upper bound on this enhancement is given by doubling the Deplete coupling coefficient. Analysis with Deplete of an ignition design for the National Ignition Facility (NIF), with a peak radiation temperature of 285 eV, shows encouragingly low reflectivity. Doubling the coupling to bracket speckle effects suggests a less optimistic picture. Re-absorption of Raman light is seen to be significant in this design.

  4. A definition of depletion of fish stocks

    USGS Publications Warehouse

    Van Oosten, John

    1949-01-01

    Attention was focused on the need of a common and better understanding of the term depletion as applied to the fisheries in order to eliminate if possible the existing inexactness of thought on the subject. Depletion has been confused at various times with at least ten different ideas associated with it but which, as has has heen pointed out, are not synonymous at all. In defining depletion we must recognize that the term represents a condition and must not he confounded with the cause (overfishing) that leads to this condition or with the symptoms that identify it. Depletion was defined as a reduction, through overfishing, in the level of abundance of the exploitable segment of a stock that prevents the realization of the maximum productive capacity.

  5. [Hepatomioneuropathy secondary to mitochondrial DNA depletion].

    PubMed

    Blanco-Barca, M O; Gómez-Lado, C; Campos-González, Y; Castro-Gago, M

    2007-04-01

    Mitochondrial DNA depletion (mtDNA) is an highly heterogeneous condition characterized by a decreased number of mtDNA copies. The patient is a 22-month-old girl with generalized hypotonia, marked weakness, respiratory failure, arterial hypertension, hyperlactacidemia, hepatosplenomegaly and mild hypertransaminasemia without hepatic failure neither hypoketotic hypoglycemia. Electromyographic findings were consistent with neuromyopathy and muscle biopsy suggested a neurogenic atrophy. Electron microscopy revealed lipid droplets, subsarcolemmal accumulation of mitochondrias and glycogen granules. Respiratory chain enzime activities were normal. Genetic study in muscle showed mtDNA depletion, and the diagnosis of spinal muscular atrophy caused by survival motoneuron gene deletion was excluded. This case might be a novel phenotype of mtDNA depletion which could be named hepatomioneuropatyc form. A normal result of respiratory chain enzimes in muscle doesn't excluded mtDNA depletion.

  6. Polar stratospheric clouds and ozone depletion

    NASA Technical Reports Server (NTRS)

    Toon, Owen B.; Turco, Richard P.

    1991-01-01

    A review is presented of investigations into the correlation between the depletion of ozone and the formation of polar stratospheric clouds (PSCs). Satellite measurements from Nimbus 7 showed that over the years the depletion from austral spring to austral spring has generally worsened. Approximately 70 percent of the ozone above Antarctica, which equals about 3 percent of the earth's ozone, is lost during September and October. Various hypotheses for ozone depletion are discussed including the theory suggesting that chlorine compounds might be responsible for the ozone hole, whereby chlorine enters the atmosphere as a component of chlorofluorocarbons produced by humans. The three types of PSCs, nitric acid trihydrate, slowly cooling water-ice, and rapidly cooling water-ice clouds act as important components of the Antarctic ozone depletion. It is indicated that destruction of the ozone will be more severe each year for the next few decades, leading to a doubling in area of the Antarctic ozone hole.

  7. Polar stratospheric clouds and ozone depletion

    NASA Technical Reports Server (NTRS)

    Toon, Owen B.; Turco, Richard P.

    1991-01-01

    A review is presented of investigations into the correlation between the depletion of ozone and the formation of polar stratospheric clouds (PSCs). Satellite measurements from Nimbus 7 showed that over the years the depletion from austral spring to austral spring has generally worsened. Approximately 70 percent of the ozone above Antarctica, which equals about 3 percent of the earth's ozone, is lost during September and October. Various hypotheses for ozone depletion are discussed including the theory suggesting that chlorine compounds might be responsible for the ozone hole, whereby chlorine enters the atmosphere as a component of chlorofluorocarbons produced by humans. The three types of PSCs, nitric acid trihydrate, slowly cooling water-ice, and rapidly cooling water-ice clouds act as important components of the Antarctic ozone depletion. It is indicated that destruction of the ozone will be more severe each year for the next few decades, leading to a doubling in area of the Antarctic ozone hole.

  8. Effects of Riverbed Conductance on Stream Depletion

    NASA Astrophysics Data System (ADS)

    Lackey, G.; Neupauer, R. M.; Pitlick, J.

    2012-12-01

    In the western United States and other regions of the world where growing population and changing climates are threatening water supplies, accurate modeling of potential human impacts on water resources is becoming more important. Stream depletion, the reduction of surface water flow due to the extraction of groundwater from a hydraulically connected aquifer, is one of the more direct ways that development can alter water availability, degrade water quality and endanger aquatic habitats. These factors have made the accurate modeling of stream depletion an important step in the process of installing groundwater wells in regions that are susceptible to this phenomenon. Proper estimation of stream depletion requires appropriate parameterization of aquifer and streambed hydraulic properties. Although many studies have conducted numerical investigations to determine stream depletion at specific sites, they typically do not measure streambed hydraulic conductivity (Kr), but rather assume a representative value. In this work, we establish a hypothetical model aquifer that is 2000 m by 1600 m and has a meandering stream running through its center. The Kr of the model stream is varied from 1.0x10-9 m s-1 to 1.0x10-2 m s-1 in order to determine the sensitivity of the stream depletion calculations to this parameter. It was found that when Kr is in the lower part of this range, slight changes in K¬r lead to significant impacts on the calculated stream depletion values. We vary Kr along the stream channel according to naturally occurring patterns and demonstrate that alterations of the parameter over a few orders of magnitude can affect the estimated stream depletion caused by a well at a specified location. The numerical simulations show that the mean value of Kr and its spatial variability along the channel should be realistic to develop an accurate model of stream depletion.

  9. Depleted uranium: A DOE management guide

    SciTech Connect

    1995-10-01

    The U.S. Department of Energy (DOE) has a management challenge and financial liability in the form of 50,000 cylinders containing 555,000 metric tons of depleted uranium hexafluoride (UF{sub 6}) that are stored at the gaseous diffusion plants. The annual storage and maintenance cost is approximately $10 million. This report summarizes several studies undertaken by the DOE Office of Technology Development (OTD) to evaluate options for long-term depleted uranium management. Based on studies conducted to date, the most likely use of the depleted uranium is for shielding of spent nuclear fuel (SNF) or vitrified high-level waste (HLW) containers. The alternative to finding a use for the depleted uranium is disposal as a radioactive waste. Estimated disposal costs, utilizing existing technologies, range between $3.8 and $11.3 billion, depending on factors such as applicability of the Resource Conservation and Recovery Act (RCRA) and the location of the disposal site. The cost of recycling the depleted uranium in a concrete based shielding in SNF/HLW containers, although substantial, is comparable to or less than the cost of disposal. Consequently, the case can be made that if DOE invests in developing depleted uranium shielded containers instead of disposal, a long-term solution to the UF{sub 6} problem is attained at comparable or lower cost than disposal as a waste. Two concepts for depleted uranium storage casks were considered in these studies. The first is based on standard fabrication concepts previously developed for depleted uranium metal. The second converts the UF{sub 6} to an oxide aggregate that is used in concrete to make dry storage casks.

  10. The influence of fog parameters on aerosol depletion measured in the KAEVER experiments

    SciTech Connect

    Poss, G.; Weber, D.; Fritsche, B.

    1995-12-31

    The release of radioactive aerosols in the environment is one of the most serious hazards in case of an accident in nuclear power plant. Many efforts have been made in the past in numerous experimental programs like NSPP, DEMONA, VANAM, LACE, MARVIKEN, others are still underway to improve the knowledge of the aerosol behavior and depletion in a reactor containment in order to estimate the possible source term and to validate computer codes. In the German single compartment KAEVER facility the influence of size distribution, morphology, composition and solubility on the aerosol behavior is investigated. One of the more specific items is to learn about {open_quotes}wet depletion{close_quotes} means, the aerosol depletion behavior in condensing atmospheres. There are no experiments known where the fog parameters like droplet size distribution, volume concentration, respectively airborne liquid water content have been measured in- and on-line explicitly. To the authors knowledge the use of the Battelle FASP photometer, which was developed especially for this reason, for the first time gives insight in condensation behavior under accident typical thermal hydraulic conditions. It delivers a basis for code validation in terms of a real comparison of measurements and calculations. The paper presents results from {open_quotes}wet depletion{close_quotes} aerosol experiments demonstrating how depletion velocity depends on the fog parameters and where obviously critical fog parameter seem to change the regime from a {open_quotes}pseudo dry depletion{close_quotes} at a relative humidity of 100% but quasi no or very low airborne liquid water content to a real {open_quotes}wet depletion{close_quotes} under the presence of fogs with varying densities. Characteristics are outlined how soluble and insoluble particles as well as aerosol mixtures behave under condensing conditions.

  11. Anatomy of Depleted Interplanetary Coronal Mass Ejections

    NASA Astrophysics Data System (ADS)

    Kocher, M.; Lepri, S. T.; Landi, E.; Zhao, L.; Manchester, W. B., IV

    2017-01-01

    We report a subset of interplanetary coronal mass ejections (ICMEs) containing distinct periods of anomalous heavy-ion charge state composition and peculiar ion thermal properties measured by ACE/SWICS from 1998 to 2011. We label them “depleted ICMEs,” identified by the presence of intervals where C6+/C5+ and O7+/O6+ depart from the direct correlation expected after their freeze-in heights. These anomalous intervals within the depleted ICMEs are referred to as “Depletion Regions.” We find that a depleted ICME would be indistinguishable from all other ICMEs in the absence of the Depletion Region, which has the defining property of significantly low abundances of fully charged species of helium, carbon, oxygen, and nitrogen. Similar anomalies in the slow solar wind were discussed by Zhao et al. We explore two possibilities for the source of the Depletion Region associated with magnetic reconnection in the tail of a CME, using CME simulations of the evolution of two Earth-bound CMEs described by Manchester et al.

  12. A new definition of maternal depletion syndrome.

    PubMed Central

    Winkvist, A; Rasmussen, K M; Habicht, J P

    1992-01-01

    BACKGROUND. Although the term "maternal depletion syndrome" has been commonly used to explain poor maternal and infant health, whether such a syndrome actually exists remains unclear. This uncertainty may be due to the lack of a clear definition of the syndrome and the absence of theoretical frameworks that account for the many factors related to reproductive nutrition. METHODS. We propose a new definition of maternal depletion syndrome within a framework that accounts for potential confounding factors. RESULTS. Our conceptual framework distinguishes between childbearing pattern and inadequate diet as causes of poor maternal health; hence, our definition of maternal depletion syndrome has both biological and practical meaning. The new definition is based on overall change in maternal nutritional status over one reproductive cycle in relation to possible depletion and repletion phases and in relation to initial nutritional status. CONCLUSIONS. The empirical application of this approach should permit the testing of the existence of maternal depletion syndrome in the developing world, and the distinction between populations where family planning will alleviate maternal depletion and those in which an improved diet is also necessary. PMID:1566948

  13. SCALE Continuous-Energy Monte Carlo Depletion with Parallel KENO in TRITON

    SciTech Connect

    Goluoglu, Sedat; Bekar, Kursat B; Wiarda, Dorothea

    2012-01-01

    The TRITON sequence of the SCALE code system is a powerful and robust tool for performing multigroup (MG) reactor physics analysis using either the 2-D deterministic solver NEWT or the 3-D Monte Carlo transport code KENO. However, as with all MG codes, the accuracy of the results depends on the accuracy of the MG cross sections that are generated and/or used. While SCALE resonance self-shielding modules provide rigorous resonance self-shielding, they are based on 1-D models and therefore 2-D or 3-D effects such as heterogeneity of the lattice structures may render final MG cross sections inaccurate. Another potential drawback to MG Monte Carlo depletion is the need to perform resonance self-shielding calculations at each depletion step for each fuel segment that is being depleted. The CPU time and memory required for self-shielding calculations can often eclipse the resources needed for the Monte Carlo transport. This summary presents the results of the new continuous-energy (CE) calculation mode in TRITON. With the new capability, accurate reactor physics analyses can be performed for all types of systems using the SCALE Monte Carlo code KENO as the CE transport solver. In addition, transport calculations can be performed in parallel mode on multiple processors.

  14. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  15. New Approach For Prediction Groundwater Depletion

    NASA Astrophysics Data System (ADS)

    Moustafa, Mahmoud

    2017-01-01

    Current approaches to quantify groundwater depletion involve water balance and satellite gravity. However, the water balance technique includes uncertain estimation of parameters such as evapotranspiration and runoff. The satellite method consumes time and effort. The work reported in this paper proposes using failure theory in a novel way to predict groundwater saturated thickness depletion. An important issue in the failure theory proposed is to determine the failure point (depletion case). The proposed technique uses depth of water as the net result of recharge/discharge processes in the aquifer to calculate remaining saturated thickness resulting from the applied pumping rates in an area to evaluate the groundwater depletion. Two parameters, the Weibull function and Bayes analysis were used to model and analyze collected data from 1962 to 2009. The proposed methodology was tested in a nonrenewable aquifer, with no recharge. Consequently, the continuous decline in water depth has been the main criterion used to estimate the depletion. The value of the proposed approach is to predict the probable effect of the current applied pumping rates on the saturated thickness based on the remaining saturated thickness data. The limitation of the suggested approach is that it assumes the applied management practices are constant during the prediction period. The study predicted that after 300 years there would be an 80% probability of the saturated aquifer which would be expected to be depleted. Lifetime or failure theory can give a simple alternative way to predict the remaining saturated thickness depletion with no time-consuming processes such as the sophisticated software required.

  16. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  17. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  18. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  19. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  20. Groundwater depletion embedded in international food trade.

    PubMed

    Dalin, Carole; Wada, Yoshihide; Kastner, Thomas; Puma, Michael J

    2017-03-29

    Recent hydrological modelling and Earth observations have located and quantified alarming rates of groundwater depletion worldwide. This depletion is primarily due to water withdrawals for irrigation, but its connection with the main driver of irrigation, global food consumption, has not yet been explored. Here we show that approximately eleven per cent of non-renewable groundwater use for irrigation is embedded in international food trade, of which two-thirds are exported by Pakistan, the USA and India alone. Our quantification of groundwater depletion embedded in the world's food trade is based on a combination of global, crop-specific estimates of non-renewable groundwater abstraction and international food trade data. A vast majority of the world's population lives in countries sourcing nearly all their staple crop imports from partners who deplete groundwater to produce these crops, highlighting risks for global food and water security. Some countries, such as the USA, Mexico, Iran and China, are particularly exposed to these risks because they both produce and import food irrigated from rapidly depleting aquifers. Our results could help to improve the sustainability of global food production and groundwater resource management by identifying priority regions and agricultural products at risk as well as the end consumers of these products.

  1. Depletion sensitivity predicts unhealthy snack purchases.

    PubMed

    Salmon, Stefanie J; Adriaanse, Marieke A; Fennis, Bob M; De Vet, Emely; De Ridder, Denise T D

    2016-01-01

    The aim of the present research is to examine the relation between depletion sensitivity - a novel construct referring to the speed or ease by which one's self-control resources are drained - and snack purchase behavior. In addition, interactions between depletion sensitivity and the goal to lose weight on snack purchase behavior were explored. Participants included in the study were instructed to report every snack they bought over the course of one week. The dependent variables were the number of healthy and unhealthy snacks purchased. The results of the present study demonstrate that depletion sensitivity predicts the amount of unhealthy (but not healthy) snacks bought. The more sensitive people are to depletion, the more unhealthy snacks they buy. Moreover, there was some tentative evidence that this relation is more pronounced for people with a weak as opposed to a strong goal to lose weight, suggesting that a strong goal to lose weight may function as a motivational buffer against self-control failures. All in all, these findings provide evidence for the external validity of depletion sensitivity and the relevance of this construct in the domain of eating behavior. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Groundwater depletion embedded in international food trade

    NASA Astrophysics Data System (ADS)

    Dalin, Carole; Wada, Yoshihide; Kastner, Thomas; Puma, Michael J.

    2017-03-01

    Recent hydrological modelling and Earth observations have located and quantified alarming rates of groundwater depletion worldwide. This depletion is primarily due to water withdrawals for irrigation, but its connection with the main driver of irrigation, global food consumption, has not yet been explored. Here we show that approximately eleven per cent of non-renewable groundwater use for irrigation is embedded in international food trade, of which two-thirds are exported by Pakistan, the USA and India alone. Our quantification of groundwater depletion embedded in the world’s food trade is based on a combination of global, crop-specific estimates of non-renewable groundwater abstraction and international food trade data. A vast majority of the world’s population lives in countries sourcing nearly all their staple crop imports from partners who deplete groundwater to produce these crops, highlighting risks for global food and water security. Some countries, such as the USA, Mexico, Iran and China, are particularly exposed to these risks because they both produce and import food irrigated from rapidly depleting aquifers. Our results could help to improve the sustainability of global food production and groundwater resource management by identifying priority regions and agricultural products at risk as well as the end consumers of these products.

  3. To Code or Not To Code?

    ERIC Educational Resources Information Center

    Parkinson, Brian; Sandhu, Parveen; Lacorte, Manel; Gourlay, Lesley

    1998-01-01

    This article considers arguments for and against the use of coding systems in classroom-based language research and touches on some relevant considerations from ethnographic and conversational analysis approaches. The four authors each explain and elaborate on their practical decision to code or not to code events or utterances at a specific point…

  4. Ego depletion in visual perception: Ego-depleted viewers experience less ambiguous figure reversal.

    PubMed

    Wimmer, Marina C; Stirk, Steven; Hancock, Peter J B

    2017-02-22

    This study examined the effects of ego depletion on ambiguous figure perception. Adults (N = 315) received an ego depletion task and were subsequently tested on their inhibitory control abilities that were indexed by the Stroop task (Experiment 1) and their ability to perceive both interpretations of ambiguous figures that was indexed by reversal (Experiment 2). Ego depletion had a very small effect on reducing inhibitory control (Cohen's d = .15) (Experiment 1). Ego-depleted participants had a tendency to take longer to respond in Stroop trials. In Experiment 2, ego depletion had small to medium effects on the experience of reversal. Ego-depleted viewers tended to take longer to reverse ambiguous figures (duration to first reversal) when naïve of the ambiguity and experienced less reversal both when naïve and informed of the ambiguity. Together, findings suggest that ego depletion has small effects on inhibitory control and small to medium effects on bottom-up and top-down perceptual processes. The depletion of cognitive resources can reduce our visual perceptual experience.

  5. The modality effect of ego depletion: Auditory task modality reduces ego depletion.

    PubMed

    Li, Qiong; Wang, Zhenhong

    2016-08-01

    An initial act of self-control that impairs subsequent acts of self-control is called ego depletion. The ego depletion phenomenon has been observed consistently. The modality effect refers to the effect of the presentation modality on the processing of stimuli. The modality effect was also robustly found in a large body of research. However, no study to date has examined the modality effects of ego depletion. This issue was addressed in the current study. In Experiment 1, after all participants completed a handgrip task, one group's participants completed a visual attention regulation task and the other group's participants completed an auditory attention regulation task, and then all participants again completed a handgrip task. The ego depletion phenomenon was observed in both the visual and the auditory attention regulation task. Moreover, participants who completed the visual task performed worse on the handgrip task than participants who completed the auditory task, which indicated that there was high ego depletion in the visual task condition. In Experiment 2, participants completed an initial task that either did or did not deplete self-control resources, and then they completed a second visual or auditory attention control task. The results indicated that depleted participants performed better on the auditory attention control task than the visual attention control task. These findings suggest that altering task modality may reduce ego depletion. © 2016 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  6. Ozone depletion and chlorine loading potentials

    NASA Technical Reports Server (NTRS)

    Pyle, John A.; Wuebbles, Donald J.; Solomon, Susan; Zvenigorodsky, Sergei; Connell, Peter; Ko, Malcolm K. W.; Fisher, Donald A.; Stordal, Frode; Weisenstein, Debra

    1991-01-01

    The recognition of the roles of chlorine and bromine compounds in ozone depletion has led to the regulation or their source gases. Some source gases are expected to be more damaging to the ozone layer than others, so that scientific guidance regarding their relative impacts is needed for regulatory purposes. Parameters used for this purpose include the steady-state and time-dependent chlorine loading potential (CLP) and the ozone depletion potential (ODP). Chlorine loading potentials depend upon the estimated value and accuracy of atmospheric lifetimes and are subject to significant (approximately 20-50 percent) uncertainties for many gases. Ozone depletion potentials depend on the same factors, as well as the evaluation of the release of reactive chlorine and bromine from each source gas and corresponding ozone destruction within the stratosphere.

  7. Depletion performance of layered reservoirs without crossflow

    SciTech Connect

    Fetkovich, M.J.; Works, A.M.; Thrasher, T.S. ); Bradley, M.D. )

    1990-09-01

    This paper presents a study of the rate/time and pressure/cumulative-production depletion performance of a two-layered gas reservoir producing without crossflow. The gas reservoir has produced for more than 20 years at an effectively constant wellbore pressure, thus giving continuously declining rate/time and pressure/cumulative-production data for analysis. The field data demonstrates that Arps depletion-decline exponents between 0.5 and 1 can be obtained with a no-crossflow, layered reservoir description. Rate-vs.-time and pressure-vs.-cumulative-production predictions were developed from both 2D numerical and simplified tank models of a two layered, no-crossflow system. These results demonstrate the effects of changes in reservoir layer volumes, permeability, and skin on the depletion performance.

  8. Neutral depletion and the helicon density limit

    SciTech Connect

    Magee, R. M.; Galante, M. E.; Carr, J. Jr.; Lusk, G.; McCarren, D. W.; Scime, E. E.

    2013-12-15

    It is straightforward to create fully ionized plasmas with modest rf power in a helicon. It is difficult, however, to create plasmas with density >10{sup 20} m{sup −3}, because neutral depletion leads to a lack of fuel. In order to address this density limit, we present fast (1 MHz), time-resolved measurements of the neutral density at and downstream from the rf antenna in krypton helicon plasmas. At the start of the discharge, the neutral density underneath the antenna is reduced to 1% of its initial value in 15 μs. The ionization rate inferred from these data implies that the electron temperature near the antenna is much higher than the electron temperature measured downstream. Neutral density measurements made downstream from the antenna show much slower depletion, requiring 14 ms to decrease by a factor of 1/e. Furthermore, the downstream depletion appears to be due to neutral pumping rather than ionization.

  9. Global Warming: Lessons from Ozone Depletion

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2010-11-01

    My teaching and textbook have always covered many physics-related social issues, including stratospheric ozone depletion and global warming. The ozone saga is an inspiring good-news story that's instructive for solving the similar but bigger problem of global warming. Thus, as soon as students in my physics literacy course at the University of Arkansas have developed a conceptual understanding of energy and of electromagnetism, including the electromagnetic spectrum, I devote a lecture (and a textbook section) to ozone depletion and another lecture (and section) to global warming. Humankind came together in 1986 and quickly solved, to the extent that humans can solve it, ozone depletion. We could do the same with global warming, but we haven't and as yet there's no sign that we will. The parallel between the ozone and global warming cases, and the difference in outcomes, are striking and instructive.

  10. Self-regulation, ego depletion, and inhibition.

    PubMed

    Baumeister, Roy F

    2014-12-01

    Inhibition is a major form of self-regulation. As such, it depends on self-awareness and comparing oneself to standards and is also susceptible to fluctuations in willpower resources. Ego depletion is the state of reduced willpower caused by prior exertion of self-control. Ego depletion undermines inhibition both because restraints are weaker and because urges are felt more intensely than usual. Conscious inhibition of desires is a pervasive feature of everyday life and may be a requirement of life in civilized, cultural society, and in that sense it goes to the evolved core of human nature. Intentional inhibition not only restrains antisocial impulses but can also facilitate optimal performance, such as during test taking. Self-regulation and ego depletion- may also affect less intentional forms of inhibition, even chronic tendencies to inhibit. Broadly stated, inhibition is necessary for human social life and nearly all societies encourage and enforce it. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Bare Code Reader

    NASA Astrophysics Data System (ADS)

    Clair, Jean J.

    1980-05-01

    The Bare code system will be used, in every market and supermarket. The code, which is normalised in US and Europe (code EAN) gives informations on price, storage, nature and allows in real time the gestion of theshop.

  12. Prediction of battery depletion in implanted pacemakers

    PubMed Central

    Davies, Geoffrey; Siddons, Harold

    1973-01-01

    By the use of a measuring oscilloscope and the standard electrocardiogram limb leads the degree of battery depletion in an implanted pacemaker can be estimated. A formula based on readings obtained by this means has been used to determine when Devices fixed rate pacemakers should be removed. Laboratory tests show that 90% of their useful life is obtained by this means and it proved possible to extend the period of implantation from an arbitrary 24 months to 25 to 34 months without failure from battery depletion. PMID:4731110

  13. Seasonal oxygen depletion in Chesapeake Bay

    SciTech Connect

    Taft, J.L.; Hartwig, E.O.; Loftus, R.

    1980-12-01

    The spring freshet increases density stratification in Chesapeake Bay and minimizes oxygen transfer from the surface to the deep layer so that waters below 10 m depth experience oxygen depletion which may lead to anoxia during June to September. Respiration in the water of the deep layer is the major factor contributing to oxygen depletion. Benthic respiration seems secondary. Organic matter from the previous year which has settled into the deep layer during winter provides most of the oxygen demand but some new production in the surface layer may sink and thus supplement the organic matter accumulated in the deep layer.

  14. Low Sulfur Depletion in Photodissociation Regions

    NASA Astrophysics Data System (ADS)

    Goicoechea, J. R.; Pety, J.; Gerin, M.; Teyssier, D.; Roueff, E.; Hily-Blant, P.

    2006-06-01

    Sulfur is an abundant element which remains undepleted in diffuse interstellar gas but it is historically assumed to deplete (by factors of ˜1000) on grains at higher densities. Photodissociation regions (PDRs) are an interesting intermediate medium between translucent and dark clouds where the energetics and dynamics are dominated by an illuminating FUV radiation field, and thus they can provide some new insights about the sulfur depletion problem. In this work we present our latest studies on CS and HCS^+ photochemistry, excitation and radiative transfer in the Horsehead PDR, allowing us to infer the sulfur abundance.

  15. Generalized concatenated quantum codes

    SciTech Connect

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-05-15

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  16. CESAR: A Code for Nuclear Fuel and Waste Characterisation

    SciTech Connect

    Vidal, J.M.; Grouiller, J.P.; Launay, A.; Berthion, Y.; Marc, A.; Toubon, H.

    2006-07-01

    CESAR (Simplified Evolution Code Applied to Reprocessing) is a depletion code developed through a joint program between CEA and COGEMA. In the late 1980's, the first use of this code dealt with nuclear measurement at the Laboratories of the La Hague reprocessing plant. The use of CESAR was then extended to characterizations of all entrance materials and for characterisation, via tracer, of all produced waste. The code can distinguish more than 100 heavy nuclides, 200 fission products and 100 activation products, and it can characterise both the fuel and the structural material of the fuel. CESAR can also make depletion calculations from 3 months to 1 million years of cooling time. Between 2003-2005, the 5. version of the code was developed. The modifications were related to the harmonisation of the code's nuclear data with the JEF2.2 nuclear data file. This paper describes the code and explains the extensive use of this code at the La Hague reprocessing plant and also for prospective studies. The second part focuses on the modifications of the latest version, and describes the application field and the qualification of the code. Many companies and the IAEA use CESAR today. CESAR offers a Graphical User Interface, which is very user-friendly. (authors)

  17. 48 CFR 52.223-11 - Ozone-Depleting Substances.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Ozone-Depleting Substances....223-11 Ozone-Depleting Substances. As prescribed in 23.804(a), insert the following clause: Ozone-Depleting Substances (MAY 2001) (a) Definition. Ozone-depleting substance, as used in this clause, means...

  18. 48 CFR 52.223-11 - Ozone-Depleting Substances.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Ozone-Depleting Substances....223-11 Ozone-Depleting Substances. As prescribed in 23.804(a), insert the following clause: Ozone-Depleting Substances (MAY 2001) (a) Definition. Ozone-depleting substance, as used in this clause, means...

  19. 48 CFR 52.223-11 - Ozone-Depleting Substances.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Ozone-Depleting Substances....223-11 Ozone-Depleting Substances. As prescribed in 23.804(a), insert the following clause: Ozone-Depleting Substances (MAY 2001) (a) Definition. Ozone-depleting substance, as used in this clause, means...

  20. 48 CFR 52.223-11 - Ozone-Depleting Substances.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Ozone-Depleting Substances....223-11 Ozone-Depleting Substances. As prescribed in 23.804(a), insert the following clause: Ozone-Depleting Substances (MAY 2001) (a) Definition. Ozone-depleting substance, as used in this clause, means...

  1. 48 CFR 52.223-11 - Ozone-Depleting Substances.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Ozone-Depleting Substances....223-11 Ozone-Depleting Substances. As prescribed in 23.804(a), insert the following clause: Ozone-Depleting Substances (MAY 2001) (a) Definition. Ozone-depleting substance, as used in this clause, means...

  2. Numerical Study of Phase Conjugation in Stimulated Backscatter with Pump Depletion.

    DTIC Science & Technology

    1982-09-17

    stimulated Brillouin scattering (SBS) of coherent beams was studied numerically, using a steady state 2D propagation code ( BOUNCE ). The present work...treats phase conjugation of a focused aberrated beam, using a modified version of BOUNCE that has been extended to include pump depletion. In all of the...which is contained within a region zi < z < z2. BOUNCE solves Eqs. (1), assuming the aberrated pump wave is incident at z2, while the backscatter grows

  3. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  4. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  5. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  6. Direct Visualization of an Impurity Depletion Zone

    NASA Technical Reports Server (NTRS)

    Chernov, Alex A.; Garcia-Ruiz, Juan Ma; Thomas, Bill R.

    2000-01-01

    When a crystal incorporates more impurity per unit of its volume than the impurity concentration in solution, the solution in vicinity of the growing crystal is depleted with respect to the impurity I,2. With a stagnant solution, e. g. in microgravity or gels, an impurity depletion zone expands as the crystal grows and results in greater purity in most of the outer portion of the crystal than in the core. Crystallization in gel provides an opportunity to mimic microgravity conditions and visualize the impurity depletion zone. Colorless, transparent apoferritin (M congruent to 450 KDa) crystals were grown in the presence of red holoferritin dimer as a microheterogeneous impurity (M congruent to 900 KDa) within agarose gel by counterdiffusion with Cd(2+) precipitant. Preferential trapping of dimers, (distribution coefficient K = 4 (exp 1,2)) results in weaker red color around the crystals grown in the left tube in the figure as compared to the control middle tube without crystals. The left and the middle tubes contain colored ferritin dimers, the right tube contains colored trimers. The meniscus in the left tube separate gel (below) and liquid solution containing Cd(2+) (above). Similar solutions, though without precipitants, were present on top of the middle and right tube allowing diffusion of dimers and trimers. The area of weaker color intensity around crystals directly demonstrates overlapped impurity depletion zones.

  7. Global Warming: Lessons from Ozone Depletion

    ERIC Educational Resources Information Center

    Hobson, Art

    2010-01-01

    My teaching and textbook have always covered many physics-related social issues, including stratospheric ozone depletion and global warming. The ozone saga is an inspiring good-news story that's instructive for solving the similar but bigger problem of global warming. Thus, as soon as students in my physics literacy course at the University of…

  8. Global Warming: Lessons from Ozone Depletion

    ERIC Educational Resources Information Center

    Hobson, Art

    2010-01-01

    My teaching and textbook have always covered many physics-related social issues, including stratospheric ozone depletion and global warming. The ozone saga is an inspiring good-news story that's instructive for solving the similar but bigger problem of global warming. Thus, as soon as students in my physics literacy course at the University of…

  9. Kinetics of the depletion of trichloroethene

    SciTech Connect

    Barrio-Lage, G.; Parson, F.Z.; Nassar, R.S.

    1987-04-01

    The depletion of trichloroethene (TCE) was studied in microcosms containing water and three types of natural sediment ranging in composition from highly organic to a calcareous sedimentary rock. The depletion rates varied slightly in the different sediments. The first-order rate constant k/sub 1/ for the depletion of TCE ranged from 8.7 x 10/sup -4/ and 4.9 x 10/sup -4/ h/sup -1/ in soils contaminated with TCE prior to microcosm preparation to 3.4 x 10/sup -4/ and 4.6 x 10/sup -4/ h/sup -1/ for soils with a large organic content to 3.2 x 10/sup -4/ h/sup -1/ for crushed rock microcosms. Depletion was found to follow nonlinear forms of Michaelis-Menten kinetics in the organic sediments; however, microcosms containing crushed rock and water followed a linear form of the equation. K/sub m/ values were found to be dependent on the percent of total organic carbon in the sediment. 16 references, 4 figures, 2 tables.

  10. Neutral depletion versus repletion due to ionization

    SciTech Connect

    Fruchtman, A.; Makrinich, G.; Raimbault, J.-L.; Liard, L.; Rax, J.-M.; Chabert, P.

    2008-05-15

    Recent theoretical analyses which predicted unexpected effects of neutral depletion in both collisional and collisionless plasmas are reviewed. We focus on the depletion of collisionless neutrals induced by strong ionization of a collisionless plasma and contrast this depletion with the effect of strong ionization on thermalized neutrals. The collisionless plasma is analyzed employing a kinetic description. The collisionless neutrals and the plasma are coupled through volume ionization and wall recombination only. The profiles of density and pressure both of the plasma and of the neutral-gas and the profile of the ionization rate are calculated. It is shown that for collisionless neutrals the ionization results in neutral depletion, while when neutrals are thermalized the ionization induces a maximal neutral-density at the discharge center, which we call neutral repletion. The difference between the two cases stems from the relation between the neutral density and pressure. The pressure of the collisionless neutral-gas turns out to be maximal where its density is minimal, in contrast to the case of a thermalized neutral gas.

  11. Soil moisture depletion patterns around scattered trees

    Treesearch

    Robert R. Ziemer

    1968-01-01

    Soil moisture was measured around an isolated mature sugar pine tree (Pinus lambertiana Dougl.) in the mixed conifer forest type of the north central Sierra Nevada, California, from November 1965 to October 1966. From a sequence of measurements, horizontal and vertical soil moisture profiles were developed. Estimated soil moisture depletion from the 61-foot radius plot...

  12. Demonstration of jackhammer incorporating depleted uranium

    SciTech Connect

    Fischer, L E; Hoard, R W; Carter, D L; Saculla, M D; Wilson, G V

    2000-04-01

    The United States Government currently has an abundance of depleted uranium (DU). This surplus of about 1 billion pounds is the result of an enrichment process using gaseous diffusion to produce enriched and depleted uranium. The enriched uranium has been used primarily for either nuclear weapons for the military or nuclear fuel for the commercial power industry. Most of the depleted uranium remains at the enrichment process plants in the form of depleted uranium hexafluoride (DUF{sub 6}). The Department of Energy (DOE) recently began a study to identify possible commercial applications for the surplus material. One of these potential applications is to use the DU in high-density strikers/hammers in pneumatically driven tools, such as jack hammers and piledrivers to improve their impulse performance. The use of DU could potentially increase tunneling velocity and excavation into target materials with improved efficiency. This report describes the efforts undertaken to analyze the particulars of using DU in two specific striking applications: the jackhammer and chipper tool.

  13. Dissolution Treatment of Depleted Uranium Waste

    SciTech Connect

    Gates-Anderson, D D; Laue, C A; Fitch, T E

    2004-02-09

    Researchers at LLNL have developed a 3-stage process that converts pyrophoric depleted uranium metal turnings to a solidified final product that can be transported to and buried at a permitted land disposal site. The three process stages are: (1) pretreatment; (2) dissolution; and (3) solidification. Each stage was developed following extensive experimentation. This report presents the results of their experimental studies.

  14. Contrasts between Antarctic and Arctic ozone depletion.

    PubMed

    Solomon, Susan; Portmann, Robert W; Thompson, David W J

    2007-01-09

    This work surveys the depth and character of ozone depletion in the Antarctic and Arctic using available long balloon-borne and ground-based records that cover multiple decades from ground-based sites. Such data reveal changes in the range of ozone values including the extremes observed as polar air passes over the stations. Antarctic ozone observations reveal widespread and massive local depletion in the heart of the ozone "hole" region near 18 km, frequently exceeding 90%. Although some ozone losses are apparent in the Arctic during particular years, the depth of the ozone losses in the Arctic are considerably smaller, and their occurrence is far less frequent. Many Antarctic total integrated column ozone observations in spring since approximately the 1980s show values considerably below those ever observed in earlier decades. For the Arctic, there is evidence of some spring season depletion of total ozone at particular stations, but the changes are much less pronounced compared with the range of past data. Thus, the observations demonstrate that the widespread and deep ozone depletion that characterizes the Antarctic ozone hole is a unique feature on the planet.

  15. Self-Regulatory Depletion Enhances Neural Responses to Rewards and Impairs Top-Down Control

    PubMed Central

    Wagner, Dylan D.; Altman, Myra; Boswell, Rebecca G.; Kelley, William M.; Heatherton, Todd F.

    2014-01-01

    To be successful at self-regulation, individuals must be able to resist impulses and desires. The strength model of self-regulation suggests that when self-regulatory capacity is depleted, self-control deficits result from a failure to engage top-down control mechanisms. Using functional neuroimaging, we examined changes in brain activity in response to viewing desirable foods among thirty-one chronic dieters, half of whom underwent self-regulatory depletion using a sequential task paradigm. Compared to non-depleted dieters, depleted dieters exhibited greater food cue-related activity in the orbitofrontal cortex, a brain area associated with coding the reward value and liking aspects of desirable foods and also showed decreased functional connectivity between this area and the inferior frontal gyrus, a region commonly implicated in self-control. These findings suggest that self-regulatory depletion provokes self-control failure by reducing connectivity between brain regions involved in cognitive control and those representing rewards thereby decreasing the capacity to resist temptations. PMID:24026225

  16. “When the going gets tough, who keeps going?” Depletion sensitivity moderates the ego-depletion effect

    PubMed Central

    Salmon, Stefanie J.; Adriaanse, Marieke A.; De Vet, Emely; Fennis, Bob M.; De Ridder, Denise T. D.

    2014-01-01

    Self-control relies on a limited resource that can get depleted, a phenomenon that has been labeled ego-depletion. We argue that individuals may differ in their sensitivity to depleting tasks, and that consequently some people deplete their self-control resource at a faster rate than others. In three studies, we assessed individual differences in depletion sensitivity, and demonstrate that depletion sensitivity moderates ego-depletion effects. The Depletion Sensitivity Scale (DSS) was employed to assess depletion sensitivity. Study 1 employs the DSS to demonstrate that individual differences in sensitivity to ego-depletion exist. Study 2 shows moderate correlations of depletion sensitivity with related self-control concepts, indicating that these scales measure conceptually distinct constructs. Study 3 demonstrates that depletion sensitivity moderates the ego-depletion effect. Specifically, participants who are sensitive to depletion performed worse on a second self-control task, indicating a stronger ego-depletion effect, compared to participants less sensitive to depletion. PMID:25009523

  17. CTCF depletion alters chromatin structure and transcription of myeloid-specific factors.

    PubMed

    Ouboussad, Lylia; Kreuz, Sarah; Lefevre, Pascal F

    2013-10-01

    Differentiation is a multistep process tightly regulated and controlled by complex transcription factor networks. Here, we show that the rate of differentiation of common myeloid precursor cells increases after depletion of CTCF, a protein emerging as a potential key factor regulating higher-order chromatin structure. We identified CTCF binding in the vicinity of important transcription factors regulating myeloid differentiation and showed that CTCF depletion impacts on the expression of these genes in concordance with the observed acceleration of the myeloid commitment. Furthermore, we observed a loss of the histone variant H2A.Z within the selected promoter regions and an increase in non-coding RNA transcription upstream of these genes. Both abnormalities suggest a global chromatin structure destabilization and an associated increase of non-productive transcription in response to CTCF depletion but do not drive the CTCF-mediated transcription alterations of the neighbouring genes. Finally, we detected a transient eviction of CTCF at the Egr1 locus in correlation with Egr1 peak of expression in response to lipopolysaccharide (LPS) treatment in macrophages. This eviction is also correlated with the expression of an antisense non-coding RNA transcribing through the CTCF-binding region indicating that non-coding RNA transcription could be the cause and the consequence of CTCF eviction.

  18. How Depleted is the MORB mantle?

    NASA Astrophysics Data System (ADS)

    Hofmann, A. W.; Hart, S. R.

    2015-12-01

    Knowledge of the degree of mantle depletion of highly incompatible elements is critically important for assessing Earth's internal heat production and Urey number. Current views of the degree of MORB source depletion are dominated by Salters and Stracke (2004), and Workman and Hart (2005). The first is based on an assessment of average MORB compositions, whereas the second considers trace element data of oceanic peridotites. Both require an independent determination of one absolute concentration, Lu (Salters & Stracke), or Nd (Workman & Hart). Both use parent-daughter ratios Lu/Hf, Sm/Nd, and Rb/Sr calculated from MORB isotopes combined with continental-crust extraction models, as well as "canonical" trace element ratios, to boot-strap the full range of trace element abundances. We show that the single most important factor in determining the ultimate degree of incompatible element depletion in the MORB source lies in the assumptions about the timing of continent extraction, exemplified by continuous extraction versus simple two-stage models. Continued crust extraction generates additional, recent mantle depletion, without affecting the isotopic composition of the residual mantle significantly. Previous emphasis on chemical compositions of MORB and/or peridotites has tended to obscure this. We will explore the effect of different continent extraction models on the degree of U, Th, and K depletion in the MORB source. Given the uncertainties of the two most popular models, the uncertainties of U and Th in DMM are at least ±50%, and this impacts the constraints on the terrestrial Urey ratio. Salters, F.J.M. and Stracke, A., 2004, Geochem. Geophys. Geosyst. 5, Q05004. Workman, R.K. and Hart, S.R., 2005, EPSL 231, 53-72.

  19. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  20. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  1. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  2. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  3. A worldwide view of groundwater depletion

    NASA Astrophysics Data System (ADS)

    van Beek, L. P.; Wada, Y.; van Kempen, C.; Reckman, J. W.; Vasak, S.; Bierkens, M. F.

    2010-12-01

    During the last decades, global water demand has increased two-fold due to increasing population, expanding irrigated area and economic development. Globally such demand can be met by surface water availability (i.e., water in rivers, lakes and reservoirs) but regional variations are large and the absence of sufficient rainfall and run-off increasingly encourages the use of groundwater resources, particularly in the (semi-)arid regions of the world. Excessive abstraction for irrigation frequently leads to overexploitation, i.e. if groundwater abstraction exceeds the natural groundwater recharge over extensive areas and prolonged times, persistent groundwater depletion may occur. Observations and various regional studies have revealed that groundwater depletion is a substantial issue in regions such as Northwest India, Northeast Pakistan, Central USA, Northeast China and Iran. Here we provide a global overview of groundwater depletion from the year 1960 to 2000 at a spatial resolution of 0.5 degree by assessing groundwater recharge with the global hydrological model PCR-GLOBWB and subtracting estimates of groundwater abstraction obtained from IGRAC-GGIS database. PCR-GLOBWB was forced by the CRU climate dataset downscaled to daily time steps using ERA40 re-analysis data. PCR-GLOBWB simulates daily global groundwater recharge (0.5 degree) while considering sub-grid variability of each grid cell (e.g., short and tall vegetation, different soil types, fraction of saturated soil). Country statistics of groundwater abstraction were downscaled to 0.5 degree by using water demand (i.e., agriculture, industry and domestic) as a proxy. To limit problems related to increased capture of discharge and increased recharge due to groundwater pumping, we restricted our analysis to sub-humid to arid areas. The uncertainty in the resulting estimates was assessed by a Monte Carlo analysis of 100 realizations of groundwater recharge and 100 realizations of groundwater abstraction

  4. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  5. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  6. Iron isotope composition of depleted MORB

    NASA Astrophysics Data System (ADS)

    Labidi, J.; Sio, C. K. I.; Shahar, A.

    2015-12-01

    In terrestrial basalts, iron isotope ratios are observed to weakly fractionate as a function of olivine and pyroxene crystallization. However, a ~0.1‰ difference between chondrites and MORB had been reported (Dauphas et al. 2009, Teng et al. 2013 and ref. therein). This observation could illustrate an isotope fractionation occurring during partial melting, as a function of the Fe valence in melt versus crystals. Here, we present high-precision Fe isotopic data measured by MC-ICP-MS on well-characterized samples from the Pacific-Antarctic Ridge (PAR, n=9) and from the Garrett Transform Fault (n=8). These samples allow exploring the Fe isotope fractionation between melt and magnetite, and the role of partial melting on Fe isotope fractionation. Our average δ56Fe value is +0.095±0.013‰ (95% confidence, n=17), indistinguishable from a previous estimate of +0.105±0.006‰ (95% confidence, n=43, see ref. 2). Our δ56Fe values correlate weakly with MgO contents, and correlate positively with K/Ti ratios. PAC1 DR10 shows the largest Ti and Fe depletion after titanomagnetite fractionation, with a δ56Fe value of +0.076±0.036‰. This is ~0.05‰ below other samples at a given MgO. This may illustrate a significant Fe isotope fractionation between the melt and titanomagnetite, in agreement with experimental determination (Shahar et al. 2008). GN09-02, the most incompatible-element depleted sample, has a δ56Fe value of 0.037±0.020‰. This is the lowest high-precision δ56Fe value recorded for a MORB worldwide. This basalt displays an incompatible-element depletion consistent with re-melting beneath the transform fault of mantle source that was depleted during a first melting event, beneath the ridge axis (Wendt et al. 1999). The Fe isotope observation could indicate that its mantle source underwent 56Fe depletion after a first melting event. It could alternatively indicate a lower Fe isotope fractionation during re-melting, if the source was depleted of its Fe3

  7. Assessment of the mechanical performance of the Westinghouse BWR control rod CR 99 at high depletion levels

    SciTech Connect

    Seltborg, P.; Jinnestrand, M.

    2012-07-01

    A long-term program assessing the mechanical performance of the Westinghouse BWR control rod CR 99 at high depletion levels has been performed. The scope of the program has mainly been based on the operation of four CR 99 Generation 2 control rods in demanding positions during 6 and 7 cycles in the Leibstadt Nuclear Power Plant (KKL) and on the detailed visual inspections and blade wing thickness measurements that were performed after the rods were discharged. By correlating statistically the blade wing thickness measurements to the appearance of irradiation-assisted stress corrosion cracking (IASCC), the probability of IASCC appearance as function of the blade wing swelling was estimated. In order to correlate the IASCC probability of a CR 99 to its depletion, the {sup 10}B depletion of the studied rods was calculated in detail on a local level with the stochastic Monte Carlo code MCNP in combination with the Westinghouse nodal code system PHOENIX4/POLCA7. Using this information coupled to the blade wing measurement data, a finite element model describing the blade wing swelling of an arbitrary CR 99 design as function of {sup 10}B depletion could then be generated. In the final step, these relationships were used to quantify the probability of IASCC appearance as function of the {sup 10}B depletion of the CR 99 Generations 2 and 3. Applying this detailed mapping of the CR 99 behavior at high depletion levels and using an on-line core monitoring system with explicit {sup 10}B depletion tracking capabilities will enable a reliable prediction of the probability for IASCC appearance, thus enhancing the optimized design and the sound operation of the CR 99 control rod. Another important outcome of the program was that it was clearly shown that no significant amount of boron leakage did occur through any of the detected IASCC cracks, despite the very high depletion levels achieved. (authors)

  8. Copenhagen delegates advance phaseout of ozone depleters

    SciTech Connect

    Kirschner, E.

    1992-12-09

    As expected, delegates at the United Nations Ozone Layer Conference in Copenhagen sped up ozone depleter phaseouts from the 1987 Montreal Protocol and the 1990 London amendments. The changes bring the worldwide production phaseout of chlorofluorocarbons (CFCs) and other ozone depleters in developed countries in line with U.S. and European plans announced earlier this year. Adjustments to the protocol, which are binding on the signatories, change the phaseout for CFC, carbon tetrachloride, and methyl chloroform production and consumption to January 1, 1996 from 2000. The 75% reduction of 1986 levels from CFCs by January 1, 1994 is a compromise between European pressure for an 85% cut and the US goal of 70%. Halon production is to end January 1, 1994, as anticipated. Developing countries continue to have a 10-year grace period. Friends of the Earth ozone campaign director Liz Cook counters that the phaseout dates were scheduled with concern for the chemical industry, not for the ozone layer.

  9. Replacements For Ozone-Depleting Foaming Agents

    NASA Technical Reports Server (NTRS)

    Blevins, Elana; Sharpe, Jon B.

    1995-01-01

    Fluorinated ethers used in place of chlorofluorocarbons and hydrochlorofluorocarbons. Replacement necessary because CFC's and HCFC's found to contribute to depletion of ozone from upper atmosphere, and manufacture and use of them by law phased out in near future. Two fluorinated ethers do not have ozone-depletion potential and used in existing foam-producing equipment, designed to handle liquid blowing agents soluble in chemical ingredients that mixed to make foam. Any polyurethane-based foams and several cellular plastics blown with these fluorinated ethers used in processes as diverse as small batch pours, large sprays, or double-band lamination to make insulation for private homes, commercial buildings, shipping containers, and storage tanks. Fluorinated ethers proved useful as replacements for CFC refrigerants and solvents.

  10. Replacements For Ozone-Depleting Foaming Agents

    NASA Technical Reports Server (NTRS)

    Blevins, Elana; Sharpe, Jon B.

    1995-01-01

    Fluorinated ethers used in place of chlorofluorocarbons and hydrochlorofluorocarbons. Replacement necessary because CFC's and HCFC's found to contribute to depletion of ozone from upper atmosphere, and manufacture and use of them by law phased out in near future. Two fluorinated ethers do not have ozone-depletion potential and used in existing foam-producing equipment, designed to handle liquid blowing agents soluble in chemical ingredients that mixed to make foam. Any polyurethane-based foams and several cellular plastics blown with these fluorinated ethers used in processes as diverse as small batch pours, large sprays, or double-band lamination to make insulation for private homes, commercial buildings, shipping containers, and storage tanks. Fluorinated ethers proved useful as replacements for CFC refrigerants and solvents.

  11. Ozone depletion in tropospheric volcanic plumes

    NASA Astrophysics Data System (ADS)

    Vance, Alan; McGonigle, Andrew J. S.; Aiuppa, Alessandro; Stith, Jeffrey L.; Turnbull, Kate; von Glasow, Roland

    2010-11-01

    We measured ozone (O3) concentrations in the atmospheric plumes of the volcanoes St. Augustine (1976), Mt. Etna (2004, 2009) and Eyjafjallajökull (2010) and found O3 to be strongly depleted compared to the background at each volcano. At Mt. Etna O3 was depleted within tens of seconds from the crater, the age of the St. Augustine plumes was on the order of hours, whereas the O3 destruction in the plume of Eyjafjallajökull was maintained in 1-9 day old plumes. The most likely cause for this O3 destruction are catalytic bromine reactions as suggested by a model that manages to reproduce the very early destruction of O3 but also shows that O3 destruction is ongoing for several days. Given the observed rapid and sustained destruction of O3, heterogeneous loss of O3 on ash is unlikely to be important.

  12. Endoplasmic-Reticulum Calcium Depletion and Disease

    PubMed Central

    Mekahli, Djalila; Bultynck, Geert; Parys, Jan B.; De Smedt, Humbert; Missiaen, Ludwig

    2011-01-01

    The endoplasmic reticulum (ER) as an intracellular Ca2+ store not only sets up cytosolic Ca2+ signals, but, among other functions, also assembles and folds newly synthesized proteins. Alterations in ER homeostasis, including severe Ca2+ depletion, are an upstream event in the pathophysiology of many diseases. On the one hand, insufficient release of activator Ca2+ may no longer sustain essential cell functions. On the other hand, loss of luminal Ca2+ causes ER stress and activates an unfolded protein response, which, depending on the duration and severity of the stress, can reestablish normal ER function or lead to cell death. We will review these various diseases by mainly focusing on the mechanisms that cause ER Ca2+ depletion. PMID:21441595

  13. Assessment of Preferred Depleted Uranium Disposal Forms

    SciTech Connect

    Croff, A.G.; Hightower, J.R.; Lee, D.W.; Michaels, G.E.; Ranek, N.L.; Trabalka, J.R.

    2000-06-01

    The Department of Energy (DOE) is in the process of converting about 700,000 metric tons (MT) of depleted uranium hexafluoride (DUF6) containing 475,000 MT of depleted uranium (DU) to a stable form more suitable for long-term storage or disposal. Potential conversion forms include the tetrafluoride (DUF4), oxide (DUO2 or DU3O8), or metal. If worthwhile beneficial uses cannot be found for the DU product form, it will be sent to an appropriate site for disposal. The DU products are considered to be low-level waste (LLW) under both DOE orders and Nuclear Regulatory Commission (NRC) regulations. The objective of this study was to assess the acceptability of the potential DU conversion products at potential LLW disposal sites to provide a basis for DOE decisions on the preferred DU product form and a path forward that will ensure reliable and efficient disposal.

  14. Depleted uranium plasma reduction system study

    SciTech Connect

    Rekemeyer, P.; Feizollahi, F.; Quapp, W.J.; Brown, B.W.

    1994-12-01

    A system life-cycle cost study was conducted of a preliminary design concept for a plasma reduction process for converting depleted uranium to uranium metal and anhydrous HF. The plasma-based process is expected to offer significant economic and environmental advantages over present technology. Depleted Uranium is currently stored in the form of solid UF{sub 6}, of which approximately 575,000 metric tons is stored at three locations in the U.S. The proposed system is preconceptual in nature, but includes all necessary processing equipment and facilities to perform the process. The study has identified total processing cost of approximately $3.00/kg of UF{sub 6} processed. Based on the results of this study, the development of a laboratory-scale system (1 kg/h throughput of UF6) is warranted. Further scaling of the process to pilot scale will be determined after laboratory testing is complete.

  15. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  16. The ultimate disposition of depleted uranium

    SciTech Connect

    Not Available

    1990-12-01

    Significant amounts of the depleted uranium (DU) created by past uranium enrichment activities have been sold, disposed of commercially, or utilized by defense programs. In recent years, however, the demand for DU has become quite small compared to quantities available, and within the US Department of Energy (DOE) there is concern for any risks and/or cost liabilities that might be associated with the ever-growing inventory of this material. As a result, Martin Marietta Energy Systems, Inc. (Energy Systems), was asked to review options and to develop a comprehensive plan for inventory management and the ultimate disposition of DU accumulated at the gaseous diffusion plants (GDPs). An Energy Systems task team, under the chairmanship of T. R. Lemons, was formed in late 1989 to provide advice and guidance for this task. This report reviews options and recommends actions and objectives in the management of working inventories of partially depleted feed (PDF) materials and for the ultimate disposition of fully depleted uranium (FDU). Actions that should be considered are as follows. (1) Inspect UF{sub 6} cylinders on a semiannual basis. (2) Upgrade cylinder maintenance and storage yards. (3) Convert FDU to U{sub 3}O{sub 8} for long-term storage or disposal. This will include provisions for partial recovery of costs to offset those associated with DU inventory management and the ultimate disposal of FDU. Another recommendation is to drop the term tails'' in favor of depleted uranium'' or DU'' because the tails'' label implies that it is waste.'' 13 refs.

  17. Health Effects of Embedded Depleted Uranium Fragments.

    DTIC Science & Technology

    2011-03-24

    Canada Feasibility Studies of a Method for Determining Depleted Uranium Deposited in Human Limbs .................. 39 Gary S . Kramer and Erin S . Niven...International Defense Review, 27:39-46 4 Introduction to the Problem 11. Filszar RL, Wilsey EF, Bloore EW (1989) Ra- and bulk storage conditions. Richland, WA...MD, Report BRL-TR- 3068 19. Limits for intakes of radionuclides by workers (1981) Pergamon Press, N.Y., ICRP Publication 12. Wilsey EF, Bloore EW (1989

  18. Carbon sequestration in depleted oil shale deposits

    DOEpatents

    Burnham, Alan K; Carroll, Susan A

    2014-12-02

    A method and apparatus are described for sequestering carbon dioxide underground by mineralizing the carbon dioxide with coinjected fluids and minerals remaining from the extraction shale oil. In one embodiment, the oil shale of an illite-rich oil shale is heated to pyrolyze the shale underground, and carbon dioxide is provided to the remaining depleted oil shale while at an elevated temperature. Conditions are sufficient to mineralize the carbon dioxide.

  19. The ultimate disposition of depleted uranium

    SciTech Connect

    Lemons, T.R.

    1991-12-31

    Depleted uranium (DU) is produced as a by-product of the uranium enrichment process. Over 340,000 MTU of DU in the form of UF{sub 6} have been accumulated at the US government gaseous diffusion plants and the stockpile continues to grow. An overview of issues and objectives associated with the inventory management and the ultimate disposition of this material is presented.

  20. Depletion modeling of liquid dominated geothermal reservoirs

    SciTech Connect

    Olsen, G.

    1984-06-01

    Depletion models for liquid-dominated geothermal reservoirs are derived and presented. The depletion models are divided into two categories: confined and unconfined. For both cases depletion models with no recharge (or influx), and depletion models including recharge, are used to match field data from the Svartsengi high temperature geothermal field in Iceland. The influx models included with the mass and energy balances are adopted from the petroleum engineering literature. The match to production data from Svartsengi is improved when influx was included. The Schilthuis steady-state influx gives a satisfactory match. The finite aquifer method of Fetkovitch, and the unsteady state method of Hurst gave reasonable answers, but not as good. The best match is obtained using Hurst simplified solution when lambda = 1.3 x 10{sup -4} m{sup -1}. From the match the cross-sectional area of the aquifer was calculated as 3.6 km{sup 2}. The drawdown was predicted using the Hurst simplified method, and compared with predicted drawdown from a boiling model and an empirical log-log model. A large difference between the models was obtained. The predicted drawdown using the Hurst simplified method falls between the other two. Injection has been considered by defining the net rate as being the production rate minus the injection rate. No thermal of transient effects were taken into account. Prediction using three different net rates shows that the pressure can be maintained using the Hurst simplified method if there is significant fluid reinjection. 32 refs., 44 figs., 2 tabs.

  1. Depleted uranium residual radiological risk assessment for Kosovo sites.

    PubMed

    Durante, Marco; Pugliese, Mariagabriella

    2003-01-01

    During the recent conflict in Yugoslavia, depleted uranium rounds were employed and were left in the battlefield. Health concern is related to the risk arising from contamination of areas in Kosovo with depleted uranium penetrators and dust. Although chemical toxicity is the most significant health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict. Uranium munitions are considered to be a source of radiological contamination of the environment. Based on measurements and estimates from the recent Balkan Task Force UNEP mission in Kosovo, we have estimated effective doses to resident populations using a well-established food-web mathematical model (RESRAD code). The UNEP mission did not find any evidence of widespread contamination in Kosovo. Rather than the actual measurements, we elected to use a desk assessment scenario (Reference Case) proposed by the UNEP group as the source term for computer simulations. Specific applications to two Kosovo sites (Planeja village and Vranovac hill) are described. Results of the simulations suggest that radiation doses from water-independent pathways are negligible (annual doses below 30 microSv). A small radiological risk is expected from contamination of the groundwater in conditions of effective leaching and low distribution coefficient of uranium metal. Under the assumptions of the Reference Case, significant radiological doses (>1 mSv/year) might be achieved after many years from the conflict through water-dependent pathways. Even in this worst-case scenario, DU radiological risk would be far overshadowed by its chemical toxicity.

  2. Barium depletion in hollow cathode emitters

    NASA Astrophysics Data System (ADS)

    Polk, James E.; Mikellides, Ioannis G.; Capece, Angela M.; Katz, Ira

    2016-01-01

    Dispenser hollow cathodes rely on a consumable supply of Ba released by BaO-CaO-Al2O3 source material in the pores of a tungsten matrix to maintain a low work function surface. The examination of cathode emitters from long duration tests shows deposits of tungsten at the downstream end that appear to block the flow of Ba from the interior. In addition, a numerical model of Ba transport in the cathode plasma indicates that the Ba partial pressure in the insert may exceed the equilibrium vapor pressure of the dominant Ba-producing reaction, and it was postulated previously that this would suppress Ba loss in the upstream part of the emitter. New measurements of the Ba depletion depth from a cathode insert operated for 8200 h reveal that Ba loss is confined to a narrow region near the downstream end, confirming this hypothesis. The Ba transport model was modified to predict the depletion depth with time. A comparison of the calculated and measured depletion depths gives excellent qualitative agreement, and quantitative agreement was obtained assuming an insert temperature 70 °C lower than measured beginning-of-life values.

  3. Renal cortical pyruvate depletion during AKI.

    PubMed

    Zager, Richard A; Johnson, Ali C M; Becker, Kirsten

    2014-05-01

    Pyruvate is a key intermediary in energy metabolism and can exert antioxidant and anti-inflammatory effects. However, the fate of pyruvate during AKI remains unknown. Here, we assessed renal cortical pyruvate and its major determinants (glycolysis, gluconeogenesis, pyruvate dehydrogenase [PDH], and H2O2 levels) in mice subjected to unilateral ischemia (15-60 minutes; 0-18 hours of vascular reflow) or glycerol-induced ARF. The fate of postischemic lactate, which can be converted back to pyruvate by lactate dehydrogenase, was also addressed. Ischemia and glycerol each induced persistent pyruvate depletion. During ischemia, decreasing pyruvate levels correlated with increasing lactate levels. During early reperfusion, pyruvate levels remained depressed, but lactate levels fell below control levels, likely as a result of rapid renal lactate efflux. During late reperfusion and glycerol-induced AKI, pyruvate depletion corresponded with increased gluconeogenesis (pyruvate consumption). This finding was underscored by observations that pyruvate injection increased renal cortical glucose content in AKI but not normal kidneys. AKI decreased PDH levels, potentially limiting pyruvate to acetyl CoA conversion. Notably, pyruvate therapy mitigated the severity of AKI. This renoprotection corresponded with increases in cytoprotective heme oxygenase 1 and IL-10 mRNAs, selective reductions in proinflammatory mRNAs (e.g., MCP-1 and TNF-α), and improved tissue ATP levels. Paradoxically, pyruvate increased cortical H2O2 levels. We conclude that AKI induces a profound and persistent depletion of renal cortical pyruvate, which may induce additional injury.

  4. Barium depletion in hollow cathode emitters

    SciTech Connect

    Polk, James E. Mikellides, Ioannis G.; Katz, Ira; Capece, Angela M.

    2016-01-14

    Dispenser hollow cathodes rely on a consumable supply of Ba released by BaO-CaO-Al{sub 2}O{sub 3} source material in the pores of a tungsten matrix to maintain a low work function surface. The examination of cathode emitters from long duration tests shows deposits of tungsten at the downstream end that appear to block the flow of Ba from the interior. In addition, a numerical model of Ba transport in the cathode plasma indicates that the Ba partial pressure in the insert may exceed the equilibrium vapor pressure of the dominant Ba-producing reaction, and it was postulated previously that this would suppress Ba loss in the upstream part of the emitter. New measurements of the Ba depletion depth from a cathode insert operated for 8200 h reveal that Ba loss is confined to a narrow region near the downstream end, confirming this hypothesis. The Ba transport model was modified to predict the depletion depth with time. A comparison of the calculated and measured depletion depths gives excellent qualitative agreement, and quantitative agreement was obtained assuming an insert temperature 70 °C lower than measured beginning-of-life values.

  5. Pumping test evaluation of stream depletion parameters.

    PubMed

    Lough, Hilary K; Hunt, Bruce

    2006-01-01

    Descriptions are given of a pumping test and a corresponding analysis that permit calculation of all five hydrogeological parameters appearing in the Hunt (2003) solution for stream depletion caused by ground water abstraction from a well beside a stream. This solution assumes that flow in the pumped aquifer is horizontal, flow in the overlying aquitard or system of aquitards is vertical, and the free surface in the top aquitard is allowed to draw down. The definition of an aquitard in this paper is any layer with a vertical hydraulic conductivity much lower than the horizontal hydraulic conductivity of the pumped aquifer. These "aquitards" may be reasonably permeable layers but are distinguished from the pumped aquifer by their hydraulic conductivity contrast. The pumping test requires a complete set of drawdown measurements from at least one observation well. This well must be deep enough to penetrate the pumped aquifer, and pumping must continue for a sufficient time to ensure that depleted streamflow becomes a significant portion of the well abstraction rate. Furthermore, two of the five parameters characterize an aquitard that overlies the pumped aquifer, and values for these parameters are seen to be dependent upon the initial water table elevation in the aquitard. The field test analyzed herein used a total of eight observation wells screened in the pumped aquifer, and measurements from these wells gave eight sets of parameters that are used in a sensitivity analysis to determine the relative importance of each parameter in the stream depletion calculations.

  6. What is Code Biology?

    PubMed

    Barbieri, Marcello

    2017-10-06

    Various independent discoveries have shown that many organic codes exist in living systems, and this implies that they came into being during the history of life and contributed to that history. The genetic code appeared in a population of primitive systems that has been referred to as the common ancestor, and it has been proposed that three distinct signal processing codes gave origin to the three primary kingdoms of Archaea, Bacteria and Eukarya. After the genetic code and the signal processing codes, on the other hand, only the ancestors of the eukaryotes continued to explore the coding space and gave origin to splicing codes, histone code, tubulin code, compartment codes and many others. A first theoretical consequence of this historical fact is the idea that the Eukarya became increasingly more complex because they maintained the potential to bring new organic codes into existence. A second theoretical consequence comes from the fact that the evolution of the individual rules of a code can take an extremely long time, but the origin of a new organic code corresponds to the appearance of a complete set of rules and from a geological point of view this amounts to a sudden event. The great discontinuities of the history of life, in other words, can be explained as the result of the appearance of new codes. A third theoretical consequence comes from the fact that the organic codes have been highly conserved in evolution, which shows that they are the great invariants of life, the sole entities that have gone intact through billions of years while everything else has changed. This tells us that the organic codes are fundamental components of life and their study - the new research field of Code Biology - is destined to become an increasingly relevant part of the life sciences. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Assessing local planning to control groundwater depletion: California as a microcosm of global issues

    NASA Astrophysics Data System (ADS)

    Nelson, Rebecca L.

    2012-01-01

    Groundwater pumping has caused excessive groundwater depletion around the world, yet regulating pumping remains a profound challenge. California uses more groundwater than any other U.S. state, and serves as a microcosm of the adverse effects of pumping felt worldwide—land subsidence, impaired water quality, and damaged ecosystems, all against the looming threat of climate change. The state largely entrusts the control of depletion to the local level. This study uses internationally accepted water resources planning theories systematically to investigate three key aspects of controlling groundwater depletion in California, with an emphasis on local-level action: (a) making decisions and engaging stakeholders; (b) monitoring groundwater; and (c) using mandatory, fee-based and voluntary approaches to control groundwater depletion (e.g., pumping restrictions, pumping fees, and education about water conservation, respectively). The methodology used is the social science-derived technique of content analysis, which involves using a coding scheme to record these three elements in local rules and plans, and State legislation, then analyzing patterns and trends. The study finds that Californian local groundwater managers rarely use, or plan to use, mandatory and fee-based measures to control groundwater depletion. Most use only voluntary approaches or infrastructure to attempt to reduce depletion, regardless of whether they have more severe groundwater problems, or problems which are more likely to have irreversible adverse effects. The study suggests legal reforms to the local groundwater planning system, drawing upon its empirical findings. Considering the content of these recommendations may also benefit other jurisdictions that use a local groundwater management planning paradigm.

  8. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  9. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  10. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  11. Cellulases and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  12. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  13. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  14. Practices in Code Discoverability

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Allen, A.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Much of scientific progress now hinges on the reliability, falsifiability and reproducibility of computer source codes. Astrophysics in particular is a discipline that today leads other sciences in making useful scientific components freely available online, including data, abstracts, preprints, and fully published papers, yet even today many astrophysics source codes remain hidden from public view. We review the importance and history of source codes in astrophysics and previous efforts to develop ways in which information about astrophysics codes can be shared. We also discuss why some scientist coders resist sharing or publishing their codes, the reasons for and importance of overcoming this resistance, and alert the community to a reworking of one of the first attempts for sharing codes, the Astrophysics Source Code Library (ASCL). We discuss the implementation of the ASCL in an accompanying poster paper. We suggest that code could be given a similar level of referencing as data gets in repositories such as ADS.

  15. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  16. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  17. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  18. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  19. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  20. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  1. 26 CFR 1.642(e)-1 - Depreciation and depletion.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Depreciation and depletion. 1.642(e)-1 Section 1... (CONTINUED) INCOME TAXES Estates, Trusts, and Beneficiaries § 1.642(e)-1 Depreciation and depletion. An estate or trust is allowed the deductions for depreciation and depletion, but only to the extent...

  2. 26 CFR 52.4682-1 - Ozone-depleting chemicals.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 26 Internal Revenue 17 2013-04-01 2013-04-01 false Ozone-depleting chemicals. 52.4682-1 Section 52... EXCISE TAXES (CONTINUED) ENVIRONMENTAL TAXES § 52.4682-1 Ozone-depleting chemicals. (a) Overview. This section provides rules relating to the tax imposed on ozone-depleting chemicals (ODCs) under section...

  3. 26 CFR 52.4682-1 - Ozone-depleting chemicals.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 17 2011-04-01 2011-04-01 false Ozone-depleting chemicals. 52.4682-1 Section 52... EXCISE TAXES (CONTINUED) ENVIRONMENTAL TAXES § 52.4682-1 Ozone-depleting chemicals. (a) Overview. This section provides rules relating to the tax imposed on ozone-depleting chemicals (ODCs) under section...

  4. 26 CFR 52.4682-1 - Ozone-depleting chemicals.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 26 Internal Revenue 17 2014-04-01 2014-04-01 false Ozone-depleting chemicals. 52.4682-1 Section 52... EXCISE TAXES (CONTINUED) ENVIRONMENTAL TAXES § 52.4682-1 Ozone-depleting chemicals. (a) Overview. This section provides rules relating to the tax imposed on ozone-depleting chemicals (ODCs) under section...

  5. 26 CFR 52.4682-1 - Ozone-depleting chemicals.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 26 Internal Revenue 17 2012-04-01 2012-04-01 false Ozone-depleting chemicals. 52.4682-1 Section 52... EXCISE TAXES (CONTINUED) ENVIRONMENTAL TAXES § 52.4682-1 Ozone-depleting chemicals. (a) Overview. This section provides rules relating to the tax imposed on ozone-depleting chemicals (ODCs) under section...

  6. Children's Models of the Ozone Layer and Ozone Depletion.

    ERIC Educational Resources Information Center

    Christidou, Vasilia; Koulaidis, Vasilis

    1996-01-01

    The views of 40 primary students on ozone and its depletion were recorded through individual, semi-structured interviews. The data analysis resulted in the formation of a limited number of models concerning the distribution and role of ozone in the atmosphere, the depletion process, and the consequences of ozone depletion. Identifies five target…

  7. Children's Models of the Ozone Layer and Ozone Depletion.

    ERIC Educational Resources Information Center

    Christidou, Vasilia; Koulaidis, Vasilis

    1996-01-01

    The views of 40 primary students on ozone and its depletion were recorded through individual, semi-structured interviews. The data analysis resulted in the formation of a limited number of models concerning the distribution and role of ozone in the atmosphere, the depletion process, and the consequences of ozone depletion. Identifies five target…

  8. Dopamine Depletion Impairs Bilateral Sensory Processing in the Striatum in a Pathway-Dependent Manner.

    PubMed

    Ketzef, Maya; Spigolon, Giada; Johansson, Yvonne; Bonito-Oliva, Alessandra; Fisone, Gilberto; Silberberg, Gilad

    2017-05-17

    Parkinson's disease (PD) is a movement disorder caused by the loss of dopaminergic innervation, particularly to the striatum. PD patients often exhibit sensory impairments, yet the underlying network mechanisms are unknown. Here we examined how dopamine (DA) depletion affects sensory processing in the mouse striatum. We used the optopatcher for online identification of direct and indirect pathway projection neurons (MSNs) during in vivo whole-cell recordings. In control mice, MSNs encoded the laterality of sensory inputs with larger and earlier responses to contralateral than ipsilateral whisker deflection. This laterality coding was lost in DA-depleted mice due to adaptive changes in the intrinsic and synaptic properties, mainly, of direct pathway MSNs. L-DOPA treatment restored laterality coding by increasing the separation between ipsilateral and contralateral responses. Our results show that DA depletion impairs bilateral tactile acuity in a pathway-dependent manner, thus providing unexpected insights into the network mechanisms underlying sensory deficits in PD. VIDEO ABSTRACT. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Potential For Stratospheric Ozone Depletion During Carboniferous

    NASA Astrophysics Data System (ADS)

    Bill, M.; Goldstein, A. H.

    Methyl bromide (CH3Br) constitutes the largest source of bromine atoms to the strato- sphere whereas methyl chloride (CH3Cl) is the most abundant halocarbon in the tro- posphere. Both gases play an important role in stratospheric ozone depletion. For in- stance, Br coupled reactions are responsible for 30 to 50 % of total ozone loss in the polar vortex. Currently, the largest natural sources of CH3Br and CH3Cl appear to be biological production in the oceans, inorganic production during biomass burning and plant production in salt marsh ecosystems. Variations of paleofluxes of CH3Br and CH3Cl can be estimated by analyses of oceanic paleoproductivity, stratigraphic analyses of frequency and distribution of fossil charcoal indicating the occurrence of wildfires, and/or by paleoreconstruction indicating the extent of salt marshes. Dur- ing the lower Carboniferous time (Tournaisian-Visean), the southern margin of the Laurasian continent was characterized by charcoal deposits. Estimation on frequency of charcoal layers indicates that wildfires occur in a range of 3-35 years (Falcon-Lang 2000). This suggests that biomass burning could be an important source of CH3Br and CH3Cl during Tournaisian-Viesan time. During Tounaisian and until Merame- cian carbon and oxygen isotope records have short term oscillations (Bruckschen et al. 1999, Mii et al. 1999). Chesterian time (mid- Carboniferous) is marked by an in- crease in delta18O values ( ~ 2 permil) and an increase of glacial deposit frequency suggesting lower temperatures. The occurrence of glacial deposits over the paleopole suggests polar conditions and the associated special features of polar mete- orology such as strong circumpolar wind in the stratosphere (polar vortex) and polar stratospheric clouds. Thus, conditions leading to polar statospheric ozone depletion can be found. Simultaneously an increase in delta13C values is documented. We interpret the positive shift in delta13C as a result of higher bioproductivity

  10. Evolution of the Depleted Mantle--Revisited

    NASA Astrophysics Data System (ADS)

    Vervoort, J. D.

    2011-12-01

    One of the basic tenets of radiogenic isotope geochemistry is that the Earth's crust has been extracted from the mantle leaving the mantle depleted in incompatible elements. Two tracers that have been used to constrain depleted mantle (DM) evolution are the long-lived radiogenic isotope systems Sm-Nd and Lu-Hf. It has long been assumed that Nd evolution of the depleted mantle follows a ~ linear path from chondritic composition (ɛNd = 0) at ~4.5 Ga to depleted MORB mantle (DMM) today (ɛNd = +10) [e.g., 1]. The Hf isotope system, before the revision of the 176Lu decay constant [2,3], appeared follow a similar evolution: chondritic at ~4.5 Ga to DMM today [4]. This relationship, however, changed radically with the revision of the 176Lu decay constant. Now, for the Lu-Hf isotope system, there is no evidence of a widespread depleted mantle prior to 3.5 Ga. Using the new decay constant and CHUR values [5], Hf data for juvenile, mantle-derived rocks through time define a trend that does not diverge from chondritic compositions until after 3.5 Ga. This new Hf DM evolution is defined by average DMM today (176Hf/177Hf = 0.28321, ɛHf =~ +15) and using 176Lu/177Hf = 0.0399 to calculate compositions back in time to CHUR at 3.5 Ga. The Hf-Nd isotope records, therefore, appear to indicate different evolutionary histories: Nd isotopes show evidence for widespread depleted mantle from the earliest preserved rocks [6] and; Hf isotopes show only chondritic to enriched sources for the oldest preserved rocks [7]. This "Hf-Nd paradox", if the reference frames are correct, would imply that differentiation of the early Earth affected the Sm-Nd and Lu-Hf systems in fundamentally differently ways-a premise that seems difficult to explain. A further complication in the Hf and Nd evolution of the Earth comes from the short-lived isotope system 146Sm-142Nd. An unavoidable consequence of the 142Nd data is that the silicate Earth sampled thus far has a super chondritic Sm/Nd composition [e

  11. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  12. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  13. A modern depleted uranium manufacturing facility

    SciTech Connect

    Zagula, T.A.

    1995-07-01

    The Specific Manufacturing Capabilities (SMC) Project located at the Idaho National Engineering Laboratory (INEL) and operated by Lockheed Martin Idaho Technologies Co. (LMIT) for the Department of Energy (DOE) manufactures depleted uranium for use in the U.S. Army MIA2 Abrams Heavy Tank Armor Program. Since 1986, SMC has fabricated more than 12 million pounds of depleted uranium (DU) products in a multitude of shapes and sizes with varying metallurgical properties while maintaining security, environmental, health and safety requirements. During initial facility design in the early 1980`s, emphasis on employee safety, radiation control and environmental consciousness was gaining momentum throughout the DOE complex. This fact coupled with security and production requirements forced design efforts to focus on incorporating automation, local containment and computerized material accountability at all work stations. The result was a fully automated production facility engineered to manufacture DU armor packages with virtually no human contact while maintaining security, traceability and quality requirements. This hands off approach to handling depleted uranium resulted in minimal radiation exposures and employee injuries. Construction of the manufacturing facility was complete in early 1986 with the first armor package certified in October 1986. Rolling facility construction was completed in 1987 with the first certified plate produced in the fall of 1988. Since 1988 the rolling and manufacturing facilities have delivered more than 2600 armor packages on schedule with 100% final product quality acceptance. During this period there was an annual average of only 2.2 lost time incidents and a single individual maximum radiation exposure of 150 mrem. SMC is an example of designing and operating a facility that meets regulatory requirements with respect to national security, radiation control and personnel safety while achieving production schedules and product quality.

  14. Pumping induced depletion from two streams

    NASA Astrophysics Data System (ADS)

    Sun, Dongmin; Zhan, Hongbin

    2007-04-01

    We have proved that the Hantush's model [Hantush MS. Wells near streams with semipervious beds. J Geophys Res 1965;70:2829-38] in a half-domain can be extended to a whole-domain and becomes identical to that of Hunt [Hunt B. Unsteady stream depletion from ground water pumping. Ground Water 1999;37(1):98-102] for a shallow and infinitely narrow stream, provided that the Dupuit assumption is adopted. This proof helps correct a false concept that regards the Hantush's model as less useful because of its fully penetrating stream assumption. This study deals with interaction of an aquifer with two parallel streams based on the Hantush's model. Semi-analytical solutions are obtained based on rigorous mass conservation requirement by maintaining continuity of flux and head along the aquifer-streambed boundaries. This study shows that the hydraulic conductivity ratio of the two streambeds appears to be the most important factor controlling the stream-aquifer interaction, followed by a less important role played by the thickness ratio of the two streambeds. When the low-permeability streambeds do not exist, the steady-state stream depletion from one stream is linearly proportional to the ratio of the shortest distance from the pumping well to the other stream over the shortest distance between the two streams. When the low-permeability streambeds are presented, similar conclusion can be drawn except that the stream depletion now also strongly depends on the hydraulic conductivity ratio of the two streambeds. When the values of the hydraulic conductivity of the two streambeds are different by an order of magnitude, the location of the pumping well that receives equal flux from two streams can be off the middle-line between the two streams by nearly 90%.

  15. Zn2+ depletion blocks endosome fusion.

    PubMed Central

    Aballay, A; Sarrouf, M N; Colombo, M I; Stahl, P D; Mayorga, L S

    1995-01-01

    Fusion among endosomes is an important step for transport and sorting of internalized macromolecules. Working in a cell-free system, we previously reported that endosome fusion requires cytosol and ATP, and is sensitive to N-ethylmaleimide. Fusion is regulated by monomeric and heterotrimeric GTP-binding proteins. We now report that fusion can proceed at very low Ca2+ concentrations, i.e. < 30 nM. Moreover, fusion is not affected when intravesicular Ca2+ is depleted by preincubation of vesicles with calcium ionophores (5 microM ionomycin or A23187) in the presence of calcium chelators (5 mM EGTA or 60 mM EDTA). The results indicate that fusion can proceed at extremely low concentrations of intravesicular and extravesicular Ca2+. However, BAPTA [1,2-bis-(o-aminophenoxy)ethane-N,N,N',N'-tetra-acetic acid], a relatively specific Ca2+ chelator, inhibits fusion. BAPTA binds other metals besides Ca2+. We present evidence that BAPTA inhibition is due not to Ca2+ chelation but to Zn2+ depletion. TPEN [N,N,N',N'-tetrakis-(2-pyridylmethyl) ethylenediamine], another metal-ion chelator with low affinity for Ca2+, also inhibited fusion. TPEN- and BAPTA-inhibited fusions were restored by addition of Zn2+. Zn(2+)-dependent fusion presents the same characteristics as control fusion. In intact cells, TPEN inhibited transport along the endocytic pathway. The results indicate that Zn2+ depletion blocks endosome fusion, suggesting that this ion is necessary for the function of one or more factors involved in the fusion process. Images Figure 1 PMID:8554539

  16. Rhenium Disulfide Depletion-Load Inverter

    NASA Astrophysics Data System (ADS)

    McClellan, Connor; Corbet, Chris; Rai, Amritesh; Movva, Hema C. P.; Tutuc, Emanuel; Banerjee, Sanjay K.

    2015-03-01

    Many semiconducting Transition Metal Dichalcogenide (TMD) materials have been effectively used to create Field-Effect Transistor (FET) devices but have yet to be used in logic designs. We constructed a depletion-load voltage inverter using ultrathin layers of Rhenium Disulfide (ReS2) as the semiconducting channel. This ReS2 inverter was fabricated on a single micromechanically-exfoliated flake of ReS2. Electron beam lithography and physical vapor deposition were used to construct Cr/Au electrical contacts, an Alumina top-gate dielectric, and metal top-gate electrodes. By using both low (Aluminum) and high (Palladium) work-function metals as two separate top-gates on a single ReS2 flake, we create a dual-gated depletion mode (D-mode) and enhancement mode (E-mode) FETs in series. Both FETs displayed current saturation in the output characteristics as a result of the FET ``pinch-off'' mechanism and On/Off current ratios of 105. Field-effect mobilities of 23 and 17 cm2V-1s-1 and subthreshold swings of 97 and 551 mV/decade were calculated for the E-mode and D-mode FETs, respectively. With a supply voltage of 1V, at low/negative input voltages the inverter output was at a high logic state of 900 mV. Conversely with high/positive input voltages, the inverter output was at a low logic state of 500 mV. The inversion of the input signal demonstrates the potential for using ReS2 in future integrated circuit designs and the versatility of depletion-load logic devices for TMD research. NRI SWAN Center and ARL STTR Program.

  17. Scientific assessment of ozone depletion: 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Over the past few years, there have been highly significant advances in the understanding of the impact of human activities on the Earth's stratospheric ozone layer and the influence of changes in chemical composition of the radiative balance of the climate system. Specifically, since the last international scientific review (1989), there have been five major advances: (1) global ozone decreases; (2) polar ozone; (3) ozone and industrial halocarbons; (4) ozone and climate relations; and (5) ozone depletion potentials (ODP's) and global warming potentials (GWP's). These topics and others are discussed.

  18. Ozone Depletion and Biologically Relevant Ultraviolet Radiation.

    NASA Astrophysics Data System (ADS)

    Zeng, Jun

    1995-01-01

    An atmospheric radiative transfer model is used to calculate surface spectral ultraviolet irradiance under cloud-free conditions, and compared with measurements made at Lauder, New Zealand (45^circ{S }, 170^circ{E}) before and after the eruption of Mt. Pinatubo, and including a snow-covered surface. The ratios of diffuse to direct irradiance depend critically on solar elevation, surface albedo, and aerosol extinction. Ozone changes have pronounced effects on the global UVB irradiance, but have only a minor effect on these ratios. The comparison suggests that the ultraviolet radiation exposure can be computed with confidence for clear sky conditions, if the appropriate atmospheric pressure and temperature profiles, ozonesonde data, surface albedo, and aerosol optical properties are available. The total ozone abundances are derived by using ground-based UV irradiance measurements and compared with TOMS in Antarctica and the Arctic from 1990 to 1994. The comparisons show that they are generally in good agreement. Possible reasons for the discrepancies between the two methods are discussed. The equivalent cloud optical depths are also inferred from these data. Ozone depletion can also increase the penetration of ultraviolet radiation into the aquatic system. A coupled atmosphere-ocean radiative transfer model is used to investigate the effect of ozone depletion on UV penetration through the atmosphere and into the underlying water column. Comparisons between model computations and in situ measurements of irradiances made in Antarctic water show good agreement in the UV spectral range between 300 and 350 nm. The ratio of UVB (280-320 nm) to total (280-700 nm) irradiance also compared well. For a given ozone reduction the largest relative increase of UVB radiation arriving at the surface and penetrating to various depths in the ocean occurs at large solar zenith angles. At high latitudes the most pronounced increase in UVB exposure due to an ozone depletion occurs in the

  19. Depletion of the Outer Asteroid Belt

    NASA Technical Reports Server (NTRS)

    Liou, Jer-Chyi; Malhotra, Renu

    1997-01-01

    During the early history of the solar system, it is likely that the outer planets changed their distance from the sun, and hence, their influence on the asteroid belt evolved with time. The gravitational influence of Jupiter and Saturn on the orbital evolution of asteroids in the outer asteroid belt was calculated. The results show that the sweeping of mean motion resonances associated with planetary migration efficiently destabilizes orbits in the outer asteroid belt on a time scale of 10 million years. This mechanism provides an explanation for the observed depletion of asteroids in that region.

  20. Ozone depletion: implications for the veterinarian.

    PubMed

    Kopecky, K E

    1978-09-15

    Man has inadvertently modified the stratosphere. There is a good possibility that the ozone layer is being depleted by the use of jet aircraft (SST), chlorofluoromethane propellants, and nitrogen fertilizers. Under unpolluted conditions, the production of ozone equals its destruction. By man's intervention, however, the destruction may exceed the production. The potential outcome is increased intensity of solar ultraviolet (280-400 nm) radiation and penetration to the earth's surface of previously absorbed wavelengths below about 280 nm. The increased ultraviolet radiation would increase the likelihood of skin cancer in man and ocular squamous cell carcinoma in cattle. The climate also might be modified, possibly in an undesirable way.

  1. Depletion of the Outer Asteroid Belt

    PubMed

    Liou; Malhotra

    1997-01-17

    During the early history of the solar system, it is likely that the outer planets changed their distance from the sun, and hence, their influence on the asteroid belt evolved with time. The gravitational influence of Jupiter and Saturn on the orbital evolution of asteroids in the outer asteroid belt was calculated. The results show that the sweeping of mean motion resonances associated with planetary migration efficiently destabilizes orbits in the outer asteroid belt on a time scale of 10 million years. This mechanism provides an explanation for the observed depletion of asteroids in that region.

  2. Capstone Depleted Uranium Aerosols: Generation and Characterization

    SciTech Connect

    Parkhurst, MaryAnn; Szrom, Fran; Guilmette, Ray; Holmes, Tom; Cheng, Yung-Sung; Kenoyer, Judson L.; Collins, John W.; Sanderson, T. Ellory; Fliszar, Richard W.; Gold, Kenneth; Beckman, John C.; Long, Julie

    2004-10-19

    In a study designed to provide an improved scientific basis for assessing possible health effects from inhaling depleted uranium (DU) aerosols, a series of DU penetrators was fired at an Abrams tank and a Bradley fighting vehicle. A robust sampling system was designed to collect aerosols in this difficult environment and continuously monitor the sampler flow rates. Aerosols collected were analyzed for uranium concentration and particle size distribution as a function of time. They were also analyzed for uranium oxide phases, particle morphology, and dissolution in vitro. The resulting data provide input useful in human health risk assessments.

  3. Correlation between cosmic rays and ozone depletion.

    PubMed

    Lu, Q-B

    2009-03-20

    This Letter reports reliable satellite data in the period of 1980-2007 covering two full 11-yr cosmic ray (CR) cycles, clearly showing the correlation between CRs and ozone depletion, especially the polar ozone loss (hole) over Antarctica. The results provide strong evidence of the physical mechanism that the CR-driven electron-induced reaction of halogenated molecules plays the dominant role in causing the ozone hole. Moreover, this mechanism predicts one of the severest ozone losses in 2008-2009 and probably another large hole around 2019-2020, according to the 11-yr CR cycle.

  4. Commercialisation of full depletion scientific CCDs

    NASA Astrophysics Data System (ADS)

    Jorden, Paul; Ball, Kevin; Bell, Ray; Burt, David; Guyatt, Neil; Hadfield, Kevin; Jerram, Paul; Pool, Peter; Pike, Andrew; Holland, Andrew; Murray, Neil

    2006-06-01

    Following successful manufacture of small-format trial devices we have now designed and manufactured large-format scientific CCDs in high resistivity silicon ('high-rho'). These devices are intended for 'full depletion' operation as backside illuminated sensors for very high red wavelength sensitivity and X-ray imaging spectroscopy at extended energies. Devices of 2k*512 and 2k*4k format, with both single and dual stage output circuits have been manufactured and tested. Design considerations, test results, and commercial manufacturing considerations will be addressed.

  5. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  6. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  7. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  8. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  9. Simulations and observations of plasma depletion, ion composition, and airglow emissions in two auroral ionospheric depletion experiments

    NASA Technical Reports Server (NTRS)

    Yau, A. W.; Whalen, B. A.; Harris, F. R.; Gattinger, R. L.; Pongratz, M. B.

    1985-01-01

    Observations of plasma depletion, ion composition modification, and airglow emissions in the Waterhole experiments are presented. The detailed ion chemistry and airglow emission processes related to the ionospheric hole formation in the experiment are examined, and observations are compared with computer simulation results. The latter indicate that the overall depletion rates in different parts of the depletion region are governed by different parameters.

  10. Decline and depletion rates of oil production: a comprehensive investigation.

    PubMed

    Höök, Mikael; Davidsson, Simon; Johansson, Sheshti; Tang, Xu

    2014-01-13

    Two of the most fundamental concepts in the current debate about future oil supply are oilfield decline rates and depletion rates. These concepts are related, but not identical. This paper clarifies the definitions of these concepts, summarizes the underlying theory and empirically estimates decline and depletion rates for different categories of oilfield. A database of 880 post-peak fields is analysed to determine typical depletion levels, depletion rates and decline rates. This demonstrates that the size of oilfields has a significant influence on decline and depletion rates, with generally high values for small fields and comparatively low values for larger fields. These empirical findings have important implications for oil supply forecasting.

  11. Ego depletion results in an increase in spontaneous false memories.

    PubMed

    Otgaar, Henry; Alberts, Hugo; Cuppens, Lesly

    2012-12-01

    The primary aim of the current study was to examine whether depleted cognitive resources might have ramifications for the formation of neutral and negative spontaneous false memories. To examine this, participants received neutral and negative Deese/Roediger-McDermott false memory wordlists. Also, for half of the participants, cognitive resources were depleted by use of an ego depletion manipulation (solving difficult calculations while being interfered with auditory noise). Our chief finding was that depleted cognitive resources made participants more vulnerable for the production of false memories. Our results shed light on how depleted cognitive resources affect neutral and negative correct and errant memories.

  12. Halocarbon ozone depletion and global warming potentials

    NASA Technical Reports Server (NTRS)

    Cox, Richard A.; Wuebbles, D.; Atkinson, R.; Connell, Peter S.; Dorn, H. P.; Derudder, A.; Derwent, Richard G.; Fehsenfeld, F. C.; Fisher, D.; Isaksen, Ivar S. A.

    1990-01-01

    Concern over the global environmental consequences of fully halogenated chlorofluorocarbons (CFCs) has created a need to determine the potential impacts of other halogenated organic compounds on stratospheric ozone and climate. The CFCs, which do not contain an H atom, are not oxidized or photolyzed in the troposphere. These compounds are transported into the stratosphere where they decompose and can lead to chlorine catalyzed ozone depletion. The hydrochlorofluorocarbons (HCFCs or HFCs), in particular those proposed as substitutes for CFCs, contain at least one hydrogen atom in the molecule, which confers on these compounds a much greater sensitivity toward oxidation by hydroxyl radicals in the troposphere, resulting in much shorter atmospheric lifetimes than CFCs, and consequently lower potential for depleting ozone. The available information is reviewed which relates to the lifetime of these compounds (HCFCs and HFCs) in the troposphere, and up-to-date assessments are reported of the potential relative effects of CFCs, HCFCs, HFCs, and halons on stratospheric ozone and global climate (through 'greenhouse' global warming).

  13. Convective Polymer Depletion on Pair Particle Interactions

    NASA Astrophysics Data System (ADS)

    Fan, Tai-Hsi; Taniguchi, Takashi; Tuinier, Remco

    2011-11-01

    Understanding transport, reaction, aggregation, and viscoelastic properties of colloid-polymer mixture is of great importance in food, biomedical, and pharmaceutical sciences. In non-adsorbing polymer solutions, colloidal particles tend to aggregate due to the depletion-induced osmotic or entropic force. Our early development for the relative mobility of pair particles assumed that polymer reorganization around the particles is much faster than particle's diffusive time, so that the coupling of diffusive and convective effects can be neglected. Here we present a nonequilibrium two-fluid (polymer and solvent) model to resolve the convective depletion effect. The theoretical framework is based on ground state approximation and accounts for the coupling of fluid flow and polymer transport to better describe pair particle interactions. The momentum and polymer transport, chemical potential, and local viscosity and osmotic pressure are simultaneously solved by numerical approximation. This investigation is essential for predicting the demixing kinetics in the pairwise regime for colloid-polymer mixtures. This work is supported by NSF CMMI 0952646.

  14. Imaging neurotransmitter uptake and depletion in astrocytes

    SciTech Connect

    Tan, W. |; Haydon, P.G.; Yeung, E.S.

    1997-08-01

    An ultraviolet (UV) laser-based optical microscope and charge-coupled device (CCD) detection system was used to obtain chemical images of biological cells. Subcellular structures can be easily seen in both optical and fluorescence images. Laser-induced native fluorescence detection provides high sensitivity and low limits of detection, and it does not require coupling to fluorescent dyes. We were able to quantitatively monitor serotonin that has been taken up into and released from individual astrocytes on the basis of its native fluorescence. Different regions of the cells took up different amounts of serotonin with a variety of uptake kinetics. Similarly, we observed different serotonin depletion dynamics in different astrocyte regions. There were also some astrocyte areas where no serotonin uptake or depletion was observed. Potential applications include the mapping of other biogenic species in cells as well as the ability to image their release from specific regions of cells in response to external stimuli. {copyright} {ital 1997} {ital Society for Applied Spectroscopy}

  15. Stratospheric ozone depletion and animal health.

    PubMed

    Mayer, S J

    1992-08-08

    There is an increasing concern over ozone depletion and its effects on the environment and human health. However, the increase in ultraviolet-B radiation (UV-B) that would result from significant losses of ozone is also potentially harmful to animals. Any increase in disease in domestic species would not only have serious animal welfare implications but may also be economically important. The diseases which are likely to increase if ozone depletion continues include the squamous cell carcinomas of the exposed, non-pigmented areas of cats, cattle, sheep and horses. Uberreiter's syndrome in dogs is also associated with exposure to UV-B and may be expected to increase, as may the severity of conditions such as infectious keratoconjunctivitis (New Forest eye) in cattle. Aquaculture systems in which fish often have little or no protection by shading may also be at risk. Cataracts and skin lesions have been associated with the exposure of farmed fish to ultraviolet radiation and have resulted in significant losses.

  16. Methods used to calculate doses resulting from inhalation of Capstone depleted uranium aerosols.

    PubMed

    Miller, Guthrie; Cheng, Yung Sung; Traub, Richard J; Little, Tom T; Guilmette, Raymond A

    2009-03-01

    The methods used to calculate radiological and toxicological doses to hypothetical persons inside either a U.S. Army Abrams tank or Bradley Fighting Vehicle that has been perforated by depleted uranium munitions are described. Data from time- and particle-size-resolved measurements of depleted uranium aerosol as well as particle-size-resolved measurements of aerosol solubility in lung fluids for aerosol produced in the breathing zones of the hypothetical occupants were used. The aerosol was approximated as a mixture of nine monodisperse (single particle size) components corresponding to particle size increments measured by the eight stages plus the backup filter of the cascade impactors used. A Markov Chain Monte Carlo Bayesian analysis technique was employed, which straightforwardly calculates the uncertainties in doses. Extensive quality control checking of the various computer codes used is described.

  17. Stimulated emission depletion microscopy with optical fibers

    NASA Astrophysics Data System (ADS)

    Yan, Lu

    Imaging at the nanoscale and/or at remote locations holds great promise for studies in fields as disparate as the life sciences and materials sciences. One such microscopy technique, stimulated emission depletion (STED) microscopy, is one of several fluorescence based imaging techniques that offers resolution beyond the diffraction-limit. All current implementations of STED microscopy, however, involve the use of free-space beam shaping devices to achieve the Gaussian- and donut-shaped Orbital Angular Momentum (OAM) carrying beams at the desired colors -- a challenging prospect from the standpoint of device assembly and mechanical stability during operation. A fiber-based solution could address these engineering challenges, and perhaps more interestingly, it may facilitate endoscopic implementation of in vivo STED imaging, a prospect that has thus far not been realized because optical fibers were previously considered to be incapable of transmitting the OAM beams that are necessary for STED. In this thesis, we investigate fiber-based STED systems to enable endoscopic nanoscale imaging. We discuss the design and characteristics of a novel class of fibers supporting and stably propagating Gaussian and OAM modes. Optimization of the design parameters leads to stable excitation and depletion beams propagating in the same fiber in the visible spectral range, for the first time, with high efficiency (>99%) and mode purity (>98%). Using the fabricated vortex fiber, we demonstrate an all-fiber STED system with modes that are tolerant to perturbations, and we obtain naturally self-aligned PSFs for the excitation and depletion beams. Initial experiments of STED imaging using our device yields a 4-fold improvement in lateral resolution compared to confocal imaging. In an experiment in parallel, we show the means of using q-plates as free-space mode converters that yield alignment tolerant STED microscopy systems at wavelengths covering the entire visible spectrum, and hence

  18. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  19. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  20. Rewriting the Genetic Code.

    PubMed

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  1. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  2. Breaking the Neural Code

    DTIC Science & Technology

    2015-05-21

    SECURITY CLASSIFICATION OF: This seedling proposed to use advanced imaging techniques to break the neuronal code that links the firing of neurons in...Report: Breaking the Neural Code Report Title This seedling proposed to use advanced imaging techniques to break the neuronal code that links the...generating a closed-loop on-line experimental platform. We have completed all proposed tasks of the seedling and successfully completed preliminary

  3. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  4. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  5. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    Ptolemy Coding Style Christopher Brooks Edward A. Lee Electrical Engineering and Computer Sciences University of California at Berkeley Technical...COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...constraints, so such constraints are not new to the academic community. This document describes the coding style used in Ptolemy II, a package with

  6. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  7. Depletion of yeast PDK1 orthologs triggers a stress-like transcriptional response.

    PubMed

    Pastor-Flores, Daniel; Ferrer-Dalmau, Jofre; Bahí, Anna; Boleda, Martí; Biondi, Ricardo M; Casamayor, Antonio

    2015-09-21

    Pkh proteins are the PDK1 orthologs in S. cerevisiae. They have redundant and essential activity and are responsible for the phosphorylation of several members of the AGC family of protein kinases. Pkh proteins have been involved in several cellular functions, including cell wall integrity and endocytosis. However the global expression changes caused by their depletion are still unknown. A doxycycline-repressible tetO7 promoter driving the expression of PKH2 in cells carrying deletions of the PKH1 and PKH3 genes allowed us to progressively deplete cells from Pkh proteins when treated with doxycycline. Global gene expression analysis indicate that depletion of Pkh results in the up-regulation of genes involved in the accumulation of glycogen and also of those related to stress responses. Moreover, genes involved in the ion transport were quickly down-regulated when the levels of Pkh decreased. The reduction in the mRNA levels required for protein translation, however, was only observed after longer doxycycline treatment (24 h). We uncovered that Pkh is important for the proper transcriptional response to heat shock, and is mostly required for the effects driven by the transcription factors Hsf1 and Msn2/Msn4, but is not required for down-regulation of the mRNA coding for ribosomal proteins. By using the tetO7 promoter we elucidated for the first time the transcriptomic changes directly or indirectly caused by progressive depletion of Pkh. Furthermore, this system enabled the characterization of the transcriptional response triggered by heat shock in wild-type and Pkh-depleted cells, showing that about 40 % of the observed expression changes were, to some degree, dependent on Pkh.

  8. Seasonal iron depletion in temperate shelf seas

    NASA Astrophysics Data System (ADS)

    Birchill, Antony J.; Milne, Angela; Woodward, E. Malcolm S.; Harris, Carolyn; Annett, Amber; Rusiecka, Dagmara; Achterberg, Eric P.; Gledhill, Martha; Ussher, Simon J.; Worsfold, Paul J.; Geibert, Walter; Lohan, Maeve C.

    2017-09-01

    Our study followed the seasonal cycling of soluble (SFe), colloidal (CFe), dissolved (DFe), total dissolvable (TDFe), labile particulate (LPFe), and total particulate (TPFe) iron in the Celtic Sea (NE Atlantic Ocean). Preferential uptake of SFe occurred during the spring bloom, preceding the removal of CFe. Uptake and export of Fe during the spring bloom, coupled with a reduction in vertical exchange, led to Fe deplete surface waters (<0.2 nM DFe; 0.11 nM LPFe, 0.45 nM TDFe, and 1.84 nM TPFe) during summer stratification. Below the seasonal thermocline, DFe concentrations increased from spring to autumn, mirroring NO3- and consistent with supply from remineralized sinking organic material, and cycled independently of particulate Fe over seasonal timescales. These results demonstrate that summer Fe availability is comparable to the seasonally Fe limited Ross Sea shelf and therefore is likely low enough to affect phytoplankton growth and species composition.

  9. Chemical and radiological toxicity of depleted uranium.

    PubMed

    Sztajnkrycer, Matthew D; Otten, Edward J

    2004-03-01

    A by-product of the uranium enrichment process, depleted uranium (DU) contains approximately 40% of the radioactivity of natural uranium yet retains all of its chemical properties. After its use in the 1991 Gulf War, public concern increased regarding its potential radiotoxicant properties. Whereas in vitro and rodent data have suggested the potential for uranium-induced carcinogenesis, human cohort studies assessing the health effects of natural and DU have failed to validate these findings. Heavy-metal nephrotoxicity has not been noted in either animal studies or Gulf War veteran cohort studies despite markedly elevated urinary uranium excretion. No significant residual environmental contamination has been found in geographical areas exposed to DU. As such, although continued surveillance of exposed cohorts and environments (particularly water sources) are recommended, current data would support the position that DU poses neither a radiological nor chemical threat.

  10. Ozone depletion following future volcanic eruptions

    NASA Astrophysics Data System (ADS)

    Eric Klobas, J.; Wilmouth, David M.; Weisenstein, Debra K.; Anderson, James G.; Salawitch, Ross J.

    2017-07-01

    While explosive volcanic eruptions cause ozone loss in the current atmosphere due to an enhancement in the availability of reactive chlorine following the stratospheric injection of sulfur, future eruptions are expected to increase total column ozone as halogen loading approaches preindustrial levels. The timing of this shift in the impact of major volcanic eruptions on the thickness of the ozone layer is poorly known. Modeling four possible climate futures, we show that scenarios with the smallest increase in greenhouse gas concentrations lead to the greatest risk to ozone from heterogeneous chemical processing following future eruptions. We also show that the presence in the stratosphere of bromine from natural, very short-lived biogenic compounds is critically important for determining whether future eruptions will lead to ozone depletion. If volcanic eruptions inject hydrogen halides into the stratosphere, an effect not considered in current ozone assessments, potentially profound reductions in column ozone would result.

  11. Magnesium depletion enhances cisplatin-induced nephrotoxicity.

    PubMed

    Lajer, H; Kristensen, M; Hansen, H H; Nielsen, S; Frøkiaer, J; Ostergaard, L F; Christensen, S; Daugaard, G; Jonassen, T E N

    2005-11-01

    Nephrotoxicity and magnesium (Mg)-depletion are well-known side effects to cisplatin (CP) treatment. The purpose of this present study was to investigate the role of Mg on CP induced changes in renal function. CP induced renal dysfunction was achieved by treatment with CP or vehicle (2.5 mg/kg) once weekly for 3 weeks. Since the CP-induced renal damage, including tubular reabsorption defects, is most prominent within the outer medulla (OM), changes in the expression pattern of OM aquaporins and sodium transporters including the Na,K-ATPase (alpha-subunit), type III Na,H-exchanger (NHE3), aquaporin 1 (AQP1) and 2 (AQP2) and the Na,K,2Cl-cotransporter (NKCC2) were investigated by semi-quantitative Western blotting. Rats had access to either a diet with standard Mg or to a Mg-depleted diet. Cisplatin was administered to female Wistar rats once a week for 3 weeks according to four regimens: (1) Cisplatin 2.5 mg/kg body weight i.p., to rats on a diet with standard Mg, (2) Cisplatin 2.5 mg/kg body weight i.p., to rats on a diet with low Mg, (3) Isotonic NaCl 2.5 ml/kg body weight i.p., to rats on a diet with standard Mg, (4) Isotonic NaCl 2.5 ml/kg body weight i.p., to rats on a diet with low Mg. CP had no effect on plasma creatinine or urea in rats with standard Mg intake, but the expression of all five transporters was significantly reduced when compared to vehicle treated rats on standard Mg-intake. Vehicle treated rats on low Mg-intake had a significant reduction in the expression of Na,K-ATPase, NHE3 and NKCC2, but unchanged expression levels of AQP1 or AQP2 when compared to standard treated controls. Forty percent of the CP-treated rats on low Mg-intake died during the experiment and the remaining animals had marked increased plasma creatinine and urea. Furthermore, the Western blot analysis revealed an almost complete disappearance of all four transporters, suggesting a dramatic synergistic effect of CP and Mg-depletion on renal function including the expression

  12. Processing depleted uranium quad alloy penetrator rods

    SciTech Connect

    Bokan, S.L.

    1987-02-19

    Two depleted uranium (DU) quad alloys were cast, extruded and rolled to produce penetrator rods. The two alloy combinations were (1) 1 wt % molybdenum (Mo), 1 wt % niobium (Nb), and 0.75 wt % titanium (Ti); and (2) 1 wt % tantalum (Ta), 1 wt % Nb, and 0.75 wt % Ti. This report covers the processing and results with limited metallographic information available. The two alloys were each vacuum induction melted (VIM) into an 8-in. log, extruded into a 3-in. log, then cut into 4 logs and extruded at 4 different temperatures into 0.8-in. bars. From the 8 conditions (2 alloys, 4 extrusion temperatures each), 10 to 13 16-in. rods were cut for rolling and swaging. Due to cracking problems, the final processing changed from rolling and swaging to limited rolling and heat treating. The contracted work was completed with the delivery of 88 rods to Dr. Zabielski. 28 figs.

  13. Tylosin depletion from edible pig tissues.

    PubMed

    Prats, C; El Korchi, G; Francesch, R; Arboix, M; Pérez, B

    2002-12-01

    The depletion of tylosin from edible pig tissues was studied following 5 days of intramuscular (i.m.) administration of 10 mg/kg of tylosin to 16 crossbreed pigs. Animals were slaughtered at intervals after treatment and samples of muscle, kidney, liver, skin+fat, and injection site were collected and analysed by high-performance liquid chromatography (HPLC). Seven days after the completion of treatment, the concentration of tylosin in kidney, skin+fat, and at the injection site was higher than the European Union maximal residue limit (MRL) of 100 microg/kg. Tylosin residues in all tissues were below the quantification limit (50 microg/kg) at 10 and 14 days post-treatment.

  14. Anxiety, ego depletion, and sports performance.

    PubMed

    Englert, Chris; Bertrams, Alex

    2012-10-01

    In the present article, we analyzed the role of self-control strength and state anxiety in sports performance. We tested the hypothesis that self-control strength and state anxiety interact in predicting sports performance on the basis of two studies, each using a different sports task (Study 1: performance in a basketball free throw task, N = 64; Study 2: performance in a dart task, N = 79). The patterns of results were as expected in both studies: Participants with depleted self-control strength performed worse in the specific tasks as their anxiety increased, whereas there was no significant relation for participants with fully available self-control strength. Furthermore, different degrees of available self-control strength did not predict performance in participants who were low in state anxiety, but did in participants who were high in state anxiety. Thus increasing self-control strength could reduce the negative anxiety effects in sports and improve athletes' performance under pressure.

  15. Arctic Ozone Depletion from UARS MLS Measurements

    NASA Technical Reports Server (NTRS)

    Manney, G. L.

    1995-01-01

    Microwave Limb Sounder (MLS) measurements of ozone during four Arctic winters are compared. The evolution of ozone in the lower stratosphere is related to temperature, chlorine monoxide (also measured by MLS), and the evolution of the polar vortex. Lagrangian transport calculations using winds from the United Kingdom Meteorological Office's Stratosphere-Troposphere Data Assimilation system are used to estimate to what extent the evolution of lower stratospheric ozone is controlled by dynamics. Observations, along with calculations of the expected dynamical behavior, show evidence for chemical ozone depletion throughout most of the Arctic lower stratospheric vortex during the 1992-93 middle and late winter, and during all of the 1994-95 winter that was observed by MLS. Both of these winters were unusually cold and had unusually cold and had unusually strong Arctic polar vortices compared to meteorological data over the past 17 years.

  16. Policies on global warming and ozone depletion

    SciTech Connect

    Green, B.

    1987-04-01

    The recent discovery of a dramatic seasonal drop in the amount of ozone over Antarctica has catalyzed concern for protection of stratospheric ozone, the layer of gas that shields the entire planet from excess ultraviolet radiation. Conservative scientific models predict about a 5% reduction in the amount of global ozone by the middle of the next century, with large local variations. The predicted global warming from increased emissions of greenhouse gases will also have differing effects on local climate and weather conditions and consequently on agriculture. Although numerous uncertainties are associated with both ozone depletion and a global warming, there is a consensus that world leaders need to address the problems. The US Congress is now beginning to take note of the task. In this article, one representative outlines some perceptions of the problems and the policy options available to Congress.

  17. OrigenArp Primer: How to Perform Isotopic Depletion and Decay Calculations with SCALE/ORIGEN

    SciTech Connect

    Bowman, Stephen M; Gauld, Ian C

    2010-08-01

    The SCALE (Standardized Computer Analyses for Licensing Evaluation) computer software system developed at Oak Ridge National Laboratory is widely used and accepted around the world for nuclear analyses. ORIGEN-ARP is a SCALE isotopic depletion and decay analysis sequence used to perform point-depletion calculations with the well-known ORIGEN-S code using problem-dependent cross sections. Problem-dependent cross-section libraries are generated using the ARP (Automatic Rapid Processing) module using an interpolation algorithm that operates on pre-generated libraries created for a range of fuel properties and operating conditions. Methods are provided in SCALE to generate these libraries using one-, two-, and three-dimensional transport codes. The interpolation of cross sections for uranium-based fuels may be performed for the variables burnup, enrichment, and water density. An option is also available to interpolate cross sections for mixed-oxide (MOX) fuels using the variables burnup, plutonium content, plutonium isotopic vector, and water moderator density. This primer is designed to help a new user understand and use ORIGEN-ARP with the OrigenArp Windows graphical user interface in SCALE. It assumes that the user has a college education in a technical field. There is no assumption of familiarity with nuclear depletion codes in general or with SCALE/ORIGEN-ARP in particular. The primer is based on SCALE 6 but should be applicable to earlier or later versions of SCALE. Information is included to help new users, along with several sample problems that walk the user through the different input forms and menus and illustrate the basic features. References to related documentation are provided. The primer provides a starting point for the nuclear analyst who uses SCALE/ORIGEN-ARP. Complete descriptions are provided in the SCALE documentation. Although the primer is self-contained, it is intended as a companion volume to the SCALE documentation. The SCALE Manual is

  18. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  19. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  20. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  1. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  2. Human podocyte depletion in association with older age and hypertension.

    PubMed

    Puelles, Victor G; Cullen-McEwen, Luise A; Taylor, Georgina E; Li, Jinhua; Hughson, Michael D; Kerr, Peter G; Hoy, Wendy E; Bertram, John F

    2016-04-01

    Podocyte depletion plays a major role in the development and progression of glomerulosclerosis. Many kidney diseases are more common in older age and often coexist with hypertension. We hypothesized that podocyte depletion develops in association with older age and is exacerbated by hypertension. Kidneys from 19 adult Caucasian American males without overt renal disease were collected at autopsy in Mississippi. Demographic data were obtained from medical and autopsy records. Subjects were categorized by age and hypertension as potential independent and additive contributors to podocyte depletion. Design-based stereology was used to estimate individual glomerular volume and total podocyte number per glomerulus, which allowed the calculation of podocyte density (number per volume). Podocyte depletion was defined as a reduction in podocyte number (absolute depletion) or podocyte density (relative depletion). The cortical location of glomeruli (outer or inner cortex) and presence of parietal podocytes were also recorded. Older age was an independent contributor to both absolute and relative podocyte depletion, featuring glomerular hypertrophy, podocyte loss, and thus reduced podocyte density. Hypertension was an independent contributor to relative podocyte depletion by exacerbating glomerular hypertrophy, mostly in glomeruli from the inner cortex. However, hypertension was not associated with podocyte loss. Absolute and relative podocyte depletion were exacerbated by the combination of older age and hypertension. The proportion of glomeruli with parietal podocytes increased with age but not with hypertension alone. These findings demonstrate that older age and hypertension are independent and additive contributors to podocyte depletion in white American men without kidney disease.

  3. Modelling chemical depletion profiles in regolith

    USGS Publications Warehouse

    Brantley, S.L.; Bandstra, J.; Moore, J.; White, A.F.

    2008-01-01

    Chemical or mineralogical profiles in regolith display reaction fronts that document depletion of leachable elements or minerals. A generalized equation employing lumped parameters was derived to model such ubiquitously observed patterns:C = frac(C0, frac(C0 - Cx = 0, Cx = 0) exp (??ini ?? over(k, ??) ?? x) + 1)Here C, Cx = 0, and Co are the concentrations of an element at a given depth x, at the top of the reaction front, or in parent respectively. ??ini is the roughness of the dissolving mineral in the parent and k???? is a lumped kinetic parameter. This kinetic parameter is an inverse function of the porefluid advective velocity and a direct function of the dissolution rate constant times mineral surface area per unit volume regolith. This model equation fits profiles of concentration versus depth for albite in seven weathering systems and is consistent with the interpretation that the surface area (m2 mineral m- 3 bulk regolith) varies linearly with the concentration of the dissolving mineral across the front. Dissolution rate constants can be calculated from the lumped fit parameters for these profiles using observed values of weathering advance rate, the proton driving force, the geometric surface area per unit volume regolith and parent concentration of albite. These calculated values of the dissolution rate constant compare favorably to literature values. The model equation, useful for reaction fronts in both steady-state erosional and quasi-stationary non-erosional systems, incorporates the variation of reaction affinity using pH as a master variable. Use of this model equation to fit depletion fronts for soils highlights the importance of buffering of pH in the soil system. Furthermore, the equation should allow better understanding of the effects of important environmental variables on weathering rates. ?? 2008.

  4. A fast and flexible reactor physics model for simulating neutron spectra and depletion in fast reactors

    NASA Astrophysics Data System (ADS)

    Recktenwald, Geoff; Deinert, Mark

    2010-03-01

    Determining the time dependent concentration of isotopes within a nuclear reactor core is central to the analysis of nuclear fuel cycles. We present a fast, flexible tool for determining the time dependent neutron spectrum within fast reactors. The code (VBUDS: visualization, burnup, depletion and spectra) uses a two region, multigroup collision probability model to simulate the energy dependent neutron flux and tracks the buildup and burnout of 24 actinides, as well as fission products. While originally developed for LWR simulations, the model is shown to produce fast reactor spectra that show high degree of fidelity to available fast reactor benchmarks.

  5. Alternatives for Disposal of Depleted Uranium Waste.

    DTIC Science & Technology

    1985-11-01

    m* *9 induce these symptoms (Reference 2). This means that a 180-pound person would require a concentration intravenous dose of 6-7 mg UO 2 (NO3...pyrophoric as is done in the case of zirconium and hafnium metals in 49 CFR 173.214. An interpretation was informally requested from representatives of the...10 Code of Federal Regulation Part 20 Prior to 1981, Section 20.304 of the Standards for Protection Against Radiation , 10 CFR 20, provided general

  6. Brief mindfulness induction could reduce aggression after depletion.

    PubMed

    Yusainy, Cleoputri; Lawrence, Claire

    2015-05-01

    Many experiments have shown that one's ability to refrain from acting on aggressive impulses is likely to decrease following a prior act of self-control. This temporary state of self-control failure is known as ego-depletion. Although mindfulness is increasingly used to treat and manage aggressive behaviour, the extent to which mindfulness may counteract the depletion effect on aggression is yet to be determined. This study (N=110) investigated the effect of a laboratory induced one-time mindfulness meditation session on aggression following depletion. Aggression was assessed by the intensity of aversive noise blast participants delivered to an opponent on a computerised task. Depleted participants who received mindfulness induction behaved less aggressively than depleted participants with no mindfulness induction. Mindfulness also improved performance on a second measure of self-control (i.e., handgrip perseverance); however, this effect was independent of depletion condition. Motivational factors may help explain the dynamics of mindfulness, self-control, and aggression.

  7. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  8. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  9. Insurance billing and coding.

    PubMed

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  10. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  11. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  12. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  13. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  14. Modified JPEG Huffman coding.

    PubMed

    Lakhani, Gopal

    2003-01-01

    It is a well observed characteristic that when a DCT block is traversed in the zigzag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. This article presents a minor modification to the Huffman coding of the JPEG baseline compression algorithm to exploit this redundancy. For this purpose, DCT blocks are divided into bands so that each band can be coded using a separate code table. Three implementations are presented, which all move the end-of-block marker up in the middle of DCT block and use it to indicate the band boundaries. Experimental results are presented to compare reduction in the code size obtained by our methods with the JPEG sequential-mode Huffman coding and arithmetic coding methods. The average code reduction to the total image code size of one of our methods is 4%. Our methods can also be used for progressive image transmission and hence, experimental results are also given to compare them with two-, three-, and four-band implementations of the JPEG spectral selection method.

  15. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  16. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  17. Lichenase and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  18. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  19. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  20. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  1. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  2. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  3. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  4. ELEMENTAL DEPLETIONS IN THE MAGELLANIC CLOUDS AND THE EVOLUTION OF DEPLETIONS WITH METALLICITY

    SciTech Connect

    Tchernyshyov, Kirill; Meixner, Margaret; Seale, Jonathan; Fox, Andrew; Friedman, Scott D.; Dwek, Eli; Galliano, Frédéric

    2015-10-01

    We present a study of the composition of gas and dust in the Large and Small Magellanic Clouds (LMC and SMC) using UV absorption spectroscopy. We measure P ii and Fe ii along 84 spatially distributed sightlines toward the MCs using archival Far Ultraviolet Spectroscopic Explorer observations. For 16 of those sightlines, we also measure Si ii, Cr ii, and Zn ii from new Hubble Space Telescope Cosmic Origins Spectrograph observations. We analyze these spectra using a new spectral line analysis technique based on a semi-parametric Voigt profile model. We have combined these measurements with H i and H{sub 2} column densities and reference stellar abundances from the literature to derive gas-phase abundances, depletions, and gas-to-dust ratios (GDRs). Of our 84 P and 16 Zn measurements, 80 and 13, respectively, are depleted by more than 0.1 dex, suggesting that P and Zn abundances are not accurate metallicity indicators at and above the metallicity of the SMC. Si, Cr, and Fe are systematically less depleted in the SMC than in the Milky Way (MW) or LMC. The minimum Si depletion in the SMC is consistent with zero. We find GDR ranges of 190–565 in the LMC and 480–2100 in the SMC, which is broadly consistent with GDRs from the literature. These ranges represent actual location to location variation and are evidence of dust destruction and/or growth in the diffuse neutral phase of the interstellar medium. Where they overlap in metallicity, the gas-phase abundances of the MW, LMC, and SMC and damped Lyα systems evolve similarly with metallicity.

  5. Producing, Importing, and Exporting Ozone-Depleting Substances

    EPA Pesticide Factsheets

    Overview page provides links to information on producing, importing, and exporting ozone-depleting substances, including information about the HCFC allowance system, importing, labeling, recordkeeping and reporting.

  6. Gas generation matrix depletion quality assurance project plan

    SciTech Connect

    1998-05-01

    The Los Alamos National Laboratory (LANL) is to provide the necessary expertise, experience, equipment and instrumentation, and management structure to: Conduct the matrix depletion experiments using simulated waste for quantifying matrix depletion effects; and Conduct experiments on 60 cylinders containing simulated TRU waste to determine the effects of matrix depletion on gas generation for transportation. All work for the Gas Generation Matrix Depletion (GGMD) experiment is performed according to the quality objectives established in the test plan and under this Quality Assurance Project Plan (QAPjP).

  7. Glutathione depletion in tissues after administration of buthionine sulphoximine

    SciTech Connect

    Minchinton, A.I.; Rojas, A.; Smith, A.; Soranson, J.A.; Shrieve, D.C.; Jones, N.R.; Bremner, J.C.

    1984-08-01

    Buthionine sulphoximine (BSO) an inhibitor of glutathione (GSH) biosynthesis, was administered to mice in single and repeated doses. The resultant pattern of GSH depletion was studied in liver, kidney, skeletal muscle and three types of murine tumor. Liver and kidney exhibited a rapid depletion of GSH. Muscle was depleted to a similar level, but at a slower rate after a single dose. All three tumors required repeated administration of BSO over several days to obtain a similar degree of depletion to that shown in the other tissues.

  8. Bipolar optical pulse coding for performance enhancement in BOTDA sensors.

    PubMed

    Soto, Marcelo A; Le Floch, Sébastien; Thévenaz, Luc

    2013-07-15

    A pump signal based on bipolar pulse coding and single-sideband suppressed-carried (SSB-SC) modulation is proposed for Brillouin optical time-domain analysis (BOTDA) sensors. Making a sequential use of the Brillouin gain and loss spectra, the technique is experimentally validated using bipolar complementary-correlation Golay codes along a 100 km-long fiber and 2 m spatial resolution, fully resolving a 2 m hot-spot at the end of the sensing fiber with no distortion introduced by the decoding algorithm. Experimental results, in good agreement with the theory, indicate that bipolar Golay codes provide a higher signal-to-noise ratio enhancement and stronger robustness to pump depletion in comparison to optimum unipolar pulse codes known for BOTDA sensing.

  9. Regret causes ego-depletion and finding benefits in the regrettable events alleviates ego-depletion.

    PubMed

    Gao, Hongmei; Zhang, Yan; Wang, Fang; Xu, Yan; Hong, Ying-Yi; Jiang, Jiang

    2014-01-01

    This study tested the hypotheses that experiencing regret would result in ego-depletion, while finding benefits (i.e., "silver linings") in the regret-eliciting events counteracted the ego-depletion effect. Using a modified gambling paradigm (Experiments 1, 2, and 4) and a retrospective method (Experiments 3 and 5), five experiments were conducted to induce regret. Results revealed that experiencing regret undermined performance on subsequent tasks, including a paper-and-pencil calculation task (Experiment 1), a Stroop task (Experiment 2), and a mental arithmetic task (Experiment 3). Furthermore, finding benefits in the regret-eliciting events improved subsequent performance (Experiments 4 and 5), and this improvement was mediated by participants' perceived vitality (Experiment 4). This study extended the depletion model of self-regulation by considering emotions with self-conscious components (in our case, regret). Moreover, it provided a comprehensive understanding of how people felt and performed after experiencing regret and after finding benefits in the events that caused the regret.

  10. If ego depletion cannot be studied using identical tasks, it is not ego depletion.

    PubMed

    Lange, Florian

    2015-01-01

    The hypothesis that human self-control capacities are fueled by glucose has been challenged on multiple grounds. A recent study by Lange and Eggert adds to this criticism by presenting two powerful but unsuccessful attempts to replicate the effect of sugar drinks on ego depletion. The dual-task paradigms employed in these experiments have been criticized for involving identical self-control tasks, a methodology that has been argued to reduce participants' willingness to exert self-control. The present article addresses this criticism by demonstrating that there is no indication to believe that the study of glucose effects on ego depletion should be restricted to paradigms using dissimilar acts of self-control. Failures to observe such effects in paradigms involving identical tasks pose a serious problem to the proposal that self-control exhaustion might be reversed by rinsing or ingesting glucose. In combination with analyses of statistical credibility, the experiments by Lange and Eggert suggest that the influence of sugar on ego depletion has been systematically overestimated.

  11. How Ego Depletion Affects Sexual Self-Regulation: Is It More Than Resource Depletion?

    PubMed

    Nolet, Kevin; Rouleau, Joanne-Lucine; Benbouriche, Massil; Carrier Emond, Fannie; Renaud, Patrice

    2015-12-21

    Rational thinking and decision making are impacted when in a state of sexual arousal. The inability to self-regulate arousal can be linked to numerous problems, like sexual risk taking, infidelity, and sexual coercion. Studies have shown that most men are able to exert voluntary control over their sexual excitation with various levels of success. Both situational and dispositional factors can influence self-regulation achievement. The goal of this research was to investigate how ego depletion, a state of low self-control capacity, interacts with personality traits-propensities for sexual excitation and inhibition-and cognitive absorption, to cause sexual self-regulation failure. The sexual responses of 36 heterosexual males were assessed using penile plethysmography. They were asked to control their sexual arousal in two conditions, with and without ego depletion. Results suggest that ego depletion has opposite effects based on the trait sexual inhibition, as individuals moderately inhibited showed an increase in performance while highly inhibited ones showed a decrease. These results challenge the limited resource model of self-regulation and point to the importance of considering how people adapt to acute and high challenging conditions.

  12. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  13. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  14. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  15. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  16. Superluminal Labview Code

    SciTech Connect

    Wheat, Robert; Marksteiner, Quinn; Quenzer, Jonathan; Higginson, Ian

    2012-03-26

    This labview code is used to set the phase and amplitudes on the 72 antenna of the superluminal machine, and to map out the radiation patter from the superluminal antenna.Each antenna radiates a modulated signal consisting of two separate frequencies, in the range of 2 GHz to 2.8 GHz. The phases and amplitudes from each antenna are controlled by a pair of AD8349 vector modulators (VMs). These VMs set the phase and amplitude of a high frequency signal using a set of four DC inputs, which are controlled by Linear Technologies LTC1990 digital to analog converters (DACs). The labview code controls these DACs through an 8051 microcontroller.This code also monitors the phases and amplitudes of the 72 channels. Near each antenna, there is a coupler that channels a portion of the power into a binary network. Through a labview controlled switching array, any of the 72 coupled signals can be channeled in to the Tektronix TDS 7404 digital oscilloscope. Then the labview code takes an FFT of the signal, and compares it to the FFT of a reference signal in the oscilloscope to determine the magnitude and phase of each sideband of the signal. The code compensates for phase and amplitude errors introduced by differences in cable lengths.The labview code sets each of the 72 elements to a user determined phase and amplitude. For each element, the code runs an iterative procedure, where it adjusts the DACs until the correct phases and amplitudes have been reached.

  17. Thermal stress depletes energy reserves in Drosophila

    PubMed Central

    Klepsatel, Peter; Gáliková, Martina; Xu, Yanjun; Kühnlein, Ronald P.

    2016-01-01

    Understanding how environmental temperature affects metabolic and physiological functions is of crucial importance to assess the impacts of climate change on organisms. Here, we used different laboratory strains and a wild-caught population of the fruit fly Drosophila melanogaster to examine the effect of temperature on the body energy reserves of an ectothermic organism. We found that permanent ambient temperature elevation or transient thermal stress causes significant depletion of body fat stores. Surprisingly, transient thermal stress induces a lasting “memory effect” on body fat storage, which also reduces survivorship of the flies upon food deprivation later after stress exposure. Functional analyses revealed that an intact heat-shock response is essential to protect flies from temperature-dependent body fat decline. Moreover, we found that the temperature-dependent body fat reduction is caused at least in part by apoptosis of fat body cells, which might irreversibly compromise the fat storage capacity of the flies. Altogether, our results provide evidence that thermal stress has a significant negative impact on organismal energy reserves, which in turn might affect individual fitness. PMID:27641694

  18. Ozone depletion: 20 Years after the alarm

    SciTech Connect

    Not Available

    1994-08-15

    Scientific curiosity in 1973 led to the challenge of determining the ultimate atmospheric fate of the chlorofluoromethanes, CFC-11 (CCl[sub 3]F) and CFC-12 (CCl[sub 2]F[sub 2]), whose presence at measurable levels in surface air had been detected only two years earlier. In retrospect, the decision to pursue the chemistry of CFC molecules to their final destruction and beyond foreordained an unusual outcome because CFCs are chemically inert and easily survive under almost all natural conditions. By midsummer 1994, the world is well on its way in transition to a CFC-free economy, although not yet to a CFC-free atmosphere. The rates of increase in atmospheric concentration for the three major CFCs (CFC-11, -12, and -113) have all slowed markedly in response to the restrictions of the revised Montreal protocol. Because of their long lifetimes, however, significant but gradually diminishing quantities of CFCs will remain in the atmosphere throughout the 21st century. Atomic chlorine will continue to be released into the stratosphere as long as CFCs persist, and ozone depletion will follow. The existence of the Montreal protocol and the agreement among industrial, governmental, and university scientists on its wisdom offers considerable promise for the handling of future global environmental problems.

  19. Recovery of Depleted Uranium Fragments from Soil

    SciTech Connect

    Farr, C.P.; Alecksen, T.J.; Heronimus, R.S.; Simonds, M.H.; Farrar, D.R.; Baker, K.R.; Miller, M.L.

    2008-07-01

    A cost-effective method was demonstrated for recovering depleted uranium (DU) fragments from soil. A compacted clean soil pad was prepared adjacent to a pile of soil containing DU fragments. Soil from the contaminated pile was placed on the pad in three-inch lifts using conventional construction equipment. Each lift was scanned with an automatic scanning system consisting of an array of radiation detectors coupled to a detector positioning system. The data were downloaded into ArcGIS for data presentation. Areas of the pad exhibiting scaler counts above the decision level were identified as likely locations of DU fragments. The coordinates of these locations were downloaded into a PDA that was wirelessly connected to the positioning system. The PDA guided technicians to the locations where hand-held trowels and shovels were used to remove the fragments. After DU removal, the affected areas were re-scanned and the new data patched into the data base to replace the original data. This new data set along with soil sample results served as final status survey data. (authors)

  20. Levels of depleted uranium in Kosovo soils.

    PubMed

    Sansone, U; Stellato, L; Jia, G; Rosamilia, S; Gaudino, S; Barbizzi, S; Belli, M

    2001-01-01

    The United Nations Environment Programme (UNEP) has performed a field survey at 11 sites located in Kosovo, where depleted uranium (DU) ammunitions were used by the North Atlantic Treaty Organization (NATO) during the last Balkans conflict (1999). Soil sampling was performed to assess the spread of DU ground contamination around and within the NATO target sites and the migration of DU along the soil profile. The 234U/238U and 235U/238U activity concentration ratios have been used as an indicator of natural against anthropogenic sources of uranium. The results show that levels of 238U activity concentrations in soils above 100 Bq x kg(-1) can be considered a 'tracer' of the presence of DU in soils. The results also indicate that detectable ground surface contamination by DU is limited to areas within a few metres from localised points of concentrated contamination caused by penetrator impacts. Vertical distribution of DU along the soil profile is measurable up to a depth of 10-20 cm. This latter aspect is of particular relevance for the potential risk of future contamination of groundwater.

  1. Preventing NAD+ Depletion Protects Neurons against Excitotoxicity

    PubMed Central

    Liu, Dong; Pitta, Michael; Mattson, Mark P.

    2008-01-01

    Neurons are excitable cells that require large amounts of energy to support their survival and functions and are therefore prone to excitotoxicity, which involves energy depletion. By examining bioenergetic changes induced by glutamate, we found that the cellular nicotinamide adenine dinucleotide (NAD+) level is a critical determinant of neuronal survival. The bioenergetic effects of mitochondrial uncoupling and caloric restriction were also examined in cultured neurons and rodent brain. 2, 4-dinitrophenol (DNP) is a chemical mitochondrial uncoupler that stimulates glucose uptake and oxygen consumption on cultured neurons, which accelerates oxidation of NAD(P)H to NAD+ in mitochondria. The NAD+-dependent histone deacetylase sirtulin 1 (SIRT1) and glucose transporter 1 (GLUT1) mRNA are upregulated mouse brain under caloric restriction. To examine whether NAD+ mediates neuroprotective effects, nicotinamide, a precursor of NAD+ and inhibitor of SIRT1 and poly (ADP-ribose) polymerase 1 (PARP1) (two NAD+-dependent enzymes), was employed. Nicotinamide attenuated excitotoxic death and preserved cellular NAD+ levels to support SIRT1 and PARP 1 activities. Our findings suggest that mild mitochondrial uncoupling and caloric restriction exert hormetic effects by stimulating bioenergetics in neurons thereby increasing tolerance of neurons to metabolic stress. PMID:19076449

  2. Sulfachlorpyrazine residues depletion in turkey edible tissues.

    PubMed

    Łebkowska-Wieruszewska, B I; Kowalski, C J

    2010-08-01

    Sulfachlorpyrazine (SCP) is currently used to treat coccidian infections in turkeys; however, there is no information available about the withdrawal period necessary for the turkey to be safe for human consumption. A high performance liquid chromatography method with ultraviolet-visible light detection was adapted and validated for the determination of SCP in turkey tissues. The procedure is based on isolation of the (SCP sodium) compound from edible turkey tissues (muscles, liver, kidneys, and fat with skin) with satisfactory recovery (72.80 +/- 1.40) and specificity. The residue depletion of SCP in turkeys was conducted after a dose of 50 mg/kg body weight/day had been administrated orally for 3 days. After treatment has been discontinued residue concentrations were detected in tissues on the 7th day. The highest SCP concentrations were measured in muscles. Based on the results presented in this study, it could be assumed that a withdrawal period of 21 days, before medicated turkeys could be slaughtered, would be sufficient to ensure consumer safety.

  3. Diallyl disulphide depletes glutathione in Candida albicans

    PubMed Central

    Lemar, Katey M.; Aon, Miguel A.; Cortassa, Sonia; O’Rourke, Brian; T. Müller, Carsten; Lloyd, David

    2008-01-01

    Using two-photon scanning laser microscopy, we investigated the effect of an Allium sativum (garlic) constituent, diallyl disulphide (DADS), on key physiological functions of the opportunistic pathogen Candida albicans. A short 30 min exposure to 0.5 mm DADS followed by removal induced 70% cell death (50% necrotic, 20% apoptotic) within 2 h, increasing to 75% after 4 h. The early intracellular events associated with DADS-induced cell death were monitored with two-photon fluorescence microscopy to track mitochondrial membrane potential (ΔΨm), reactive oxygen species (ROS) and NADH or reduced glutathione (GSH) under aerobic conditions. DADS treatment decreased intracellular GSH and elevated intracellular ROS levels. Additionally, DADS induced a marked decrease of ΔΨm and lowered respiration in cell suspensions and isolated mitochondria. In vitro kinetic experiments in cell-free extracts suggest that glutathione-S-transferase (GST) is one of the intracellular targets of DADS. Additional targets were also identified, including inhibition of a site or sites between complexes II-IV in the electron transport chain, as well as the mitochondrial ATP-synthase. The results indicate that DADS is an effective antifungal agent able to trigger cell death in Candida, most probably by eliciting oxidative stress as a consequence of thiol depletion and impaired mitochondrial function. PMID:17534841

  4. Ichnologic signature of oxygen-depleted deposits

    SciTech Connect

    Ekdale, A.A.; Mason, T.R.

    1987-05-01

    The sedimentologic record of oxygen-poor depositional environments commonly includes trace fossils, especially those produced by deposit-feeding organisms that must have had broad oxygen tolerances. Endostratal fodinichnial and pascichnial traces indicate lack of oxygen within the substrate. Complex fodinichnia, such as Chondrites and Zoophycos, may form in anoxic sediment some distance below the water-sediment interface. The deposit-feeding animals can circulate oxygenated bottom water from the sea floor down through semipermanent shafts to permit respiration while they feed on unoxidized organic matter in the subsurface. Endostratal pascichnia, such as Helminthoida and Spirophycus, typically lack a continuous connection with the water-sediment interface, so interstitial water cannot be totally devoid of oxygen or else the animals cannot respire. However, endostratal pascichnia normally do not occur in oxidized sediment where digestible organic detritus has decomposed completely. In totally oxidized substrates, which typify higher energy depositional environments, permanent dwellings (domichnia) of filter-feeding organisms predominate. The ichnologic signature of oxygen-depleted deposits is a very high-density, very low-diversity association of deposit-feeding trace fossils. They suggest an oxygen-controlled trace fossil model in which increasing oxygen concentration of the interstitial water parallels a transition from fodinichnia-dominated through pascichnia-dominated to domichnia-dominated trace fossil associations. This model provides an alternative to the more traditional depth-controlled trace fossil distribution model in certain situations.

  5. Supercontinuum Stimulated Emission Depletion Fluorescence Lifetime Imaging

    SciTech Connect

    Lesoine, Michael; Bose, Sayantan; Petrich, Jacob; Smith, Emily

    2012-06-13

    Supercontinuum (SC) stimulated emission depletion (STED) fluorescence lifetime imaging is demonstrated by using time-correlated single-photon counting (TCSPC) detection. The spatial resolution of the developed STED instrument was measured by imaging monodispersed 40-nm fluorescent beads and then determining their fwhm, and was 36 ± 9 and 40 ± 10 nm in the X and Y coordinates, respectively. The same beads measured by confocal microscopy were 450 ± 50 and 430 ± 30 nm, which is larger than the diffraction limit of light due to underfilling the microscope objective. Underfilling the objective and time gating the signal were necessary to achieve the stated STED spatial resolution. The same fluorescence lifetime (2.0 ± 0.1 ns) was measured for the fluorescent beads by using confocal or STED lifetime imaging. The instrument has been applied to study Alexa Fluor 594-phalloidin labeled F-actin-rich projections with dimensions smaller than the diffraction limit of light in cultured cells. Fluorescence lifetimes of the actin-rich projections range from 2.2 to 2.9 ns as measured by STED lifetime imaging.

  6. Thermal stress depletes energy reserves in Drosophila.

    PubMed

    Klepsatel, Peter; Gáliková, Martina; Xu, Yanjun; Kühnlein, Ronald P

    2016-09-19

    Understanding how environmental temperature affects metabolic and physiological functions is of crucial importance to assess the impacts of climate change on organisms. Here, we used different laboratory strains and a wild-caught population of the fruit fly Drosophila melanogaster to examine the effect of temperature on the body energy reserves of an ectothermic organism. We found that permanent ambient temperature elevation or transient thermal stress causes significant depletion of body fat stores. Surprisingly, transient thermal stress induces a lasting "memory effect" on body fat storage, which also reduces survivorship of the flies upon food deprivation later after stress exposure. Functional analyses revealed that an intact heat-shock response is essential to protect flies from temperature-dependent body fat decline. Moreover, we found that the temperature-dependent body fat reduction is caused at least in part by apoptosis of fat body cells, which might irreversibly compromise the fat storage capacity of the flies. Altogether, our results provide evidence that thermal stress has a significant negative impact on organismal energy reserves, which in turn might affect individual fitness.

  7. Too Depleted to Try? Testing the Process Model of Ego Depletion in the Context of Unhealthy Snack Consumption.

    PubMed

    Haynes, Ashleigh; Kemps, Eva; Moffitt, Robyn

    2016-11-01

    The process model proposes that the ego depletion effect is due to (a) an increase in motivation toward indulgence, and (b) a decrease in motivation to control behaviour following an initial act of self-control. In contrast, the reflective-impulsive model predicts that ego depletion results in behaviour that is more consistent with desires, and less consistent with motivations, rather than influencing the strength of desires and motivations. The current study sought to test these alternative accounts of the relationships between ego depletion, motivation, desire, and self-control. One hundred and fifty-six undergraduate women were randomised to complete a depleting e-crossing task or a non-depleting task, followed by a lab-based measure of snack intake, and self-report measures of motivation and desire strength. In partial support of the process model, ego depletion was related to higher intake, but only indirectly via the influence of lowered motivation. Motivation was more strongly predictive of intake for those in the non-depletion condition, providing partial support for the reflective-impulsive model. Ego depletion did not affect desire, nor did depletion moderate the effect of desire on intake, indicating that desire may be an appropriate target for reducing unhealthy behaviour across situations where self-control resources vary. © 2016 The International Association of Applied Psychology.

  8. Transient Treg depletion enhances therapeutic anti‐cancer vaccination

    PubMed Central

    Aston, Wayne J.; Chee, Jonathan; Khong, Andrea; Cleaver, Amanda L.; Solin, Jessica N.; Ma, Shaokang; Lesterhuis, W. Joost; Dick, Ian; Holt, Robert A.; Creaney, Jenette; Boon, Louis; Robinson, Bruce; Lake, Richard A.

    2016-01-01

    Abstract Introduction Regulatory T cells (Treg) play an important role in suppressing anti‐ immunity and their depletion has been linked to improved outcomes. To better understand the role of Treg in limiting the efficacy of anti‐cancer immunity, we used a Diphtheria toxin (DTX) transgenic mouse model to specifically target and deplete Treg. Methods Tumor bearing BALB/c FoxP3.dtr transgenic mice were subjected to different treatment protocols, with or without Treg depletion and tumor growth and survival monitored. Results DTX specifically depleted Treg in a transient, dose‐dependent manner. Treg depletion correlated with delayed tumor growth, increased effector T cell (Teff) activation, and enhanced survival in a range of solid tumors. Tumor regression was dependent on Teffs as depletion of both CD4 and CD8 T cells completely abrogated any survival benefit. Severe morbidity following Treg depletion was only observed, when consecutive doses of DTX were given during peak CD8 T cell activation, demonstrating that Treg can be depleted on multiple occasions, but only when CD8 T cell activation has returned to base line levels. Finally, we show that even minimal Treg depletion is sufficient to significantly improve the efficacy of tumor‐peptide vaccination. Conclusions BALB/c.FoxP3.dtr mice are an ideal model to investigate the full therapeutic potential of Treg depletion to boost anti‐tumor immunity. DTX‐mediated Treg depletion is transient, dose‐dependent, and leads to strong anti‐tumor immunity and complete tumor regression at high doses, while enhancing the efficacy of tumor‐specific vaccination at low doses. Together this data highlight the importance of Treg manipulation as a useful strategy for enhancing current and future cancer immunotherapies. PMID:28250921

  9. Podocyte Depletion in Thin GBM and Alport Syndrome

    PubMed Central

    Wang, Su Q.; Afshinnia, Farsad; Kershaw, David; Wiggins, Roger C.

    2016-01-01

    The proximate genetic cause of both Thin GBM and Alport Syndrome (AS) is abnormal α3, 4 and 5 collagen IV chains resulting in abnormal glomerular basement membrane (GBM) structure/function. We previously reported that podocyte detachment rate measured in urine is increased in AS, suggesting that podocyte depletion could play a role in causing progressive loss of kidney function. To test this hypothesis podometric parameters were measured in 26 kidney biopsies from 21 patients aged 2–17 years with a clinic-pathologic diagnosis including both classic Alport Syndrome with thin and thick GBM segments and lamellated lamina densa [n = 15] and Thin GBM cases [n = 6]. Protocol biopsies from deceased donor kidneys were used as age-matched controls. Podocyte depletion was present in AS biopsies prior to detectable histologic abnormalities. No abnormality was detected by light microscopy at <30% podocyte depletion, minor pathologic changes (mesangial expansion and adhesions to Bowman’s capsule) were present at 30–50% podocyte depletion, and FSGS was progressively present above 50% podocyte depletion. eGFR did not change measurably until >70% podocyte depletion. Low level proteinuria was an early event at about 25% podocyte depletion and increased in proportion to podocyte depletion. These quantitative data parallel those from model systems where podocyte depletion is the causative event. This result supports a hypothesis that in AS podocyte adherence to the GBM is defective resulting in accelerated podocyte detachment causing progressive podocyte depletion leading to FSGS-like pathologic changes and eventual End Stage Kidney Disease. Early intervention to reduce podocyte depletion is projected to prolong kidney survival in AS. PMID:27192434

  10. Podocyte Depletion in Thin GBM and Alport Syndrome.

    PubMed

    Wickman, Larysa; Hodgin, Jeffrey B; Wang, Su Q; Afshinnia, Farsad; Kershaw, David; Wiggins, Roger C

    2016-01-01

    The proximate genetic cause of both Thin GBM and Alport Syndrome (AS) is abnormal α3, 4 and 5 collagen IV chains resulting in abnormal glomerular basement membrane (GBM) structure/function. We previously reported that podocyte detachment rate measured in urine is increased in AS, suggesting that podocyte depletion could play a role in causing progressive loss of kidney function. To test this hypothesis podometric parameters were measured in 26 kidney biopsies from 21 patients aged 2-17 years with a clinic-pathologic diagnosis including both classic Alport Syndrome with thin and thick GBM segments and lamellated lamina densa [n = 15] and Thin GBM cases [n = 6]. Protocol biopsies from deceased donor kidneys were used as age-matched controls. Podocyte depletion was present in AS biopsies prior to detectable histologic abnormalities. No abnormality was detected by light microscopy at <30% podocyte depletion, minor pathologic changes (mesangial expansion and adhesions to Bowman's capsule) were present at 30-50% podocyte depletion, and FSGS was progressively present above 50% podocyte depletion. eGFR did not change measurably until >70% podocyte depletion. Low level proteinuria was an early event at about 25% podocyte depletion and increased in proportion to podocyte depletion. These quantitative data parallel those from model systems where podocyte depletion is the causative event. This result supports a hypothesis that in AS podocyte adherence to the GBM is defective resulting in accelerated podocyte detachment causing progressive podocyte depletion leading to FSGS-like pathologic changes and eventual End Stage Kidney Disease. Early intervention to reduce podocyte depletion is projected to prolong kidney survival in AS.

  11. Three-dimensional modeling of the neutral gas depletion effect in a helicon discharge plasma

    NASA Astrophysics Data System (ADS)

    Kollasch, Jeffrey; Schmitz, Oliver; Norval, Ryan; Reiter, Detlev; Sovinec, Carl

    2016-10-01

    Helicon discharges provide an attractive radio-frequency driven regime for plasma, but neutral-particle dynamics present a challenge to extending performance. A neutral gas depletion effect occurs when neutrals in the plasma core are not replenished at a sufficient rate to sustain a higher plasma density. The Monte Carlo neutral particle tracking code EIRENE was setup for the MARIA helicon experiment at UW Madison to study its neutral particle dynamics. Prescribed plasma temperature and density profiles similar to those in the MARIA device are used in EIRENE to investigate the main causes of the neutral gas depletion effect. The most dominant plasma-neutral interactions are included so far, namely electron impact ionization of neutrals, charge exchange interactions of neutrals with plasma ions, and recycling at the wall. Parameter scans show how the neutral depletion effect depends on parameters such as Knudsen number, plasma density and temperature, and gas-surface interaction accommodation coefficients. Results are compared to similar analytic studies in the low Knudsen number limit. Plans to incorporate a similar Monte Carlo neutral model into a larger helicon modeling framework are discussed. This work is funded by the NSF CAREER Award PHY-1455210.

  12. Depletion of Ribosomal RNA Sequences from Single-Cell RNA-Sequencing Library.

    PubMed

    Fang, Nan; Akinci-Tolun, Rumeysa

    2016-07-01

    Recent advances in single-cell RNA sequencing technologies have revealed high heterogeneity of gene expression profiles in individual cells. However, most current single-cell RNA-seq methods use oligo-dT priming in the reverse transcription steps and detect only polyA-positive for more accuracy, since there are also polyA-positive non-coding RNAs transcripts, not other important RNA species, such as polyA-negative noncoding RNA. Reverse transcription using random oligos enables detection of not only the noncoding RNA species without polyA tails, but also ribosomal RNA (rRNA). rRNA comprises more than 90% of the total RNA and should be depleted from the RNA-seq library to ensure efficient usage of the sequencing capacity. Commonly used hybridization-based rRNA depletion methods can preserve noncoding RNA in the standard RNA-seq library. However, such rRNA depletion methods require high input amounts of total RNA and do not work at the single-cell level or with limited input DNA. This unit describes a novel procedure for RNA-seq library construction from single cells or a minimal amount of RNA. A thermostable duplex-specific nuclease is used in this method to effectively remove ribosomal RNA sequences following whole-transcriptome amplification and sequencing library construction. © 2016 by John Wiley & Sons, Inc.

  13. Coding for surgical audit.

    PubMed

    Pettigrew, R A; van Rij, A M

    1990-05-01

    A simple system of codes for operations, diagnoses and complications, developed specifically for computerized surgical audit, is described. This arose following a review of our established surgical audit in which problems in the retrieval of data from the database were identified. Evaluation of current methods of classification of surgical data highlighted the need for a dedicated coding system that was suitable for classifying surgical audit data, enabling rapid retrieval from large databases. After 2 years of use, the coding system has been found to fulfil the criteria of being sufficiently flexible and specific for computerized surgical audit, yet simple enough for medical staff to use.

  14. Mcnp-Based Methodology to Calculate Helium Production in Bwr Shrouds

    NASA Astrophysics Data System (ADS)

    Sitaraman, S.; Chiang, R.-T.; Oliver, B. M.

    2003-06-01

    A three-dimensional computational method based on Monte Carlo radiation transport techniques was developed to calculate thermal and fast neutron fields in the downcomer region of a Boiling Water Reactor (BWR). This methodology was validated using measured data obtained from an operating BWR. The helium production was measured in stainless steel at locations near the shroud and compared with values from the Monte Carlo calculations. The methodology produced results that were in agreement with measurements, thereby providing a useful tool for the determination of helium levels in shroud components.

  15. Optimal Allocation of Sampling Effort in Depletion Surveys

    EPA Science Inventory

    We consider the problem of designing a depletion or removal survey as part of estimating animal abundance for populations with imperfect capture or detection rates. In a depletion survey, animals are captured from a given area, counted, and withheld from the population. This proc...

  16. Whistler waves guided by density depletion ducts in a magnetoplasma

    SciTech Connect

    Bakharev, P. V.; Zaboronkova, T. M.; Kudrin, A. V.; Krafft, C.

    2010-11-15

    The guided propagation of whistler waves along cylindrical density depletion ducts in a magneto-plasma is studied. It is shown that, under certain conditions, such ducts can support volume and surface eigenmodes. The dispersion properties and field structure of whistler modes guided by density depletion ducts are analyzed. The effect of collisional losses in the plasma on the properties of modes is discussed.

  17. Optimal Allocation of Sampling Effort in Depletion Surveys

    EPA Science Inventory

    We consider the problem of designing a depletion or removal survey as part of estimating animal abundance for populations with imperfect capture or detection rates. In a depletion survey, animals are captured from a given area, counted, and withheld from the population. This proc...

  18. Auto-aligning stimulated emission depletion microscope using adaptive optics.

    PubMed

    Gould, Travis J; Kromann, Emil B; Burke, Daniel; Booth, Martin J; Bewersdorf, Joerg

    2013-06-01

    Stimulated emission depletion (STED) microscopy provides diffraction-unlimited resolution in fluorescence microscopy. Imaging at the nanoscale, however, requires precise alignment of the depletion and excitation laser foci of the STED microscope. We demonstrate here that adaptive optics can be implemented to automatically align STED and confocal images with a precision of 4.3 ± 2.3 nm.

  19. [Internal contamination with depleted uranium and health disorders].

    PubMed

    Pranjić, Nurka; Karamehić, Jasenko; Ljuca, Farid; Zigić, Zlata; Ascerić, Mensura

    2002-01-01

    In this review we used the published data on depleted uranium (experimental and epidemiological) from the current literature. Depleted uranium is a toxic heavy metal that in high dose may cause poisoning and health effects as those caused by lead, mercury, and chromium. It is slightly radioactive. The aim of this review was to select, to arrange, to present references of scientific papers, and to summarise the data in order to give a comprehensive image of the results of toxicological studies on depleted uranium that have been done on animals (including carcinogenic activity). We have also used epidemiological posted study results related to occupational and environmental exposure to depleted uranium. The toxicity of uranium has been studied extensively. The results of the studies indicated primarily its chemical toxicity, particularly renal effects, but depleted uranium is not radiological hazard. Uranium is not metal determined to be carcinogenic (the International Agency of Research on Cancer). The military use of depleted uranium will give additional insight into the toxicology of depleted uranium. The present controversy over the radiological and chemical toxicity of depleted uranium used in the Gulf War requests further experimental and clinical investigations of its effects on the biosphere and human beings.

  20. 26 CFR 1.613-1 - Percentage depletion; general rule.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 7 2011-04-01 2009-04-01 true Percentage depletion; general rule. 1.613-1 Section 1.613-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Natural Resources § 1.613-1 Percentage depletion;...

  1. SASSYS LMFBR systems code

    SciTech Connect

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time.

  2. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  3. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  4. Critical Care Coding for Neurologists.

    PubMed

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  5. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  6. Depletion of Appalachian coal reserves - how soon?

    USGS Publications Warehouse

    Milici, R.C.

    2000-01-01

    Much of the coal consumed in the US since the end of the last century has been produced from the Pennsylvanian strata of the Appalachian basin. Even though quantities mined in the past are less than they are today, this basin yielded from 70% to 80% of the nation's annual coal production from the end of the last century until the early 1970s. During the last 25 years, the proportion of the nation's coal that was produced annually from the Appalachian basin has declined markedly, and today it is only about 40% of the total. The amount of coal produced annually in the Appalachian basin, however, has been rising slowly over the last several decades, and has ranged generally from 400 to 500 million tons (Mt) per year. A large proportion of Appalachian historical production has come from relatively few counties in southwestern Pennsylvania, northern and southern West Virginia, eastern Kentucky, Virginia and Alabama. Many of these counties are decades past their years of peak production and several are almost depleted of economic deposits of coal. Because the current major consumer of Appalachian coal is the electric power industry, coal quality, especially sulfur content, has a great impact on its marketability. High-sulfur coal deposits in western Pennsylvania and Ohio are in low demand when compared with the lower sulfur coals of Virginia and southern West Virginia. Only five counties in the basin that have produced 500 Mt or more exhibit increasing rates of production at relatively high levels. Of these, six are in the central part of the basin and only one, Greene County, Pennsylvania, is in the northern part of the basin. Decline rate models, based on production decline rates and the decline rate of the estimated, 'potential' reserve, indicate that Appalachian basin annual coal production will be 200 Mt or less by the middle of the next century. Published by Elsevier Science B.V.Much of the coal consumed in the US since the end of the last century has been produced

  7. Removal of depleted uranium from contaminated soils.

    PubMed

    Choy, Christine Chin; Korfiatis, George P; Meng, Xiaoguang

    2006-08-10

    Contamination of soil and water with depleted uranium (DU) has increased public health concerns due to the chemical toxicity of DU at elevated dosages. For this reason, there is great interest in developing methods for DU removal from contaminated sources. Two DU laden soils, taken from U.S. Army sites, were characterized for particle size distribution, total uranium concentration and removable uranium. Soil A was found to be a well graded sand containing a total of 3210 mg/kg DU (3.99 x 10(4) Bq/kg, where a Becquerel (Bq) is a unit of radiation). About 83% of the DU in the fines fraction (particle diameter <0.075 mm, total DU 7732 mg/kg (9.61 x 10(4) Bq/kg)) was associated with the carbonate, iron and manganese oxide and organic matter fractions of the material. Soil B was classified as a sandy silt with total DU of 1560 mg/kg (1.94 x 10(4) Bq/kg). The DU content in the fines fraction was 5171 mg/kg (6.43 x 10(4) Bq/kg). Sequential extraction of the Soil B fines fraction indicated that 64% of the DU was present either as soluble U(VI) minerals or as insoluble U(IV). Citric acid, sodium bicarbonate and hydrogen peroxide were used in batch experiments to extract DU from the fines fraction of both soils. Citric acid and sodium bicarbonate were relatively successful for Soil A (50-60% DU removal), but not for Soil B (20-35% DU removal). Hydrogen peroxide was found to significantly increase DU extraction from both soils, attaining removals up to 60-80%.

  8. Impact of ozone depletion on immune function

    SciTech Connect

    Jeevan, A.; Kripke, M.L. . Dept. of Immunology)

    1993-06-01

    Depletion of stratospheric ozone is expected to lead to an increase in the amount of UV-B radiation present in sunlight. In addition to its well known ability to cause skin cancer, UV-B radiation has been shown to alter the immune system. The immune system is the body's primary defense mechanism against infectious diseases and protects against the development of certain types of cancer. Any impairment of immune function may jeopardize health by increasing susceptibility to infectious diseases, increasing the severity of infections, or delaying recovery for infections. In addition, impaired immune function can increase the incidence of certain cancers, particularly cancers of the skin. Research carried out with laboratory animals over the past 15 years has demonstrated that exposure of the skin to UV-B radiation can suppress certain types of immune responses. These include rejection of UV-induced skin cancers and melanomas, contact allergy reactions to chemicals, delayed-type hypersensitivity responses to microbial and other antigens, and phagocytosis and elimination of certain bacteria from lymphoid tissues. Recent studies with mycobacterial infection of mice demonstrated that exposure to UV-B radiation decreased the delayed hypersensitivity response to mycobacterial antigens and increased the severity of infection. In humans, UV-B radiation has also been shown to impair the contact allergy response. These studies demonstrate that UV radiation can decrease immune responses in humans and laboratory and raise the possibility that increased exposure to UV-B radiation could adversely affect human health by increasing the incidence or severity of certain infectious diseases.

  9. Recovery of depleted uranium fragments from soil.

    PubMed

    Farr, C P; Alecksen, T J; Heronimus, R S; Simonds, M H; Farrar, D R; Miller, M L; Baker, K R

    2010-02-01

    A "proof of concept" was conducted to determine the effectiveness of a survey method for cost-effective recovery of depleted uranium (DU) fragments from contaminated soil piles at Sandia National Laboratories. First, DU fragments ranging from less than a gram up to 48 g were covered by various thicknesses of soil and used for detector efficiency measurements. The efficiencies were measured for three different sodium iodide detectors: a 5.1-cm by 5.1-cm (2-inch by 2-inch) detector, a 7.6-cm by 7.6-cm (3-inch by 3-inch) detector, and a Field Instrument for the Detection of Low Energy Radiation (FIDLER) detector. The FIDLER detector was found to be superior to the other detectors in each measurement. Next, multiple 7.6-cm (3-inch) layers of soil, taken from the contaminated piles, were applied to a clean pad of soil. Each layer was scanned by an array of eight FIDLER detectors pulled by a tractor. The array, moving 10.2 to 12.7 cm s(-1) (4 to 5 inches per second), automatically recorded radiation count data along with associated detector coordinates at 3-s intervals. The DU fragments were located and identified with a handheld system consisting of a FIDLER detector and a positioning system and then removed. After DU removal, the affected areas were re-scanned and a new lift of contaminated soil was applied. The detection capability of the system as a function of DU fragment mass and burial depth was modeled and determined to be sufficient to ensure that the dose-based site concentration goals would be met. Finally, confirmation soil samples were taken from random locations and from decontaminated soil areas. All samples had concentrations of U that met the goal of 400-500 pCi g(-1).

  10. Barium Depletion in Hollow Cathode Emitters

    NASA Technical Reports Server (NTRS)

    Polk, James E.; Capece, Angela M.; Mikellides, Ioannis G.; Katz, Ira

    2009-01-01

    The effect of tungsten erosion, transport and redeposition on the operation of dispenser hollow cathodes was investigated in detailed examinations of the discharge cathode inserts from an 8200 hour and a 30,352 hour ion engine wear test. Erosion and subsequent re-deposition of tungsten in the electron emission zone at the downstream end of the insert reduces the porosity of the tungsten matrix, preventing the ow of barium from the interior. This inhibits the interfacial reactions of the barium-calcium-aluminate impregnant with the tungsten in the pores. A numerical model of barium transport in the internal xenon discharge plasma shows that the barium required to reduce the work function in the emission zone can be supplied from upstream through the gas phase. Barium that flows out of the pores of the tungsten insert is rapidly ionized in the xenon discharge and pushed back to the emitter surface by the electric field and drag from the xenon ion flow. This barium ion flux is sufficient to maintain a barium surface coverage at the downstream end greater than 0.6, even if local barium production at that point is inhibited by tungsten deposits. The model also shows that the neutral barium pressure exceeds the equilibrium vapor pressure of the impregnant decomposition reaction over much of the insert length, so the reactions are suppressed. Only a small region upstream of the zone blocked by tungsten deposits is active and supplies the required barium. These results indicate that hollow cathode failure models based on barium depletion rates in vacuum dispenser cathodes are very conservative.

  11. Ozone Depletion Potential of CH3Br

    NASA Technical Reports Server (NTRS)

    Sander, Stanley P.; Ko, Malcolm K. W.; Sze, Nien Dak; Scott, Courtney; Rodriquez, Jose M.; Weisenstein, Debra K.

    1998-01-01

    The ozone depletion potential (ODP) of methyl bromide (CH3Br) can be determined by combining the model-calculated bromine efficiency factor (BEF) for CH3Br and its atmospheric lifetime. This paper examines how changes in several key kinetic data affect BEF. The key reactions highlighted in this study include the reaction of BrO + H02, the absorption cross section of HOBr, the absorption cross section and the photolysis products of BrON02, and the heterogeneous conversion of BrON02 to HOBR and HN03 on aerosol particles. By combining the calculated BEF with the latest estimate of 0.7 year for the atmospheric lifetime of CH3Br, the likely value of ODP for CH3Br is 0.39. The model-calculated concentration of HBr (approximately 0.3 pptv) in the lower stratosphere is substantially smaller than the reported measured value of about I pptv. Recent publications suggested models can reproduce the measured value if one assumes a yield for HBr from the reaction of BrO + OH or from the reaction of BrO + H02. Although the DeAlore et al. evaluation concluded any substantial yield of HBr from BrO + HO2 is unlikely, for completeness, we calculate the effects of these assumed yields on BEF for CH3Br. Our calculations show that the effects are minimal: practically no impact for an assumed 1.3% yield of HBr from BrO + OH and 10% smaller for an assumed 0.6% yield from BrO + H02.

  12. Residue depletion of ampicillin in eggs.

    PubMed

    Zhao, M; Xie, K-Z; Guo, H-S; Li, A-H; Xie, X; Zhang, G-X; Dai, G-J; Wang, J-Y

    2015-10-01

    A residue depletion study of ampicillin (AMP) was performed after oral dosing (60.0 mg/kg and 120.0 mg/kg body weight once a day for 5 days) to laying hens, through the use of reversed-phase high-performance liquid chromatography with fluorescence detection (RP-HPLC-FLD) to achieve detection of ampicillin residue in eggs. Limit of detection was 0.5 ng/g, and limit of quantitation was 1.2 ng/g for ampicillin. Extraction recoveries of ampicillin from samples fortified at 5.0-125.0 ng/g levels ranged from 77.5% to 84.6% in albumen, 77.9% to 87.5% in yolk, and 77.9% to 88.6% in whole egg, with coefficients of variation ≤ 9.3%. The maximum concentrations of ampicillin in albumen, yolk, and whole egg were detected at 1, 2, and 1 day after the last administration of ampicillin, respectively. Ampicillin was not detectable in albumen at day 9 of withdrawal time, at day 10 and 11 in yolk, and day 9 and 11 in whole egg at each of those 2 dose levels. The theoretical withdrawal time of AMP in whole egg was 6.730 and 7.296 days for 60 and 120 mg/kg oral dosage, respectively. This method also proved to be suitable as a rapid and reliable method for the determination of ampicillin in other poultry eggs. © 2015 John Wiley & Sons Ltd.

  13. Ozone Depletion Potential of CH3Br

    NASA Technical Reports Server (NTRS)

    Sander, Stanley P.; Ko, Malcolm K. W.; Sze, Nien Dak; Scott, Courtney; Rodriquez, Jose M.; Weisenstein, Debra K.

    1998-01-01

    The ozone depletion potential (ODP) of methyl bromide (CH3Br) can be determined by combining the model-calculated bromine efficiency factor (BEF) for CH3Br and its atmospheric lifetime. This paper examines how changes in several key kinetic data affect BEF. The key reactions highlighted in this study include the reaction of BrO + H02, the absorption cross section of HOBr, the absorption cross section and the photolysis products of BrON02, and the heterogeneous conversion of BrON02 to HOBR and HN03 on aerosol particles. By combining the calculated BEF with the latest estimate of 0.7 year for the atmospheric lifetime of CH3Br, the likely value of ODP for CH3Br is 0.39. The model-calculated concentration of HBr (approximately 0.3 pptv) in the lower stratosphere is substantially smaller than the reported measured value of about I pptv. Recent publications suggested models can reproduce the measured value if one assumes a yield for HBr from the reaction of BrO + OH or from the reaction of BrO + H02. Although the DeAlore et al. evaluation concluded any substantial yield of HBr from BrO + HO2 is unlikely, for completeness, we calculate the effects of these assumed yields on BEF for CH3Br. Our calculations show that the effects are minimal: practically no impact for an assumed 1.3% yield of HBr from BrO + OH and 10% smaller for an assumed 0.6% yield from BrO + H02.

  14. The timing and mechanism of depletion in Lewisian granulites

    NASA Technical Reports Server (NTRS)

    Cohen, A. S.; Onions, R. K.; Ohara, M. J.

    1988-01-01

    Large Ion Lithophile (LIL) depletion in Lewisian granulites is discussed. Severe depletions in U, Th, and other LIL have been well documented in Lewisan mafic and felsic gneisses, but new Pb isotopic analyses show little or no depletion in lithologies with high solidus temperatures, such as peridotite. This suggests that LIL transport in this terrane took place by removal of partial melts rather than by pervasive flooding with externally derived CO2. The Pb and Nd isotopic data gathered on these rocks show that the depletion and granulite metamorphism are distinct events about 250 Ma apart. Both fluid inclusions and cation exchange geothermometers date from the later metamorphic event and therefore have little bearing on the depletion event, suggesting a note of caution for interpretations of other granulite terranes.

  15. Caffeine expectancies but not caffeine reduce depletion-induced aggression.

    PubMed

    Denson, Thomas F; Jacobson, Mandi; von Hippel, William; Kemp, Richard I; Mak, Tinnie

    2012-03-01

    Caffeine is the most widely consumed central nervous system stimulant in the world, yet little is known about its effects on aggressive behavior. Individuals often consume caffeine to increase energy and ward off mental depletion. Because mental depletion increases aggression when people are provoked, caffeine might reduce aggression by ameliorating the negative effects of depletion. In 2 experiments, participants consumed a 200-mg caffeine tablet or a placebo, were mentally depleted or not, and then provoked and given the opportunity to retaliate with a blast of white noise. Results showed that consuming a placebo reduced aggression relative to both caffeine (Experiments 1 and 2) and a no-pill control condition (Experiment 2). These data suggest that expectancies about the effects of caffeine in the absence of the pharmacological effects of the drug can reduce aggression when mentally depleted.

  16. Extreme Vulnerability of IDH1 Mutant Cancers to NAD+ Depletion.

    PubMed

    Tateishi, Kensuke; Wakimoto, Hiroaki; Iafrate, A John; Tanaka, Shota; Loebel, Franziska; Lelic, Nina; Wiederschain, Dmitri; Bedel, Olivier; Deng, Gejing; Zhang, Bailin; He, Timothy; Shi, Xu; Gerszten, Robert E; Zhang, Yiyun; Yeh, Jing-Ruey J; Curry, William T; Zhao, Dan; Sundaram, Sudhandra; Nigim, Fares; Koerner, Mara V A; Ho, Quan; Fisher, David E; Roider, Elisabeth M; Kemeny, Lajos V; Samuels, Yardena; Flaherty, Keith T; Batchelor, Tracy T; Chi, Andrew S; Cahill, Daniel P

    2015-12-14

    Heterozygous mutation of IDH1 in cancers modifies IDH1 enzymatic activity, reprogramming metabolite flux and markedly elevating 2-hydroxyglutarate (2-HG). Here, we found that 2-HG depletion did not inhibit growth of several IDH1 mutant solid cancer types. To identify other metabolic therapeutic targets, we systematically profiled metabolites in endogenous IDH1 mutant cancer cells after mutant IDH1 inhibition and discovered a profound vulnerability to depletion of the coenzyme NAD+. Mutant IDH1 lowered NAD+ levels by downregulating the NAD+ salvage pathway enzyme nicotinate phosphoribosyltransferase (Naprt1), sensitizing to NAD+ depletion via concomitant nicotinamide phosphoribosyltransferase (NAMPT) inhibition. NAD+ depletion activated the intracellular energy sensor AMPK, triggered autophagy, and resulted in cytotoxicity. Thus, we identify NAD+ depletion as a metabolic susceptibility of IDH1 mutant cancers.

  17. The effect of ego depletion on sprint start reaction time.

    PubMed

    Englert, Chris; Bertrams, Alex

    2014-10-01

    In the current study, we consider that optimal sprint start performance requires the self-control of responses. Therefore, start performance should depend on athletes' self-control strength. We assumed that momentary depletion of self-control strength (ego depletion) would either speed up or slow down the initiation of a sprint start, where an initiation that was sped up would carry the increased risk of a false start. Applying a mixed between- (depletion vs. nondepletion) and within- (before vs. after manipulation of depletion) subjects design, we tested the start reaction times of 37 sport students. We found that participants' start reaction times decelerated after finishing a depleting task, whereas it remained constant in the nondepletion condition. These results indicate that sprint start performance can be impaired by unrelated preceding actions that lower momentary self-control strength. We discuss practical implications in terms of optimizing sprint starts and related overall sprint performance.

  18. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  19. Scalable motion vector coding

    NASA Astrophysics Data System (ADS)

    Barbarien, Joeri; Munteanu, Adrian; Verdicchio, Fabio; Andreopoulos, Yiannis; Cornelis, Jan P.; Schelkens, Peter

    2004-11-01

    Modern video coding applications require transmission of video data over variable-bandwidth channels to a variety of terminals with different screen resolutions and available computational power. Scalable video coding is needed to optimally support these applications. Recently proposed wavelet-based video codecs employing spatial domain motion compensated temporal filtering (SDMCTF) provide quality, resolution and frame-rate scalability while delivering compression performance comparable to that of the state-of-the-art non-scalable H.264-codec. These codecs require scalable coding of the motion vectors in order to support a large range of bit-rates with optimal compression efficiency. Scalable motion vector coding algorithms based on the integer wavelet transform followed by embedded coding of the wavelet coefficients were recently proposed. In this paper, a new and fundamentally different scalable motion vector codec (MVC) using median-based motion vector prediction is proposed. Extensive experimental results demonstrate that the proposed MVC systematically outperforms the wavelet-based state-of-the-art solutions. To be able to take advantage of the proposed scalable MVC, a rate allocation mechanism capable of optimally dividing the available rate among texture and motion information is required. Two rate allocation strategies are proposed and compared. The proposed MVC and rate allocation schemes are incorporated into an SDMCTF-based video codec and the benefits of scalable motion vector coding are experimentally demonstrated.

  20. Adjoint-based uncertainty quantification and sensitivity analysis for reactor depletion calculations

    NASA Astrophysics Data System (ADS)

    Stripling, Hayes Franklin

    Depletion calculations for nuclear reactors model the dynamic coupling between the material composition and neutron flux and help predict reactor performance and safety characteristics. In order to be trusted as reliable predictive tools and inputs to licensing and operational decisions, the simulations must include an accurate and holistic quantification of errors and uncertainties in its outputs. Uncertainty quantification is a formidable challenge in large, realistic reactor models because of the large number of unknowns and myriad sources of uncertainty and error. We present a framework for performing efficient uncertainty quantification in depletion problems using an adjoint approach, with emphasis on high-fidelity calculations using advanced massively parallel computing architectures. This approach calls for a solution to two systems of equations: (a) the forward, engineering system that models the reactor, and (b) the adjoint system, which is mathematically related to but different from the forward system. We use the solutions of these systems to produce sensitivity and error estimates at a cost that does not grow rapidly with the number of uncertain inputs. We present the framework in a general fashion and apply it to both the source-driven and k-eigenvalue forms of the depletion equations. We describe the implementation and verification of solvers for the forward and ad- joint equations in the PDT code, and we test the algorithms on realistic reactor analysis problems. We demonstrate a new approach for reducing the memory and I/O demands on the host machine, which can be overwhelming for typical adjoint algorithms. Our conclusion is that adjoint depletion calculations using full transport solutions are not only computationally tractable, they are the most attractive option for performing uncertainty quantification on high-fidelity reactor analysis problems.

  1. Cost estimate report for the long-term management of depleted uranium hexafluoride : storage of depleted uranium metal.

    SciTech Connect

    Folga, S.M.; Kier, P.H.; Thimmapuram, P.R.

    2001-01-24

    This report contains a cost analysis of the long-term storage of depleted uranium in the form of uranium metal. Three options are considered for storage of the depleted uranium. These options are aboveground buildings, partly underground vaults, and mined cavities. Three cases are presented. In the first case, all the depleted uranium metal that would be produced from the conversion of depleted uranium hexafluoride (UF{sub 6}) generated by the US Department of Energy (DOE) prior to July 1993 would be stored at the storage facility (100% Case). In the second case, half the depleted uranium metal would be stored at this storage facility (50% Case). In the third case, one-quarter of the depleted uranium metal would be stored at the storage facility (25% Case). The technical basis for the cost analysis presented in this report is principally found in the companion report, ANL/EAD/TM-100, ''Engineering Analysis Report for the Long-Term Management of Depleted Uranium Hexafluoride: Storage of Depleted Uranium Metal'', prepared by Argonne National Laboratory.

  2. GPT-Free Sensitivity Analysis for Reactor Depletion and Analysis

    NASA Astrophysics Data System (ADS)

    Kennedy, Christopher Brandon

    Introduced in this dissertation is a novel approach that forms a reduced-order model (ROM), based on subspace methods, that allows for the generation of response sensitivity profiles without the need to set up or solve the generalized inhomogeneous perturbation theory (GPT) equations. The new approach, denoted hereinafter as the generalized perturbation theory free (GPT-Free) approach, computes response sensitivity profiles in a manner that is independent of the number or type of responses, allowing for an efficient computation of sensitivities when many responses are required. Moreover, the reduction error associated with the ROM is quantified by means of a Wilks' order statistics error metric denoted by the kappa-metric. Traditional GPT has been recognized as the most computationally efficient approach for performing sensitivity analyses of models with many input parameters, e.g. when forward sensitivity analyses are computationally overwhelming. However, most neutronics codes that can solve the fundamental (homogenous) adjoint eigenvalue problem do not have GPT (inhomogenous) capabilities unless envisioned during code development. Additionally, codes that use a stochastic algorithm, i.e. Monte Carlo methods, may have difficult or undefined GPT equations. When GPT calculations are available through software, the aforementioned efficiency gained from the GPT approach diminishes when the model has both many output responses and many input parameters. The GPT-Free approach addresses these limitations, first by only requiring the ability to compute the fundamental adjoint from perturbation theory, and second by constructing a ROM from fundamental adjoint calculations, constraining input parameters to a subspace. This approach bypasses the requirement to perform GPT calculations while simultaneously reducing the number of simulations required. In addition to the reduction of simulations, a major benefit of the GPT-Free approach is explicit control of the reduced order

  3. Serotonin and social norms: tryptophan depletion impairs social comparison and leads to resource depletion in a multiplayer harvesting game.

    PubMed

    Bilderbeck, Amy C; Brown, Gordon D A; Read, Judi; Woolrich, Mark; Cowen, Phillip J; Behrens, Tim E J; Rogers, Robert D

    2014-07-01

    How do people sustain resources for the benefit of individuals and communities and avoid the tragedy of the commons, in which shared resources become exhausted? In the present study, we examined the role of serotonin activity and social norms in the management of depletable resources. Healthy adults, alongside social partners, completed a multiplayer resource-dilemma game in which they repeatedly harvested from a partially replenishable monetary resource. Dietary tryptophan depletion, leading to reduced serotonin activity, was associated with aggressive harvesting strategies and disrupted use of the social norms given by distributions of other players' harvests. Tryptophan-depleted participants more frequently exhausted the resource completely and also accumulated fewer rewards than participants who were not tryptophan depleted. Our findings show that rank-based social comparisons are crucial to the management of depletable resources, and that serotonin mediates responses to social norms.

  4. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  5. Associative Interactions in Crowded Solutions of Biopolymers Counteract Depletion Effects.

    PubMed

    Groen, Joost; Foschepoth, David; te Brinke, Esra; Boersma, Arnold J; Imamura, Hiromi; Rivas, Germán; Heus, Hans A; Huck, Wilhelm T S

    2015-10-14

    The cytosol of Escherichia coli is an extremely crowded environment, containing high concentrations of biopolymers which occupy 20-30% of the available volume. Such conditions are expected to yield depletion forces, which strongly promote macromolecular complexation. However, crowded macromolecule solutions, like the cytosol, are very prone to nonspecific associative interactions that can potentially counteract depletion. It remains unclear how the cytosol balances these opposing interactions. We used a FRET-based probe to systematically study depletion in vitro in different crowded environments, including a cytosolic mimic, E. coli lysate. We also studied bundle formation of FtsZ protofilaments under identical crowded conditions as a probe for depletion interactions at much larger overlap volumes of the probe molecule. The FRET probe showed a more compact conformation in synthetic crowding agents, suggesting strong depletion interactions. However, depletion was completely negated in cell lysate and other protein crowding agents, where the FRET probe even occupied slightly more volume. In contrast, bundle formation of FtsZ protofilaments proceeded as readily in E. coli lysate and other protein solutions as in synthetic crowding agents. Our experimental results and model suggest that, in crowded biopolymer solutions, associative interactions counterbalance depletion forces for small macromolecules. Furthermore, the net effects of macromolecular crowding will be dependent on both the size of the macromolecule and its associative interactions with the crowded background.

  6. Adjoint simulation of stream depletion due to aquifer pumping.

    PubMed

    Neupauer, Roseanna M; Griebling, Scott A

    2012-01-01

    If an aquifer is hydraulically connected to an adjacent stream, a pumping well operating in the aquifer will draw some water from aquifer storage and some water from the stream, causing stream depletion. Several analytical, semi-analytical, and numerical approaches have been developed to estimate stream depletion due to pumping. These approaches are effective if the well location is known. If a new well is to be installed, it may be desirable to install the well at a location where stream depletion is minimal. If several possible locations are considered for the location of a new well, stream depletion would have to be estimated for all possible well locations, which can be computationally inefficient. The adjoint approach for estimating stream depletion is a more efficient alternative because with one simulation of the adjoint model, stream depletion can be estimated for pumping at a well at any location. We derive the adjoint equations for a coupled system with a confined aquifer, an overlying unconfined aquifer, and a river that is hydraulically connected to the unconfined aquifer. We assume that the stage in the river is known, and is independent of the stream depletion, consistent with the assumptions of the MODFLOW river package. We describe how the adjoint equations can be solved using MODFLOW. In an illustrative example, we show that for this scenario, the adjoint approach is as accurate as standard forward numerical simulation methods, and requires substantially less computational effort.

  7. Long-term groundwater depletion in the United States

    USGS Publications Warehouse

    Konikow, Leonard F.

    2015-01-01

    The volume of groundwater stored in the subsurface in the United States decreased by almost 1000 km3 during 1900–2008. The aquifer systems with the three largest volumes of storage depletion include the High Plains aquifer, the Mississippi Embayment section of the Gulf Coastal Plain aquifer system, and the Central Valley of California. Depletion rates accelerated during 1945–1960, averaging 13.6 km3/year during the last half of the century, and after 2000 increased again to about 24 km3/year. Depletion intensity is a new parameter, introduced here, to provide a more consistent basis for comparing storage depletion problems among various aquifers by factoring in time and areal extent of the aquifer. During 2001–2008, the Central Valley of California had the largest depletion intensity. Groundwater depletion in the United States can explain 1.4% of observed sea-level rise during the 108-year study period and 2.1% during 2001–2008. Groundwater depletion must be confronted on local and regional scales to help reduce demand (primarily in irrigated agriculture) and/or increase supply.

  8. Long-term groundwater depletion in the United States.

    PubMed

    Konikow, Leonard F

    2015-01-01

    The volume of groundwater stored in the subsurface in the United States decreased by almost 1000 km3 during 1900-2008. The aquifer systems with the three largest volumes of storage depletion include the High Plains aquifer, the Mississippi Embayment section of the Gulf Coastal Plain aquifer system, and the Central Valley of California. Depletion rates accelerated during 1945-1960, averaging 13.6 km3/year during the last half of the century, and after 2000 increased again to about 24 km3/year. Depletion intensity is a new parameter, introduced here, to provide a more consistent basis for comparing storage depletion problems among various aquifers by factoring in time and areal extent of the aquifer. During 2001-2008, the Central Valley of California had the largest depletion intensity. Groundwater depletion in the United States can explain 1.4% of observed sea-level rise during the 108-year study period and 2.1% during 2001-2008. Groundwater depletion must be confronted on local and regional scales to help reduce demand (primarily in irrigated agriculture) and/or increase supply.

  9. Unified transport scaling laws for plasma blobs and depletions

    NASA Astrophysics Data System (ADS)

    Wiesenberger, M.; Held, M.; Kube, R.; Garcia, O. E.

    2017-06-01

    We study the dynamics of seeded plasma blobs and depletions in an (effective) gravitational field. For incompressible flows, the radial center of mass velocity of blobs and depletions is proportional to the square root of their initial cross-field size and amplitude. If the flows are compressible, this scaling holds only for ratios of amplitude to size larger than a critical value. Otherwise, the maximum blob and depletion velocity depends linearly on the initial amplitude and is independent of size. In both cases, the acceleration of blobs and depletions depends on their initial amplitude relative to the background plasma density and is proportional to gravity and independent of their cross-field size. Due to their reduced inertia plasma, depletions accelerate more quickly than the corresponding blobs. These scaling laws are derived from the invariants of the governing drift-fluid equations for blobs and agree excellently with numerical simulations over five orders of magnitude for both blobs and depletions. We suggest an empirical model that unifies and correctly captures the radial acceleration and maximum velocities of both blobs and depletions.

  10. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  11. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  12. Induction technology optimization code

    SciTech Connect

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-08-21

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. The Induction Technology Optimization Study (ITOS) was undertaken to examine viable combinations of a linear induction accelerator and a relativistic klystron (RK) for high power microwave production. It is proposed, that microwaves from the RK will power a high-gradient accelerator structure for linear collider development. Previous work indicates that the RK will require a nominal 3-MeV, 3-kA electron beam with a 100-ns flat top. The proposed accelerator-RK combination will be a high average power system capable of sustained microwave output at a 300-Hz pulse repetition frequency. The ITOS code models many combinations of injector, accelerator, and pulse power designs that will supply an RK with the beam parameters described above.

  13. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  14. Depletion of mesospheric sodium during extended period of pulsating aurora

    NASA Astrophysics Data System (ADS)

    Takahashi, T.; Hosokawa, K.; Nozawa, S.; Tsuda, T. T.; Ogawa, Y.; Tsutsumi, M.; Hiraki, Y.; Fujiwara, H.; Kawahara, T. D.; Saito, N.; Wada, S.; Kawabata, T.; Hall, C.

    2017-01-01

    We quantitatively evaluated the Na density depletion due to charge transfer reactions between Na atoms and molecular ions produced by high-energy electron precipitation during a pulsating aurora (PsA). An extended period of PsA was captured by an all-sky camera at the European Incoherent Scatter (EISCAT) radar Tromsø site (69.6°N, 19.2°E) during a 2 h interval from 00:00 to 02:00 UT on 25 January 2012. During this period, using the EISCAT very high frequency (VHF) radar, we detected three intervals of intense ionization below 100 km that were probably caused by precipitation of high-energy electrons during the PsA. In these intervals, the sodium lidar at Tromsø observed characteristic depletion of Na density at altitudes between 97 and 100 km. These Na density depletions lasted for 8 min and represented 5-8% of the background Na layer. To examine the cause of this depletion, we modeled the depletion rate based on charge transfer reactions with NO+ and O2+ while changing the R value which is defined as the ratio of NO+ to O2+ densities, from 1 to 10. The correlation coefficients between observed and modeled Na density depletion calculated with typical value R = 3 for time intervals T1, T2, and T3 were 0.66, 0.80, and 0.67, respectively. The observed Na density depletion rates fall within the range of modeled depletion rate calculated with R from 1 to 10. This suggests that the charge transfer reactions triggered by the auroral impact ionization at low altitudes are the predominant process responsible for Na density depletion during PsA intervals.

  15. A Multilab Preregistered Replication of the Ego-Depletion Effect.

    PubMed

    Hagger, Martin S; Chatzisarantis, Nikos L D; Alberts, Hugo; Anggono, Calvin Octavianus; Batailler, Cédric; Birt, Angela R; Brand, Ralf; Brandt, Mark J; Brewer, Gene; Bruyneel, Sabrina; Calvillo, Dustin P; Campbell, W Keith; Cannon, Peter R; Carlucci, Marianna; Carruth, Nicholas P; Cheung, Tracy; Crowell, Adrienne; De Ridder, Denise T D; Dewitte, Siegfried; Elson, Malte; Evans, Jacqueline R; Fay, Benjamin A; Fennis, Bob M; Finley, Anna; Francis, Zoë; Heise, Elke; Hoemann, Henrik; Inzlicht, Michael; Koole, Sander L; Koppel, Lina; Kroese, Floor; Lange, Florian; Lau, Kevin; Lynch, Bridget P; Martijn, Carolien; Merckelbach, Harald; Mills, Nicole V; Michirev, Alexej; Miyake, Akira; Mosser, Alexandra E; Muise, Megan; Muller, Dominique; Muzi, Milena; Nalis, Dario; Nurwanti, Ratri; Otgaar, Henry; Philipp, Michael C; Primoceri, Pierpaolo; Rentzsch, Katrin; Ringos, Lara; Schlinkert, Caroline; Schmeichel, Brandon J; Schoch, Sarah F; Schrama, Michel; Schütz, Astrid; Stamos, Angelos; Tinghög, Gustav; Ullrich, Johannes; vanDellen, Michelle; Wimbarti, Supra; Wolff, Wanja; Yusainy, Cleoputri; Zerhouni, Oulmann; Zwienenberg, Maria

    2016-07-01

    Good self-control has been linked to adaptive outcomes such as better health, cohesive personal relationships, success in the workplace and at school, and less susceptibility to crime and addictions. In contrast, self-control failure is linked to maladaptive outcomes. Understanding the mechanisms by which self-control predicts behavior may assist in promoting better regulation and outcomes. A popular approach to understanding self-control is the strength or resource depletion model. Self-control is conceptualized as a limited resource that becomes depleted after a period of exertion resulting in self-control failure. The model has typically been tested using a sequential-task experimental paradigm, in which people completing an initial self-control task have reduced self-control capacity and poorer performance on a subsequent task, a state known as ego depletion Although a meta-analysis of ego-depletion experiments found a medium-sized effect, subsequent meta-analyses have questioned the size and existence of the effect and identified instances of possible bias. The analyses served as a catalyst for the current Registered Replication Report of the ego-depletion effect. Multiple laboratories (k = 23, total N = 2,141) conducted replications of a standardized ego-depletion protocol based on a sequential-task paradigm by Sripada et al. Meta-analysis of the studies revealed that the size of the ego-depletion effect was small with 95% confidence intervals (CIs) that encompassed zero (d = 0.04, 95% CI [-0.07, 0.15]. We discuss implications of the findings for the ego-depletion effect and the resource depletion model of self-control. © The Author(s) 2016.

  16. The 'depletion layer' of amorphous p-n junctions

    NASA Technical Reports Server (NTRS)

    Von Roos, O.

    1981-01-01

    It is shown that within reasonable approximations for the density of state distribution within the mobility gap of a:Si, a one-to-one correspondence exists between the electric field distribution in the transition region of an amorphous p-n junction and that in the depletion layer of a crystalline p-n junction. Thus it is inferred that the depletion layer approximation which leads to a parabolic potential distribution within the depletion layer of crystalline junctions also constitutes a fair approximation in the case of amorphous junctions. This fact greatly simplifies an analysis of solid-state electronic devices based on amorphous material (i.e., solar cells).

  17. Challenges dealing with depleted uranium in Germany - Reuse or disposal

    SciTech Connect

    Moeller, Kai D.

    2007-07-01

    During enrichment large amounts of depleted Uranium are produced. In Germany every year 2.800 tons of depleted uranium are generated. In Germany depleted uranium is not classified as radioactive waste but a resource for further enrichment. Therefore since 1996 depleted Uranium is sent to ROSATOM in Russia. However it still has to be dealt with the second generation of depleted Uranium. To evaluate the alternative actions in case a solution has to be found in Germany, several studies have been initiated by the Federal Ministry of the Environment. The work that has been carried out evaluated various possibilities to deal with depleted uranium. The international studies on this field and the situation in Germany have been analyzed. In case no further enrichment is planned the depleted uranium has to be stored. In the enrichment process UF{sub 6} is generated. It is an international consensus that for storage it should be converted to U{sub 3}O{sub 8}. The necessary technique is well established. If the depleted Uranium would have to be characterized as radioactive waste, a final disposal would become necessary. For the planned Konrad repository - a repository for non heat generating radioactive waste - the amount of Uranium is limited by the licensing authority. The existing license would not allow the final disposal of large amounts of depleted Uranium in the Konrad repository. The potential effect on the safety case has not been roughly analyzed. As a result it may be necessary to think about alternatives. Several possibilities for the use of depleted uranium in the industry have been identified. Studies indicate that the properties of Uranium would make it useful in some industrial fields. Nevertheless many practical and legal questions are open. One further option may be the use as shielding e.g. in casks for transport or disposal. Possible techniques for using depleted Uranium as shielding are the use of the metallic Uranium as well as the inclusion in concrete

  18. It Is Chloride Depletion Alkalosis, Not Contraction Alkalosis

    PubMed Central

    Galla, John H.

    2012-01-01

    Maintenance of metabolic alkalosis generated by chloride depletion is often attributed to volume contraction. In balance and clearance studies in rats and humans, we showed that chloride repletion in the face of persisting alkali loading, volume contraction, and potassium and sodium depletion completely corrects alkalosis by a renal mechanism. Nephron segment studies strongly suggest the corrective response is orchestrated in the collecting duct, which has several transporters integral to acid-base regulation, the most important of which is pendrin, a luminal Cl/HCO3− exchanger. Chloride depletion alkalosis should replace the notion of contraction alkalosis. PMID:22223876

  19. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  20. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  1. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  2. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  3. Quantification of stochastic uncertainty propagation for Monte Carlo depletion methods in reactor analysis

    NASA Astrophysics Data System (ADS)

    Newell, Quentin Thomas

    The Monte Carlo method provides powerful geometric modeling capabilities for large problem domains in 3-D; therefore, the Monte Carlo method is becoming popular for 3-D fuel depletion analyses to compute quantities of interest in spent nuclear fuel including isotopic compositions. The Monte Carlo approach has not been fully embraced due to unresolved issues concerning the effect of Monte Carlo uncertainties on the predicted results. Use of the Monte Carlo method to solve the neutron transport equation introduces stochastic uncertainty in the computed fluxes. These fluxes are used to collapse cross sections, estimate power distributions, and deplete the fuel within depletion calculations; therefore, the predicted number densities contain random uncertainties from the Monte Carlo solution. These uncertainties can be compounded in time because of the extrapolative nature of depletion and decay calculations. The objective of this research was to quantify the stochastic uncertainty propagation of the flux uncertainty, introduced by the Monte Carlo method, to the number densities for the different isotopes in spent nuclear fuel due to multiple depletion time steps. The research derived a formula that calculates the standard deviation in the nuclide number densities based on propagating the statistical uncertainty introduced when using coupled Monte Carlo depletion computer codes. The research was developed with the use of the TRITON/KENO sequence of the SCALE computer code. The linear uncertainty nuclide group approximation (LUNGA) method developed in this research approximated the variance of ψN term, which is the variance in the flux shape due to uncertainty in the calculated nuclide number densities. Three different example problems were used in this research to calculate of the standard deviation in the nuclide number densities using the LUNGA method. The example problems showed that the LUNGA method is capable of calculating the standard deviation of the nuclide

  4. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  5. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  6. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  7. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  8. Depleted uranium human health risk assessment, Jefferson Proving Ground, Indiana

    SciTech Connect

    Ebinger, M.H.; Hansen, W.R.

    1994-04-29

    The risk to human health from fragments of depleted uranium (DU) at Jefferson Proving Ground (JPG) was estimated using two types of ecosystem pathway models. A steady-state, model of the JPG area was developed to examine the effects of DU in soils, water, and vegetation on deer that were hunted and consumed by humans. The RESRAD code was also used to estimate the effects of farming the impact area and consuming the products derived from the farm. The steady-state model showed that minimal doses to humans are expected from consumption of deer that inhabit the impact area. Median values for doses to humans range from about 1 mrem ({plus_minus}2.4) to 0.04 mrem ({plus_minus}0.13) and translate to less than 1 {times} 10{sup {minus}6} detriments (excess cancers) in the population. Monte Carlo simulation of the steady-state model was used to derive the probability distributions from which the median values were drawn. Sensitivity analyses of the steady-state model showed that the amount of DU in airborne dust and, therefore, the amount of DU on the vegetation surface, controlled the amount of DU ingested by deer and by humans. Human doses from the RESRAD estimates ranged from less than 1 mrem/y to about 6.5 mrem/y in a hunting scenario and subsistence fanning scenario, respectively. The human doses exceeded the 100 mrem/y dose limit when drinking water for the farming scenario was obtained from the on-site aquifer that was presumably contaminated with DU. The two farming scenarios were unrealistic land uses because the additional risk to humans due to unexploded ordnance in the impact area was not figured into the risk estimate. The doses estimated with RESRAD translated to less than 1 {times} 10{sup {minus}6} detriments to about 1 {times} 10{sup {minus}3} detriments. The higher risks were associated only with the farming scenario in which drinking water was obtained on-site.

  9. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1984-01-01

    Several error control coding techniques for reliable satellite communications were investigated to find algorithms for fast decoding of Reed-Solomon codes in terms of dual basis. The decoding of the (255,223) Reed-Solomon code, which is used as the outer code in the concatenated TDRSS decoder, was of particular concern.

  10. STRATOSPHERIC OZONE DEPLETION: A FOCUS ON EPA'S RESEARCH

    EPA Science Inventory

    In September of 1987 the United States, along with 26 other countries, signed a landmark treaty to limit and subsequently, through revisions, phase out the production of all significant ozone depleting substances. Many researchers suspected that these chemicals, especially chl...

  11. Depletion in Antarctic Ozone and Associated Climatic Change,

    DTIC Science & Technology

    ANTARCTIC REGIONS, *CLIMATE, *DEPLETION, *OZONE, AGREEMENTS, ATMOSPHERES, ATMOSPHERICS, CARBON, CARBON DIOXIDE, COMPUTATIONS, DIOXIDES, GREENHOUSE ... EFFECT , GREENHOUSES, HIGH LATITUDES, LATITUDE, LOSSES, MEAN, METHANE, MODELS, NETS, NITROUS OXIDE, OBSERVATION, OXIDES, PERTURBATIONS, REGIONS, STEADY

  12. STRATOSPHERIC OZONE DEPLETION: A FOCUS ON EPA'S RESEARCH

    EPA Science Inventory

    In September of 1987 the United States, along with 26 other countries, signed a landmark treaty to limit and subsequently, through revisions, phase out the production of all significant ozone depleting substances. Many researchers suspected that these chemicals, especially chl...

  13. Stimulated Emission Depletion Lithography with Mercapto-Functional Polymers

    PubMed Central

    2016-01-01

    Surface reactive nanostructures were fabricated using stimulated emission depletion (STED) lithography. The functionalization of the nanostructures was realized by copolymerization of a bifunctional metal oxo cluster in the presence of a triacrylate monomer. Ligands of the cluster surface cross-link to the monomer during the lithographic process, whereas unreacted mercapto functionalized ligands are transferred to the polymer and remain reactive after polymer formation of the surface of the nanostructure. The depletion efficiency in dependence of the cluster loading was investigated and full depletion of the STED effect was observed with a cluster loading exceeding 4 wt %. A feature size by λ/11 was achieved by using a donut-shaped depletion beam. The reactivity of the mercapto groups on the surface of the nanostructure was tested by incubation with mercapto-reactive fluorophores. PMID:26816204

  14. Individual differences in dopamine level modulate the ego depletion effect.

    PubMed

    Dang, Junhua; Xiao, Shanshan; Liu, Ying; Jiang, Yumeng; Mao, Lihua

    2016-01-01

    Initial exertion of self-control impairs subsequent self-regulatory performance, which is referred to as the ego depletion effect. The current study examined how individual differences in dopamine level, as indexed by eye blink rate (EBR), would moderate ego depletion. An inverted-U-shaped relationship between EBR and subsequent self-regulatory performance was found when participants initially engaged in self-control but such relationship was absent in the control condition where there was no initial exertion, suggesting individuals with a medium dopamine level may be protected from the typical ego depletion effect. These findings are consistent with a cognitive explanation which considers ego depletion as a phenomenon similar to "switch costs" that would be neutralized by factors promoting flexible switching. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. 10. VIEW OF DEPLETED URANIUM INGOT AND MOLD IN FOUNDRY. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. VIEW OF DEPLETED URANIUM INGOT AND MOLD IN FOUNDRY. (11/11/56) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  16. Background suppression in fluorescence nanoscopy with stimulated emission double depletion

    NASA Astrophysics Data System (ADS)

    Gao, Peng; Prunsche, Benedikt; Zhou, Lu; Nienhaus, Karin; Nienhaus, G. Ulrich

    2017-01-01

    Stimulated emission depletion (STED) fluorescence nanoscopy is a powerful super-resolution imaging technique based on the confinement of fluorescence emission to the central subregion of an observation volume through de-excitation of fluorophores in the periphery via stimulated emission. Here, we introduce stimulated emission double depletion (STEDD) as a method to selectively remove artificial background intensity. In this approach, a first, conventional STED pulse is followed by a second, delayed Gaussian STED pulse that specifically depletes the central region, thus leaving only background. Thanks to time-resolved detection we can remove this background intensity voxel by voxel by taking the weighted difference of photons collected before and after the second STED pulse. STEDD thus yields background-suppressed super-resolved images as well as STED-based fluorescence correlation spectroscopy data. Furthermore, the proposed method is also beneficial when considering lower-power, less redshifted depletion pulses.

  17. Hyperspectral stimulated emission depletion microscopy and methods of use thereof

    SciTech Connect

    Timlin, Jerilyn A; Aaron, Jesse S

    2014-04-01

    A hyperspectral stimulated emission depletion ("STED") microscope system for high-resolution imaging of samples labeled with multiple fluorophores (e.g., two to ten fluorophores). The hyperspectral STED microscope includes a light source, optical systems configured for generating an excitation light beam and a depletion light beam, optical systems configured for focusing the excitation and depletion light beams on a sample, and systems for collecting and processing data generated by interaction of the excitation and depletion light beams with the sample. Hyperspectral STED data may be analyzed using multivariate curve resolution analysis techniques to deconvolute emission from the multiple fluorophores. The hyperspectral STED microscope described herein can be used for multi-color, subdiffraction imaging of samples (e.g., materials and biological materials) and for analyzing a tissue by Forster Resonance Energy Transfer ("FRET").

  18. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  19. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  20. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  1. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  2. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  3. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  4. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  5. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  6. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  7. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  8. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  9. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  10. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  11. Retrieval of buried depleted uranium from the T-1 trench

    SciTech Connect

    Burmeister, M.; Castaneda, N.; Greengard, T. |; Hull, C.; Barbour, D.; Quapp, W.J.

    1998-07-01

    The Trench 1 remediation project will be conducted this year to retrieve depleted uranium and other associated materials from a trench at Rocky Flats Environmental Technology Site. The excavated materials will be segregated and stabilized for shipment. The depleted uranium will be treated at an offsite facility which utilizes a novel approach for waste minimization and disposal through utilization of a combination of uranium recycling and volume efficient uranium stabilization.

  12. Selective Dissolution and Recovery of Depleted Uranium from Armor Plate

    DTIC Science & Technology

    1987-05-05

    targets used in testing high density armor-piercing ammunition containing depleted uranium (DU) are subject to disposal as low level radioactive waste...radioactive material. Decontamination testing on targets characterized as containing residual entrained penetrator fragments is necessary to...depleted uranium from the ac4.d solvent. A conceptual flow I sheet, based on preliminar:" test data, suggested that uranium could be I extracted from

  13. Observed and Simulated Depletion Layers with Southward IMF

    DTIC Science & Technology

    2007-11-02

    characteristics of an event on 12 March 2001, in ionosphere , follow magnetic field lines to near the mag- which a depletion layer was observed just...depletion layers 2153 region. The second type inhibits dayside merging and is a ionosphere /thermosphere. The simulations discussed here possible...mechanism for understanding the saturation of the contain specifically selected parameters and simplifying ap- ionospheric potential under strongly driven

  14. Lithium Depletion in the Beta Pictoris Moving Group

    NASA Astrophysics Data System (ADS)

    Yee, Jennifer C.; Jensen, E. L.; Reaser, B. E.

    2006-12-01

    We present a study of lithium depletion in twelve late-type pre-main-sequence stars in the coeval Beta Pictoris Moving Group (BPMG). The age of this group ( 12 Myr) is well constrained because all of the stars in the sample have Hipparcos distances. We have determined Li abundances for these K and M stars using equivalent width measurements of the 6707.8 Angstrom Li I line from new high-resolution, high-S/N echelle spectra, and we compare these abundances to models of pre-main-sequence Li depletion by Baraffe et al. (1998), D'Antona & Mazzitelli (1997, 1998), and Siess, Dufour, & Forestini (2000). Significantly more lithium depletion is observed in the sample than is predicted for a group of this age. In particular, the discrepancy between the predicted and the observed lithium abundances increases with decreasing effective temperature, suggesting a problem with theories describing pre-main-sequence lithium depletion. Our data indicate that M stars deplete lithium more rapidly than predicted, which could make M-type post-T-Tauri stars difficult to identify. In addition, we compare our results to the work of Song, Bessell, & Zuckerman (2002) on HIP 112312. In contrast to that work, we did not observe the lithium depletion boundary of the BPMG; none of the three M4.5 stars in the sample showed evidence of lithium (log N(Li) < -0.5), indicating a lithium depletion boundary later than M4.5, further underscoring the gap between age estimates from lithium depletion and those from theoretical evolutionary tracks. We gratefully acknowledge the support of the National Science Foundation through grant AST-0307830.

  15. Mechanism of actin polymerization in cellular ATP depletion.

    PubMed

    Atkinson, Simon J; Hosford, Melanie A; Molitoris, Bruce A

    2004-02-13

    Cellular ATP depletion in diverse cell types results in the net conversion of monomeric G-actin to polymeric F-actin and is an important aspect of cellular injury in tissue ischemia. We propose that this conversion results from altering the ratio of ATP-G-actin and ADP-G-actin, causing a net decrease in the concentration of thymosinactin complexes as a consequence of the differential affinity of thymosin beta4 for ATP- and ADP-G-actin. To test this hypothesis we examined the effect of ATP depletion induced by antimycin A and substrate depletion on actin polymerization, the nucleotide state of the monomer pool, and the association of actin monomers with thymosin and profilin in the kidney epithelial cell line LLC-PK1. ATP depletion for 30 min increased F-actin content to 145% of the levels under physiological conditions, accompanied by a corresponding decrease in G-actin content. Cytochalasin D treatment did not reduce F-actin formation during ATP depletion, indicating that it was predominantly not because of barbed end monomer addition. ATP-G-actin levels decreased rapidly during depletion, but there was no change in the concentration of ADP-G-actin monomers. The decrease in ATP-G-actin levels could be accounted for by dissociation of the thymosin-G-actin binary complex, resulting in a rise in the concentration of free thymosin beta4 from 4 to 11 microm. Increased detection of profilin-actin complexes during depletion indicated that profilin may participate in catalyzing nucleotide exchange during depletion. This mechanism provides a biochemical basis for the accumulation of F-actin aggregates in ischemic cells.

  16. Competition, traits and resource depletion in plant communities.

    PubMed

    Violle, Cyrille; Garnier, Eric; Lecoeur, Jérémie; Roumet, Catherine; Podeur, Cécile; Blanchard, Alain; Navas, Marie-Laure

    2009-07-01

    Although of primary importance to explain plant community structure, general relationships between plant traits, resource depletion and competitive outcomes remain to be quantified across species. Here, we used a comparative approach to test whether instantaneous measurements of plant traits can capture both the amount of resources depleted under plant cover over time (competitive effect) and the way competitors perceived this resource depletion (competitive response). We performed a large competition experiment in which phytometers from a single grass species were transplanted within 18 different monocultures grown in a common-garden experiment, with a time-integrative quantification of light and water depletion over the phytometers' growing season. Resource-capturing traits were measured on both phytometers (competitive response traits) and monocultures (competitive effect traits). The total amounts of depleted light and water availabilities over the season strongly differed among monocultures; they were best estimated by instantaneous measurements of height and rooting depth, respectively, performed when either light or water became limiting. Specific leaf area and leaf water potential, two competitive response traits measured at the leaf level, were good predictors of changes in phytometer performance under competition, and reflected the amount of light and water, respectively, perceived by plants throughout their lifespan. Our results demonstrated the relevance of instantaneous measures of plant traits as indicators of resource depletion over time, validating the trait-based approach for competition ecology.

  17. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  18. Fundamentals of coding and reimbursement.

    PubMed

    Price, Paula

    2002-01-01

    After completing this introduction to radiology coding and reimbursement, readers will: Understand how health care reimbursement evolved over the past 50 years. Know the importance of documenting the patient's history. Have an overall picture of the standardized numerical coding system. Understand how accurate coding affects reimbursement. Understand coding functions as they pertain to regulatory compliance in the radiology department. Be familiar with the U.S. Justice Department's use of coding in tracking health care fraud.

  19. High Dimensional Trellis Coded Modulation

    DTIC Science & Technology

    2002-03-01

    popular recently for the decoding of turbo codes (or parallel concatenated codes ) which require an iteration between two permuted code sequences. The...nonsystematic constituent codes ) Published descriptions of the implementation of turbo decoders refer to the permuted “common” or “extrinsic” information...invented based on that condition. With the recent development of turbo codes [4] and the requirement of short frame transmission [5] [6], trellis

  20. Genetics Home Reference: MPV17-related hepatocerebral mitochondrial DNA depletion syndrome

    MedlinePlus

    ... mitochondrial DNA depletion syndrome MPV17-related hepatocerebral mitochondrial DNA depletion syndrome Enable Javascript to view the expand/ ... All Close All Description MPV17 -related hepatocerebral mitochondrial DNA depletion syndrome is an inherited disorder that can ...

  1. Engineering analysis report for the long-term management of depleted uranium hexafluoride : storage of depleted uranium metal.

    SciTech Connect

    Folga, S.M.; Kier, P.H.; Thimmapuram, P.R.

    2001-01-24

    This report contains an engineering analysis of long-term storage of uranium metal in boxes as an option for long-term management of depleted uranium hexafluoride (UF{sub 6}). Three storage facilities are considered: buildings, vaults, and mined cavities. Three cases are considered: either all, half, or a quarter of the depleted uranium metal that would be produced from the conversion of depleted UF{sub 6} is stored at the facility. The analysis of these alternatives is based on a box design used in the Final Programmatic Environmental Impact Statement for Alternative Strategies for the Long-Term Management and Use of Depleted Uranium Hexafluoride, report DOE/EIS-0269, published in 1999 by the US Department of Energy. This box design does not appear to effectively use space within the box. Hence, an alternative box design that allows for a reduced storage area is addressed in the appendices for long-term storage in buildings.

  2. Quantum codes from linear codes over finite chain rings

    NASA Astrophysics Data System (ADS)

    Liu, Xiusheng; Liu, Hualu

    2017-10-01

    In this paper, we provide two methods of constructing quantum codes from linear codes over finite chain rings. The first one is derived from the Calderbank-Shor-Steane (CSS) construction applied to self-dual codes over finite chain rings. The second construction is derived from the CSS construction applied to Gray images of the linear codes over finite chain ring F_{p^{2m}}+u{F}_{p^{2m}}. The good parameters of quantum codes from cyclic codes over finite chain rings are obtained.

  3. A semi-empirical model for the formation and depletion of the high burnup structure in UO2

    NASA Astrophysics Data System (ADS)

    Pizzocri, D.; Cappia, F.; Luzzi, L.; Pastore, G.; Rondinella, V. V.; Van Uffelen, P.

    2017-04-01

    In the rim zone of UO2 nuclear fuel pellets, the combination of high burnup and low temperature drives a microstructural change, leading to the formation of the high burnup structure (HBS). In this work, we propose a semi-empirical model to describe the formation of the HBS, which embraces the polygonisation/recrystallization process and the depletion of intra-granular fission gas, describing them as inherently related. For this purpose, we performed grain-size measurements on samples at radial positions in which the restructuring was incomplete. Based on these new experimental data, we infer an exponential reduction of the average grain size with local effective burnup, paired with a simultaneous depletion of intra-granular fission gas driven by diffusion. The comparison with currently used models indicates the applicability of the herein developed model within integral fuel performance codes.

  4. A semi-empirical model for the formation and depletion of the high burnup structure in UO 2

    DOE PAGES

    Pizzocri, D.; Cappia, F.; Luzzi, L.; ...

    2017-01-31

    In the rim zone of UO2 nuclear fuel pellets, the combination of high burnup and low temperature drives a microstructural change, leading to the formation of the high burnup structure (HBS). In this work, we propose a semi-empirical model to describe the formation of the HBS, which embraces the polygonisation/recrystallization process and the depletion of intra-granular fission gas, describing them as inherently related. To this end, we per-formed grain-size measurements on samples at radial positions in which the restructuring was incomplete. Moreover, based on these new experimental data, we assume an exponential reduction of the average grain size with localmore » effective burnup, paired with a simultaneous depletion of intra-granular fission gas driven by diffusion. The comparison with currently used models indicates the applicability of the herein developed model within integral fuel performance codes.« less

  5. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  6. Epetra developers coding guidelines.

    SciTech Connect

    Heroux, Michael Allen; Sexton, Paul Michael

    2003-12-01

    Epetra is a package of classes for the construction and use of serial and distributed parallel linear algebra objects. It is one of the base packages in Trilinos. This document describes guidelines for Epetra coding style. The issues discussed here go beyond correct C++ syntax to address issues that make code more readable and self-consistent. The guidelines presented here are intended to aid current and future development of Epetra specifically. They reflect design decisions that were made in the early development stages of Epetra. Some of the guidelines are contrary to more commonly used conventions, but we choose to continue these practices for the purposes of self-consistency. These guidelines are intended to be complimentary to policies established in the Trilinos Developers Guide.

  7. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  8. The NIMROD Code

    NASA Astrophysics Data System (ADS)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  9. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  10. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  11. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  12. Efficient convolutional sparse coding

    DOEpatents

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  13. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  14. A robust TEC depletion detector algorithm for satellite based navigation in Indian zone and depletion analysis for GAGAN

    NASA Astrophysics Data System (ADS)

    Dashora, Nirvikar

    2012-07-01

    Equatorial plasma bubble (EPB) and associated plasma irregularities are known to cause severe scintillation for the satellite signals and produce range errors, which eventually result either in loss of lock of the signal or in random fluctuation in TEC, respectively, affecting precise positioning and navigation solutions. The EPBs manifest as sudden reduction in line of sight TEC, which are more often called TEC depletions, and are spread over thousands of km in meridional direction and a few hundred km in zonal direction. They change shape and size while drifting from one longitude to another in nighttime ionosphere. For a satellite based navigation system, like GAGAN in India that depends upon (i) multiple satellites (i.e. GPS) (ii) multiple ground reference stations and (iii) a near real time data processing, such EPBs are of grave concern. A TEC model generally provides a near real-time grid based ionospheric vertical errors (GIVEs) over hypothetically spread 5x5 degree latitude-longitude grid points. But, on night when a TEC depletion occurs in a given longitude sector, it is almost impossible for any system to give a forecast of GIVEs. If loss-of-lock events occur due to scintillation, there is no way to improve the situation. But, when large and random depletions in TEC occur with scintillations and without loss-of-lock, it affects low latitude TEC in two ways. (a) Multiple satellites show depleted TEC which may be very different from model-TEC values and hence the GIVE would be incorrect over various grid points (ii) the user may be affected by depletions which are not sampled by reference stations and hence interpolated GIVE within one square would be grossly erroneous. The most general solution (and the far most difficult as well) is having advance knowledge of spatio-temporal occurrence and precise magnitude of such depletions. While forecasting TEC depletions in spatio-temporal domain are a scientific challenge (as we show below), operational systems

  15. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  16. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  17. The Tau Code

    PubMed Central

    Avila, Jesús

    2009-01-01

    In this short review, I will focus on how a unique tau gene may produce many tau isoforms through alternative splicing and how the phosphorylation of these isoforms by different kinases may affect their activity and behaviour. Indeed, each of the different tau isoforms may play a distinct role under both physiological and pathological conditions. Thus, I will discuss whether a tau code exists that might explain the involvement of different tau isoforms in different cellular functions. PMID:20552052

  18. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  19. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  20. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  1. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  2. Ego depletion decreases trust in economic decision making

    PubMed Central

    Ainsworth, Sarah E.; Baumeister, Roy F.; Vohs, Kathleen D.; Ariely, Dan

    2014-01-01

    Three experiments tested the effects of ego depletion on economic decision making. Participants completed a task either requiring self-control or not. Then participants learned about the trust game, in which senders are given an initial allocation of $10 to split between themselves and another person, the receiver. The receiver receives triple the amount given and can send any, all, or none of the tripled money back to the sender. Participants were assigned the role of the sender and decided how to split the initial allocation. Giving less money, and therefore not trusting the receiver, is the safe, less risky response. Participants who had exerted self-control and were depleted gave the receiver less money than those in the non-depletion condition (Experiment 1). This effect was replicated and moderated in two additional experiments. Depletion again led to lower amounts given (less trust), but primarily among participants who were told they would never meet the receiver (Experiment 2) or who were given no information about how similar they were to the receiver (Experiment 3). Amounts given did not differ for depleted and non-depleted participants who either expected to meet the receiver (Experiment 2) or were led to believe that they were very similar to the receiver (Experiment 3). Decreased trust among depleted participants was strongest among neurotics. These results imply that self-control facilitates behavioral trust, especially when no other cues signal decreased social risk in trusting, such as if an actual or possible relationship with the receiver were suggested. PMID:25013237

  3. Global storm time depletion of the outer electron belt.

    PubMed

    Ukhorskiy, A Y; Sitnov, M I; Millan, R M; Kress, B T; Fennell, J F; Claudepierre, S G; Barnes, R J

    2015-04-01

    The outer radiation belt consists of relativistic (>0.5 MeV) electrons trapped on closed trajectories around Earth where the magnetic field is nearly dipolar. During increased geomagnetic activity, electron intensities in the belt can vary by orders of magnitude at different spatial and temporal scales. The main phase of geomagnetic storms often produces deep depletions of electron intensities over broad regions of the outer belt. Previous studies identified three possible processes that can contribute to the main-phase depletions: adiabatic inflation of electron drift orbits caused by the ring current growth, electron loss into the atmosphere, and electron escape through the magnetopause boundary. In this paper we investigate the relative importance of the adiabatic effect and magnetopause loss to the rapid depletion of the outer belt observed at the Van Allen Probes spacecraft during the main phase of 17 March 2013 storm. The intensities of >1 MeV electrons were depleted by more than an order of magnitude over the entire radial extent of the belt in less than 6 h after the sudden storm commencement. For the analysis we used three-dimensional test particle simulations of global evolution of the outer belt in the Tsyganenko-Sitnov (TS07D) magnetic field model with an inductive electric field. Comparison of the simulation results with electron measurements from the Magnetic Electron Ion Spectrometer experiment shows that magnetopause loss accounts for most of the observed depletion at L>5, while at lower L shells the depletion is adiabatic. Both magnetopause loss and the adiabatic effect are controlled by the change in global configuration of the magnetic field due to storm time development of the ring current; a simulation of electron evolution without a ring current produces a much weaker depletion.

  4. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  5. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. Copyright © 2014. Published by Elsevier Masson SAS.

  6. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  7. Lithium Depletion is a Strong Test of Core-envelope Recoupling

    NASA Astrophysics Data System (ADS)

    Somers, Garrett; Pinsonneault, Marc H.

    2016-09-01

    Rotational mixing is a prime candidate for explaining the gradual depletion of lithium from the photospheres of cool stars during the main sequence. However, previous mixing calculations have relied primarily on treatments of angular momentum transport in stellar interiors incompatible with solar and stellar data in the sense that they overestimate the internal differential rotation. Instead, recent studies suggest that stars are strongly differentially rotating at young ages but approach a solid body rotation during their lifetimes. We modify our rotating stellar evolution code to include an additional source of angular momentum transport, a necessary ingredient for explaining the open cluster rotation pattern, and examine the consequences for mixing. We confirm that core-envelope recoupling with a ∼20 Myr timescale is required to explain the evolution of the mean rotation pattern along the main sequence, and demonstrate that it also provides a more accurate description of the Li depletion pattern seen in open clusters. Recoupling produces a characteristic pattern of efficient mixing at early ages and little mixing at late ages, thus predicting a flattening of Li depletion at a few Gyr, in agreement with the observed late-time evolution. Using Li abundances we argue that the timescale for core-envelope recoupling during the main sequence decreases sharply with increasing mass. We discuss the implications of this finding for stellar physics, including the viability of gravity waves and magnetic fields as agents of angular momentum transport. We also raise the possibility of intrinsic differences in initial conditions in star clusters using M67 as an example.

  8. An Automated, Multi-Step Monte Carlo Burnup Code System.

    SciTech Connect

    TRELLUE, HOLLY R.

    2003-07-14

    Version 02 MONTEBURNS Version 2 calculates coupled neutronic/isotopic results for nuclear systems and produces a large number of criticality and burnup results based on various material feed/removal specifications, power(s), and time intervals. MONTEBURNS is a fully automated tool that links the LANL MCNP Monte Carlo transport code with a radioactive decay and burnup code. Highlights on changes to Version 2 are listed in the transmittal letter. Along with other minor improvements in MONTEBURNS Version 2, the option was added to use CINDER90 instead of ORIGEN2 as the depletion/decay part of the system. CINDER90 is a multi-group depletion code developed at LANL and is not currently available from RSICC. This MONTEBURNS release was tested with various combinations of CCC-715/MCNPX 2.4.0, CCC-710/MCNP5, CCC-700/MCNP4C, CCC-371/ORIGEN2.2, ORIGEN2.1 and CINDER90. Perl is required software and is not included in this distribution. MCNP, ORIGEN2, and CINDER90 are not included.

  9. The triple distribution of codes and ordered codes

    PubMed Central

    Trinker, Horst

    2011-01-01

    We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859–2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound. PMID:22505770

  10. Computer-Based Coding of Occupation Codes for Epidemiological Analyses

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Johnson, Calvin A.; Friesen, Melissa C.

    2014-01-01

    Mapping job titles to standardized occupation classification (SOC) codes is an important step in evaluating changes in health risks over time as measured in inspection databases. However, manual SOC coding is cost prohibitive for very large studies. Computer based SOC coding systems can improve the efficiency of incorporating occupational risk factors into large-scale epidemiological studies. We present a novel method of mapping verbatim job titles to SOC codes using a large table of prior knowledge available in the public domain that included detailed description of the tasks and activities and their synonyms relevant to each SOC code. Job titles are compared to our knowledge base to find the closest matching SOC code. A soft Jaccard index is used to measure the similarity between a previously unseen job title and the knowledge base. Additional information such as standardized industrial codes can be incorporated to improve the SOC code determination by providing additional context to break ties in matches. PMID:25221787

  11. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  12. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  13. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  14. Nucleosynthesis in the Hyades Open Cluster: Evidence for the Enhanced Depletion of 12C

    NASA Astrophysics Data System (ADS)

    Schuler, Simon C.; King, Jeremy R.; The, Lih-Sin

    2010-03-01

    We present the results of a light element abundance analysis of three solar-type main sequence (MS) dwarfs and three red giant branch (RGB) clump stars in the Hyades open cluster using high-resolution and high signal-to-noise spectroscopy. The CNO abundances of each group (MS or RGB) are in excellent star-to-star agreement and confirm that the giants have undergone first dredge-up mixing. The observed abundances are compared to predictions of a standard stellar model based on the Clemson-American University of Beirut (CAUB) stellar evolution code. The model reproduces the observed evolution of the N and O abundances, as well as the previously derived 12C/13C ratio, but it fails to predict the observed level of 12C depletion in the giants. More tellingly, the sum of the observed giant CNO abundances does not equal that of the dwarfs.

  15. Transcription impairment and cell migration defects in elongator-depleted cells: implication for familial dysautonomia.

    PubMed

    Close, Pierre; Hawkes, Nicola; Cornez, Isabelle; Creppe, Catherine; Lambert, Charles A; Rogister, Bernard; Siebenlist, Ulrich; Merville, Marie-Paule; Slaugenhaupt, Susan A; Bours, Vincent; Svejstrup, Jesper Q; Chariot, Alain

    2006-05-19

    Mutations in IKBKAP, encoding a subunit of Elongator, cause familial dysautonomia (FD), a severe neurodevelopmental disease with complex clinical characteristics. Elongator was previously linked not only with transcriptional elongation and histone acetylation but also with other cellular processes. Here, we used RNA interference (RNAi) and fibroblasts from FD patients to identify Elongator target genes and study the role of Elongator in transcription. Strikingly, whereas Elongator is recruited to both target and nontarget genes, only target genes display histone H3 hypoacetylation and progressively lower RNAPII density through the coding region in FD cells. Interestingly, several target genes encode proteins implicated in cell motility. Indeed, characterization of IKAP/hELP1 RNAi cells, FD fibroblasts, and neuronal cell-derived cells uncovered defects in this cellular function upon Elongator depletion. These results indicate that defects in Elongator function affect transcriptional elongation of several genes and that the ensuing cell motility deficiencies may underlie the neuropathology of FD patients.

  16. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  17. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  18. Depletion models can predict shorebird distribution at different spatial scales.

    PubMed

    Gill, J A; Sutherland, W J; Norris, K

    2001-02-22

    Predicting the impact of habitat change on populations requires an understanding of the number of animals that a given area can support. Depletion models enable predictions of the numbers of individuals an area can support from prey density and predator searching efficiency and handling time. Depletion models have been successfully employed to predict patterns of abundance over small spatial scales, but most environmental change occurs over large spatial scales. We test the ability of depletion models to predict abundance at a range of scales with black-tailed godwits, Limosa limosa islandica. From the type II functional response of godwits to their prey, we calculated the handling time and searching efficiency associated with these prey. These were incorporated in a depletion model, together with the density of available prey determined from surveys, in order to predict godwit abundance. Tests of these predictions with Wetland Bird Survey data from the British Trust for Ornithology showed significant correlations between predicted and observed densities at three scales: within mudflats, within estuaries and between estuaries. Depletion models can thus be powerful tools for predicting the population size that can be supported on sites at a range of scales. This greatly enhances our confidence in predictions of the consequences of environmental change.

  19. The association between controlled interpersonal affect regulation and resource depletion.

    PubMed

    Martínez-Íñigo, David; Poerio, Giulia Lara; Totterdell, Peter

    2013-07-01

    This investigation focuses on what occurs to individuals' self-regulatory resource during controlled Interpersonal Affect Regulation (IAR) which is the process of deliberately influencing the internal feeling states of others. Combining the strength model of self-regulation and the resources conservation model, the investigation tested whether: (1) IAR behaviors are positively related to ego-depletion because goal-directed behaviors demand self-regulatory processes, and (2) the use of affect-improving strategies benefits from a source of resource-recovery because it initiates positive feedback from targets, as proposed from a resource-conservation perspective. To test this, a lab study based on an experimental dual-task paradigm using a sample of pairs of friends in the UK and a longitudinal field study of a sample of healthcare workers in Spain were conducted. The experimental study showed a depleting effect of interpersonal affect-improving IAR on a subsequent self-regulation task. The field study showed that while interpersonal affect-worsening was positively associated with depletion, as indicated by the level of emotional exhaustion, interpersonal affect-improving was only associated with depletion after controlling for the effect of positive feedback from clients. The findings indicate that IAR does have implications for resource depletion, but that social reactions play a role in the outcome. © 2013 The Authors. Applied Psychology: Health and Well-Being © 2013 The International Association of Applied Psychology.

  20. Self-regulatory depletion increases emotional reactivity in the amygdala.

    PubMed

    Wagner, Dylan D; Heatherton, Todd F

    2013-04-01

    The ability to self-regulate can become impaired when people are required to engage in successive acts of effortful self-control, even when self-control occurs in different domains. Here, we used functional neuroimaging to test whether engaging in effortful inhibition in the cognitive domain would lead to putative dysfunction in the emotional domain. Forty-eight participants viewed images of emotional scenes during functional magnetic resonance imaging in two sessions that were separated by a challenging attention control task that required effortful inhibition (depletion group) or not (control group). Compared to the control group, depleted participants showed increased activity in the left amygdala to negative but not to positive or neutral scenes. Moreover, whereas the control group showed reduced amygdala activity to all scene types (i.e. habituation), the depletion group showed increased amygdala activity relative to their pre-depletion baseline; however this was only significant for negative scenes. Finally, depleted participants showed reduced functional connectivity between the left amygdala and ventromedial prefrontal cortex during negative scene processing. These findings demonstrate that consuming self-regulatory resources leads to an exaggerated neural response to emotional material that appears specific to negatively valenced stimuli and further suggests a failure to recruit top-down prefrontal regions involved in emotion regulation.

  1. Self-regulatory depletion increases emotional reactivity in the amygdala

    PubMed Central

    Heatherton, Todd F.

    2013-01-01

    The ability to self-regulate can become impaired when people are required to engage in successive acts of effortful self-control, even when self-control occurs in different domains. Here, we used functional neuroimaging to test whether engaging in effortful inhibition in the cognitive domain would lead to putative dysfunction in the emotional domain. Forty-eight participants viewed images of emotional scenes during functional magnetic resonance imaging in two sessions that were separated by a challenging attention control task that required effortful inhibition (depletion group) or not (control group). Compared to the control group, depleted participants showed increased activity in the left amygdala to negative but not to positive or neutral scenes. Moreover, whereas the control group showed reduced amygdala activity to all scene types (i.e. habituation), the depletion group showed increased amygdala activity relative to their pre-depletion baseline; however this was only significant for negative scenes. Finally, depleted participants showed reduced functional connectivity between the left amygdala and ventromedial prefrontal cortex during negative scene processing. These findings demonstrate that consuming self-regulatory resources leads to an exaggerated neural response to emotional material that appears specific to negatively valenced stimuli and further suggests a failure to recruit top–down prefrontal regions involved in emotion regulation. PMID:22842815

  2. Antarctic winter mercury and ozone depletion events over sea ice

    NASA Astrophysics Data System (ADS)

    Nerentorp Mastromonaco, M.; Gårdfeldt, K.; Jourdain, B.; Abrahamsson, K.; Granfors, A.; Ahnoff, M.; Dommergue, A.; Méjean, G.; Jacobi, H.-W.

    2016-03-01

    During atmospheric mercury and ozone depletion events in the springtime in polar regions gaseous elemental mercury and ozone undergo rapid declines. Mercury is quickly transformed into oxidation products, which are subsequently removed by deposition. Here we show that such events also occur during Antarctic winter over sea ice areas, leading to additional deposition of mercury. Over four months in the Weddell Sea we measured gaseous elemental, oxidized, and particulate-bound mercury, as well as ozone in the troposphere and total and elemental mercury concentrations in snow, demonstrating a series of depletion and deposition events between July and September. The winter depletions in July were characterized by stronger correlations between mercury and ozone and larger formation of particulate-bound mercury in air compared to later spring events. It appears that light at large solar zenith angles is sufficient to initiate the photolytic formation of halogen radicals. We also propose a dark mechanism that could explain observed events in air masses coming from dark regions. Br2 that could be the main actor in dark conditions was possibly formed in high concentrations in the marine boundary layer in the dark. These high concentrations may also have caused the formation of high concentrations of CHBr3 and CH2I2 in the top layers of the Antarctic sea ice observed during winter. These new findings show that the extent of depletion events is larger than previously believed and that winter depletions result in additional deposition of mercury that could be transferred to marine and terrestrial ecosystems.

  3. Cholesterol depletion impairs contractile machinery in neonatal rat cardiomyocytes

    PubMed Central

    Hissa, Barbara; Oakes, Patrick W.; Pontes, Bruno; Ramírez-San Juan, Guillermina; Gardel, Margaret L.

    2017-01-01

    Cholesterol regulates numerous cellular processes. Depleting its synthesis in skeletal myofibers induces vacuolization and contraction impairment. However, little is known about how cholesterol reduction affects cardiomyocyte behavior. Here, we deplete cholesterol by incubating neonatal cardiomyocytes with methyl-beta-cyclodextrin. Traction force microscopy shows that lowering cholesterol increases the rate of cell contraction and generates defects in cell relaxation. Cholesterol depletion also increases membrane tension, Ca2+ spikes frequency and intracellular Ca2+ concentration. These changes can be correlated with modifications in caveolin-3 and L-Type Ca2+ channel distributions across the sarcolemma. Channel regulation is also compromised since cAMP-dependent PKA activity is enhanced, increasing the probability of L-Type Ca2+ channel opening events. Immunofluorescence reveals that cholesterol depletion abrogates sarcomeric organization, changing spacing and alignment of α-actinin bands due to increase in proteolytic activity of calpain. We propose a mechanism in which cholesterol depletion triggers a signaling cascade, culminating with contraction impairment and myofibril disruption in cardiomyocytes. PMID:28256617

  4. Global Depletion of Groundwater Resources: Past and Future Analyses

    NASA Astrophysics Data System (ADS)

    Bierkens, M. F.; de Graaf, I. E. M.; Van Beek, L. P.; Wada, Y.

    2014-12-01

    Globally, about 17% of the crops are irrigated, yet irrigation accounts for 40% of the global food production. As more than 40% of irrigation water comes from groundwater, groundwater abstraction rates are large and exceed natural recharge rates in many regions of the world, thus leading to groundwater depletion. In this paper we provide an overview of recent research on global groundwater depletion. We start with presenting various estimates of global groundwater depletion, both from flux based as well as volume based methods. We also present estimates of the contribution of non-renewable groundwater to irrigation water consumption and how this contribution developed during the last 50 years. Next, using a flux based method, we provide projections of groundwater depletion for the coming century under various socio-economic and climate scenarios. As groundwater depletion contributes to sea-level rise, we also provide estimates of this contribution from the past as well as for future scenarios. Finally, we show recent results of groundwater level changes and change in river flow as a result of global groundwater abstractions as obtained from a global groundwater flow model.

  5. Recovery of the Ozone Layer: The Ozone Depleting Gas Index

    NASA Astrophysics Data System (ADS)

    Hofmann, David J.; Montzka, Stephen A.

    2009-01-01

    The stratospheric ozone layer, through absorption of solar ultraviolet radiation, protects all biological systems on Earth. In response to concerns over the depletion of the global ozone layer, the U.S. Clean Air Act as amended in 1990 mandates that NASA and NOAA monitor stratospheric ozone and ozone-depleting substances. This information is critical for assessing whether the Montreal Protocol on Substances That Deplete the Ozone Layer, an international treaty that entered into force in 1989 to protect the ozone layer, is having its intended effect of mitigating increases in harmful ultraviolet radiation. To provide the information necessary to satisfy this congressional mandate, both NASA and NOAA have instituted and maintained global monitoring programs to keep track of ozone-depleting gases as well as ozone itself. While data collected for the past 30 years have been used extensively in international assessments of ozone layer depletion science, the language of scientists often eludes the average citizen who has a considerable interest in the health of Earth's protective ultraviolet radiation shield. Are the ozone-destroying chemicals declining in the atmosphere? When will these chemicals decline to pre-ozone hole levels so that the Antarctic ozone hole might disappear? Will this timing be different in the stratosphere above midlatitudes?

  6. Wall depletion length of a channel-confined polymer

    NASA Astrophysics Data System (ADS)

    Cheong, Guo Kang; Li, Xiaolan; Dorfman, Kevin D.

    2017-02-01

    Numerous experiments have taken advantage of DNA as a model system to test theories for a channel-confined polymer. A tacit assumption in analyzing these data is the existence of a well-defined depletion length characterizing DNA-wall interactions such that the experimental system (a polyelectrolyte in a channel with charged walls) can be mapped to the theoretical model (a neutral polymer with hard walls). We test this assumption using pruned-enriched Rosenbluth method (PERM) simulations of a DNA-like semiflexible polymer confined in a tube. The polymer-wall interactions are modeled by augmenting a hard wall interaction with an exponentially decaying, repulsive soft potential. The free energy, mean span, and variance in the mean span obtained in the presence of a soft wall potential are compared to equivalent simulations in the absence of the soft wall potential to determine the depletion length. We find that the mean span and variance about the mean span have the same depletion length for all soft potentials we tested. In contrast, the depletion length for the confinement free energy approaches that for the mean span only when depletion length no longer depends on channel size. The results have implications for the interpretation of DNA confinement experiments under low ionic strengths.

  7. Bond rupture between colloidal particles with a depletion interaction

    SciTech Connect

    Whitaker, Kathryn A.; Furst, Eric M.

    2016-05-15

    The force required to break the bonds of a depletion gel is measured by dynamically loading pairs of colloidal particles suspended in a solution of a nonadsorbing polymer. Sterically stabilized poly(methyl methacrylate) colloids that are 2.7 μm diameter are brought into contact in a solvent mixture of cyclohexane-cyclohexyl bromide and polystyrene polymer depletant. The particle pairs are subject to a tensile load at a constant loading rate over many approach-retraction cycles. The stochastic nature of the thermal rupture events results in a distribution of bond rupture forces with an average magnitude and variance that increases with increasing depletant concentration. The measured force distribution is described by the flux of particle pairs sampling the energy barrier of the bond interaction potential based on the Asakura–Oosawa depletion model. A transition state model demonstrates the significance of lubrication hydrodynamic interactions and the effect of the applied loading rate on the rupture force of bonds in a depletion gel.

  8. A TEST OF PRE-MAIN-SEQUENCE LITHIUM DEPLETION MODELS

    SciTech Connect

    Yee, Jennifer C.; Jensen, Eric L. N.

    2010-03-01

    Despite the extensive study of lithium depletion during pre-main-sequence (PMS) contraction, studies of individual stars show discrepancies between ages determined from the Hertzsprung-Russell (H-R) diagram and ages determined from lithium depletion, indicating open questions in the PMS evolutionary models. To further test these models, we present high-resolution spectra for members of the beta Pictoris Moving Group (BPMG), which is young and nearby. We measure equivalent widths of the 6707.8 A Li I line in these stars and use them to determine lithium abundances. We combine the lithium abundance with the predictions of PMS evolutionary models in order to calculate a lithium depletion age for each star. We compare this age to the age predicted by the H-R diagram of the same model. We find that the evolutionary models underpredict the amount of lithium depletion for the BPMG given its nominal H-R diagram age of {approx}12 Myr, particularly for the mid-M stars, which have no observable Li I line. This results in systematically older ages calculated from lithium depletion isochrones than from the H-R diagram. We suggest that this discrepancy may be related to the discrepancy between measured M-dwarf radii and the smaller radii predicted by evolutionary models.

  9. Examining depletion theories under conditions of within-task transfer.

    PubMed

    Brewer, Gene A; Lau, Kevin K H; Wingert, Kimberly M; Ball, B Hunter; Blais, Chris

    2017-07-01

    In everyday life, mental fatigue can be detrimental across many domains including driving, learning, and working. Given the importance of understanding and accounting for the deleterious effects of mental fatigue on behavior, a growing body of literature has studied the role of motivational and executive control processes in mental fatigue. In typical laboratory paradigms, participants complete a task that places demand on these self-control processes and are later given a subsequent task. Generally speaking, decrements to subsequent task performance are taken as evidence that the initial task created mental fatigue through the continued engagement of motivational and executive functions. Several models have been developed to account for negative transfer resulting from this "ego depletion." In the current study, we provide a brief literature review, specify current theoretical approaches to ego-depletion, and report an empirical test of current models of depletion. Across 4 experiments we found minimal evidence for executive control depletion along with strong evidence for motivation mediated ego depletion. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. About ozone depletion in stratosphere over Brazil in last decade

    NASA Astrophysics Data System (ADS)

    Martin, Inácio M.; Imai, Takeshi; Seguchi, Tomio

    The depletion of stratospheric ozone, resulting from the emission of chlorofluorocarbons (CFCs), has become a major issue since 1980. The decrease in stratospheric ozone over the polar regions has been pronounced at the South Pole than at the North Pole. In mid-latitude and equatorial regions, ozone depletion becomes less important; it depends on seasonal effects and on the characteristics of a particular region. The detailed mechanism by which the polar ozone holes form is different from that for the mid-latitude thinning, but the most important process in both trends is the catalytic destruction of ozone by atomic chlorine and bromine. The main source of these halogen atoms in the stratosphere is photodissociation of CFC compounds, commonly called freons, and of bromofluorocarbon compounds known as halons. These compounds are transported into the stratosphere after being emitted at the surface. Both ozone depletion mechanisms strengthened as emissions of CFCs and halons increased [1]. Measurements of stratospheric ozone carried out on several locations in Brazil and on the South Pole in the last decade (1996-2005), using detectors placed on ground, stratospheric balloons and Earth Probe TOMS satellites, are presented here. Detailed series analysis from 1980 up to the present describes a mean ozone depletion of 4[1] http://en.wikipedia.org/wiki/Ozone/depletion.

  11. Surface depletion induced quantum confinement in CdS nanobelts.

    PubMed

    Li, Dehui; Zhang, Jun; Xiong, Qihua

    2012-06-26

    We investigate the surface depletion induced quantum confinement in CdS nanobelts beyond the quantum confinement regime, where the thickness is much larger than the bulk exciton Bohr radius. From room temperature to 77 K, the emission energy of free exciton A scales linearly versus 1/L(2) when the thickness L is less than 100 nm, while a deviation occurs for those belts thicker than 100 nm due to the reabsorption effect. The 1/L(2) dependence can be explained by the surface depletion induced quantum confinement, which modifies the confinement potential leading to a quasi-square potential well smaller than the geometric thickness of nanobelts, giving rise to the confinement effect to exciton emission beyond the quantum confinement regime. The surface depletion is sensitive to carrier concentration and surface states. As the temperature decreases, the decrease of the electrostatic potential drop in the surface depletion region leads to a weaker confinement due to the decrease of carrier concentration. With a layer of polymethyl methacrylate (PMMA) passivation, PL spectra exhibit pronounced red shifts due to the decrease of the surface states at room temperature. No shift is found at 10 K both with or without PMMA passivation, suggesting a much weaker depletion field due to the freezing-out of donors.

  12. Magnetic flux pileup and plasma depletion in Mercury's subsolar magnetosheath

    NASA Astrophysics Data System (ADS)

    Gershman, Daniel J.; Slavin, James A.; Raines, Jim M.; Zurbuchen, Thomas H.; Anderson, Brian J.; Korth, Haje; Baker, Daniel N.; Solomon, Sean C.

    2013-11-01

    from the Fast Imaging Plasma Spectrometer (FIPS) and Magnetometer (MAG) on the MErcury Surface, Space ENvironment, GEochemistry, and Ranging spacecraft during 40 orbits about Mercury are used to characterize the plasma depletion layer just exterior to the planet's dayside magnetopause. A plasma depletion layer forms at Mercury as a result of piled-up magnetic flux that is draped around the magnetosphere. The low average upstream Alfvénic Mach number (MA ~3-5) in the solar wind at Mercury often results in large-scale plasma depletion in the magnetosheath between the subsolar magnetopause and the bow shock. Flux pileup is observed to occur downstream under both quasi-perpendicular and quasi-parallel shock geometries for all orientations of the interplanetary magnetic field (IMF). Furthermore, little to no plasma depletion is seen during some periods with stable northward IMF. The consistently low value of plasma β, the ratio of plasma pressure to magnetic pressure, at the magnetopause associated with the low average upstream MA is believed to be the cause for the high average reconnection rate at Mercury, reported to be nearly 3 times that observed at Earth. Finally, a characteristic depletion length outward from the subsolar magnetopause of ~300 km is found for Mercury. This value scales among planetary bodies as the average standoff distance of the magnetopause.

  13. Calculating Time-Integral Quantities in Depletion Calculations

    SciTech Connect

    Isotalo, Aarno

    2016-06-02

    A method referred to as tally nuclides is presented for accurately and efficiently calculating the time-step averages and integrals of any quantities that are weighted sums of atomic densities with constant weights during the step. The method allows all such quantities to be calculated simultaneously as a part of a single depletion solution with existing depletion algorithms. Some examples of the results that can be extracted include step-average atomic densities and macroscopic reaction rates, the total number of fissions during the step, and the amount of energy released during the step. Furthermore, the method should be applicable with several depletion algorithms, and the integrals or averages should be calculated with an accuracy comparable to that reached by the selected algorithm for end-of-step atomic densities. The accuracy of the method is demonstrated in depletion calculations using the Chebyshev rational approximation method. Here, we demonstrate how the ability to calculate energy release in depletion calculations can be used to determine the accuracy of the normalization in a constant-power burnup calculation during the calculation without a need for a reference solution.

  14. Neutron irradiation test of depleted CMOS pixel detector prototypes

    NASA Astrophysics Data System (ADS)

    Mandić, I.; Cindro, V.; Gorišek, A.; Hiti, B.; Kramberger, G.; Mikuž, M.; Zavrtanik, M.; Hemperek, T.; Daas, M.; Hügging, F.; Krüger, H.; Pohl, D.-L.; Wermes, N.; Gonella, L.

    2017-02-01

    Charge collection properties of depleted CMOS pixel detector prototypes produced on p-type substrate of 2 kΩ cm initial resistivity (by LFoundry 150 nm process) were studied using Edge-TCT method before and after neutron irradiation. The test structures were produced for investigation of CMOS technology in tracking detectors for experiments at HL-LHC upgrade. Measurements were made with passive detector structures in which current pulses induced on charge collecting electrodes could be directly observed. Thickness of depleted layer was estimated and studied as function of neutron irradiation fluence. An increase of depletion thickness was observed after first two irradiation steps to 1 · 1013 n/cm2 and 5 · 1013 n/cm2 and attributed to initial acceptor removal. At higher fluences the depletion thickness at given voltage decreases with increasing fluence because of radiation induced defects contributing to the effective space charge concentration. The behaviour is consistent with that of high resistivity silicon used for standard particle detectors. The measured thickness of the depleted layer after irradiation with 1 · 1015 n/cm2 is more than 50 μm at 100 V bias. This is sufficient to guarantee satisfactory signal/noise performance on outer layers of pixel trackers in HL-LHC experiments.

  15. Cadmium Depletion Impacts on Hardening Neutron6 Spectrum for Advanced Fuel Testing in ATR

    SciTech Connect

    Gray S. Chang

    2011-05-01

    For transmuting long-lived isotopes contained in spent nuclear fuel into shorter-lived fission products effectively is in a fast neutron spectrum reactor. In the absence of a fast spectrum test reactor in the United States of America (USA), initial irradiation testing of candidate fuels can be performed in a thermal test reactor that has been modified to produce a test region with a hardened neutron spectrum. A test region is achieved with a Cadmium (Cd) filter which can harden the neutron spectrum to a spectrum similar (although still somewhat softer) to that of the liquid metal fast breeder reactor (LMFBR). A fuel test loop with a Cd-filter has been installed within the East Flux Trap (EFT) of the Advanced Test Reactor (ATR) at the Idaho National Laboratory (INL). A detailed comparison analyses between the cadmium (Cd) filter hardened neutron spectrum in the ATR and the LMFBR fast neutron spectrum have been performed using MCWO. MCWO is a set of scripting tools that are used to couple the Monte Carlo transport code MCNP with the isotope depletion and buildup code ORIGEN-2.2. The MCWO-calculated results indicate that the Cd-filter can effectively flatten the Rim-Effect and reduce the linear heat rate (LHGR) to meet the advanced fuel testing project requirements at the beginning of irradiation (BOI). However, the filtering characteristics of Cd as a strong absorber quickly depletes over time, and the Cd-filter must be replaced for every two typical operating cycles within the EFT of the ATR. The designed Cd-filter can effectively depress the LHGR in experimental fuels and harden the neutron spectrum enough to adequately flatten the Rim Effect in the test region.

  16. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  17. New optimal asymmetric quantum codes from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Zhang, Guanghui; Chen, Bocong; Li, Liangchen

    2014-06-01

    In this paper, we construct two classes of asymmetric quantum codes by using constacyclic codes. The first class is the asymmetric quantum codes with parameters [[q2 + 1, q2 + 1 - 2(t + k + 1), (2k + 2)/(2t + 2)

  18. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  19. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  20. Simulation of groundwater conditions and streamflow depletion to evaluate water availability in a Freeport, Maine, watershed

    USGS Publications Warehouse

    Nielsen, Martha G.; Locke, Daniel B.

    2012-01-01

    , the public-supply withdrawals (105.5 million gallons per year (Mgal/yr)) were much greater than those for any other category, being almost 7 times greater than all domestic well withdrawals (15.3 Mgal/yr). Industrial withdrawals in the study area (2.0 Mgal/yr) are mostly by a company that withdraws from an aquifer at the edge of the Merrill Brook watershed. Commercial withdrawals are very small (1.0 Mgal/yr), and no irrigation or other agricultural withdrawals were identified in this study area. A three-dimensional, steady-state groundwater-flow model was developed to evaluate stream-aquifer interactions and streamflow depletion from pumping, to help refine the conceptual model, and to predict changes in streamflow resulting from changes in pumping and recharge. Groundwater levels and flow in the Freeport aquifer study area were simulated with the three-dimensional, finite-difference groundwater-flow modeling code, MODFLOW-2005. Study area hydrology was simulated with a 3-layer model, under steady-state conditions. The groundwater model was used to evaluate changes that could occur in the water budgets of three parts of the local hydrologic system (the Harvey Brook watershed, the Merrill Brook watershed, and the buried aquifer from which pumping occurs) under several different climatic and pumping scenarios. The scenarios were (1) no pumping well withdrawals; (2) current (2009) pumping, but simulated drought conditions (20-percent reduction in recharge); (3) current (2009) recharge, but a 50-percent increase in pumping well withdrawals for public supply; and (4) drought conditions and increased pumping combined. In simulated drought situations, the overall recharge to the buried valley is about 15 percent less and the total amount of streamflow in the model area is reduced by about 19 percent. Without pumping, infiltration to the buried valley aquifer around the confining unit decreased by a small amount (0.05 million gallons per day (Mgal/d)), and discharge to the

  1. On lossless coding for HEVC

    NASA Astrophysics Data System (ADS)

    Gao, Wen; Jiang, Minqiang; Yu, Haoping

    2013-02-01

    In this paper, we first review the lossless coding mode in the version 1 of the HEVC standard that has recently finalized. We then provide a performance comparison between the lossless coding mode in the HEVC and MPEG-AVC/H.264 standards and show that the HEVC lossless coding has limited coding efficiency. To improve the performance of the lossless coding mode, several new coding tools that were contributed to JCT-VC but not adopted in version 1 of HEVC standard are introduced. In particular, we discuss sample based intra prediction and coding of residual coefficients in more detail. At the end, we briefly address a new class of coding tools, i.e., a dictionary-based coder, that is efficient in encoding screen content including graphics and text.

  2. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  3. ENSDF ANALYSIS AND UTILITY CODES.

    SciTech Connect

    BURROWS, T.

    2005-04-04

    The ENSDF analysis and checking codes are briefly described, along with their uses with various types of ENSDF datasets. For more information on the programs see ''Read Me'' entries and other documentation associated with each code.

  4. Depletion induced clustering of red blood cells in microchannels

    NASA Astrophysics Data System (ADS)

    Wagner, Christian; Brust, Mathias; Podgorski, Thomas; Coupier, Gwennou

    2012-11-01

    The flow properties of blood are determined by the physical properties of its main constituents, the red blood cells (RBC's). At low shear rates RBC's form aggregates, so called rouleaux. Higher shear rates can break them up and the viscosity of blood shows a shear thinning behavior. The physical origin of the rouleaux formation is not yet fully resolved and there are two competing models available. One predicts that the adhesion is induced by bridging of the plasma (macromolecular) proteins in-between two RBC's. The other is based on the depletion effect and thus predicts the absence of macromolecules in-between the cells of a rouleaux. Recent single cell force measurements by use of an AFM support strongly the depletion model. By varying the concentration of Dextran at different molecular weights we can control the adhesions strength. Measurements at low hematocrit in a microfluidic channel show that the number of size of clusters is determined by the depletion induced adhesion strength.

  5. Depleted uranium as a backfill for nuclear fuel waste package

    DOEpatents

    Forsberg, C.W.

    1998-11-03

    A method is described for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotopically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package. 6 figs.

  6. Processable high internal phase Pickering emulsions using depletion attraction.

    PubMed

    Kim, KyuHan; Kim, Subeen; Ryu, Jiheun; Jeon, Jiyoon; Jang, Se Gyu; Kim, Hyunjun; Gweon, Dae-Gab; Im, Won Bin; Han, Yosep; Kim, Hyunjung; Choi, Siyoung Q

    2017-02-01

    High internal phase emulsions have been widely used as templates for various porous materials, but special strategies are required to form, in particular, particle-covered ones that have been more difficult to obtain. Here, we report a versatile strategy to produce a stable high internal phase Pickering emulsion by exploiting a depletion interaction between an emulsion droplet and a particle using water-soluble polymers as a depletant. This attractive interaction facilitating the adsorption of particles onto the droplet interface and simultaneously suppressing desorption once adsorbed. This technique can be universally applied to nearly any kind of particle to stabilize an interface with the help of various non- or weakly adsorbing polymers as a depletant, which can be solidified to provide porous materials for many applications.

  7. Depletion potential in colloidal mixtures of hard spheres and platelets.

    PubMed

    Harnau, L; Dietrich, S

    2004-05-01

    The depletion potential between two hard spheres in a solvent of thin hard disclike platelets is investigated by using either the Derjaguin approximation or density functional theory. Particular attention is paid to the density dependence of the depletion potential. A second-order virial approximation is applied, which yields nearly exact results for the bulk properties of the hard-platelet fluid at densities two times smaller than the density of the isotropic fluid at isotropic-nematic phase coexistence. As the platelet density increases, the attractive primary minimum of the depletion potential deepens and an additional small repulsive barrier at larger sphere separations develops. Upon decreasing the ratio of the radius of the spheres and the platelets, the primary minimum diminishes and the position of the small repulsive barrier shifts to smaller values of the sphere separation.

  8. Processable high internal phase Pickering emulsions using depletion attraction

    NASA Astrophysics Data System (ADS)

    Kim, Kyuhan; Kim, Subeen; Ryu, Jiheun; Jeon, Jiyoon; Jang, Se Gyu; Kim, Hyunjun; Gweon, Dae-Gab; Im, Won Bin; Han, Yosep; Kim, Hyunjung; Choi, Siyoung Q.

    2017-02-01

    High internal phase emulsions have been widely used as templates for various porous materials, but special strategies are required to form, in particular, particle-covered ones that have been more difficult to obtain. Here, we report a versatile strategy to produce a stable high internal phase Pickering emulsion by exploiting a depletion interaction between an emulsion droplet and a particle using water-soluble polymers as a depletant. This attractive interaction facilitating the adsorption of particles onto the droplet interface and simultaneously suppressing desorption once adsorbed. This technique can be universally applied to nearly any kind of particle to stabilize an interface with the help of various non- or weakly adsorbing polymers as a depletant, which can be solidified to provide porous materials for many applications.

  9. Effect of Shim Arm Depletion in the NBSR

    SciTech Connect

    Hanson A. H.; Brown N.; Diamond, D.J.

    2013-02-22

    The cadmium shim arms in the NBSR undergo burnup during reactor operation and hence, require periodic replacement. Presently, the shim arms are replaced after every 25 cycles to guarantee they can maintain sufficient shutdown margin. Two prior reports document the expected change in the 113Cd distribution because of the shim arm depletion. One set of calculations was for the present high-enriched uranium fuel and the other for the low-enriched uranium fuel when it was in the COMP7 configuration (7 inch fuel length vs. the present 11 inch length). The depleted 113Cd distributions calculated for these cores were applied to the current design for an equilibrium low-enriched uranium core. This report details the predicted effects, if any, of shim arm depletion on the shim arm worth, the shutdown margin, power distributions and kinetics parameters.

  10. Depleted uranium as a backfill for nuclear fuel waste package

    SciTech Connect

    Forsberg, Charles W.

    1997-12-01

    A method is described for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotonically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package.

  11. Coherent quantum depletion of an interacting atom condensate.

    PubMed

    Kira, M

    2015-03-13

    Sufficiently strong interactions promote coherent quantum transitions in spite of thermalization and losses, which are the adversaries of delicate effects such as reversibility and correlations. In atomic Bose-Einstein condensates (BECs), strong atom-atom interactions can eject atoms from the BEC to the normal component, yielding quantum depletion instead of temperature depletion. A recent experiment has already been verified to overcome losses. Here I show that it also achieves coherent quantum-depletion dynamics in a BEC swept fast enough from weak to strong atom-atom interactions. The elementary coherent process first excites the normal component into a liquid state that evolves into a spherical shell state, where the atom occupation peaks at a finite momentum to shield 50% of the BEC atoms from annihilation. The identified coherent processes resemble ultrafast semiconductor excitations expanding the scope of BEC explorations to many-body non-equilibrium studies.

  12. Coherent quantum depletion of an interacting atom condensate

    PubMed Central

    Kira, M.

    2015-01-01

    Sufficiently strong interactions promote coherent quantum transitions in spite of thermalization and losses, which are the adversaries of delicate effects such as reversibility and correlations. In atomic Bose–Einstein condensates (BECs), strong atom–atom interactions can eject atoms from the BEC to the normal component, yielding quantum depletion instead of temperature depletion. A recent experiment has already been verified to overcome losses. Here I show that it also achieves coherent quantum-depletion dynamics in a BEC swept fast enough from weak to strong atom–atom interactions. The elementary coherent process first excites the normal component into a liquid state that evolves into a spherical shell state, where the atom occupation peaks at a finite momentum to shield 50% of the BEC atoms from annihilation. The identified coherent processes resemble ultrafast semiconductor excitations expanding the scope of BEC explorations to many-body non-equilibrium studies. PMID:25767044

  13. Processable high internal phase Pickering emulsions using depletion attraction

    PubMed Central

    Kim, KyuHan; Kim, Subeen; Ryu, Jiheun; Jeon, Jiyoon; Jang, Se Gyu; Kim, Hyunjun; Gweon, Dae-Gab; Im, Won Bin; Han, Yosep; Kim, Hyunjung; Choi, Siyoung Q.

    2017-01-01

    High internal phase emulsions have been widely used as templates for various porous materials, but special strategies are required to form, in particular, particle-covered ones that have been more difficult to obtain. Here, we report a versatile strategy to produce a stable high internal phase Pickering emulsion by exploiting a depletion interaction between an emulsion droplet and a particle using water-soluble polymers as a depletant. This attractive interaction facilitating the adsorption of particles onto the droplet interface and simultaneously suppressing desorption once adsorbed. This technique can be universally applied to nearly any kind of particle to stabilize an interface with the help of various non- or weakly adsorbing polymers as a depletant, which can be solidified to provide porous materials for many applications. PMID:28145435

  14. Tuning of depletion interaction in nanoparticle-surfactant systems

    SciTech Connect

    Ray, D. Aswal, V. K.

    2014-04-24

    The interaction of anionic silica nanoparticles (Ludox LS30) and non-ionic surfactants decaethylene glycol monododecylether (C12E10) without and with anionic sodium dodecyl sulfate (SDS) in aqueous electrolyte solution has been studied by small-angle neutron scattering (SANS). The measurements have been carried out for fixed concentrations of nanoparticle (1 wt%), surfactants (1 wt%) and electrolyte (0.1 M NaCl). Each of these nanoparticlesurfactant systems has been examined for different contrast conditions where individual components (nanoparticle or surfactant) are made visible. It is observed that the nanoparticle-C12E10 system leads to the depletion-induced aggregation of nanoparticles. The system however behaves very differently on addition of SDS where depletion interaction gets suppressed and aggregation of nanoparticles can be prevented. We show that C12E10 and SDS form mixed micelles and the charge on these micelles plays important role in tuning the depletion interaction.

  15. Transient Effects And Pump Depletion In Stimulated Raman Scattering

    NASA Astrophysics Data System (ADS)

    Carlsten, J. L.; Wenzel, R. G...; Druhl, K.

    1983-11-01

    Stimulated rotational Raman scattering in a 300-K multipass cell filled with para-H2 with a single-mode CO2-pumped laser is studied using a frequency-narrowed optical parametric oscillator (OPO) as a probe laser at the Stokes frequency for the So(0) transition. Amplification and pump depletion are examined as a function of incident pump energy. The pump depletion shows clear evidence of transient behavior. A theoretical treatment of transient stimulated Raman scattering, including effects of both pump depletion and medium saturation is presented. In a first approximation, diffraction effects are neglected, and only plane-wave interactions are considered. The theoretical results are compared to the experimental pulse shapes.

  16. Programmable nanometer-scale electrolytic metal deposition and depletion

    DOEpatents

    Lee, James Weifu [Oak Ridge, TN; Greenbaum, Elias [Oak Ridge, TN

    2002-09-10

    A method of nanometer-scale deposition of a metal onto a nanostructure includes the steps of: providing a substrate having thereon at least two electrically conductive nanostructures spaced no more than about 50 .mu.m apart; and depositing metal on at least one of the nanostructures by electric field-directed, programmable, pulsed electrolytic metal deposition. Moreover, a method of nanometer-scale depletion of a metal from a nanostructure includes the steps of providing a substrate having thereon at least two electrically conductive nanostructures spaced no more than about 50 .mu.m apart, at least one of the nanostructures having a metal disposed thereon; and depleting at least a portion of the metal from the nanostructure by electric field-directed, programmable, pulsed electrolytic metal depletion. A bypass circuit enables ultra-finely controlled deposition.

  17. Depleted uranium as a backfill for nuclear fuel waste package

    DOEpatents

    Forsberg, Charles W.

    1998-01-01

    A method for packaging spent nuclear fuel for long-term disposal in a geological repository. At least one spent nuclear fuel assembly is first placed in an unsealed waste package and a depleted uranium fill material is added to the waste package. The depleted uranium fill material comprises flowable particles having a size sufficient to substantially fill any voids in and around the assembly and contains isotopically-depleted uranium in the +4 valence state in an amount sufficient to inhibit dissolution of the spent nuclear fuel from the assembly into a surrounding medium and to lessen the potential for nuclear criticality inside the repository in the event of failure of the waste package. Last, the waste package is sealed, thereby substantially reducing the release of radionuclides into the surrounding medium, while simultaneously providing radiation shielding and increased structural integrity of the waste package.

  18. International aspects of restrictions of ozone-depleting substances

    SciTech Connect

    McDonald, S.C.

    1989-10-01

    This report summarizes international efforts to protect stratospheric ozone. Also included in this report is a discussion of activities in other countries to meet restrictions in the production and use of ozone-depleting substances. Finally, there is a brief presentation of trade and international competitiveness issues relating to the transition to alternatives for the regulated chlorofluorocarbons (CFCs) and halons. The stratosphere knows no international borders. Just as the impact of reduced stratospheric ozone will be felt internationally, so protection of the ozone layer is properly an international effort. Unilateral action, even by a country that produces and used large quantities of ozone-depleting substances, will not remedy the problem of ozone depletion if other countries do not follow suit. 32 refs., 7 tabs.

  19. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  20. On quantum codes obtained from cyclic codes over A2

    NASA Astrophysics Data System (ADS)

    Dertli, Abdullah; Cengellenmis, Yasemin; Eren, Senol

    2015-05-01

    In this paper, quantum codes from cyclic codes over A2 = F2 + uF2 + vF2 + uvF2, u2 = u, v2 = v, uv = vu, for arbitrary length n have been constructed. It is shown that if C is self orthogonal over A2, then so is Ψ(C), where Ψ is a Gray map. A necessary and sufficient condition for cyclic codes over A2 that contains its dual has also been given. Finally, the parameters of quantum error correcting codes are obtained from cyclic codes over A2.

  1. Code stroke in Asturias.

    PubMed

    Benavente, L; Villanueva, M J; Vega, P; Casado, I; Vidal, J A; Castaño, B; Amorín, M; de la Vega, V; Santos, H; Trigo, A; Gómez, M B; Larrosa, D; Temprano, T; González, M; Murias, E; Calleja, S

    2016-04-01

    Intravenous thrombolysis with alteplase is an effective treatment for ischaemic stroke when applied during the first 4.5 hours, but less than 15% of patients have access to this technique. Mechanical thrombectomy is more frequently able to recanalise proximal occlusions in large vessels, but the infrastructure it requires makes it even less available. We describe the implementation of code stroke in Asturias, as well as the process of adapting various existing resources for urgent stroke care in the region. By considering these resources, and the demographic and geographic circumstances of our region, we examine ways of reorganising the code stroke protocol that would optimise treatment times and provide the most appropriate treatment for each patient. We distributed the 8 health districts in Asturias so as to permit referral of candidates for reperfusion therapies to either of the 2 hospitals with 24-hour stroke units and on-call neurologists and providing IV fibrinolysis. Hospitals were assigned according to proximity and stroke severity; the most severe cases were immediately referred to the hospital with on-call interventional neurology care. Patient triage was provided by pre-hospital emergency services according to the NIHSS score. Modifications to code stroke in Asturias have allowed us to apply reperfusion therapies with good results, while emphasising equitable care and managing the severity-time ratio to offer the best and safest treatment for each patient as soon as possible. Copyright © 2015 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  2. Prevalence of vitamin B(12) depletion and deficiency in Liechtenstein.

    PubMed

    Koenig, Victoria; Stanga, Zeno; Zerlauth, Manfred; Bernasconi, Luca; Risch, Martin; Huber, Andreas; Risch, Lorenz

    2014-02-01

    Data about vitamin B(12) (B(12)) deficiency in the general population are scarce. The present study was performed to determine the prevalence of B(12) deficiency in the general population of the Principality of Liechtenstein, as well as to identify sub-populations potentially at high risk. Retrospective study. Ambulatory setting, population of the Principality of Liechtenstein. Seven thousand four hundred and twenty-four patients seeking medical attention whose serum samples were referred for routine work-up in an ambulatory setting were consecutively enrolled. Serum total B(12) was determined in all patients in this cohort. In addition, for a subgroup of 1328 patients, serum holotranscobalamin was also measured. Prevalence of B(12) deficiency was calculated. Further, multivariate logistical regression models were applied to identify covariates independently associated with B(12) deficiency and depletion. Nearly 8% of the general population was suffering from either B(12) depletion or deficiency. The ratio between B(12) depletion and deficiency was 2:1 for all age ranges. Pathological changes were detected predominantly in older people. Female gender was a significant predictor of B(12) depletion. In the cohort, nearly 40% exhibited either depletion or deficiency of B(12). B(12) depletion and deficiency are common in Liechtenstein, a Central European country. The measurement of biochemical markers represents a cost-efficient and valid assessment of the B(12) state. When a deficiency of B(12) is diagnosed at an early stage, many cases can be treated or prevented, with beneficial effects on individual outcomes and subsequent potential reductions in health-care costs.

  3. Auxin-inducible protein depletion system in fission yeast.

    PubMed

    Kanke, Mai; Nishimura, Kohei; Kanemaki, Masato; Kakimoto, Tatsuo; Takahashi, Tatsuro S; Nakagawa, Takuro; Masukata, Hisao

    2011-02-11

    Inducible inactivation of a protein is a powerful approach for analysis of its function within cells. Fission yeast is a useful model for studying the fundamental mechanisms such as chromosome maintenance and cell cycle. However, previously published strategies for protein-depletion are successful only for some proteins in some specific conditions and still do not achieve efficient depletion to cause acute phenotypes such as immediate cell cycle arrest. The aim of this work was to construct a useful and powerful protein-depletion system in Shizosaccaromyces pombe. We constructed an auxin-inducible degron (AID) system, which utilizes auxin-dependent poly-ubiquitination of Aux/IAA proteins by SCFTIR1 in plants, in fission yeast. Although expression of a plant F-box protein, TIR1, decreased Mcm4-aid, a component of the MCM complex essential for DNA replication tagged with Aux/IAA peptide, depletion did not result in an evident growth defect. We successfully improved degradation efficiency of Mcm4-aid by fusion of TIR1 with fission yeast Skp1, a conserved F-box-interacting component of SCF (improved-AID system; i-AID), and the cells showed severe defect in growth. The i-AID system induced degradation of Mcm4-aid in the chromatin-bound MCM complex as well as those in soluble fractions. The i-AID system in conjunction with transcription repression (off-AID system), we achieved more efficient depletion of other proteins including Pol1 and Cdc45, causing early S phase arrest. Improvement of the AID system allowed us to construct conditional null mutants of S. pombe. We propose that the off-AID system is the powerful method for in vivo protein-depletion in fission yeast.

  4. Erythrocyte depletion from bone marrow: performance evaluation after 50 clinical-scale depletions with Spectra Optia BMC.

    PubMed

    Kim-Wanner, Soo-Zin; Bug, Gesine; Steinmann, Juliane; Ajib, Salem; Sorg, Nadine; Poppe, Carolin; Bunos, Milica; Wingenfeld, Eva; Hümmer, Christiane; Luxembourg, Beate; Seifried, Erhard; Bonig, Halvard

    2017-08-11

    Red blood cell (RBC) depletion is a standard graft manipulation technique for ABO-incompatible bone marrow (BM) transplants. The BM processing module for Spectra Optia, "BMC", was previously introduced. We here report the largest series to date of routine quality data after performing 50 clinical-scale RBC-depletions. Fifty successive RBC-depletions from autologous (n = 5) and allogeneic (n = 45) BM transplants were performed with the Spectra Optia BMC apheresis suite. Product quality was assessed before and after processing for volume, RBC and leukocyte content; RBC-depletion and stem cell (CD34+ cells) recovery was calculated there from. Clinical engraftment data were collected from 26/45 allogeneic recipients. Median RBC removal was 98.2% (range 90.8-99.1%), median CD34+ cell recovery was 93.6%, minimum recovery being 72%, total product volume was reduced to 7.5% (range 4.7-23.0%). Products engrafted with expected probability and kinetics. Performance indicators were stable over time. Spectra Optia BMC is a robust and efficient technology for RBC-depletion and volume reduction of BM, providing near-complete RBC removal and excellent CD34+ cell recovery.

  5. Noiseless Coding Of Magnetometer Signals

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1989-01-01

    Report discusses application of noiseless data-compression coding to digitized readings of spaceborne magnetometers for transmission back to Earth. Objective of such coding to increase efficiency by decreasing rate of transmission without sacrificing integrity of data. Adaptive coding compresses data by factors ranging from 2 to 6.

  6. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  7. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  8. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  9. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  10. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  11. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  12. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  13. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  14. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  15. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  16. The random coding bound is tight for the average code.

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.

    1973-01-01

    The random coding bound of information theory provides a well-known upper bound to the probability of decoding error for the best code of a given rate and block length. The bound is constructed by upperbounding the average error probability over an ensemble of codes. The bound is known to give the correct exponential dependence of error probability on block length for transmission rates above the critical rate, but it gives an incorrect exponential dependence at rates below a second lower critical rate. Here we derive an asymptotic expression for the average error probability over the ensemble of codes used in the random coding bound. The result shows that the weakness of the random coding bound at rates below the second critical rate is due not to upperbounding the ensemble average, but rather to the fact that the best codes are much better than the average at low rates.

  17. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  18. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  19. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  20. Ozone depletion. (Latest citations from the NTIS database). Published Search

    SciTech Connect

    Not Available

    1993-04-01

    The bibliography contains citations concerning studies of atmospheric chemistry and modeling of ozone depletion in Antarctica, and the consequences of the depletion on ultraviolet radiation levels. The studies involve chemical reactions in the atmosphere, including temperature dynamics, possible changes in solar insolation, and effects of pollution from nitrogen, chloroflourocarbons, carbon dioxide, and methane. The studies involve references to observations of the ozonosphere and modeling of interactions worldwide, together with data on the sources of the natural and man-made pollutants. (Contains a minimum of 173 citations and includes a subject term index and title list.)